Thursday, April 17, 2014

Google explains technology behind Lens Blur in new Google Camera

With the surprising release of Google Camera yesterday, we got a bunch of new features previously not seen on most Android devices. Photo Sphere became available to many, instead of being tied only to Nexus devices. But most surprising of all was the new Lens Blur feature, which had not been seen before. It allowed you to automatically blur your background, giving a shallow depth of field effect to any photo.

Google has posted a technical explanation of how this feature works, and it’s pretty fancy. This feature doesn’t just blur the background and keep the foreground clear, but it blurs individual pixels specific amounts depending on how far away they were calculated to be. Blur varies depending on distance, just like the real deal.

When shooting a photo, you need to tilt the phone up slightly while keeping the subject centered. This allows the camera to know what is moving and what’s not. What doesn’t move is the foreground, and the more something moves, the further away it is. It’s a far more complicated process than that, but I’ll let Google explain it better in its blog post. Basically, it uses a sort of 3D model for estimating depth.

2014-04-16 19.20.10

The feature actually works quite well in use, as long as you keep the subject centered well. I’ll definitely enjoy this feature, so props to Google for including it in the new Google Camera. And while the app itself is pretty great, it needs a bit of work. Hit the source link to read the full explanation and tell us if you like the feature.



source: androidandme

0 comments :