App developers will be able to capture and use depth data in photos in iOS 11. That should mean super-cool photo filters for the rest of us!
Note: If you're not a registered member of Apple's Developer Program, you won't be able to download the sample code mentioned in this article. That said, you can still use this article to learn a little more about what's to come in iOS 11!
When Apple introduced iPhone 7 Plus and its dual-camera system, it also introduced Portrait Mode. Portrait Mode uses the dual-camera system on iPhone 7 Plus to capture depth data when it takes a photo and make use of said data to add a pleasing, DSLR-style blur effect to photos. Outside of Portrait Mode's nifty blur, though, we haven't been able to do much else with the depth data captured by our phones. Come iOS 11, though, developers will be able to read embedded depth data and use it in their own apps.
Blurring out the background of a photo is but one option available when you've got access to a photo's depth. An app might let you add a photo filter, adjust the saturation, or make transparent a specific depth layer within a photograph. There are loads of creative possibilities and I'm looking forward to seeing what developers come up with!
Apple occasionally offers updates to iOS, watchOS, tvOS, and macOS as closed developer previews or public betas for iPhone, iPad, Apple TV and Mac (sadly, no public beta for the Apple Watch). While the betas contain new features, they also contain pre-release bugs that can prevent the normal use of your iPhone, iPad, Apple Watch, Apple TV, or Mac, and are not intended for everyday use on a primary device. That's why we strongly recommend staying away from developer previews unless you need them for software development, and using the public betas with caution. If you depend on your devices, wait for the final release.
How to capture and visualize photo depth in iOS 11
If you're a registered member of Apple's Developer Program, you can get an early look at what's possible with depth data in iOS 11. Apple's Brad Ford hosted a session at WWDC 2017 called "Capturing Depth in iPhone Photography" where he explained not only how the dual-camera system captures depth in photographs, but also how developers can make use of it.
You should start by checking out the session, which is chock-full of math (gasp!) and groan-worthy depth puns:
After you've gotten your fill, it's time to kick the tires on depth data in iOS 11. Apple has provided a couple sample apps to help you both capture and visualize depth data in photographs. They'll give you an idea of what depth information is available and ways it can be manipulated.
First, download AVCam and install it on your developer device running iOS 11. Remember, it's gotta be an iPhone 7 Plus to capture depth data.
Launch AVCam on your iPhone 7 Plus.
Tap Depth Data Delivery: Off so that it says Depth Data Delivery: On. This will include depth data in each photo you take in AVCam.
Snap a few photos by tapping the word Photo at the bottom of the app.
Photos with depth data work a little differently from photos taken in Portrait Mode. You can use the Photos app in iOS 11 to tell the difference.
Launch the Photos app in iOS 11.
Tap a photo that was shot using Portrait Mode or using the AVCam sample app.
If the photo was taken in Portrait Mode, you'll see a label in the top left corner of the screen that says DEPTH EFFECT.
If the photo includes embedded depth data, you'll need to tap the Edit button (looks like three sliders) at the bottom of the app. You'll see a label in the top middle area of the screen that says DEPTH. You can tap it to enable and disable the Portrait Mode effect.
How to use WiggleMe to see depth data in all its glory
Enabling and disabling the Portrait Mode effect is one way to visualize depth data in iOS 11, but there's a better, more hilarious way. Apple has created a sample app called WiggleMe that uses depth data to separate the background of a photo from the foreground and add an interesting 3D wiggle effect.
First, download WiggleMe and install it on your developer device running iOS 11.
Tap on one of the photos you took using AVCam (the ones that have embedded depth data).
The app will immediately display a wiggling, depth-adjusted animation of your photograph. It pulls out the foreground from the background and extrapolates the visual info in between to give the image a sort of 3D look. It's pretty awesome! You can also pinch and zoom on the screen while the animation is playing to increase or decrease the intensity of the depth effect.
Tap the photo once to switch off the automatic animation. The app will use your phone's gyroscope to trigger movement.
Tap the photo twice to stop the animation and choose a new photo.
There's obviously a lot more you can do with depth data in iOS 11, but WiggleMe gives you a quirky first look at the new feature. If you're registered with Apple's Developer Program, consider downloading the sample apps and using them as inspiration for creating your own photo apps that take advantage of depth data!
I'm looking forward to seeing how apps like Instagram and Snapchat use depth data come iOS 11. I'm also looking forward to the dozens of unique and interesting implementations independent developers use in their own apps. The thought of removing a boring background and subbing in an awesome one with just a few taps, making my puppers the center of attention in an otherwise busy photograph, or adding a subtle contrast change to the foreground of a photo has me eagerly awaiting the arrival of iOS 11. What about you?