iOS 12 will unlock TrueDepth camera app magic

Third party photo and video apps look set to get much more portrait mode and depth-related effects with iOS 12, as Apple hugely refines the data it shares from its iPhone camera arrays. The feature, previewed in a WWDC 2018 session this week in San Jose, CA, is based on iOS 12's new Portrait Matte feature, a dramatic improvement over the current depth data third-party apps are currently able to access.

iOS 11 offered developers access to the raw depth date from the dual rear cameras on the iPhone 8 Plus and iPhone X, and the TrueDepth camera array on the front of the iPhone X. That adds depth data to the pixel, along with its color and brightness. Apps that wanted to replicate things like Apple's Portrait mode photography – which is able to blur the background while keeping the subject crisp – could use that depth data to separate foreground from background.

Portrait Matte versus Depth Data

In practice, though, the results from that raw depth data could be middling. Developers were left to figure out themselves where the edges of the subject ended and where the background began. Feathering around that line could help capture smaller details, like the a portrait subject's hair, but would typically include unwanted background details in the process. Lending to the problem was the relatively low resolution of the depth map data too: closer to 0.5-megapixels, a long way from the actual resolution of the image.

Come iOS 12, however, Apple is offering a second route. The raw depth data will still be available from still image captures. However, if there's a person detected in the frame, developers will also be offered a portrait matte: a mask with far greater detail around those troublesome edges. In Apple's demo the portrait matte was able to pick out things down to almost individual strands of hair.

Not only is that far more accurate than using feathering around raw depth data, the portrait matte itself is higher resolution in itself. It's 2016 x 1512 from the rear cameras, in fact, and 1544 x 1160 from the TrueDepth camera. Still lower than the actual image, but only by around half the resolution.

With the portrait matte, developers will have a lot more flexibility in producing more interesting portrait-style shots. You could have a brighter subject on a desaturated background, for example, or replace the background altogether with something different. However, iOS 12 won't limit that to just still shots.

Bring on the Green Screen

In fact, it will also grant developers access to the TrueDepth camera array on the front of the iPhone X – and that's expected to be a part of all the new iPhones that Apple is set to launch later this year, and indeed on an upcoming iPad – for video. They'll be offered a higher fidelity depth map, which codes the closest pixels as red, the furthest away as blue, and a spectrum of colors in-between. Pixels with no depth data, such as reflections off shiny surfaces, will be black.

As with the portrait matte, the depth map for TrueDepth video won't be the same as the maximum Full HD resolution that the camera can capture. Instead, it'll be 640 x 480 for 4:3 aspect video, or 640 x 360 for 16:9; either way, it'll cover the whole frame. Developers will be able to adjust the frame rate from 30fps down: since it'll be fairly system intensive, they'll be responsible for making sure they don't set the frame rate so high that the iPhone slows down or the app crashes altogether.

This RGB-D – red/green/blue and depth – data will open the door to a number of effects. In Apple's demo app, for example, a green-screen effect was possible: the background of the frame was replaced in real-time by a different image, allowing you to record video of yourself in front of an alternative screen. We've seen Apple use that before, in its own Clips 2.0 app.

The power of the Point Cloud

However as Apple is also supplying 3D point cloud data from the TrueDepth camera, a video app could virtually change the angle of the shot while the iPhone itself remains fixed. There's only a certain degree of freedom there before large gaps open in the point cloud – where depth data isn't available because TrueDepth can't see that portion of the space – but the effect can still be dramatic.

While Apple has a couple of sample apps for developers to play with, it'll be up to the coders themselves to make software that actually ends up in the App Store. Still, given the number of apps already available that offer different portrait mode effects, it seems likely that iOS 12 will bring with it a rush of new computational photography apps that take advantage of the new portrait matte and video 3D depth maps.