FaceTime in iOS 13 will use AR to fix video call eye-contact

Video chat like FaceTime is a useful feature for a lot of people, but it can paradoxically end up feeling a little bit impersonal due to the simple fact that you're hardly ever making actual eye contact with the person you're talking to. Instead of looking into the camera – which would be necessary if you actually want to make eye contact – often times most people look at their display, making their eyes appear off-center.

Apple, it turns out, has a fix for that in iOS 13 beta 3. A new feature called FaceTime Attention Correction is present in that beta, and it does something that's admittedly a little strange. As discovered by Mark Rundle on Twitter, the feature's description simply says, "Your eye contact with the camera will be more accurate during FaceTime Video calls."

That doesn't really tell us much, but further down in that thread, Rundle says that he tested the feature Will Sigmon and that it actually worked. "Looking at him on-screen (not at the camera) produces a picture of me looking dead at his eyes like I was staring at the camera," Rundle said. "This is insane."

Sigmon posted images of the effect on his own Twitter feed, which you can see above. So, how does it all work? For that we turn to yet another Twitter post by Dave Schukin, who put together a quick video to show that FaceTime Attention Correction "uses ARKit to grab a depth map-position of your face, and adjusts the eyes accordingly." In the video, Schukin passes one of the arms of a pair of glasses in front of his face and we can see it warping as it moves in front of his nose and eyes.

So, in the end, it seems that Apple is using a bit of augmented reality magic to make it look as if you're gazing into the camera during FaceTime calls even when you're looking at the display. At the moment, this feature is only available to developers in iOS 13 beta 3, but we'll hopefully see it launch in public beta soon.