iOS 13.4, along with iPadOS 13.4, isn’t the only big software release that Apple rolled out today. As part of its gift to iPad Pro users, it has announced the availability of ARKit version 3.5. Its biggest new feature is support for LiDARs which, of course, requires the new 2020 iPad Pro hardware. It may, however, also prefigure what may be coming to this year’s iPhones as well.
Apple did more than just add a 3D time-of-flight (ToF) depth sensor to this year’s line of iPad Pros. It added an actual Light Detection and Ranging sensor or LiDAR that may make that large square bump on the back of the slate worth its presence. And, as expected, that new sensor is being put at the service of augmented reality, which gets a significant feature boost with today’s update.
ARKit 3.5 brings a new Scene Geometry API that uses that sensor to create a 3D representation of the real world in real-time, labeling floors, doors, seats, and more which are essential for properly putting AR objects in front or behind physical objects. With a proper LiDAR sensor, the Instant AR feature makes it faster to place AR objects in a snap without having to first scan the scene.
In addition to these two new APIs, ARKit 3.5 also updates existing features with more information to improve their accuracy. People Occlusion, for example, gets a better depth estimation while Motion Capture’s height estimation has also been improved.
Many of these new and improved ARKit features naturally require new hardware that’s currently only on the iPad Pro. This, however, could hint that the iPhones later this year would be equipped with the same or similar hardware, finally making rumors and leaks about 3D ToF sensors for iPhones true.