Google gives Android depth sensing and object occlusion with ARCore 1.18
-
Google's Depth API in action. This whole gallery is gifs. [credit: Google ]
The latest version of ARCore, Google's augmented reality developer platform for Android phones, now includes a depth API. The API was released as a preview back in December, but now it's live for everyone in ARCore 1.18.
Previously, ARCore would map out walls and floors and scale AR objects accordingly, but the Depth API enables things like occlusion-letting AR actors appear to be behind objects in the real world. The other big feature enabled by depth sensing is the ability to simulate physics, like the ability to toss a virtual object down the real-life stairs and have it bounce around realistically.
3D sensingWhile Apple is building more advanced hardware into its devices for augmented reality, namely the lidar sensor in the iPad Pro, ARCore has typically been designed to work on the lowest common denominator in camera hardware. In the past that has meant ARCore only uses a single camera, even when most Android phones, even cheap ~$100 Android phones, come with multiple cameras that could help with 3D sensing. (Qualcomm's deserves some of the blame here, since its SoCs have often only supported running a single camera at a time.)
Read 3 remaining paragraphs | Comments