Article 551FV Google gives Android depth sensing and object occlusion with ARCore 1.18

Google gives Android depth sensing and object occlusion with ARCore 1.18

by
Ron Amadeo
from Ars Technica - All content on (#551FV)
  • depth2.gif

    Google's Depth API in action. This whole gallery is gifs. [credit: Google ]

The latest version of ARCore, Google's augmented reality developer platform for Android phones, now includes a depth API. The API was released as a preview back in December, but now it's live for everyone in ARCore 1.18.

Previously, ARCore would map out walls and floors and scale AR objects accordingly, but the Depth API enables things like occlusion-letting AR actors appear to be behind objects in the real world. The other big feature enabled by depth sensing is the ability to simulate physics, like the ability to toss a virtual object down the real-life stairs and have it bounce around realistically.

3D sensing

While Apple is building more advanced hardware into its devices for augmented reality, namely the lidar sensor in the iPad Pro, ARCore has typically been designed to work on the lowest common denominator in camera hardware. In the past that has meant ARCore only uses a single camera, even when most Android phones, even cheap ~$100 Android phones, come with multiple cameras that could help with 3D sensing. (Qualcomm's deserves some of the blame here, since its SoCs have often only supported running a single camera at a time.)

Read 3 remaining paragraphs | Comments

index?i=kflADwE0j0U:QD73L_zKJ_g:V_sGLiPB index?i=kflADwE0j0U:QD73L_zKJ_g:F7zBnMyn index?d=qj6IDK7rITs index?d=yIl2AUoC8zA
External Content
Source RSS or Atom Feed
Feed Location http://feeds.arstechnica.com/arstechnica/index
Feed Title Ars Technica - All content
Feed Link https://arstechnica.com/
Reply 0 comments