Article 592TQ Snapchat among first to leverage iPhone 12 Pro’s LiDAR Scanner for AR

Snapchat among first to leverage iPhone 12 Pro’s LiDAR Scanner for AR

by
Sarah Perez
from Crunch Hype on (#592TQ)

Apple introduced its latest flagship iPhone models, the iPhone 12 Pro and 12 Pro Max, at its iPhone event on Tuesday. Among other things, the devices sport a new LiDAR Scanner designed to allow for more immersive augmented reality (AR) experiences. Snapchat today confirms it will be among the first to put the new technology to use in its iOS app for a lidar-powered Lens.

As Apple explained during the event, the LiDAR (Light Detection And Ranging) Scanner measures how long it takes for light to reach an object and reflect back.

Apple unveils its flagship 5G phones, the iPhone 12 Pro and Pro Max

Along with iPhone's machine learning capabilities and dev frameworks, lidar helps the iPhone understand the world around you.

Apple adapted this technology for its iPhone 12 Pro models, where it's helping to improve low-light photography, thanks to its ability to see in the dark."

Screen-Shot-2020-10-13-at-3.37.02-PM.jpg

Image Credits: Apple presentation, screenshot via TechCrunch

The technology can also be used by app developers to build a precise depth map of the scene, and help speed up AR so it feels more instantaneous, while enabling new app experiences that use AR.

In practice, what this means for app developers is the ability to use lidar to enable things like object and room scanning - think, better AR shopping apps, home design tools or AR games, for example.

It also can enable photo and video effects and a more exact placement of AR objects, as the iPhone is actually able to see" a depth map of the room.

Screen-Shot-2020-10-13-at-3.36.47-PM.jpg

Image Credits: Apple presentation, screenshot via TechCrunch

That can lead to new AR experiences like what Snapchat is prepared to introduce. Already known for some best-in-class AR photo filters, the company says it will soon launch a lidar-powered lens specifically for the iPhone 12 Pro models.

Apple gave a brief peek at Snapchat's lidar-powered feature during the lidar portion of the iPhone event today.

Here, you can see an AR Lens in the Snapchat app where flowers and grasses cover the table and floor, and birds fly toward the user's face. The grasses toward the back of the room looked as if they were further away than those closer to the user, and vegetation was even climbing up and around the kitchen cabinets - an indication that it saw where those objects were in the physical space.

The birds in the Snapchat Lens disappear as they move behind the person, out of view, and even land precisely in the person's hand.

We understand this is the exact Lens Snapchat has in the works, but the company is holding further details for the time being. However, it shows what a lidar-enabled Snapchat experience would feel like.

You can see the Snapchat filter in action at 59:41 in the Apple iPhone Event video.

Updated, 10/13/20, 4:47 PM ET with confirmation that the Lens shown during the event is the one that will launch.

apple-iphone-event-2020.jpg

Techcrunch?d=2mJPEYqXBVI Techcrunch?d=7Q72WNTAKBA Techcrunch?d=yIl2AUoC8zA Techcrunch?i=6S3PDO0weKk:cEllmpJevdo:-BT Techcrunch?i=6S3PDO0weKk:cEllmpJevdo:D7D Techcrunch?d=qj6IDK7rITs6S3PDO0weKk
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/TechCrunch/
Feed Title Crunch Hype
Feed Link https://techncruncher.blogspot.com/
Reply 0 comments