Google has previously teased a depth map for augmented reality using a single camera a lot, now the ARCore Depth API by the Alphabet Inc. company is publicly launching on Android with several third-party apps already taking advantage of the technology.
Google uses depth-from-motion algorithms to generate a depth map and instead of requiring special sensors or multiple lenses all that’s needed is a single RGB camera.
The depth map is created by taking multiple images from different angles and comparing them as you move your phone to estimate the distance to every pixel.
This ultimately allows digital objects to accurately appear and remain behind real-world ones. Occlusion, which is important for realistic AR, then makes sure objects are not just floating in space or virtually placed in a physically impossible position.
Rolling out with version 1.18 of ARCore (Google Play Services for AR), the Depth API will be available on “hundreds of millions of compatible Android devices.” Google first demoed Google Augmented Reality (AR) with 3D objects, life-sized animals, cars, shoes, and even astronauts in search.
Snapchat has already made use of the Depth API by creating several Snapchat Lenses including Dancing Hotdog and a new Android exclusive Undersea World Lens. In the example below, we see the hotdog disappear behind the couch.
Meanwhile, the Google Creative Lab created a demo (Lines of Play) that “uses depth information to showcase both occlusion and collisions” with domino tiles.