Learn More

AR/VR

Published on June 25th, 2020 | by Emergent Enterprise

0

Triangulating Clues in Apple’s AR Roadmap

Emergent Insight:
Whenever Apple announces their AR product (2021? 2022?) they will have already built a substantial augmented world full of tags and anchors and experiences that the new device can read. This preparation is evaluated by Mike Boland at AR Insider with an overview of Apple’s current AR efforts. It makes sense to leverage Apple Maps and populate a “world” in which Apple devices are already living. Look for all of the tech giants to build augmented worlds that push us content everywhere all the time – whether we want it or not.

Original Article:

Announcements at WWDC this week continue to paint a picture of Apple’s long-term AR play. As we examined, that includes AirPods Pro spatial audio — signaling a continued march towards audio AR, and a wearables suite that carries different flavors of sensory augmentation.

But more closely related to AR and its common graphical connotations, Apple announced GeoAnchors for ARkit 4. These evoke AR’s location-based potential by letting users plant and discover spatially-anchored graphics that are persistent across sessions and users.

AR proponents and enthusiasts will recognize this for its AR cloud underpinnings: To overlay location-relevant graphics, devices first must understand a scene and localize themselves. That can happen by mapping the contours of the scene or having previously-mapped spatial data.

Google’s answer to this challenge is to utilize imagery from Street View as a visual database for object recognition so that AR devices can localize. That forms the basis for its storefront recognition in Google Lens, as well as AR-infused urban navigation in its Live View feature.

Look Around

Back to GeoAnchors, they’ll similarly tap into Apple’s Street View-like “Look Around” feature, including purpose-built point clouds as we theorized at its launch. GeoAnchors can use this data to localize a device before rendering the right spatially-anchored AR graphics in the right place.

Also similar to Google Lens, Apple will use a combination of data sources. Image recognition via Look Around’s visual database is just one source. Other data sources will likely include a device’s position (via GPS) where it’s pointing (compass), and how it’s moving (IMU).

This combination of inputs allows for maximum power and data efficiency. Spatial mapping and point clouds have data-heavy payloads. So an AR device — in this case an iPhone — can selectively access just-in-time data based on where it is. This is a core AR cloud principle.

Speaking of which, GeoAchors round out AR cloud initiatives from tech giants. Google has the above-mentioned efforts. Facebook has Live Maps and ongoing acquisitions to tie it together. And Snap revealed its AR cloud play last week, which will crowdsource data from snaps.

To continue reading, go here…

Tags: ,


About the Author

Emergent Enterprise

The Emergent Enterprise (EE) website brings together current and important news in enterprise mobility and the latest in innovative technologies in the business world. The articles are hand selected by Emergent Enterprise and not the result of automated electronic aggregating. The site is designed to be a one-stop shop for anyone who has an ongoing interest in how technology is changing how the world does business and how it affects the workforce from the shop floor to the top floor. EE encourages visitor contributions and participation through comments, social media activity and ratings.



Back to Top ↑