Learn More

AR/VR

Published on September 8th, 2020 | by Emergent Enterprise

0

Lidar is Dull on iPads, but Could Go Beyond AR on the iPhone 12 Pro

Emergent Insight:
What makes a new technology successful? The innovation? Affordability? It really comes down to its usefulness. How can it provide value to an everyday user? The potential of lidar technology in Apple products is explained in this post at VentureBeat by Jeremy Horwitz and how it still hasn’t resonated with developers. Perhaps the next iPhones will change that but for now lidar will be used by a select few. It’s possible Apple is intentionally engaging with a smaller audience to learn from some real world usability before widespread distribution of lidar. Stay tuned.

Original Article:

While many of Apple’s investments in innovative technologies pay off, some just don’t: Think back to the “tremendous amount” of money and engineering time it spent on force-sensitive screens, which are now in the process of disappearing from Apple Watches and iPhones, or its work on Siri, which still feels like it’s in beta nine years after it was first integrated into iOS. In some cases, Apple’s backing is enough to take a new technology into the mainstream; in others, Apple gets a feature into a lot of devices only for the innovation to go nowhere.

Lidar has the potential to be Apple’s next “here today, gone tomorrow” technology. The laser-based depth scanner was the marquee addition to the 2020 iPad Pro that debuted this March, and has been rumored for nearly two years as a 2020 iPhone feature. Recently leaked rear glass panes for the iPhone 12 Pro and Max suggest that lidar scanners will appear in both phones, though they’re unlikely to be in the non-Pro versions of the iPhone 12. Moreover, they may be the only major changes to the new iPhones’ rear camera arrays this year.

If you don’t fully understand lidar, you’re not alone. Think of it as an extra camera that rapidly captures a room’s depth data rather than creating traditional photos or videos. To users, visualizations of lidar look like black-and-white point clouds focused on the edges of objects, but when devices gather lidar data, they know relative depth locations for the individual points and can use that depth information to improve augmented reality, traditional photography, and various computer vision tasks. Unlike a flat photo, a depth scan offers a finely detailed differentiation of what’s close, mid range, and far away.

Six months after lidar arrived in the iPad Pro, the hardware’s potential hasn’t been matched by Apple software. Rather than releasing a new user-facing app to show off the feature or conspicuously augmenting the iPad’s popular Camera app with depth-sensing tricks, Apple pitched lidar to developers as a way to instantly improve their existing AR software — often without the need for extra coding. Room-scanning and depth features previously implemented in apps would just work faster and more accurately than before. As just one example, AR content composited on real-world camera video could automatically hide partially behind depth-sensed objects, a feature known as occlusion.

To continue reading, go here…

Tags: , ,


About the Author

Emergent Enterprise

The Emergent Enterprise (EE) website brings together current and important news in enterprise mobility and the latest in innovative technologies in the business world. The articles are hand selected by Emergent Enterprise and not the result of automated electronic aggregating. The site is designed to be a one-stop shop for anyone who has an ongoing interest in how technology is changing how the world does business and how it affects the workforce from the shop floor to the top floor. EE encourages visitor contributions and participation through comments, social media activity and ratings.



Back to Top ↑