Apple’s. Or glasses. Or an AR/VR headset. Something like that, something coming in 2021, 2022… or 2023. Reports have been everywhere; the pieces seem to be coming into place.
And yet, at this year’sin a coronavirus-scrambled 2020, those plans weren’t announced — or even hinted at. Instead, there are pieces in place that seem to connect parts to a puzzle. And unanswered questions.
Apple’s CEO, Tim Cook, has telegraphed intentions to make augmented reality Facebook, , and many others, Apple still has the pieces in play to make hardware that could make a tremendous splash: millions of iPhones, iPads, an ecosystem of products running Apple-made processors and an evolving AR graphics toolkit. And yet, as everyone’s been stuck at home , augmented reality experiences were downplayed at this year’s event.over the past four years. With competition from ,
Last year, Apple unveiled integrated AR tools toand . This year, Apple’s updated ARKit 4 tools in and iPadOS 14 seem to have fewer dynamic new pieces, even if they’re playing key parts. Some other announcements seem to play roles, too. (The new ARKit 4 features require Apple A12 processors or newer.)
Here’s what could factor in, however, if you look at the puzzle pieces: a depth-sensing iPad, spatially-aware AirPods, location-based markers tied to a more evolved reality-scanning Apple Maps, and more ways to link virtual worlds to real places.
The iPad Pro’s depth sensing is key
, released earlier this spring, has a unique lidar sensor that scans spaces and can create 3D maps. It’s very likely to be the world-scanning feature that will be on future iPhones and eventually in Apple’s AR headsets, too. Apple’s new ARKit 4 toolkit for developers has a Depth API that will take greater advantage of the sensor and promises more accurate measurements. Already, developers are using lidar to scan homes and spaces and mesh out scans that could be used not just for augmented reality, but for saving models of places in formats like CAD.
Real-world locations and AR, blending further
Much like, and , Apple is adding Location Anchors to its iOS 14 AR tools, but with some precision tools lining up GPS and Apple Maps data. These specific geolocation markers will make pinning virtual things in specific places easier. Microsoft’s , released last year, already has location-specific anchors. Apple looks ready to move this idea forward more, for shared worlds and maybe even experiences that could be pinned to city maps. Combined with the multi-user possibilities AR already enables, this should lead to more shared, pinned-down things in reality, like location-specific art experiences. One interesting thing, though: Apple says its new geolocation anchors will only work in certain major US cities for now, since it relies on more advanced Apple Maps data to coordinate and fine-tune positioning.
Apple’s new clip-like app snippets could be used to launch AR with a quick scan
A new iOS 14 feature calledpromises to bring up fast app snippets when NFC scanning or using QR codes. Imagine a tap-to-pay location which could offer more, like a menu of options, too. App Clips can also launch AR experiences, which could mean that an NFC tap or QR code scan could then launch an AR experience, without needing to download a full app. My mind leaps to stores that could use it to show items that aren’t available in-person, or museums with enhanced exhibits — or who knows what else? This expands on Apple’s , which rely on a previously downloaded or loaded app or page. Now, those things could launch when scanned at real-world things.
Spatial audio on AirPods Pro looks like Apple’s audio AR
Last year, I started thinking about audio as pretty key in augmented reality: Instead of lifting a phone or even glasses to see a virtual thing in your world, audio cues could be a casual way of bringing up environmental info with earbuds. It’s a way that’s a lot less intrusive, and could include an option to launch some kind of visual AR after. After all, we already wear headphones all the time and live in audio bubbles. Apple’s AirPods have often seemed like .
iOS 14 enables spatial audio in Apple’s step-upmodels, using motion-tracking to position audio depending on how your head moves. For now, it’s meant for listening to surround sound on an iPhone or iPad, and Apple hasn’t integrated spatial audio for AirPods Pro into ARKIt yet, but this could be applied to audio AR experiences, too. Combine with eventual glasses, and it makes perfect sense. Bose before this year, but Apple could pick up where Bose left off.
Apple AR can cast virtual video screens now
An in-the-weeds feature of ARKit 4 called “video textures” does something I’ve seen in AR headsets like: It can project video into AR. This can be used for floating TV screens, or to map moving video avatars onto 3D models. Right now, it may seem silly to use your iPhone or iPad to create a floating virtual TV screen in your living room, when the iPhone or iPad literally is a mini TV screen. But, in a pair of glasses, this idea doesn’t seem silly at all.
The projected video avatars idea is fascinating, too. Right now, AR and VR don’t do great jobs of showing people’s real faces in virtual worlds; usually it feels more like living in a cartoon or puppet universe. Even in virtual Zoom-like conference apps like Spatial, avatars look like crude stretched-out approximations of real acquaintances. Maybe video-mapped avatars could be a step toward meeting with holographic friends in future AR FaceTime calls.
What does it all mean?
If you were expecting something big from Apple in AR (or VR), it’s not happening now. No Apple headset. No Apple Glasses. Neither, alas, is the ability to plug existing AR headsets into Apple devices. But Apple’s AR tools are getting very advanced, even without a headset. It remains as unclear as ever when actual Apple AR headsets will arrive, but the existing iPad Pro looks like it’ll continue to double as Apple’s AR dev kit for now.