One of the greatest strengths of Apple’s product line is its ability for interplay. Not only do its devices work closely with each other, but features that begin life on one platform often make their way to others. Touch ID, for example, started on the iPhone before jumping to iPad and Mac models. The same goes for Retina and True Tone displays. Heck, Apple silicon began as part of Apple’s mobile efforts and now powers its entire lineup.
As demonstrated by the proceeding, this shift tends to happen especially with devices at the more cutting-edge end of Apple’s portfolio. No surprise, really, given that those tend to be the places that the company is investing the most time, money, and resources in pushing the technological envelope.
Of course, the latest in cutting-edge tech from Apple these days is none other than the Vision Pro. It’s chock-full of expensive, complex technology and frankly, it would be a bigger shock if none of its advancements ever made their way to other Apple products. Now that the Vision Pro has been in the wild for a couple of weeks, it’s a bit easier to figure out which applications might actually make sense on Apple’s other products.
Virtual reality
One of the biggest hit features of the Vision Pro is the ability to use it as a virtual display for your Mac. This, by all accounts, seems to build upon the new high performance screen-sharing mode that’s available for Apple silicon-based Macs running macOS Sonoma.
Apple
While third-party screen-sharing apps have existed on Apple’s other platforms for a long time, it’s somewhat surprising that the company’s never offered its own take on this feature on its mobile devices. The iPad in particular seems like it would be a perfect place to have the ability to display a Mac display, especially with the benefits of the high-performance mode: stereo audio, HDR video, and so on. On the face of it, it seems as though those devices would be compatible; after all, they use, in many cases, the same Apple silicon chips that run comparable Macs (not to mention the Vision Pro).
Certainly, such a feature might be a little bit less useful on an iPhone, with its small touch targets and general lack of a keyboard and trackpad, but I’ve definitely run into instances in the past where I’ve needed to screen share into my Mac at home from my smartphone. Bringing the full power of the Mac to the iPad might seem like a concession that the iPad can’t entirely replace a Mac, but after more than a decade of fretting about the disparity, perhaps it’s time to stop worrying about it and embrace the functionality it could bring.
Hands off
Oftentimes with a brand new platform comes a brand new interface paradigm and visionOS is no different. With the Vision Pro, much of the interaction is done via eye- and hand-tracking: you look at something to highlight it and use hand gestures to confirm it.
But my question is: why stop at the Vision Pro? Both of those interactions models seem like they could be useful on other Apple devices as well. While there would no doubt be much work needed to adapt them to, say the Mac or iPad, they certainly could be a boon in certain accessibility contexts, such as for people who have motor-based disabilities. The Mac has always been a home for other user interaction methods, from Mouse Keys to Voice Control, and adding eye- and hand-tracking to the list makes the platform even more versatile.
Foundry
Even for those who don’t need these particular accessibility affordances, being able to control a Mac or iPad from across the room without touching it seems like it might have use cases whether you’re, say, skipping to the next music track during a party or using an iPad while cooking. (Speaking as someone who frequently wants to scroll a recipe on a touchscreen device while their hands are covered in raw meat, I would be all over some sort of hand-tracking.)
Though it would also require the use of external hardware and probably some much better cameras, I’d likewise be interested to see both eye-tracking and hand gestures supported on the Apple TV. You know, for those all too frequent occasions when you accidentally lose the Siri Remote in your couch cushions.
Environmentally conscious
Immersive environments are another one of Vision Pro’s big attractions, and though I can immediately see the limitations of porting them to Apple’s other platforms—the decided lack of immersion being the biggest—there are still elements of the feature that could be great for users of other platforms.
In particular, as someone who often likes working at other places, like coffee shops and libraries, I’d love the ability to channel those environments using my desktop background and perhaps some ambient sound as well. Third-party options to do some of this have existed in the past, but building it into the operating systems feels like it could be a fun way to get a taste of the Vision Pro experience on my MacBook or even my iPad.
Apple’s Aerial screensavers already provide a different but similar vibe on the Mac, and while I don’t need the constant movement that those provide, some subtle animation in the desktop background could go a long way to helping my Mac transport me to a different headspace.