Apple Vision Pro needs something bigger than a software update

MacworldI have been using Apple Vision Pro on and off for the last nine months. When it was released in February, I described it as a device with a ton of potential but with huge challenges to overcome. Now, with its first major operating system update available and plenty of time for developers to get on board, I’m left with a more pessimistic view.For spatial computing to be a hit, it’s going to need more than the Vision Pro can deliver. The first major update fixed some issues and added welcome features, but too many problems remain—and some simply can’t be fixed in a software update. If Apple is going to make spatial computing (i.e. mixed reality) its next major software platform, it will need to drastically accelerate its rate of progress.What visionOS 2 brings to the tableThe Vision Pro launched with a host of annoyances and visionOS 2 goes a long way to addressing them. You can more easily get to the Home View with a new gesture—look at your palm and when a circle appears, tap your thumb and finger together. Flip your hand around to look at the back of it to see a little widget that displays time, volume, and battery life. Tap your fingers while this is showing to bring up Control Center.Speaking of the Home View, you can finally freely rearrange your icons and take iPad and iPhone apps out of the dedicated folder. It also remembers the visual setup of the last Guest user, so your partner doesn’t have to run through setup every time you want them to check out something cool. It’s all a lot easier and more natural than fiddling around with the buttons or looking up at a little dot at the top of your view.If you’re in a spatial environment, your Magic Keyboard or MacBook keyboard will “punch through” and remain visible (all the time, or only when you bring your hands near it, your choice) which makes it easier to type. And there are a bunch of really good updates to developer tools that will make it easier for them to make good spatial experiences, too.visionOS 2 brings some good features, but it still has a lonf way to go.AppleThe software is still awkward and limitingDespite the improvements, there are still so many places where visionOS feels unfinished.How is the Clock app not an actual selection of real 3D clocks you can virtually hang on your wall? How does the Calendar not look at behave like a traditional paper calendar, with live data and the ability to zoom in on dates? This stuff is so obvious, it should have been some of the first apps Apple built. Instead, apps like Clock, Calendar, and Home are just iPad apps, hanging in the air in a virtual window.Speaking of hanging in the air, the spatial tracking has improved (and so has object and hand occlusion) but most Vision Pro apps are still just big windows with simplistic controls that stay anchored in place in your environment. This makes window management a real chore.You have the entire virtual world, and you still have to literally move everything around in physical space to manage your windows. It should be possible to set an app to remain anchored to the device, so it follows the user as you move around. You should be able to stack app windows and swap between them, so you don’t have to find a new area of physical space just to open another app.The biggest issues with Vision Pro can’t be fixed through software updates.David Price / FoundryVision Pro should be a way to expand our space, to have as much room as we want without going anywhere. Instead, it’s the opposite—on my laptop or desktop I can swipe between desktops and layer windows on top of each other, with a dozen things taking up the space of my monitor. With Vision Pro, I have to turn my head and move all over the place to make use of more than three apps.Imagine if every app on your Mac was a whiteboard on a stand. You can easily reposition them, but you’re limited to one app per whiteboard and they take up a LOT of space. That’s what it feels like to multi-task on Vision Pro. The new ultrawide virtual monitor for Mac will greatly improve the ability of users to get work done on Vision Pro (it’s coming in visionOS 2.2), but that’s really just your Mac being productive.There’s no doubt that visionOS 2 made the software experience better, but it’s still far too limited, and limiting, for mainstream adoption. At the current pace of improvement, it feels like it won’t come into its own until somewhere around visionOS 5 or 6.This hardware will never be mainstreamSome parts of the Vision Pro experience just aren’t going to get any better without different hardware.The headset is too heavy. The field of view is too narrow. The battery and tether are cumbersome. You can’t see the eyes of the person wearing it and the EyeSight feature doesn’t get the job done well. It’s fatiguing to use for more than half an hour at a time. The battery life stinks.Oh, and it’s priced 3-4x higher than a mainstream tech product.Vision Pro is an isolating experience and the high-priced hardware means it’ll stay that way for a while.FoundryAs I said back in February, Vision Pro’s success will be measured by Apple’s ability to maintain interest—among both the public and developers—to drive momentum to a more mainstream future product that will cost less and do more.Unfortunately, I don’t think visionOS 2 will do nearly enough to succeed at that. Apple’s new 17-minute immersive film Submerged is very impressive, but it’s driving at the wrong thing. Linear video content is not the future of the medium, and it shouldn’t be the present of a $3,500 headset that only lets you watch it by yourself.The Vision Pro is a spectacular isolationist video viewing device in short sessions, but should Apple really be highlighting that? Should the message be: Buy this $3,500 thing to have the best solitary video experience for up to 30 minutes?Apple needs to show people why spatial computing is the future. How it can be faster, easier, more natural, and more efficient than the phones, tablets, and laptops we use today. While visionOS 2 takes a step in that direction, it’s only a step, and the journey from here to there is long.

featured-image

I have been using Apple Vision Pro on and off for the last nine months. When it was released in February, I described it as a device with a ton of potential but with huge challenges to overcome . Now, with its first major operating system update available and plenty of time for developers to get on board, I’m left with a more pessimistic view.

For spatial computing to be a hit, it’s going to need more than the Vision Pro can deliver. The first major update fixed some issues and added welcome features, but too many problems remain—and some simply can’t be fixed in a software update. If Apple is going to make spatial computing (i.



e. mixed reality) its next major software platform, it will need to drastically accelerate its rate of progress. What visionOS 2 brings to the table The Vision Pro launched with a host of annoyances and visionOS 2 goes a long way to addressing them.

You can more easily get to the Home View with a new gesture—look at your palm and when a circle appears, tap your thumb and finger together. Flip your hand around to look at the back of it to see a little widget that displays time, volume, and battery life. Tap your fingers while this is showing to bring up Control Center.

Speaking of the Home View, you can finally freely rearrange your icons and take iPad and iPhone apps out of the dedicated folder. It also remembers the visual setup of the last Guest user, so your partner doesn’t have to run through setup every time you want them to check out something cool. It’s all a lot easier and more natural than fiddling around with the buttons or looking up at a little dot at the top of your view.

If you’re in a spatial environment, your Magic Keyboard or MacBook keyboard will “punch through” and remain visible (all the time, or only when you bring your hands near it, your choice) which makes it easier to type. And there are a bunch of really good updates to developer tools that will make it easier for them to make good spatial experiences, too. The software is still awkward and limiting Despite the improvements, there are still so many places where visionOS feels unfinished.

How is the Clock app not an actual selection of real 3D clocks you can virtually hang on your wall? How does the Calendar not look at behave like a traditional paper calendar, with live data and the ability to zoom in on dates? This stuff is so obvious, it should have been some of the first apps Apple built. Instead, apps like Clock, Calendar, and Home are just iPad apps, hanging in the air in a virtual window. Speaking of hanging in the air, the spatial tracking has improved (and so has object and hand occlusion) but most Vision Pro apps are still just big windows with simplistic controls that stay anchored in place in your environment.

This makes window management a real chore. You have the entire virtual world, and you still have to literally move everything around in physical space to manage your windows. It should be possible to set an app to remain anchored to the device, so it follows the user as you move around.

You should be able to stack app windows and swap between them, so you don’t have to find a new area of physical space just to open another app. Vision Pro should be a way to expand our space, to have as much room as we want without going anywhere. Instead, it’s the opposite—on my laptop or desktop I can swipe between desktops and layer windows on top of each other, with a dozen things taking up the space of my monitor.

With Vision Pro, I have to turn my head and move all over the place to make use of more than three apps. Imagine if every app on your Mac was a whiteboard on a stand. You can easily reposition them, but you’re limited to one app per whiteboard and they take up a LOT of space.

That’s what it feels like to multi-task on Vision Pro. The new ultrawide virtual monitor for Mac will greatly improve the ability of users to get work done on Vision Pro (it’s coming in visionOS 2.2), but that’s really just your Mac being productive.

There’s no doubt that visionOS 2 made the software experience better, but it’s still far too limited, and limiting, for mainstream adoption. At the current pace of improvement, it feels like it won’t come into its own until somewhere around visionOS 5 or 6. This hardware will never be mainstream Some parts of the Vision Pro experience just aren’t going to get any better without different hardware.

The headset is too heavy. The field of view is too narrow. The battery and tether are cumbersome.

You can’t see the eyes of the person wearing it and the EyeSight feature doesn’t get the job done well. It’s fatiguing to use for more than half an hour at a time. The battery life stinks.

Oh, and it’s priced 3-4x higher than a mainstream tech product. As I said back in February , Vision Pro’s success will be measured by Apple’s ability to maintain interest—among both the public and developers—to drive momentum to a more mainstream future product that will cost less and do more. Unfortunately, I don’t think visionOS 2 will do nearly enough to succeed at that.

Apple’s new 17-minute immersive film Submerged is very impressive, but it’s driving at the wrong thing. Linear video content is not the future of the medium, and it shouldn’t be the present of a $3,500 headset that only lets you watch it by yourself. The Vision Pro is a spectacular isolationist video viewing device in short sessions, but should Apple really be highlighting that? Should the message be: Buy this $3,500 thing to have the best solitary video experience for up to 30 minutes? Apple needs to show people why spatial computing is the future.

How it can be faster, easier, more natural, and more efficient than the phones, tablets, and laptops we use today. While visionOS 2 takes a step in that direction, it’s only a step, and the journey from here to there is long..