Vision Pro Check-In

At a recent MacSparky Labs event, the topic of Vision Pro came up. Several members are now getting access to the device as Apple expands the release to more countries.

side view of vision pro device from apple, with visor on the right, headband on the left.
Image: Apple

Some Vision Pro owners regularly use the hardware, particularly those who travel and want to use that big screen and entertainment device in a hotel. For other folks, the bloom is off the rose, and they’re not exactly sure what to do with their Vision Pro.

I am in the middle.

Productivity

My best-case productivity usage continues to be writing. I’m writing this post sitting in my Vision Pro in my usual spot at Yosemite with a keyboard in my lap. I do that often. It feels like a context change and makes my work easier. I do two to four writing sessions a week.

Other transactional productivity tasks, like email, calendar, and task management, never stuck with me. Maybe I should try to do it exclusively for a month to see if I could build some new workflows, but for now, at least, there’s too much friction.

Content Consumption

Unsurprisingly, this is the most successful implementation of Vision Pro. I’m pretty careful about video consumption, but when I do decide to watch something, I want to give it my entire attention — none of this silly fiddle-with-iPad-while-watching-TV for me. So, Vision Pro is perfect for what I’ll call mindful consumption. I’m out of town this week, but I recently finally bought The Boy and the Heron. I can’t wait to watch it in Vision Pro when I return home.

That said, Apple needs to turn the Vision Pro content machine up to 11. They should produce a lot more immersive content and make deals with artists and sports teams.

I’m also a fan of several art gallery-style apps. There is a huge difference in looking at art in Vision Pro versus any other screen.

Gaming

I’m hardly a serious gamer, but Bombaroom continues to make me giggle as I lay waste to a digital castle across the room.

Putting Vision Pro in context, this is the early days. I wish Apple were more aggressive with the software stack and content. If I add all of these uses up, do they stack up to the significant cost of the Vision Pro for me? I think so, but it isn’t an obvious calculus. Moreover, I want to watch this technology and see what Apple does with it. That is an additional benefit for me, but not for all.

Vision’s Future

Mark Gurman reports that Apple is looking at some interesting ideas for a more affordable visionOS headset, including tethering the device to an iPhone or a Mac. Tethering to an iPhone doesn’t seem much different from being connected to a battery, as it currently does. Tethering to a Mac, however, feels like an entirely different kettle of fish.

medium close up of vision pro device on its custom stand, in an apple retail store.

Photo by Mylo Kaye

If Apple is rethinking its Vision strategy, my two cents would be to make the currently shipping hardware more attractive:

  • More immersive content
  • More sports
  • More immersive concert footage
  • More environments
  • More productivity software and workflows

There is a lot of blue sky left for Vision and visionOS, but a more compelling software and content story needs to come first.

The State of visionOS Content

We’ve had some nice Vision Pro content announcements over the last few weeks. There is a new adventure experience, Parkour in Paris. I watched it and realized halfway through that I must be developing a fear of heights because many shots terrified me. We also got Demeo, a role-playing game that now works on Vision Pro, and that has several interesting twists. At some point in the next few days, Disney will release the Vision Pro version of What if…, an ongoing Marvel series that looks at alternative timelines and ideas. The Vision Pro version is supposed to be both immersive and interactive.

I’m calling all this out because it is simultaneously promising and overdue on the Vision Pro. I expected more frequent releases like this when the hardware became available, and there hasn’t been enough of it. People talk about Vision Pro as if it’s a dud, and I don’t see it this way. I regularly watch videos on it and write on it. And yet…

There has been a dearth of content taking advantage of what makes the Vision Pro special. There are many great clips in the demonstration. I expected more like that to show up sooner on the device. I think a regular diet of content like this (along with more immersive sports and concerts) would help generate excitement for the platform and satisfaction for existing owners. One of the primary reasons to buy the Vision Pro is for content, so more exclusive content that takes advantage of the hardware would be welcome. What’s unclear is how invested Apple is in paying for that kind of content. I’m not sure if WWDC is the place for such an announcement, but a public declaration from Apple and promises of regular releases of future content Vision Pro would be welcome.

Vision Pro Notes: Media Consumption

This is the third part of my series on early notes from the Vision Pro. This one is focused on media consumption—earlier entries covered the hardware and interface and productivity.

  • Media consumption with photos and videos is fantastic. There’s nothing like it. Watching movies on it is even better. I’ve never had a particularly good home theater system. Now I do.
  • I started by watching Moana in the Disney+ app in their theater environment. It is like having your own movie theater. I got so absorbed in the movie’s climax that I teared up a bit. Since I couldn’t wipe my eyes with the Vision Pro, it made me cry a bit, but the story about watching Moana and coming to tears without being able to wipe my eyes, my light seal cushions got wet, which was kind of funny. Hippie.
  • 3D videos are impressive, but at this point, more like a demo. When I have older videos of my family, they’ll start ruining light seals like Moana did.
  • Panoramas look great. I will be shooting a lot more of them going forward. I can tell newer vs. older panoramic photos based on their fidelity. I want to be able to make some of them the equivalent of a background wallpaper so I can put apps in front of them. My guess is Apple is more focused on Environments.
  • I watched a Netflix show in Safari. It was also great, but app-specific media is better.
  • The big asterisk with media consumption is that it is a solitary experience. There are shows I watch without my family, and it’s great for that. The device does not enable any joint viewing experience.

Vision Pro Notes: Productivity

Yesterday, I wrote my notes about the Vision Pro hardware and its interface. Here are my notes on productivity:

  • visionOS has roots in iPadOS, and it shows. You’ll be disappointed if you are looking for a Vision Pro to replace a Mac.
  • Instead, I’ve focused on ways Vision Pro is superior to the Mac for productivity, like my writing cabin.
  • Vision Pro is very good at keeping me isolated for focused work. I can already be productive with the device where that focus matters.
  • We don’t have enough environments to get the most out of that last point.
  • I found an attached Bluetooth keyboard a big help. I use a connected trackpad much less, but it also can come in handy.
  • That said, dictation is much better than it used to be, and don’t forget to use it with the Vision Pro.
  • Fantastical is a stand-out app. Putting up your calendar and make it the size of the wall is pretty damn cool. It works particularly well for the big picture of monthly, quarterly, and yearly use. I’ve got a massive version of my monthly calendar installed on my ceiling. As I think about next month, I can look at the ceiling to see what’s on deck.
  • MindNode Next is also an interesting early entry. It’s a mind-mapping app but also a brainstorming app where you can locate ideas in space.
  • Ideation development (like MindNode) is an excellent use case for Vision Pro. Apple’s Freeform could also serve in this capacity, but it’s not yet there. My experiments continue.
  • If you want to capture a lot of text, try Aiko, an AI-based transcription tool. You just hit the record button, which converts the recording to text with the Whisper AI engine. I checked with the developer, who reports all work is done on-device.
  • Mac display mode acts as an escape hatch, but I don’t see it replacing monitors for extended work sessions. It makes tons of sense to have a big display attached to a laptop in a hotel room or to give you the ability to move your desktop Mac display to a different room, though.
  • We are in early days for the productivity question on Vision Pro. There are still many workflows to be explored and apps to be released.

Vision Pro Notes: The Hardware and Interface

Now that I’ve logged some serious hours in the Vision Pro, I thought I’d share some thoughts about it. This post focuses on the hardware and interface:

  • Strapping into the Vision Pro does feel a little bit like getting ready to make a spacewalk. I charge the battery (generally) with it disconnected, letting me store the hardware (along with a keyboard) in a drawer. When it’s time to go into the device, I put the battery in a pocket and run the cable under my shirt to my neck to avoid getting tangled in things if I go mobile.
  • For productivity work, a keyboard is necessary. I had an extra keyboard and trackpad. I’ve combined them into one unit using this gizmo from Amazon. Twelve South also makes one that looks a little nicer.
  • The screens are excellent, and anything rendered by them (apps, icons, environments) is entirely believable. The pass-through cameras, however, are darker and grainier than I expected.
  • The pre-release discussion of it being too heavy was overblown. I’ve worn it for hours without much trouble.
  • The Dual Loop Band is more comfortable for me than the Solo Knit Band, but the Solo Knit Band is more straightforward to configure. I use the Solo Knit band for short sessions and the Dual Loop band for longer ones, like watching movies.
  • The audio on the Vision Pro is much better than I expected. I connected my AirPods earlier today to confirm they work, but I’ve been using the built-in speakers exclusively thus far for everything (including watching movies), and they seem fine to me.
  • You must train yourself to avoid picking it up by the light seal. It’s a light magnetic connection, and it is easy to drop the device.
  • Touch targets on iPad apps are too small. The eye tracking works great with native apps but is sometimes tricky with iPad apps.
  • One of the nice touches: when you grab the handle of a window, it automatically aligns rotationally to where you’re standing in the space in the room. There are so many subtle details with the way it renders windows. The shadows on real-world objects are another of my favorites.
  • If you’re having trouble with tracking, make the object bigger by stretching it or bringing it closer to you. I kept forgetting about that.
  • You can rotate a window by rotating your head.
  •  The pinch gesture only works when you have your hand with your palm down. I never got it to work with my palm up. 
  • You can long-press the pinch gesture, and you get right-click options. I’d like to know how many other ideas they have for gestures as this product matures. 
  • Strangely, I think I feel things when I touch them: virtual keyboard keys, butterflies, and the like.
  • I struggle a little bit with app management. There aren’t any options except to go through the alphabetical list.
  • It seems silly that you can’t set favorites, have a dock, or otherwise arrange your applications beyond the main screen.
  • With a device so dependent on dictation, there should be an easier way to trigger dictation without resorting to the virtual keyboard.

Contextual Computing with Vision Pro: My Writing Cabin

A wide screen image showing Apple's Notes app with all panes open, against a virtual Yosemite Valley background. This is viewed through a Vision Pro device.
Looking at Yosemite Valley while writing in Apple Notes

This entire post was composed on Apple Vision Pro with dictation and a Bluetooth Apple Keyboard attached…in virtual Yosemite Valley.

One of my interests in the visionOS platform is whether or not I can use it to get work done. Apple thinks so and has made that a big part of the marketing push for this device. However, it is a new platform with a fledgling App Store and many questions surrounding whether it is useful for productive work.

Moreover, the types of workflows that lend themselves to the platform are also in question. Don’t forget the Vision Pro operating system is based on the iPad, not the Mac. It’s easy to strap on this new device, thinking you can turn it into a Mac. (The fact that you can mirror a Mac display makes it even more tempting.) That’s the mistake I made with the iPad, and I spent years drilling empty wells, looking for productivity workflows that would allow me to duplicate Mac workflows. It was only after I accepted the iPad as an iPad that it became productive for me.

I’m not going to make that mistake with the Vision Pro. I’m going into this thing with open eyes and a sense of curiosity for where it can be used to get work done.

This is not a Macintosh. It is something else. And that is where the opportunity lies. While Mac workflows don’t work here in visionOS, are there things in visionOS that don’t work on a Mac? That is where we should be looking.

And for me, that starts with the idea of contextual computing. I have long felt that computers put too much interference between you and your work.

If you want to write an email, you need to open an email application, which will show you a bunch of new emails, but not a compose window where you can write that email. So many times, you’ll start with that task to write that important email but never actually find your way to the compose window. If you want to work on your task list, you often have to wade through screens and screens of existing tasks before you can get to the ones you need. Put simply, computers need to put you in the context of the work with as little interference as possible.

Sadly, most modern software doesn’t do that. Instead, it does the exact opposite. This is partly due to bad design and partly because tech companies have figured out ways to monetize your attention. They are intentionally trying to divert you from the work. That’s how they keep the lights on. One of the easiest ways to be more productive on any platform is to find quick ways to get yourself in the context of the work you seek to do with as little interference as possible.

This is where visionOS and Vision Pro come in. It’s a new platform tightly controlled by one of the only big tech companies interested in user privacy. This new visionOS is where you can work if you are smart about it.

I’m still experimenting and figuring out my workflows, but here’s an easy one I’ve been using in visionOS for several days: my context-based writing space.

It starts in Yosemite Valley. Using the visionOS “Environments” space, I have found myself in an immersive rendition of the Yosemite Valley in winter. There’s snow on the ground, but I’m sitting there right now comfortably with just my socks on … which is nice.

The main screen in front of me has Apple Notes, where I’m writing this article. To my left is a floating research pane with Safari in it. That’s it. A little research. A place to write. Yosemite Valley. I’ve written about 3,000 words here in the valley over the last few days, which is very comforting. I’ve got a focus mode on, so I don’t get interrupted, and I genuinely feel alone with my words. That’s important. For this to work, I need to be off the grid. This is my cabin in the woods, where I do my writing.

When I’m not writing, I don’t go to Yosemite to watch a visionOS movie, or check email, or play with some other aspect of visionOS. My brain is already figuring out that Yosemite Valley equals writing. My Mac is far away, back at my studio, along with the the cognitive load that comes with the work I do on my Mac. That’s all a distant memory here in Yosemite Valley. My brain is successfully duped.

As the context sticks, the work gets easier. This is a form of contextual computing that I’ve never experienced before. I’ve tried it with other headsets, but the poor-quality screens made it unbearable. I expect this writing context will get only easier over time. As the habit sticks and more writing apps and tools start showing up, I’ll consider bringing the better ones with me to future trips to the valley.

When I’m done writing, I leave this place, knowing Yosemite Valley will be there the next time I want to write.

This immersive context is not possible while sitting at a Mac. And for me, it is just the beginning of these explorations. I’m considering building a similar workflow in some other environment for journaling. And I’ve got more ideas after that.

This started simply as a proof-of-concept experiment, but now it’s set for me. I’ll return here the next time I need to do some serious writing. It’s already working: the valley appears, and my brain says, “Okay. Let’s get to it. Let’s start moving that cursor.”

This a digitally created distraction-free environment that is made possible by visionOS. And this is the productivity story for Vision Pro. I’m not looking to replace an existing platform but find new ways that are only possible in the new platform. The valley proves it’s possible. So now I need to see what else it can do. visionOS isn’t at a place where it can become my only operating system. But that doesn’t mean it can’t be an essential tool in helping me get my work done.

Apple Vision Pro Thoughts

It’s a big week for those contemplating buying a Vision Pro. Apple has always prided itself on only releasing products when they are “done.” While I have no doubt that the Vision Pro is done, I also think the use case for the product is far from done…This is a post for MacSparky Labs Members only. Care to join? Or perhaps do you need to sign in?

The Vision Pro Software Question

One of the bigger questions around the looming release of the Vision Pro is software. Specifically, will there be any, and will it be any good? We don’t know yet. We’ve seen some offerings from Apple and some limited offerings from third parties, but now that we have a shipping date on the hardware, announcements are starting to roll out. The Omni Group announced OmniPlan will be on the new hardware. I suspect they’ll be announcing more. This is where the companies that adopted SwiftUI will get their payoff.

But it’s too early to tell whether a healthy software stack will be available to us on Day One. The device could be focused on enterprise-style software, given its cost. But I could equally see a lot of the better developers getting apps on it despite its small initial adoption so that they can have their flag planted. We’ll see.

Spatial Video Demonstrations

John Gruber spent more time with Vision Pro, focusing on the Photos app, including Spatial Video and panoramic photos. In short, John was impressed, and this is just the first iteration of this stuff.

These things are hard to predict. (It took a pandemic for video chat to get legs.) Nevertheless, as families and friends are spread to the four winds, this holodeck-like experience could be a big deal. Moreover, I’ve lost enough people to appreciate how memories fade. My dad died over 30 years ago, and I’d give a lot to be able to feel his presence again, even if just part of a silly spatial video file.

If this takes off, it could become a killer feature for Apple’s future Vision products. And as explained by John, when iOS 17.2 releases you’ll be able to start recording those spatial videos immediately with your iPhone 15 Pro, even if you don’t yet own a Vision Pro headset.