This is the third part of my series on early notes from the Vision Pro. This one is focused on media consumption—earlier entries covered the hardware and interface and productivity.
Media consumption with photos and videos is fantastic. There’s nothing like it. Watching movies on it is even better. I’ve never had a particularly good home theater system. Now I do.
I started by watching Moana in the Disney+ app in their theater environment. It is like having your own movie theater. I got so absorbed in the movie’s climax that I teared up a bit. Since I couldn’t wipe my eyes with the Vision Pro, it made me cry a bit, but the story about watching Moana and coming to tears without being able to wipe my eyes, my light seal cushions got wet, which was kind of funny. Hippie.
3D videos are impressive, but at this point, more like a demo. When I have older videos of my family, they’ll start ruining light seals like Moana did.
Panoramas look great. I will be shooting a lot more of them going forward. I can tell newer vs. older panoramic photos based on their fidelity. I want to be able to make some of them the equivalent of a background wallpaper so I can put apps in front of them. My guess is Apple is more focused on Environments.
I watched a Netflix show in Safari. It was also great, but app-specific media is better.
The big asterisk with media consumption is that it is a solitary experience. There are shows I watch without my family, and it’s great for that. The device does not enable any joint viewing experience.
This week, MacSparky is sponsored by SoundSource, a utility I use daily. The problem with the Mac’s native sound controls is that they are designed for people who need them once every few years. If you need to adjust your sound settings more than that, you need SoundSource.
SoundSource is the sound settings controller that should be built in. Whether you listen to podcasts, blast music, or stream video, SoundSource is for everyone who uses audio on their Mac. It gives you per-app audio control, letting you change the volume of any app and route individual apps to different audio devices. Mute your browser, or send music to one set of speakers and everything else to another.
It doesn’t just give you sources; it also lets you apply effects to any audio on your Mac. Boost volume levels, add an equalizer, and even apply Audio Unit plugins. SoundSource also provides fast access to your Mac’s audio devices, so there’s no need to dig around in System Settings when you need to adjust things.
If you have a DisplayPort or HDMI device that fails to offer volume adjustment, SoundSource can help there, too. It gives those devices a proper volume slider, and the Super Volume Keys feature also makes your keyboard volume controls work. And the best part is you don’t have to click through cryptic preferences boxes. With SoundSource you get all of these tools right in your menu bar.
And don’t forget, SoundSource is from Rogue Amoeba, which is the authority for sound processing on your Mac.
Best of all, through the end of February, MacSparky readers can save 20% by purchasing with coupon code SOUNDSPARK20. Learn more and download a free trial on the SoundSource site.
Vitally: A new era for customer success productivity. Get a free pair of AirPods Pro when you book a qualified meeting.
Nom Nom: Healthy, fresh food for dogs formulated by top Board Certified Veterinary Nutritionists. Prepped in our kitchens with free delivery to your door. Get 50% off.
Squarespace: Save 10% off your first purchase of a website or domain using code FOCUSED.
Indeed: Join more than 3.5 million businesses worldwide using Indeed to hire great talent fast.
visionOS has roots in iPadOS, and it shows. You’ll be disappointed if you are looking for a Vision Pro to replace a Mac.
Instead, I’ve focused on ways Vision Pro is superior to the Mac for productivity, like my writing cabin.
Vision Pro is very good at keeping me isolated for focused work. I can already be productive with the device where that focus matters.
We don’t have enough environments to get the most out of that last point.
I found an attached Bluetooth keyboard a big help. I use a connected trackpad much less, but it also can come in handy.
That said, dictation is much better than it used to be, and don’t forget to use it with the Vision Pro.
Fantastical is a stand-out app. Putting up your calendar and make it the size of the wall is pretty damn cool. It works particularly well for the big picture of monthly, quarterly, and yearly use. I’ve got a massive version of my monthly calendar installed on my ceiling. As I think about next month, I can look at the ceiling to see what’s on deck.
MindNode Next is also an interesting early entry. It’s a mind-mapping app but also a brainstorming app where you can locate ideas in space.
Ideation development (like MindNode) is an excellent use case for Vision Pro. Apple’s Freeform could also serve in this capacity, but it’s not yet there. My experiments continue.
If you want to capture a lot of text, try Aiko, an AI-based transcription tool. You just hit the record button, which converts the recording to text with the Whisper AI engine. I checked with the developer, who reports all work is done on-device.
Mac display mode acts as an escape hatch, but I don’t see it replacing monitors for extended work sessions. It makes tons of sense to have a big display attached to a laptop in a hotel room or to give you the ability to move your desktop Mac display to a different room, though.
We are in early days for the productivity question on Vision Pro. There are still many workflows to be explored and apps to be released.
Now that I’ve logged some serious hours in the Vision Pro, I thought I’d share some thoughts about it. This post focuses on the hardware and interface:
Strapping into the Vision Pro does feel a little bit like getting ready to make a spacewalk. I charge the battery (generally) with it disconnected, letting me store the hardware (along with a keyboard) in a drawer. When it’s time to go into the device, I put the battery in a pocket and run the cable under my shirt to my neck to avoid getting tangled in things if I go mobile.
For productivity work, a keyboard is necessary. I had an extra keyboard and trackpad. I’ve combined them into one unit using this gizmo from Amazon. Twelve South also makes one that looks a little nicer.
The screens are excellent, and anything rendered by them (apps, icons, environments) is entirely believable. The pass-through cameras, however, are darker and grainier than I expected.
The pre-release discussion of it being too heavy was overblown. I’ve worn it for hours without much trouble.
The Dual Loop Band is more comfortable for me than the Solo Knit Band, but the Solo Knit Band is more straightforward to configure. I use the Solo Knit band for short sessions and the Dual Loop band for longer ones, like watching movies.
The audio on the Vision Pro is much better than I expected. I connected my AirPods earlier today to confirm they work, but I’ve been using the built-in speakers exclusively thus far for everything (including watching movies), and they seem fine to me.
You must train yourself to avoid picking it up by the light seal. It’s a light magnetic connection, and it is easy to drop the device.
Touch targets on iPad apps are too small. The eye tracking works great with native apps but is sometimes tricky with iPad apps.
One of the nice touches: when you grab the handle of a window, it automatically aligns rotationally to where you’re standing in the space in the room. There are so many subtle details with the way it renders windows. The shadows on real-world objects are another of my favorites.
If you’re having trouble with tracking, make the object bigger by stretching it or bringing it closer to you. I kept forgetting about that.
You can rotate a window by rotating your head.
The pinch gesture only works when you have your hand with your palm down. I never got it to work with my palm up.
You can long-press the pinch gesture, and you get right-click options. I’d like to know how many other ideas they have for gestures as this product matures.
Strangely, I think I feel things when I touch them: virtual keyboard keys, butterflies, and the like.
I struggle a little bit with app management. There aren’t any options except to go through the alphabetical list.
It seems silly that you can’t set favorites, have a dock, or otherwise arrange your applications beyond the main screen.
With a device so dependent on dictation, there should be an easier way to trigger dictation without resorting to the virtual keyboard.
NetSuite: The leading integrated cloud business software suite. Download NetSuite’s popular KPI Checklist, designed to give you consistently excellent performance.
Indeed: Join more than 3.5 million businesses worldwide using Indeed to hire great talent fast.
This entire post was composed on Apple Vision Pro with dictation and a Bluetooth Apple Keyboard attached…in virtual Yosemite Valley.
One of my interests in the visionOS platform is whether or not I can use it to get work done. Apple thinks so and has made that a big part of the marketing push for this device. However, it is a new platform with a fledgling App Store and many questions surrounding whether it is useful for productive work.
Moreover, the types of workflows that lend themselves to the platform are also in question. Don’t forget the Vision Pro operating system is based on the iPad, not the Mac. It’s easy to strap on this new device, thinking you can turn it into a Mac. (The fact that you can mirror a Mac display makes it even more tempting.) That’s the mistake I made with the iPad, and I spent years drilling empty wells, looking for productivity workflows that would allow me to duplicate Mac workflows. It was only after I accepted the iPad as an iPad that it became productive for me.
I’m not going to make that mistake with the Vision Pro. I’m going into this thing with open eyes and a sense of curiosity for where it can be used to get work done.
This is not a Macintosh. It is something else. And that is where the opportunity lies. While Mac workflows don’t work here in visionOS, are there things in visionOS that don’t work on a Mac? That is where we should be looking.
And for me, that starts with the idea of contextual computing. I have long felt that computers put too much interference between you and your work.
If you want to write an email, you need to open an email application, which will show you a bunch of new emails, but not a compose window where you can write that email. So many times, you’ll start with that task to write that important email but never actually find your way to the compose window. If you want to work on your task list, you often have to wade through screens and screens of existing tasks before you can get to the ones you need. Put simply, computers need to put you in the context of the work with as little interference as possible.
Sadly, most modern software doesn’t do that. Instead, it does the exact opposite. This is partly due to bad design and partly because tech companies have figured out ways to monetize your attention. They are intentionally trying to divert you from the work. That’s how they keep the lights on. One of the easiest ways to be more productive on any platform is to find quick ways to get yourself in the context of the work you seek to do with as little interference as possible.
This is where visionOS and Vision Pro come in. It’s a new platform tightly controlled by one of the only big tech companies interested in user privacy. This new visionOS is where you can work if you are smart about it.
I’m still experimenting and figuring out my workflows, but here’s an easy one I’ve been using in visionOS for several days: my context-based writing space.
It starts in Yosemite Valley. Using the visionOS “Environments” space, I have found myself in an immersive rendition of the Yosemite Valley in winter. There’s snow on the ground, but I’m sitting there right now comfortably with just my socks on … which is nice.
The main screen in front of me has Apple Notes, where I’m writing this article. To my left is a floating research pane with Safari in it. That’s it. A little research. A place to write. Yosemite Valley. I’ve written about 3,000 words here in the valley over the last few days, which is very comforting. I’ve got a focus mode on, so I don’t get interrupted, and I genuinely feel alone with my words. That’s important. For this to work, I need to be off the grid. This is my cabin in the woods, where I do my writing.
When I’m not writing, I don’t go to Yosemite to watch a visionOS movie, or check email, or play with some other aspect of visionOS. My brain is already figuring out that Yosemite Valley equals writing. My Mac is far away, back at my studio, along with the the cognitive load that comes with the work I do on my Mac. That’s all a distant memory here in Yosemite Valley. My brain is successfully duped.
As the context sticks, the work gets easier. This is a form of contextual computing that I’ve never experienced before. I’ve tried it with other headsets, but the poor-quality screens made it unbearable. I expect this writing context will get only easier over time. As the habit sticks and more writing apps and tools start showing up, I’ll consider bringing the better ones with me to future trips to the valley.
When I’m done writing, I leave this place, knowing Yosemite Valley will be there the next time I want to write.
This immersive context is not possible while sitting at a Mac. And for me, it is just the beginning of these explorations. I’m considering building a similar workflow in some other environment for journaling. And I’ve got more ideas after that.
This started simply as a proof-of-concept experiment, but now it’s set for me. I’ll return here the next time I need to do some serious writing. It’s already working: the valley appears, and my brain says, “Okay. Let’s get to it. Let’s start moving that cursor.”
This a digitally created distraction-free environment that is made possible by visionOS. And this is the productivity story for Vision Pro. I’m not looking to replace an existing platform but find new ways that are only possible in the new platform. The valley proves it’s possible. So now I need to see what else it can do. visionOS isn’t at a place where it can become my only operating system. But that doesn’t mean it can’t be an essential tool in helping me get my work done.
This week I welcome a new sponsor to MacSparky: AudiOn, from the makers of Boom3D, an app that lets you record and edit audio easily on iPhone, and that gives you a lot of great editing options.
I record my voice a lot to capture thoughts, ideas for content, or anything else that comes to mind. AudiOn takes that experience to a whole new level. You can do a lot to bring your recordings to life once they’re captured. You can add some effects like reverb or use the built-in equalizer, blend your voice with music, and also edit and merge recordings. I can see AudiOn being used by podcasters, voice actors, or anyone that use their voice to create content.What impressed me the most after using the app to record my voice is how clear the recording is. I was also able to use the Voice Isolation feature right within the app. You can of course record any audio, not just your voice, and have access to a nice set of editing tools.Take your iPhone audio recording to the next level with AudiOn, available for free on the App Store (with optional in-app purchase).
Jeff Richardson returns to Mac Power Users to talk about how AI has changed the way he works with documents, and how technologies like spatial computing and Standby can impact day-to-day workflows.