Private Cloud Compute

I watched the Apple WWDC 2024 keynote again, and one of the sections that went by pretty quickly was the reference to Private Cloud Compute, or PCC. For some of Apple’s AI initiative, they will need to send your data to the cloud. The explanation wasn’t clear about what sorts of factors come into play when necessary. Hopefully, they disclose more in the future. Regardless, Apple has built its own server farm using Apple silicon to do that processing. According to Craig Federighi, they will use the data, send back a response, and then cryptographically destroy the data after processing.

Theoretically, Apple will never be able to know what you did or asked for. This sounds like a tremendous amount of work, and I’m unaware of any other AI provider doing it. It’s also exactly the kind of thing I would like to see Apple do. The entire discussion of PCC was rather short in the keynote, but I expect Apple will disclose more as we get closer to seeing the Apple Intelligence betas.

Hope Springs Eternal for Apple Intelligence

Yesterday, Apple announced its new name for artificial intelligence tools on its platform: Apple Intelligence. If you watched the keynote carefully, it was almost humorous how they danced around the term “artificial intelligence” throughout. At the beginning, Tim made reference to “intelligence” without the word “artificial. Then, throughout the rest of the keynote, up until the announcement of Apple Intelligence, Apple relied on its old standby, “machine learning.” Nevertheless, they eventually got there with the announcement of Apple Intelligence.

official Apple Intelligence text and iPhone image from their website after the june 10 2024 announcement.

The initial explanation was telling. They stated five principles for Apple Intelligence: powerful, intuitive, integrated, personal, and private. These principles are the foundation of what they’re trying to ship. Also, in Apple fashion, the term Apple Intelligence doesn’t refer to a single product or service, but a group of intelligence-related features:

Table Stakes AI This is the type of AI that everyone was expecting. It includes things like removing lampposts from picture backgrounds and cleaning up text. We already see multiple implementations throughout the Internet and in many apps already on our Macs. Apple had to do this.

They did, and the implementation makes sense. It’s got a clean user interface and clear options. Moreover, developers can incorporate these tools into their apps with little or no work. It should be universal throughout the operating systems, so learning how the tool works in one place means you can use it everywhere else. For most consumers, this is golden.

Also, it will be private. While I’m a paying customer of Grammarly, I’m aware that everything it checks is going to their servers. That means there are some things that don’t get checked. I’d much prefer to do this work privately on my device.

LLM AI There have been many rumors about Apple developing its own Large Language Model (LLM), but nobody expected them to have one competitive with the likes of OpenAI and Google. So the question was, is Apple going to ship something inferior, work with one of the big players, or not include LLM as part of this initiative? We got our answer with the partnership with OpenAI, which incorporates OpenAI’s 4o engine into the operating system.

This makes a lot of sense. Since the keynote, Craig Federighi has gone on record saying they also want to make similar partnerships with Google and other LLM providers. While nothing is going to be private sent to a company like OpenAI, Apple is doing what it can to help you out. It doesn’t require an account, and it gives you a warning before it sends data to them. Again, I think this is a rational implementation.

If you already have an OpenAI account, you can even hook it up in the operating system to take advantage of all those additional features.

Private AI

This was the most important component of Apple Intelligence and was underplayed in the keynote. Using the built-in neural engine on Apple silicon combined with Apple Intelligence, Apple intends to give us the ability to take intelligence-based actions that can only be accomplished with knowledge of our data. That bit is essential: Apple Intelligence can see your data, but more powerful LLMs, like ChatGPT, cannot.

That gives Apple Intelligence powers that you won’t get from traditional LLMs. Craig explained it with some example requests:

“Move this note into my Inactive Projects folder”, requiring access to Apple Notes. “Email this presentation to Zara”, requiring access to Keynote and Apple Mail. “Play the podcast that my wife sent the other day,” which requires access to data in the Podcasts and Messages apps.

While these commands aren’t as sexy as asking an LLM engine to write your college paper for you, if they work, they’d be damn useful. This is exactly the kind of implementation I was hoping Apple would pursue. Because they control the whole stack and can do the work on device, this feature will also be unique to Apple customers.

“AI for the Rest of Us”

During the WWDC keynote, I only heard the term “Artificial Intelligence” once. At the end, when Craig said, “Apple Intelligence. This is AI for the rest of us.” I think that sentiment summarizes Apple’s entire approach. I agree with the philosophy.

I’m convinced that Apple has considered AI in a way that makes sense to me and that I’d like to use it. The question now is whether Apple can deliver the goods. Apple Intelligence isn’t going to be released for beta testing until the fall, so now we just have promises.

Apple’s challenge is the way Siri lingered so long. You’ll recall that Siri, too, started with a good philosophy and a lot of promises, but Apple didn’t keep up with it, and Siri never fulfilled its potential.

Looking at the Siri example, I should be skeptical of Apple Intelligence and its commitment. Yet, I’m more hopeful than that. The degree of intentionality described yesterday, combined with the extent to which Apple’s stock price is contingent on getting this right, makes me think this time will be different. In the meantime, we wait.

Record any audio on your Mac in seconds, with Audio Hijack (Sponsor)

For over 20 years, Audio Hijack has enabled you to record any audio on your Mac. Capture audio from individual applications like Safari or Zoom, hardware audio devices like microphones and mixers, or even the audio output of the entire system. If you can hear it, you can record it. Whatever you need to do with audio on your Mac, Audio Hijack can help.

  • Record conversations from Zoom, FaceTime, and other VoIP apps
  • Save streaming audio from the web
  • Create podcasts, both remote and in-studio
  • Digitize vinyl
  • And so much more

When on macOS 14.4 and newer, Audio Hijack is now easier to use than ever before. The new installer-free setup makes getting started a snap, so why not give it a try? Visit the Audio Hijack site to download the free trial.

audio hijack new installer graphic showing a floppy disk with a label at the top showing Audio Hijack logo and text

Best of all, through the end of June, MacSparky readers can save 24% by purchasing with coupon code SPARKY24. Act fast!

OmniFocus 4.3

This week The Omni Group released version 4.3 of OmniFocus with several nice improvements:

  • New Focus Filters: This version introduces device Focus Filters to customize visible app data based on the current Focus mode.
  • Enhanced Automation: A new Favorite Perspective shortcut and automation options for perspective displays using Siri commands or your device focus.
  • Apple Watch Improvements: Faster, more reliable syncing with paired iPhones, and automatic sync error resolution.

However, the thing that stands out for me with this release is that it is shipping simultaneously on Mac, iPhone, iPad, Apple Watch, and Vision Pro. I know The Omni Group took heat for the delay in getting to version 4 and rebuilding the app in SwiftUI, but that’s exactly why they can ship new features to all platforms on the same day.

Also, it’s not lost on me that I’m a bit biased.