Apple’s Too Conservative Approach to Text Intelligence

Apple is, understandably, taking a conservative approach to artificial intelligence. Nowhere is this more obvious and product-crippling than its text intelligence features. I am a fan of using AI for an edit pass on my words. Specifically, I’ve come to rely on Grammarly and its ability to sniff out overused adverbs and otherwise general sloppiness in my writing.

I’ve been around long enough to recall when grammar checkers first started appearing in applications like Microsoft Word. They were useless. It was comical how often their recommendations went against the grammar rules and made your writing worse. It wasn’t until the arrival of Grammarly that I got back on board with the idea of a grammar checker, and it’s been quite helpful. Note that I’m not using artificial intelligence to write for me; I’m using it to check my work and act as a first-pass editor. The problem I’ve always had with Grammarly is that it sends my words to the cloud whenever I want them checked.

Ideally, I’d like that done privately and locally. That’s why I was so excited about Apple Intelligence and text intelligence. It would presumably all happen on the device or Apple’s Private Cloud Compute servers. Unfortunately, at least in beta, Apple Intelligence isn’t up to the task. That conservative approach makes Apple’s Text Intelligence useless to me in this editor role. While Apple’s tools can identify obvious grammatical errors, they fall short in the more nuanced aspects of writing assistance.

A telling example: As a test, I recently gave Apple Intelligence a paragraph where the word “very” appeared in three consecutive sentences — a clear style issue that any modern writing tool would flag. However, Apple’s text intelligence didn’t notice this repetition. That’s very, very, very bad.

This limitation reflects a broader pattern in Apple’s approach to AI tools. While the foundation is solid, the current implementation may be too restrained to compete with existing writing assistance tools that offer more comprehensive feedback on style and readability. The challenge for Apple will be finding the sweet spot between maintaining their caution and delivering genuinely useful writing assistance. I get the big picture here. I know they’re not trying to make a Grammarly competitor, but they need to take several steps away from that conservative benchmark if this is going to be useful.

Another problem with the text tools is the implementation of recommended changes. You can have it either replace your text entirely (without any indicator of what exactly was changed) or give you a list of suggested edits, which you must implement manually. Other players in this space, like Grammarly, highlight recommended changes and make it easy to implement or ignore them with a button.

Apple is famous for its ability to create excellent user interfaces, and I suspect they could do something similar but probably better if they put their minds to it. Unfortunately, the current version of the text intelligence tools in Apple Intelligence isn’t even close.

Another Siri?

Mark Gurman’s got another AI/Siri report and it’s a doozy. According to the latest rumors, Apple is cooking up an LLM-powered Siri for iOS 19 and macOS 16.

The idea is that this would be yet another Siri reboot, but this time built on Apple’s own AI models. Think ChatGPT or Google’s Gemini but with that special Apple sauce (and privacy-focused access to your on-device data).

Here’s where I get a bit twitchy, though. Apple has been tight-lipped about the details of its AI strategy, and it’s starting to wear thin. If this massive LLM overhaul is really coming next year, what exactly are we getting with the current “Apple Intelligence” features that are supposed to land this year?

If, after all the WWDC and iPhone release hype, we get through all these betas only to find that Siri is still struggling with basic tasks, and then Apple says, “But wait until next year, we’ve got this whole new system that will finally fix everything!” Well, that will be just a little heartbreaking for me.

Apple’s Image Playground: Safety at the Cost of Utility?

As I’ve spent considerable time with Apple’s Image Playground in the recent iOS 18.2 beta, I’m left with more questions than answers about Apple’s approach to AI image generation. The most striking aspect is how deliberately unrealistic the output appears — every image unmistakably reads as AI-generated, which seems to be exactly what Apple intended.

The guardrails are everywhere. Apple has implemented strict boundaries around generating images of real people, and interestingly, even their own intellectual property is off-limits. When I attempted to generate an image of a Mac mini, the system politely declined.

Drawing a Mac mini is a no-go for Image Playground

This protective stance extends beyond the obvious restrictions: Try anything remotely offensive or controversial, and Image Playground simply won’t engage.

Apple’s cautious approach makes sense. Apple’s customers expect their products to be safe. Moreover, Apple is not aiming to revolutionize AI image generation; rather, they’re working to provide a safe, controlled creative tool for their users. These limitations however can significantly impact practical applications. My simple request to generate an image of a friend holding a Mac mini (a seemingly innocent use case) was rejected outright.

I hope Apple is aware of this tipping point and reconsidering as Image Playground heads toward public launch. At least let it draw your own products, Apple.

An Automation Golden Age

Did you know I have a newsletter? I post some, but not all of my newsletter’s content to this blog. Here’a a recent one.

An Automation Golden Age

I’ve mentioned several times on my podcasts that we’re experiencing a renaissance in automation, particularly on the Mac. This shift isn’t driven by a single tool but rather by the interoperability of a collection of tools.

AppleScript has been available on the Mac for decades, offering significant automation opportunities if you want to learn it. AppleScript allows users to connect applications and work with the same data to accomplish unified tasks. However, for many, learning AppleScript was a challenge. Programmers found it too different from traditional programming languages, and non-programmers struggled with its syntax. As a result, AppleScript adoption remained relatively small.

Apple and Sal Soghoian introduced Automator in early 2005 to address this, bringing drag-and-drop automation with its original version. Meanwhile, tools like Keyboard Maestro and Hazel, developed outside of Apple, have been actively filling the gaps in Apple’s automation solutions for years.

Then came Shortcuts ( Workflow). Initially developed for iOS, Shortcuts is now firmly embedded in the Mac ecosystem. It’s a spiritual (if not direct) descendant of Automator, and in recent years, these tools have learned to work together. You can run Keyboard Maestro macros from Shortcuts, and Shortcuts can be triggered from within Hazel. Users can now mix and match these tools to create robust automation chains, combining the strengths of each.

For those willing to invest the time to master—or at least gain a working knowledge of—these tools, few tasks on the Mac can’t be automated today.

The next big shift in this process is the integration of artificial intelligence. AI is already proving useful in helping generate automation, but if Apple Intelligence can fully tap into user data while still protecting user privacy and integrate it with Shortcuts, we could see a new era of powerful, personalized automation. This leap could be as significant as the jump from AppleScript to Automator. Of course, this depends on Apple getting both Apple Intelligence and the integration right, but I suspect this is already on the big whiteboard in Cupertino.

Shortcuts and Apple Intelligence both use the Intents system to work their magic. Developers who build for Shortcuts benefit from Apple Intelligence and vice versa. With this common architecture, I believe Apple will eventually tighten the connections between Shortcuts and Apple Intelligence. It won’t happen overnight, but over the coming years, I expect this combination to become the next frontier of automation in the Apple ecosystem.

Apple Intelligence Summarization

Because I’m a little nuts, I’m running the Apple Intelligence betas on all my devices. In my opinion, Apple Intelligence still has a ways to go, and these are early days, but we’re starting to get a peek at what Apple’s been working on. The more powerful Apple Intelligence features haven’t entered beta yet, but there are already some features in this first tier of on-device Apple Intelligence features that I’m genuinely enjoying.

One of them is the message summaries. In earlier betas, they just applied to the Messages and Apple Mail apps, but now you can extend them to third-party applications. (Although you can control this in the Notification settings on a per-app basis) This means that notifications on your phone are now summarized by Apple Intelligence on-device. With less than a week of testing, I already dig it.

The summaries are more concise and include more details than the previous notifications generated by the apps. There are no teasers here. It just tells me what I need. It’s AI-based, so it occasionally gets it wrong, but we’re still in early beta, and I will give it some grace. Also, this is the worst it will ever be, and it’s already pretty good.

The Big Question About Apple Intelligence

Since Apple unveiled its vision for artificial intelligence at WWDC, there have been numerous announcements and rumors about releasing different components of Apple Intelligence. The most recent update suggests that the all-new Siri won’t be available in beta until January of next year, with a general public release in Spring. While in recent years Apple has not hesitated to announce features that won’t ship with the initial release of a new operating system, Apple Intelligence takes it to a new level.

I don’t find fault with the delay, though. Even though it seems that Apple would have preferred to wait another year on these features, I believe that if the delay is necessary to get it right, it’s worth waiting for.

The big question about all the Apple Intelligence elements is not “When do we get them?” but “Do they work”? Significant market pressures forced Apple to explain its AI position, and now it has. I generally agree with the thinking behind Apple Intelligence, and if it works as promised, it will be very impressive.

In a few years, people will not remember exactly when the various Apple Intelligence components were released. However, they will remember whether or not they worked.

Beta 2 Updates and iPhone Mirroring

Yesterday, Apple released the second beta of the new operating systems, as announced at WWDC.

There are still no signs of Apple Intelligence. The most interesting addition is the ability to mirror your iPhone to your Mac via Continuity. I’ve just started testing it, and the feature feels solid enough for an early beta. Interestingly, the feature doesn’t work if your iPhone is active, which makes sense.

A Mac desktop with a gray background showing a Drafts window on the left side, and an iPhone visible using the new iPhone mirroring app available in the beta two version of macOS Sonoma.

There are some limitations. I tried to use Whisper Memos to record dictation onto the iPhone app via the Mac in mirroring mode, but it didn’t take. Nevertheless, keeping your iPhone on your Mac screen is excellent.

Private Cloud Compute

I watched the Apple WWDC 2024 keynote again, and one of the sections that went by pretty quickly was the reference to Private Cloud Compute, or PCC. For some of Apple’s AI initiative, they will need to send your data to the cloud. The explanation wasn’t clear about what sorts of factors come into play when necessary. Hopefully, they disclose more in the future. Regardless, Apple has built its own server farm using Apple silicon to do that processing. According to Craig Federighi, they will use the data, send back a response, and then cryptographically destroy the data after processing.

Theoretically, Apple will never be able to know what you did or asked for. This sounds like a tremendous amount of work, and I’m unaware of any other AI provider doing it. It’s also exactly the kind of thing I would like to see Apple do. The entire discussion of PCC was rather short in the keynote, but I expect Apple will disclose more as we get closer to seeing the Apple Intelligence betas.

Hope Springs Eternal for Apple Intelligence

Yesterday, Apple announced its new name for artificial intelligence tools on its platform: Apple Intelligence. If you watched the keynote carefully, it was almost humorous how they danced around the term “artificial intelligence” throughout. At the beginning, Tim made reference to “intelligence” without the word “artificial. Then, throughout the rest of the keynote, up until the announcement of Apple Intelligence, Apple relied on its old standby, “machine learning.” Nevertheless, they eventually got there with the announcement of Apple Intelligence.

official Apple Intelligence text and iPhone image from their website after the june 10 2024 announcement.

The initial explanation was telling. They stated five principles for Apple Intelligence: powerful, intuitive, integrated, personal, and private. These principles are the foundation of what they’re trying to ship. Also, in Apple fashion, the term Apple Intelligence doesn’t refer to a single product or service, but a group of intelligence-related features:

Table Stakes AI This is the type of AI that everyone was expecting. It includes things like removing lampposts from picture backgrounds and cleaning up text. We already see multiple implementations throughout the Internet and in many apps already on our Macs. Apple had to do this.

They did, and the implementation makes sense. It’s got a clean user interface and clear options. Moreover, developers can incorporate these tools into their apps with little or no work. It should be universal throughout the operating systems, so learning how the tool works in one place means you can use it everywhere else. For most consumers, this is golden.

Also, it will be private. While I’m a paying customer of Grammarly, I’m aware that everything it checks is going to their servers. That means there are some things that don’t get checked. I’d much prefer to do this work privately on my device.

LLM AI There have been many rumors about Apple developing its own Large Language Model (LLM), but nobody expected them to have one competitive with the likes of OpenAI and Google. So the question was, is Apple going to ship something inferior, work with one of the big players, or not include LLM as part of this initiative? We got our answer with the partnership with OpenAI, which incorporates OpenAI’s 4o engine into the operating system.

This makes a lot of sense. Since the keynote, Craig Federighi has gone on record saying they also want to make similar partnerships with Google and other LLM providers. While nothing is going to be private sent to a company like OpenAI, Apple is doing what it can to help you out. It doesn’t require an account, and it gives you a warning before it sends data to them. Again, I think this is a rational implementation.

If you already have an OpenAI account, you can even hook it up in the operating system to take advantage of all those additional features.

Private AI

This was the most important component of Apple Intelligence and was underplayed in the keynote. Using the built-in neural engine on Apple silicon combined with Apple Intelligence, Apple intends to give us the ability to take intelligence-based actions that can only be accomplished with knowledge of our data. That bit is essential: Apple Intelligence can see your data, but more powerful LLMs, like ChatGPT, cannot.

That gives Apple Intelligence powers that you won’t get from traditional LLMs. Craig explained it with some example requests:

“Move this note into my Inactive Projects folder”, requiring access to Apple Notes. “Email this presentation to Zara”, requiring access to Keynote and Apple Mail. “Play the podcast that my wife sent the other day,” which requires access to data in the Podcasts and Messages apps.

While these commands aren’t as sexy as asking an LLM engine to write your college paper for you, if they work, they’d be damn useful. This is exactly the kind of implementation I was hoping Apple would pursue. Because they control the whole stack and can do the work on device, this feature will also be unique to Apple customers.

“AI for the Rest of Us”

During the WWDC keynote, I only heard the term “Artificial Intelligence” once. At the end, when Craig said, “Apple Intelligence. This is AI for the rest of us.” I think that sentiment summarizes Apple’s entire approach. I agree with the philosophy.

I’m convinced that Apple has considered AI in a way that makes sense to me and that I’d like to use it. The question now is whether Apple can deliver the goods. Apple Intelligence isn’t going to be released for beta testing until the fall, so now we just have promises.

Apple’s challenge is the way Siri lingered so long. You’ll recall that Siri, too, started with a good philosophy and a lot of promises, but Apple didn’t keep up with it, and Siri never fulfilled its potential.

Looking at the Siri example, I should be skeptical of Apple Intelligence and its commitment. Yet, I’m more hopeful than that. The degree of intentionality described yesterday, combined with the extent to which Apple’s stock price is contingent on getting this right, makes me think this time will be different. In the meantime, we wait.