Another Siri?

Mark Gurman’s got another AI/Siri report and it’s a doozy. According to the latest rumors, Apple is cooking up an LLM-powered Siri for iOS 19 and macOS 16.

The idea is that this would be yet another Siri reboot, but this time built on Apple’s own AI models. Think ChatGPT or Google’s Gemini but with that special Apple sauce (and privacy-focused access to your on-device data).

Here’s where I get a bit twitchy, though. Apple has been tight-lipped about the details of its AI strategy, and it’s starting to wear thin. If this massive LLM overhaul is really coming next year, what exactly are we getting with the current “Apple Intelligence” features that are supposed to land this year?

If, after all the WWDC and iPhone release hype, we get through all these betas only to find that Siri is still struggling with basic tasks, and then Apple says, “But wait until next year, we’ve got this whole new system that will finally fix everything!” Well, that will be just a little heartbreaking for me.

Siri Concerns

Last week Ryan Christoffel over at 9to5Mac quoted the latest Mark Gurman report about Apple developing an additional AI personality. Gurman reports that Apple is working on “[…]another human-like interface based on generative AI.” Like Ryan, I am confused by this.

official siri icon currently in use in 2024

For too long, Apple let Siri linger. It’s been the butt of jokes in tech circles for years. We’re told that this year will be different and Siri will truly get the brain transplant it deserves. But if so, why is Apple working on an entirely different human-like interface? Does this signal that the Siri update isn’t all it should be?

It’s too early for any of us to tell on the outside. There are some Siri updates in 18.1, but they are largely cosmetic. We’re still waiting for the big shoe to drop on Siri updates with later betas.

However, the idea that Apple is already working on the next thing before they fix the current shipping thing does make me a little nervous. I realize that at this point, we’re all just reading tea leaves, and I could be completely off the mark here, but I sincerely hope that the updates to Siri this year get all of the effort that Apple can muster.

Hope Springs Eternal for Apple Intelligence

Yesterday, Apple announced its new name for artificial intelligence tools on its platform: Apple Intelligence. If you watched the keynote carefully, it was almost humorous how they danced around the term “artificial intelligence” throughout. At the beginning, Tim made reference to “intelligence” without the word “artificial. Then, throughout the rest of the keynote, up until the announcement of Apple Intelligence, Apple relied on its old standby, “machine learning.” Nevertheless, they eventually got there with the announcement of Apple Intelligence.

official Apple Intelligence text and iPhone image from their website after the june 10 2024 announcement.

The initial explanation was telling. They stated five principles for Apple Intelligence: powerful, intuitive, integrated, personal, and private. These principles are the foundation of what they’re trying to ship. Also, in Apple fashion, the term Apple Intelligence doesn’t refer to a single product or service, but a group of intelligence-related features:

Table Stakes AI This is the type of AI that everyone was expecting. It includes things like removing lampposts from picture backgrounds and cleaning up text. We already see multiple implementations throughout the Internet and in many apps already on our Macs. Apple had to do this.

They did, and the implementation makes sense. It’s got a clean user interface and clear options. Moreover, developers can incorporate these tools into their apps with little or no work. It should be universal throughout the operating systems, so learning how the tool works in one place means you can use it everywhere else. For most consumers, this is golden.

Also, it will be private. While I’m a paying customer of Grammarly, I’m aware that everything it checks is going to their servers. That means there are some things that don’t get checked. I’d much prefer to do this work privately on my device.

LLM AI There have been many rumors about Apple developing its own Large Language Model (LLM), but nobody expected them to have one competitive with the likes of OpenAI and Google. So the question was, is Apple going to ship something inferior, work with one of the big players, or not include LLM as part of this initiative? We got our answer with the partnership with OpenAI, which incorporates OpenAI’s 4o engine into the operating system.

This makes a lot of sense. Since the keynote, Craig Federighi has gone on record saying they also want to make similar partnerships with Google and other LLM providers. While nothing is going to be private sent to a company like OpenAI, Apple is doing what it can to help you out. It doesn’t require an account, and it gives you a warning before it sends data to them. Again, I think this is a rational implementation.

If you already have an OpenAI account, you can even hook it up in the operating system to take advantage of all those additional features.

Private AI

This was the most important component of Apple Intelligence and was underplayed in the keynote. Using the built-in neural engine on Apple silicon combined with Apple Intelligence, Apple intends to give us the ability to take intelligence-based actions that can only be accomplished with knowledge of our data. That bit is essential: Apple Intelligence can see your data, but more powerful LLMs, like ChatGPT, cannot.

That gives Apple Intelligence powers that you won’t get from traditional LLMs. Craig explained it with some example requests:

“Move this note into my Inactive Projects folder”, requiring access to Apple Notes. “Email this presentation to Zara”, requiring access to Keynote and Apple Mail. “Play the podcast that my wife sent the other day,” which requires access to data in the Podcasts and Messages apps.

While these commands aren’t as sexy as asking an LLM engine to write your college paper for you, if they work, they’d be damn useful. This is exactly the kind of implementation I was hoping Apple would pursue. Because they control the whole stack and can do the work on device, this feature will also be unique to Apple customers.

“AI for the Rest of Us”

During the WWDC keynote, I only heard the term “Artificial Intelligence” once. At the end, when Craig said, “Apple Intelligence. This is AI for the rest of us.” I think that sentiment summarizes Apple’s entire approach. I agree with the philosophy.

I’m convinced that Apple has considered AI in a way that makes sense to me and that I’d like to use it. The question now is whether Apple can deliver the goods. Apple Intelligence isn’t going to be released for beta testing until the fall, so now we just have promises.

Apple’s challenge is the way Siri lingered so long. You’ll recall that Siri, too, started with a good philosophy and a lot of promises, but Apple didn’t keep up with it, and Siri never fulfilled its potential.

Looking at the Siri example, I should be skeptical of Apple Intelligence and its commitment. Yet, I’m more hopeful than that. The degree of intentionality described yesterday, combined with the extent to which Apple’s stock price is contingent on getting this right, makes me think this time will be different. In the meantime, we wait.

Is AI Apple’s Siri Moonshot?

The Information has an article by Wayne Ma reporting Apple is spending “millions of dollars a day” on Artificial Intelligence initiatives. The article is pay-walled, but The Verge summarizes it nicely.

Apple has multiple teams working on different AI initiatives throughout the company, including Large Language Models (LLMs), image generation, and multi-modal AI, which can recognize and produce “images or video as well as text”.

The Information article reports Apple’s Ajax GPT was trained on more than 200 billion parameters and is more potent than GPT 3.5.

I have a few points on this.

First, this should be no surprise.

I’m sure folks will start writing about how Apple is now desperately playing catch-up. However, I’ve seen no evidence that Apple got caught with its pants down on AI. They’ve been working on Artificial Intelligence for years. Apple’s head of AI, John Giannandrea, came from Google, and he’s been with Apple for years. You’d think that people would know by now that just because Apple doesn’t talk about things doesn’t mean they are not working on things.

Second, this should dovetail into Siri and Apple Automation.

If I were driving at Apple, I’d make the Siri, Shortcuts and AI teams all share the same workspace in Apple Park. Thus far, AI has been smoke and mirrors for most people. If Apple could implement it in a way that directly impacts our lives, people will notice.

Shortcuts with its Actions give them an easy way to pull this off. Example: You leave 20 minutes late for work. When you connect to CarPlay, Siri asks, “I see you are running late for work. Do you want me to text Tom?” That seems doable with an AI and Shortcuts. The trick would be for it to self-generate. It shouldn’t require me to already have a “I’m running late” shortcut. It should make it dynamically as needed. As reported by 9to5Mac, Apple wants to incorporate language models to generate automated tasks.

Similarly, this technology could result in a massive improvement to Siri if done right. Back in reality, however, Siri still fumbles simple requests routinely. There hasn’t been the kind of improvement that users (myself included) want. Could it be that all this behind-the-scenes AI research is Apple’s ultimate answer on improving Siri? I sure hope so.

Apple is Staffing Up Siri

There’s a lot of news lately about Apple staffing up Siri. First we heard that they are adding something like 100 additional engineers to the product. Now the New Your Times is reporting Apple hired Google’s former artificial intelligence chief, John Gannandrea to oversee Apple’s machine learning and artificial intelligence efforts. Reportedly, Gannandrea will report directly to Tim Cook.

Speaking at John Gruber’s Daring Fireball party a few years ago, Apple’s Craig Federighi and Phil Schiller both explained that Apple can still make Siri smart without looking at all of its user’s data the way Google does. I don’t remember the exact example, but they said something like they don’t need to look at your pictures of mountains to teach a computer what a mountain looks like. Nevertheless, Siri does lag behind competing virtual assistants. I found their confidence uplifting because I want both to protect my privacy and for Siri to get smarter.

It looks like Apple is going to try and make Siri better by increasing engineering while maintaining its position on user privacy. I hope this makes a difference because Google and Amazon certainly aren’t standing still. 

Regardless, don’t expect results immediately. I think Siri improvements will be a gradual thing, over time. I think it’s similar to the way Apple has improved its cloud services. They’ve come a long way with iCloud over the past few years, but that would be easy to miss if you weren’t paying attention.

Getting the Most from the Siri Watch Face


Siri Watch Face.png

I have been using the Siri watch face with watchOS 4 as my primary watch face since iOS 11 shipped. Ordinarily, I am not a digital watch face guy. I grew up looking at analog watches and I’ve been primarily using those on the Apple Watch since it first arrived. Nevertheless, I like the idea of a smart watch face on the Apple Watch giving me more timely information, so I went in with the Siri watch face. Also, I spend a lot of time at the sharp end of the stick when it comes to Siri, so I had to give it a try.

The idea behind the Siri watch face is to contextually give users the information most relevant to them at the time. The face itself is the time with a few complications and a scrolling list of information boxes below that you can move throughout using the Digital Crown. Tapping on any of these boxes brings you into the source application. Tap on an event, for instance, and you go to the calendar app.

There are a lot of Apple applications acting as a data sources for the Siri watch face. Using the Apple Watch face you can get information as to when the sun will rise, the weather forecast, and upcoming appointments. It runs much deeper than that, however. Data sources can also include reminders, alarms, timers, the stopwatch, your wallet, workouts, meditation/breathing reminders, HomeKit notifications, what’s now playing on your media device, photos, and even news.

For the two complications, I use the one on the right to display the current date and the left one for OmniFocus.


IMG_3648.PNG

There are a lot of applications feeding data into the Siri watch face. One of the first things I did was customize that. If you go into the Apple Watch settings application on your iPhone and tap on your Siri watch face, you get a screen that gives you several options to turn these data sources on or off. I left most of them on but turned off photos, because pictures on that tiny screen don’t make sense to me, and news, which I found to be too much of a distraction.

I have had a few pleasant surprises using the Siri watch face. I like the way it displays upcoming appointments. They are easy-to-read, and they disappear automatically as the day goes on. Rotating the Digital Crown up gives you future Siri chosen events and spinning the opposite direction brings up prior entries and if you’ve played audio recently, the last playing audio. This gives you an easy way to restart podcast or music from your wrist.

I’ve often been tempted to add the timer and alarm complications to my analog faces, but that complication space is so valuable. With the Siri face timers, stopwatch, and alarms only appear when in use so I get them when I need them and only that. Finally, the now playing entries are great for getting back into whatever audio you played last.

Overall, the convenience of the Siri watch face is enough to get me to stick with it despite my preference for analog faces. I’m going to keep using it for the foreseeable future. If you are going to use it, take the time to go into the settings application and customize the data sources to your preference. 

My biggest wish for the Siri watch face is to see third-party applications get on that data source list. For instance, why can’t I get upcoming OmniFocus deadlines or Carrot Weather reports? Hopefully, that comes with future iterations.

Siri Today and in the Future

Yesterday Wired magazine published an article about the most recent improvements to Siri. Several prominent Apple executives participated including Alex Acero, the Siri lead, and Greg Joswiak.

The focus of the article was the improvement to Siri’s voice with IOS 11. Having used the beta now for several months, I can tell you that Siri is most certainly more expressive than in prior versions. The technology behind it, as explained in the article, is quite fascinating. Rather than using recorded words, they are using phonemes, which are the individual sound components of words assembled by Siri on-the-fly to be as expressive as possible.

One issue I would take with the article is that it almost feels as if they are implying Apple is only working on making Siri more expressive and not generally smarter. I’m pretty sure Apple can walk and chew gum, and from my own experience with Siri, it has continually improved since first released.

An example of this is calendar appointments. Up until about a year ago, scheduling calendar appointments was a syntax-heavy task with Siri. That’s not true anymore. Now there are several ways that you can naturally ask Siri to schedule an appointment, and she usually gets it right. The “usually” in that sentence is the problem. “Usually” needs to become “Always” or “Almost Always”. For Siri, the make-or-break moment is the first time a user tries to do something new. If you try to set a calendar appointment and Siri crashes and burns, you probably won’t try it again. To get more users to buy in, Apple needs to continue to improve that first experience, so users are encouraged to dig deeper.

The Wired article also addresses the different philosophies of Apple versus Amazon with the development of intelligent assistants. Amazon, with the Echo, is opening things for third-party developers making the device work with more services but also requiring users to learn the specific syntax needed to use those newly acquired skills. Apple, on the other hand, wants things to become more natural language-based where users don’t have to use a specific syntax to get work done.

For non-nerd users, natural language seems the only approach. I can’t imagine convincing my wife to memorize the appropriate speaking syntax for every service she wants to use through Siri or Alexa.

I think in the short term, the Amazon approach is easier and gets the ball forward faster. In the long-term, I think the Apple approach could be right if properly executed. If Siri does incorporate machine learning and artificial intelligence the way Apple wants it to, it could ultimately end up leapfrogging the syntax driven approach of its competitors. 

The Siri Complaint Department

alt Mossberg wrote an article over at Recode, Why does Siri seem so Dumb?. In it Walt points out several failings.

data-animation-override>
It seems to me that Apple has wasted its lead with Siri. And now Google, Amazon, Microsoft, Facebook, and others are on the march. Apple has made excited announcements each time it added knowledge domains like sports and movies and restaurants to Siri on the iPhone. But it seems like it hasn’t added any major new topic domains in quite a while.

— Walt Mossberg

I understand that Apple has fixed several of these issues since the article posted but that’s actually part of the problem. Why does it take an article by a popular journalist to get these things fixed? I feel as if Siri needs more attention. I don’t think the underlying technology is as bad as most people think but it is these little failures that causes everyone to lose faith. Siri is a cloud based service and needs to be upgraded and improved every day. While things are better, the rate of improvement needs to accelerate.