As I’ve spent considerable time with Apple’s Image Playground in the recent iOS 18.2 beta, I’m left with more questions than answers about Apple’s approach to AI image generation. The most striking aspect is how deliberately unrealistic the output appears — every image unmistakably reads as AI-generated, which seems to be exactly what Apple intended.
The guardrails are everywhere. Apple has implemented strict boundaries around generating images of real people, and interestingly, even their own intellectual property is off-limits. When I attempted to generate an image of a Mac mini, the system politely declined.
This protective stance extends beyond the obvious restrictions: Try anything remotely offensive or controversial, and Image Playground simply won’t engage.
Apple’s cautious approach makes sense. Apple’s customers expect their products to be safe. Moreover, Apple is not aiming to revolutionize AI image generation; rather, they’re working to provide a safe, controlled creative tool for their users. These limitations however can significantly impact practical applications. My simple request to generate an image of a friend holding a Mac mini (a seemingly innocent use case) was rejected outright.
I hope Apple is aware of this tipping point and reconsidering as Image Playground heads toward public launch. At least let it draw your own products, Apple.
Yesterday, Apple announced its new name for artificial intelligence tools on its platform: Apple Intelligence. If you watched the keynote carefully, it was almost humorous how they danced around the term “artificial intelligence” throughout. At the beginning, Tim made reference to “intelligence” without the word “artificial. Then, throughout the rest of the keynote, up until the announcement of Apple Intelligence, Apple relied on its old standby, “machine learning.” Nevertheless, they eventually got there with the announcement of Apple Intelligence.
The initial explanation was telling. They stated five principles for Apple Intelligence: powerful, intuitive, integrated, personal, and private. These principles are the foundation of what they’re trying to ship. Also, in Apple fashion, the term Apple Intelligence doesn’t refer to a single product or service, but a group of intelligence-related features:
Table Stakes AI This is the type of AI that everyone was expecting. It includes things like removing lampposts from picture backgrounds and cleaning up text. We already see multiple implementations throughout the Internet and in many apps already on our Macs. Apple had to do this.
They did, and the implementation makes sense. It’s got a clean user interface and clear options. Moreover, developers can incorporate these tools into their apps with little or no work. It should be universal throughout the operating systems, so learning how the tool works in one place means you can use it everywhere else. For most consumers, this is golden.
Also, it will be private. While I’m a paying customer of Grammarly, I’m aware that everything it checks is going to their servers. That means there are some things that don’t get checked. I’d much prefer to do this work privately on my device.
LLM AI There have been many rumors about Apple developing its own Large Language Model (LLM), but nobody expected them to have one competitive with the likes of OpenAI and Google. So the question was, is Apple going to ship something inferior, work with one of the big players, or not include LLM as part of this initiative? We got our answer with the partnership with OpenAI, which incorporates OpenAI’s 4o engine into the operating system.
This makes a lot of sense. Since the keynote, Craig Federighi has gone on record saying they also want to make similar partnerships with Google and other LLM providers. While nothing is going to be private sent to a company like OpenAI, Apple is doing what it can to help you out. It doesn’t require an account, and it gives you a warning before it sends data to them. Again, I think this is a rational implementation.
If you already have an OpenAI account, you can even hook it up in the operating system to take advantage of all those additional features.
Private AI
This was the most important component of Apple Intelligence and was underplayed in the keynote. Using the built-in neural engine on Apple silicon combined with Apple Intelligence, Apple intends to give us the ability to take intelligence-based actions that can only be accomplished with knowledge of our data. That bit is essential: Apple Intelligence can see your data, but more powerful LLMs, like ChatGPT, cannot.
That gives Apple Intelligence powers that you won’t get from traditional LLMs. Craig explained it with some example requests:
“Move this note into my Inactive Projects folder”, requiring access to Apple Notes. “Email this presentation to Zara”, requiring access to Keynote and Apple Mail. “Play the podcast that my wife sent the other day,” which requires access to data in the Podcasts and Messages apps.
While these commands aren’t as sexy as asking an LLM engine to write your college paper for you, if they work, they’d be damn useful. This is exactly the kind of implementation I was hoping Apple would pursue. Because they control the whole stack and can do the work on device, this feature will also be unique to Apple customers.
“AI for the Rest of Us”
During the WWDC keynote, I only heard the term “Artificial Intelligence” once. At the end, when Craig said, “Apple Intelligence. This is AI for the rest of us.” I think that sentiment summarizes Apple’s entire approach. I agree with the philosophy.
I’m convinced that Apple has considered AI in a way that makes sense to me and that I’d like to use it. The question now is whether Apple can deliver the goods. Apple Intelligence isn’t going to be released for beta testing until the fall, so now we just have promises.
Apple’s challenge is the way Siri lingered so long. You’ll recall that Siri, too, started with a good philosophy and a lot of promises, but Apple didn’t keep up with it, and Siri never fulfilled its potential.
Looking at the Siri example, I should be skeptical of Apple Intelligence and its commitment. Yet, I’m more hopeful than that. The degree of intentionality described yesterday, combined with the extent to which Apple’s stock price is contingent on getting this right, makes me think this time will be different. In the meantime, we wait.
I’ve been playing with Bluetooth trackers since long before the AirTag showed up. There’s always been two schools of thought around these things:
1. Let Me Track Anything
A lot of the initial trackers had no limitations attached to them. If someone steals your thing, you’ll be able to track it without anyone knowing. If someone plants a tracker on you (or on your bag, or car, or whatever), they’ll be able to also track you without you knowing.
2. Just Help Me Find Lost Stuff
These are trackers not meant to remain a secret. Anyone who has an object with one of these trackers will get notified so they are never tracked secretly.
Apple immediately took this second path, which I agree with. I don’t ever see myself chasing down a thief and the idea of someone secretly tracking my location gives me the creeps and I don’t have any vindictive stalkers or exes in my life.
Apple and Google are officially both on board with a recent joint press release. (That’s right, a press release by Apple and Google, together.)
“Today Apple and Google jointly submitted a proposed industry specification to help combat the misuse of Bluetooth location-tracking devices for unwanted tracking. The first-of-its-kind specification will allow Bluetooth location-tracking devices to be compatible with unauthorized tracking detection and alerts across iOS and Android platforms.”
Today 9to5 Mac ran an article about how Apple’s privacy focus comes with a cost of slower app development and fewer features. That makes sense to me. It is harder to develop with privacy limitations and smaller data sets.
This is an old debate. I used to write about this years ago when Apple refused to process user data with cloud servers. For example, Google Photos, as I understand it, does all of its magic on their servers, which requires them to see your photos. Apple Photos does its magic on your device so Apple doesn’t need to see your photos.
There is always some cost to this. The extent of that cost is dependent on how advanced the underlying technologies get. Using the above example with Apple Photos, the fact that Apple now has rocketship-style Apple silicon with dedicated artificial intelligence components, my iPhone is more than good enough to do that photo processing locally without requiring me to share my photos with Apple. That’s a win.
At the leading edge, however, Apple will always be a little constrained as it makes privacy a priority. That used to bother me. Now it doesn’t. Constraints often make things better. Apple will figure this out in a way that does serve consumers and protect our privacy. The other guys aren’t bothering. This is one more reason why I’m using Apple gear.
There is a story making the rounds today about a secret CIA program for which very few details exist except for the disclosure that it involves a mass surveillance program on American soil that included at least some data collection of U.S. Citizens. It looks like the Wall Street Journal broke the article but Fortune has a good summary.
What we do on the Internet has been commoditized for years. If you’ve been paying attention, you shouldn’t be surprised. If advertisers are figuring out when you’re pregnant, don’t you think the government is also taking notes?
At this point, governments (and companies looking to monetize you) are punching holes in the Internet much faster than the folks trying to protect your privacy can patch them. When I was a lawyer, and a client would ask me how to make sure sensitive data was safe “in the cloud”, my stock answer was, “Don’t put it there.” Reading the story about the CIA’s data collection plan is not shocking. It would be surprising if they weren’t doing it. (I expect numerous foreign governments are doing the same things, if not worse.)
Just think about email, for instance. You send an email, and it goes through the Internet pipes to get to your recipient. It has to. No pipes, no email. Clever governments and hackers can snoop in those pipes and capture copies of unencrypted email as it is in transit—we just kind of live with that. If we rewound the clock several decades and discovered that the government was intercepting and making copies of all the mail that arrived in our physical mailboxes, there would have been riots in the streets. Now we just sort of shrug.
All we can do now is try to make smart choices.
Try to deal with companies with transparent ownership and express an interest in privacy through their actions.
Don’t rely on companies that you suspect will one day need to monetize your data to stay afloat.
If you want to be even more paranoid, don’t trust small start-ups. You never know who will end up buying them and inheriting your data.
Wherever possible, use end-to-end encryption.
Seriously, consider why you’re sending data somewhere else.
All that said, I’m not sure how you escape it in the modern world. We live in an age of mass surveillance, whether you realize it or not.
We discuss user data and privacy a lot around here. Here is a Kickstarter project that will actually respect user privacy. Instead of collecting and mining your user data to sell you creepily specific targeted ads, Tim Smith is building Bokeh to be a private, secure, and user-funded social network. For instance, when you post your photos, you get to choose who sees them. Bokeh won’t show who follows you or who you follow. You don’t have to worry about friends of friends seeing your photos. If one of these “friends” has requested to follow you three times and you said no, Bokeh will prompt you to block them.
It’s intended to be a user funded project. No creepy ad-crawling. I sincerely hope this works.
The contrast Apple is trying to draw is with other Silicon Valley giants whose business model is grounded on user data (and advertising)—namely Facebook and Google.
The question gets interesting when you realize there are tradeoffs. Privacy protects users, but access to mountains of user data helps make better, faster, more responsive cloud services, which also benefits users.
If Apple intends to protect user data, are they going to fall behind on the better/faster end of the equation? Probably. But how much?
Those who follow Apple closely have known about their position on user privacy for years. But lately, Apple is more vocal about their preference to protect user privacy. Nearly every time someone puts a microphone in front of Tim Cook, he raises this point.
When these lines were first drawn years ago, there was a lot more digital ink being spilled on the wisdom of Apple’s position. You don’t hear as much about it lately.
So how is Apple doing? From my experience, Apple still is lagging, but not as much as I worried it might.
One way to evaluate this is Photo search in Apple Photos versus Google Photos. Google pioneered the ability to search for contents of photos with words. They have a massive database of photos to work with, and their algorithms can easily find a “dog” in the “snow” from your library of 42,000 photos. Apple added this feature a few years ago, but the difference is that Apple built its models on purchased photo libraries, not looking at all of its users’ photos. Moreover, Apple does the machine learning for these searches not on their cloud servers but instead on your devices. You too can now find a “dog” in the “snow” with Apple Photos. I am pretty confident the search terms don’t update as quickly in Apple Photos as they do in Google Photos, but that is the cost of that privacy thing.
Photos is just one measure, and I am sure if I thought about it long enough, I could find other examples that are both better and worse in comparison. For me, at least, when comparing privacy versus cloud services, I would rather err on the side of privacy. So long as the Apple cloud services are viable, I’m okay if they aren’t the best if in exchange I’m getting a higher degree of privacy.
At first, I tried to quantify it. How close does Apple have to be to Google for me to be happy? 50%? 75%? For me, it is more a question of whether the cloud service is: 1) something I’d use often and; 2) functional. In my case, functionality, even if slower and not quite as good, is good enough. I think Apple gets off easy with my calculus, but everybody gets to set their own threshold, and everyone isn’t as paranoid as I am when it comes to privacy.
One thing everyone can agree on is that this story isn’t over yet.
For years I was one of those curmudgeons that refused to use Facebook in any capacity. I’ve been turned around on that a little bit because of the success of the Mac Power Users and Free Agents Facebook groups at creating a safe, fun place to talk about shared interests. They are both special communities. Nevertheless, Facebook can be a dangerous place if you care anything about your privacy.
There’s a lot of questions about Facebook lately and I’ve been receiving a lot of email from listeners on the subject. I should preface this post by saying I am hardly a Facebook power user. I log in to participate in the above two groups, but that’s about it.
Nevertheless, even this limited exposure could get me in trouble because Facebook likes to collect data. Between the news of the last few weeks plus the recent discovery that they can also collect your call and text history, I decided it was time to spend a little bit of time tuning up my own Facebook settings and thought I should share with you. So here are a few things you can do today.
1. Delete All Facebook Applications from your Phone (and iPad).
A lot of the trouble arising from Facebook starts with their mobile applications. The trouble is that your phone has a lot of information about you and Facebook is insatiably hungry for information about you. Moreover, over the years we’ve had plenty of evidence that Facebook hasn’t been a real team player on the iPhone and they’ve done all sorts of dirty tricks to make sure their app is always front and center. This is both creepy, and it kills your battery faster.
I understand for a lot of people this is asking a lot. Their phone is their primary window into Facebook, and if that is really what you want, I don’t begrudge you. However, if you can live without Facebook on your phone, I think you’re better off. I just use Facebook in the browser on my Mac (or the browser on my iPad), and it’s just fine.
2. Audit your Privacy Settings
One thing Facebook has improved over the years is exposing its privacy settings. Years ago it felt like playing a videogame to find your way to the proper screens. Now it’s all combined in your setting screen under the privacy tab. Go through it and make changes to suit your level of comfort. I would recommend erring on the side of caution. You can always go back and make the settings more open if you’re finding that the more conservative settings are getting in the way.
3. Audit your Application Installations
A big part of the recent problems is that the Facebook API is so liberal that apps you authorize are taking a lot more information than you may think. I have not authorized any apps to access my Facebook data and given the limited way in which I use a service; I expect I never will.
You may have some apps that you want to use with Facebook and that is fine but make sure it is your conscious decision to opt in. Take a close look at the apps tab in your Facebook settings and make sure you feel comfortable with every app you’ve authorized to access your data.
Note there is also a setting on this screen, Apps Others Use, to edit the amount of information other people’s applications can use when accessing your Facebook data. I recommend tapping the edit button and making appropriate changes. I leave very little data exposed this way.
The U.S. Senate has now voted to remove prior regulations prohibiting Internet Service Providers (ISPs)–the folks you pay for your home Internet pipe–from selling your browsing and Internet data to others for fun and profit. This is pretty terrible news if you care at all about your Internet privacy. For a long time now ISP’s have been storing and saving your Internet history data. They know where you go and how long you spend there. This new regulation, assuming it also passes the house and gets signed into law (it will) lets them sell your data.
I hate this.
One of the big arguments in favor of this change by ISPs is that because Google and Facebook are making money from our data, they should get in on the action too. That argument, however, fails. Google and Facebook are services that consumers can use or avoid. Consumers can, in effect, opt out of the madness. That isn’t true with your home Internet connection. Every website you visit and every web service you use are now information available on the open market.
You may be thinking how you don’t do anything particularly nefarious so it doesn’t matter. I think that is short-sighted. Somebody with a few bucks should not be able to find that I spend time at certain banking websites or researching certain medical issues or even websites about one political belief over another. Future employers, or insurers, or anybody else with a check book should not be able to snoop through my browsing records.
This seems to me the kind of thing that you’d want to protect no matter where you stand on the political spectrum. Even though the vote on this is down party lines, I have multiple conservative friends that are up in arms over it.
So what can you do?
1. Complain
I’d encourage you to complain to your congressperson. The House of Representatives hasn’t voted yet and 5calls.org is a great place to start.
2. Get a VPN
Virtual Private Network services allow you to get on the Internet without the ISP seeing where you are actually going. The VPN company will know but, assuming you use a reputable one, they won’t sell your data. I’ve been using VPNs for years. They’re particularly helpful if you spend a lot of time on the road using WiFi that you don’t control. Recently I purchased a one-year subscription from Cloak and right now I’m feeling pretty good about that. I could turn that on at home any time (or selectively) to hold on to my privacy.
3. Go Elsewhere for your Internet Pipe
For a lot of communities, the options are very limited but if you have other options for your Internet service, investigate them. Maybe some of them will make your privacy their selling point.
Before you email me to say I’m being alarmist or to remind me that most of our Internet privacy was already fictional, I understand what you are saying. Nevertheless, I can’t help but feel in the slippery slope of Internet privacy, we’re about to take a pretty long slide.
Jonathan Zdziarski is a well respected security and privacy expert. Now he works for Apple. Jonathan’s explanation of why he took the gig pushes all my buttons.
I think Apple is serious when they talk about protecting user privacy and hiring people like Jonathan. I don’t know if this priority gives Apple much market advantage in the world today where most consumers are pretty cavalier about their privacy but it sure makes me happy to be using Apple products.