When the Machines Remember

Early in my experience with ChatGPT’s voice feature, my daughter walked into the room and I told it to say hello to her. It did, cheerfully, and it totally creeped her out. I turned the voice off.

A week later, I fired ChatGPT back up, and it asked me how Samantha was doing.

That moment hit me like a splash of cold water. This thing remembers. And not in a “session cookie” kind of way, but in a persistent, “let me tell you about yourself” way.

That was the beginning of my realization that these large language models (LLMs) are quietly soaking up everything we put into them. From that point on, I’ve approached the whole thing with very open eyes about how much of me I’m exposing to this frontier model every time I fire it up.

To some degree, this has always felt inevitable. I’ve long suspected it was just a matter of time before we’d be offered a memory-enabled assistant that could draw from everything we’d ever said to it.

What I didn’t expect was how soon that day would arrive.

The Memory Feature

ChatGPT recently added a feature called Memory, which allows it to remember facts about you across conversations. Things like your name, your tone, your preferred writing style, your favorite apps; it’s all fair game.

And yes, that makes it dramatically better at giving you what you want. But of course, that efficiency comes at the cost of increased data retention and exposure.

I’ve spent almost thirty years as a lawyer, carrying around a near-constant fiduciary duty of confidentiality. So you’d think I’d be paranoid about this sort of thing.

And yet, I’ve been surprisingly relaxed about it. If I had to give an explanation, I’d say my cavalier attitude is motivated by curiosity more than anything else. I’d tell you it’s so I can experiment for the MacSparky Labs, but I’m also just curious.

Regardless, I recently decided to test how well it really knows me.

I typed: “Roast me.”

Here’s what I got back:

Alright, David — you asked for it. Here we go:

You’re like the productivity version of a dad joke: efficient, earnest… and just a little too proud of your label maker.

You’ve got so many field guides, I’m surprised you haven’t written one on ‘How to Overwhelm a First-Time Mac User in 5 Easy Steps.’ Your idea of a wild night is alphabetizing your shortcuts while sipping matcha and whispering sweet nothings to your OmniFocus database.

And let’s not forget your digital filing system — more organized than a NASA launch sequence, yet you still manage to lose your AirPods in the couch cushions like it’s a side quest in a MacSparky RPG.

But hey, if there’s one thing you’ve mastered, it’s turning nerd-level productivity into a full-blown lifestyle brand — and somehow making it cool. Respect.

– ChatGPT

Honestly? A little too close to home.

Managing Your Memory Settings

If you’re curious about trying out this memory feature — or turning it off — you can manage it pretty easily:

  • Go to ChatGPT Settings > Personalization > Memory.
  • There, you’ll see whether memory is on.
  • You can toggle it off entirely or delete specific memories.
  • OpenAI also now notifies you when a memory is created or updated, so it’s not operating behind a curtain.

For a deeper dive into how it works, I recommend John Gruber’s take on Daring Fireball. He poked at it similar to me, but he asked it to describe, not roast.

This moment feels like a turning point. AI tools are starting to remember us, and that memory now lives quietly in the background of every prompt. Whether you view that as magic or menace probably depends on your own relationship with data privacy.

As for me, I’ll keep experimenting — with eyes open and curiosity intact — and maybe just a little less matcha near the keyboard.

Black Boxes and Bad Habits

I had to chuckle when I read this Verge article about Meta’s attorneys botching their document redactions during the FTC antitrust trial. Having spent three decades in the legal racket, it was shocking to me how often lawyers (and their staff) were completely flummoxed by the idea of digital redaction. Way more often than not, someone would just draw a black square on top of a PDF and send it to me, blissfully unaware that all the underlying text was still there, searchable, and easily extracted.

You’d think that by now they’d at least teach how to avoid this potential act of malpractice in law school. But alas, it appears they do not.

The funny part is, this isn’t even hard anymore. There are any number of PDF tools on the market today that will properly redact a document and permanently remove the underlying text. But time and again, we see these same mistakes, often in high-profile, high-stakes cases.

It’s easy to make fun of this stuff (and I do), but someday it might not be Apple’s secrets showing through a bad redaction job. It could just as easily be your Social Security number under one of those black boxes.

The legal profession has a complicated relationship with technology. Some lawyers are very tuned in, using tech to get real leverage in their practices. Others act like technology doesn’t exist at all, hoping to retire before it becomes their problem. Incidents like this show that the problem is already here, and pretending otherwise doesn’t help.

Apple’s New Child Safety Update: A Step in the Right Direction

Apple recently posted a white paper called Helping Protect Kids Online, where, among other things, they disclosed an upcoming software update that will make it easier to set up child accounts and provide age ranges to developers. I am sure part of the reason for this is pending legislation making platform owners and software developers more responsible for keeping kids out of apps they shouldn’t be in.

The dangers of social media for children are well documented in Jonathan Haidt’s book, The Anxious Generation. Haidt dives deep into how social media has contributed to rising anxiety and mental health struggles among young people. I’d recommend anybody interested in this topic give that book a read.

Social media companies have been doing a lot of finger-pointing at Apple and Google, arguing that platform owners should be responsible for ensuring that children don’t access inappropriate apps. Their claim is that the platform owner is in the best position to determine the age of the user. I’d agree with that.

However, I don’t think platform owners are exclusively responsible for fixing this problem. Indeed, part of me thinks that Facebook had better be careful what it wishes for. If Apple goes through with this update — and I fully expect them to — we will have a reliable age verification system on these devices. That means companies like Meta will have no excuse to claim they didn’t know children were using their applications.

I’m generally very privacy-sensitive, but in this case, I think the trade-off is worth it. Yes, Apple and app developers will know your age range, but at least we’ll have a system in place to help avoid the problems outlined in The Anxious Generation. I’m glad Apple is taking steps in this regard. But this isn’t just Apple’s responsibility. Other participants in this (specifically, social media companies and parents) need to do their part as well.

For parents, this is a good reminder to take an active role in your child’s online experience. With Apple making it easier to set up child accounts and manage access, now is the time to familiarize yourself with these tools. Technology can help, but ultimately, it’s a combination of smart tools and engaged parenting that will make the biggest difference.

Apple Disables Advanced Data Protection in the UK

In another chapter of its ongoing battle with the UK government, Apple has pulled the plug on Advanced Data Protection for UK customers. In a statement, Apple explained, “We are gravely disappointed that the protections provided by ADP will not be available to our customers in the UK…”

Instead of complying with the UK government’s demands for a system backdoor, Apple opted to completely disable the Advanced Data Protection feature for UK users.

The big question remains: Is this move enough to satisfy the UK government? We don’t know if this is a strategic play by Apple or if it represents a negotiated agreement between the two parties.

This story isn’t over yet. Meanwhile, UK citizens looking to keep their data private are left holding the bag.

The UK’s Encryption Backdoor and Apple’s Privacy Crossroads

For a long time, I felt it was only a matter of time before some nation called Apple’s bluff on iOS security and demanded a secret backdoor into user data. While the United States has made several threats over the years without following through, it appears the United Kingdom has now mandated access to end-to-end encrypted data, complete with a gag order preventing Apple from disclosing this fact. Apple hasn’t addressed this publicly, and it’s unclear whether they’ve taken any action to comply. My guess is that Apple will initially appeal and contest this order as far as they can take it.

If the UK is serious about this, Apple faces a real challenge. The thing about secret backdoors is that they never remain secret for long. The system is either secure or it isn’t, and adding a backdoor firmly places it in that second category. Moreover, I expect the act of adding a backdoor would jeopardize the European Union privacy compliance requirements.

This is a big deal for both Apple and us consumers. If the United Kingdom really presses this point, I could see Apple deciding to stop selling iPhones there altogether. If they do capitulate, you can expect every country to demand its own method of accessing end-to-end encrypted data. And any hope of user privacy will vanish. To add to our frustration, the existence of the gag order makes public discourse on this opaque demand difficult.

Apple’s Angle for HomeKit Products: Privacy

While Apple has been in the HomeKit game for many years, thus far they have shied away from making HomeKit products like locks and cameras. If the rumors are to be believed, that’s going to change over the next few years. Apple is rumored to be planning its own set of HomeKit cameras including a doorbell camera. I wouldn’t be surprised if they have other products on the drawing board as well. The question is: Why is Apple getting into this game now?

Apple normally doesn’t enter a space unless they have something to distinguish themselves from the competition. In this case, my guess is privacy. There have been a lot of stories about vendors mismanaging home security data and video feeds through a combination of mismanagement and malice. With its size and focus on privacy, Apple is in a unique position to make HomeKit accessories we can trust to put in our homes. Ideally, Apple would pair this move with a rebirth of their AirPort wireless routers to offer consumers the entire widget. And if Apple is getting into this game, I’d argue it’s because they can offer a level of consumer privacy not currently available.

Apple’s Image Playground: Safety at the Cost of Utility?

As I’ve spent considerable time with Apple’s Image Playground in the recent iOS 18.2 beta, I’m left with more questions than answers about Apple’s approach to AI image generation. The most striking aspect is how deliberately unrealistic the output appears — every image unmistakably reads as AI-generated, which seems to be exactly what Apple intended.

The guardrails are everywhere. Apple has implemented strict boundaries around generating images of real people, and interestingly, even their own intellectual property is off-limits. When I attempted to generate an image of a Mac mini, the system politely declined.

Drawing a Mac mini is a no-go for Image Playground

This protective stance extends beyond the obvious restrictions: Try anything remotely offensive or controversial, and Image Playground simply won’t engage.

Apple’s cautious approach makes sense. Apple’s customers expect their products to be safe. Moreover, Apple is not aiming to revolutionize AI image generation; rather, they’re working to provide a safe, controlled creative tool for their users. These limitations however can significantly impact practical applications. My simple request to generate an image of a friend holding a Mac mini (a seemingly innocent use case) was rejected outright.

I hope Apple is aware of this tipping point and reconsidering as Image Playground heads toward public launch. At least let it draw your own products, Apple.

Hope Springs Eternal for Apple Intelligence

Yesterday, Apple announced its new name for artificial intelligence tools on its platform: Apple Intelligence. If you watched the keynote carefully, it was almost humorous how they danced around the term “artificial intelligence” throughout. At the beginning, Tim made reference to “intelligence” without the word “artificial. Then, throughout the rest of the keynote, up until the announcement of Apple Intelligence, Apple relied on its old standby, “machine learning.” Nevertheless, they eventually got there with the announcement of Apple Intelligence.

official Apple Intelligence text and iPhone image from their website after the june 10 2024 announcement.

The initial explanation was telling. They stated five principles for Apple Intelligence: powerful, intuitive, integrated, personal, and private. These principles are the foundation of what they’re trying to ship. Also, in Apple fashion, the term Apple Intelligence doesn’t refer to a single product or service, but a group of intelligence-related features:

Table Stakes AI This is the type of AI that everyone was expecting. It includes things like removing lampposts from picture backgrounds and cleaning up text. We already see multiple implementations throughout the Internet and in many apps already on our Macs. Apple had to do this.

They did, and the implementation makes sense. It’s got a clean user interface and clear options. Moreover, developers can incorporate these tools into their apps with little or no work. It should be universal throughout the operating systems, so learning how the tool works in one place means you can use it everywhere else. For most consumers, this is golden.

Also, it will be private. While I’m a paying customer of Grammarly, I’m aware that everything it checks is going to their servers. That means there are some things that don’t get checked. I’d much prefer to do this work privately on my device.

LLM AI There have been many rumors about Apple developing its own Large Language Model (LLM), but nobody expected them to have one competitive with the likes of OpenAI and Google. So the question was, is Apple going to ship something inferior, work with one of the big players, or not include LLM as part of this initiative? We got our answer with the partnership with OpenAI, which incorporates OpenAI’s 4o engine into the operating system.

This makes a lot of sense. Since the keynote, Craig Federighi has gone on record saying they also want to make similar partnerships with Google and other LLM providers. While nothing is going to be private sent to a company like OpenAI, Apple is doing what it can to help you out. It doesn’t require an account, and it gives you a warning before it sends data to them. Again, I think this is a rational implementation.

If you already have an OpenAI account, you can even hook it up in the operating system to take advantage of all those additional features.

Private AI

This was the most important component of Apple Intelligence and was underplayed in the keynote. Using the built-in neural engine on Apple silicon combined with Apple Intelligence, Apple intends to give us the ability to take intelligence-based actions that can only be accomplished with knowledge of our data. That bit is essential: Apple Intelligence can see your data, but more powerful LLMs, like ChatGPT, cannot.

That gives Apple Intelligence powers that you won’t get from traditional LLMs. Craig explained it with some example requests:

“Move this note into my Inactive Projects folder”, requiring access to Apple Notes. “Email this presentation to Zara”, requiring access to Keynote and Apple Mail. “Play the podcast that my wife sent the other day,” which requires access to data in the Podcasts and Messages apps.

While these commands aren’t as sexy as asking an LLM engine to write your college paper for you, if they work, they’d be damn useful. This is exactly the kind of implementation I was hoping Apple would pursue. Because they control the whole stack and can do the work on device, this feature will also be unique to Apple customers.

“AI for the Rest of Us”

During the WWDC keynote, I only heard the term “Artificial Intelligence” once. At the end, when Craig said, “Apple Intelligence. This is AI for the rest of us.” I think that sentiment summarizes Apple’s entire approach. I agree with the philosophy.

I’m convinced that Apple has considered AI in a way that makes sense to me and that I’d like to use it. The question now is whether Apple can deliver the goods. Apple Intelligence isn’t going to be released for beta testing until the fall, so now we just have promises.

Apple’s challenge is the way Siri lingered so long. You’ll recall that Siri, too, started with a good philosophy and a lot of promises, but Apple didn’t keep up with it, and Siri never fulfilled its potential.

Looking at the Siri example, I should be skeptical of Apple Intelligence and its commitment. Yet, I’m more hopeful than that. The degree of intentionality described yesterday, combined with the extent to which Apple’s stock price is contingent on getting this right, makes me think this time will be different. In the meantime, we wait.

Addressing Unwanted Tracking

I’ve been playing with Bluetooth trackers since long before the AirTag showed up. There’s always been two schools of thought around these things:

1. Let Me Track Anything

A lot of the initial trackers had no limitations attached to them. If someone steals your thing, you’ll be able to track it without anyone knowing. If someone plants a tracker on you (or on your bag, or car, or whatever), they’ll be able to also track you without you knowing.

2. Just Help Me Find Lost Stuff

These are trackers not meant to remain a secret. Anyone who has an object with one of these trackers will get notified so they are never tracked secretly.

Apple immediately took this second path, which I agree with. I don’t ever see myself chasing down a thief and the idea of someone secretly tracking my location gives me the creeps and I don’t have any vindictive stalkers or exes in my life.

Apple and Google are officially both on board with a recent joint press release. (That’s right, a press release by Apple and Google, together.)

“Today Apple and Google jointly submitted a proposed industry specification to help combat the misuse of Bluetooth location-tracking devices for unwanted tracking. The first-of-its-kind specification will allow Bluetooth location-tracking devices to be compatible with unauthorized tracking detection and alerts across iOS and Android platforms.”

Apple and Privacy

Today 9to5 Mac ran an article about how Apple’s privacy focus comes with a cost of slower app development and fewer features. That makes sense to me. It is harder to develop with privacy limitations and smaller data sets.

This is an old debate. I used to write about this years ago when Apple refused to process user data with cloud servers. For example, Google Photos, as I understand it, does all of its magic on their servers, which requires them to see your photos. Apple Photos does its magic on your device so Apple doesn’t need to see your photos.

There is always some cost to this. The extent of that cost is dependent on how advanced the underlying technologies get. Using the above example with Apple Photos, the fact that Apple now has rocketship-style Apple silicon with dedicated artificial intelligence components, my iPhone is more than good enough to do that photo processing locally without requiring me to share my photos with Apple. That’s a win.

At the leading edge, however, Apple will always be a little constrained as it makes privacy a priority. That used to bother me. Now it doesn’t. Constraints often make things better. Apple will figure this out in a way that does serve consumers and protect our privacy. The other guys aren’t bothering. This is one more reason why I’m using Apple gear.