Profile: Apple, Privacy... and AI
Purpose is a Differentiator, particularly as it defines what you will and won't do

This week’s World Wide Developer’s Conference (WWDC) was highly anticipated.
How had Apple allowed itself to fall behind in A.I.? And how would they catch up?
While they didn’t announce some AI-specific gadget or new LLM (or kill off Siri), the version they call Apple Intelligence takes practical advantage of their existing ecosystem. Blending seamlessly between the devices, apps, and data (images, calendars, location, etc) into a holistic integrated experience.
In many ways, amplifying much of what Apple products already do.
But if that’s it, what took them so long?
Privacy as a Differentiator
As Tim Cook begins to announce Apple Intelligence, he lists off the non-negotiables for Apple.
“Private” is the last non-negotiable: “Of course, it has to be private from the ground up”. In fact, the words “Private” or “Privacy” are mentioned ~20 times in the hour it takes to announce Apple Intelligence (another 10 times in the previous hour to announce all of the OS updates).
Privacy has long been a core value for Apple*, but it’s taken on importance in the last 10 years. In 2014, Tim issued an open letter outlining Apple’s POV, stating Privacy is a Human Right:
*NOTE: Apple lists 7 Key Values: Accessibility, Environment, Racial Equity and Justice, Education, Supply Chain Innovation and Inclusion and Diversity, along with Privacy
"A few years ago, users of Internet services began to realize that when an online service is free, you’re not the customer," said Cook. "You’re the product. But at Apple, we believe a great customer experience shouldn’t come at the expense of your privacy."
In the decade since, ensuring its user’s privacy has remained at the core of its ecosystem of products and services.
For anyone who buys into that ecosystem, privacy is a feature to sell on. From the bits you see (like passkey and transparent app tracking notifications) to the bits you don’t (data protection on the backend), the free flow between all of the devices is backed up by the security of knowing you control it all.
This kind of protection only becomes more popular as we’re more aware of the data we generate and how it may be used against us.
Concerns over AI and Privacy
Lost in all of the buzz about the potential of AI as a creation tool, the vast majority of its benefit will come from how it acts as an assistant, intuitively helping us organize our lives.
To do this, the AI’s need to know a lot about you which requires a lot of data collection.
For example, Microsoft announced its AI platform CoPilot would have a feature called Recall which would take a screenshot of whatever is on your screen every 5 seconds. Giving you the ability to ‘recall’ anything you’ve seen even if you haven’t saved it. A helpful feature, but one that comes with a big tradeoff. How much do you trust Microsoft to keep all of that information safe?
Microsoft has quickly walked that feature back (it’s an opt-in versus opt-out). However, most of us don’t realize how much personal knowledge AI has on each of us already. You’re likely interacting with AI already without even knowing it.
A.I. platform usage is believed to be at 33% by consumers, whereas the actual usage is 77%.
For any of these these LLMs (large language models) to run, they require large amounts of data. And to date, how they acquire that data hasn’t been transparent.
How much is the public worried about these risks? While many of us don’t quite even realize when we’re using AI, concern remains suggesting what we don’t understand can still scare us:
Private information protection by AI is doubted by 52% of consumers.
57% of consumers globally agree that AI poses a significant threat to their privacy.
It’s not A.I., it’s Apple Intelligence
Not so subtly, Apple is drawing a big distinction between how its systems and devices will use AI to assist you. While investing heavily in ensuring all this AI stuff stays private. Meaning the AI keeps what it learns about you from your apps strictly between you and your device.
For most questions, the device will be enough. For more complex questions, Siri will consult a smarter AI model that runs on Apple’s servers. So when you search your photos for pictures of your kids, you’ll know these aren’t being shared outside as well.
And for even more complex needs, like "Rewrite this email using a more casual tone,” Siri will use ChatGPT only if you give it permission to.
NOTE: Marques Brownlee does a much better job of explaining how this works, watch here
Summary: The Purpose of Purpose
To answer what took them so long? It’s here, to ensure your privacy.
To do this, Apple had to build out its own cloud-computing solution for everyone’s individual data. When the more expedient solution would have been to sign a deal with an external partner but lose control of the privacy protection around that data.
Its this core value telling them what to do and more importantly what not to do. They aren’t going to compromise on this core value for the sake of keeping up. As Tim Cook says in one interview, we’re not trying to be first but we are trying to be the best.
Is this what ‘saving the world’ looks like from a company like Apple?
Yeah, this is what purpose often looks like. Addressing a market need, which can be saving its customers from the market itself.