Essential: Apple’s Craig Federighi on privacy protection
Apple SVP Software Engineering, Craig Federighi, spoke at the European Data Protection and Privacy Conference on December 8. Here’s what he said:
Complete transcript follows
“I’m here today to talk about our work to protect and strengthen privacy—not only in Apple’s products, but across the technology industry.
This topic could not be more important. Like you, I believe that privacy is a fundamental human right. And all of us, in both government and business, have an obligation to protect that right.
At Apple, this is not just a matter of advocacy in capitals across Europe, nor a matter of compliance, or staying on the right side of regulations.
Our commitment goes deeper than that. It’s built right into everything we create.
And I mean that literally. My team and I build the operating systems that power all of Apple’s products—from the underlying frameworks to the applications to the user interfaces. We’re responsible not only for upholding Apple’s commitments to privacy, but for actually embodying those commitments as code.
More recent Apple privacy reports
- Here is what developers must share in Apple’s App privacy labels
- Apple Europe VP on privacy, the environment and COVID-19 recovery
- Privacy: How to prevent tracking with ‘Superhuman’/marketing emails
- Read Apple’s promise to introduce privacy features Facebook opposes (updated)
- Read Facebook’s angry response to Apple’s ATT privacy promise
- WATCH: Apple’s latest ad makes privacy funny (seriously)
- How to use Apple’s Siri privacy controls
Never before has the right to privacy—the right to keep personal data under your own control— been under assault like it is today. As external threats to privacy continue to evolve, our work to counter them must, too.
I’d like to share a few insights into the approaches that Apple engineers have taken to maximize privacy and to give users control of their data. And I want to tell you about the direction we hope to go—as a company, and in partnership with policymakers like you.
Privacy has been a core commitment at Apple since the very beginning.
Apple began in the 1970s as a personal computer company. The word “personal” was not incidental. The computer was yours, and the data was, too—stored, in those days, on floppy disks. Data wasn’t on a server somewhere operated by someone else. It was quite literally in a shoebox next to your desk.
It’s hard to overstate how radical that was relative to the reality before us today.
When I first worked for Apple in the nineties, privacy was not a thing that most tech companies really talked about. People in the industry thought we at Apple had some kind of weird fixation on privacy. And you know… they were right.
Privacy was—and remains—one of our deepest values.
People who work at Apple are attracted not just by our products, but by our values. Our culture is built around creating the kinds of products we want for ourselves, for our friends, and for our families. We want our privacy respected, and we want the same for our customers around the world.
When it comes to privacy, our commitment starts at the top—with our CEO, Tim Cook, our Executive Team, and our Board. And it flows through every part of the company.
We live by four key principles, and they build upon each other:
First and foremost—and this is a familiar idea here in Europe—is data minimization.
The mass centralization of data puts privacy at risk—no matter who’s collecting it and what their intentions might be. So we believe Apple should have as little data about our customers as possible.
Now, others take the opposite approach. They gather, sell, and hoard as much of your personal information as they can. The result is a data-industrial complex, where shadowy actors work to infiltrate the most intimate parts of your life and exploit whatever they can find—whether to sell you something, to radicalize your views, or worse.
That’s unacceptable. And the solution has to start with not collecting the data in the first place.
Second, to avoid the risks of moving data off device, Apple processes as much of your data on your device as possible. Data that stays on your own device is data that you control.
Third, when data is collected by Apple, we make that transparent, and we help you control how it’s used.
And fourth, we see security as the foundation for all of our privacy protections. If your data isn’t secure, it’s not going to stay private. And our unique model of integrated hardware and software is key to enabling these strong protections.
Now I’d like to describe how we put these principles into practice.
At Apple, every product and every feature is assigned a lead from our dedicated team of privacy engineers. From day one of the design process, they work alongside the other engineers as part of the core project team.
Now, I imagine many companies can say something like this. But what makes us different is that Apple’s privacy engineers are not trying to find justifications to collect as much data as possible. Quite the opposite. If we can’t say that we’ve ensured the best outcome for privacy and the user’s experience, we won’t ship that software to our customers. Period.
The creation of iPhone was a critical juncture in the expression of these privacy values. And in designing iPhone, we understood we were creating a more personal device than any that had come before. One that would be with you wherever you go; one that would become integral to how you live, work, and communicate. And one that would constantly have a network connection.
We were stepping into an exciting space—but it was also a space where the stakes were exceptionally high. We knew that we’d have to architect an unprecedented level of privacy protection into this new and even more personal personal computer.
Enabling truly private communication was one of the challenges we faced from the beginning. Our customers were not only using iPhone to record some of the most important experiences in their lives, and to store some of their most sensitive information… They were also sharing some of those things with other people. And they rightly expected that those communications would be private—no less so than a conversation that they might have in person, in a closed room.
The problem is that communication over the internet is rarely so direct as that. When you communicate with another person online, your message doesn’t reach them until it has traveled through a number of intermediaries—from the free Wi-Fi you might use to the internet service providers that are the backbone of the Internet. If the data you send is unprotected, those intermediaries and others are able to listen in on your conversation. And they can exploit what you say for their own purposes.
So we chose to build end-to-end encryption into iPhone’s core communication tools—beginning with FaceTime in 2010, and continuing with iMessage in 2011.
The result is that iPhone users don’t have to worry that their private conversations using iMessage and FaceTime will be intercepted. We’ve designed these features so that bad actors can’t listen to these communications, and neither can anyone at Apple.
For quite a while, Apple was the only major technology company to provide that protection. Others developed their own systems—but they were systems that didn’t have privacy foremost in mind. Some, in fact, were designed specifically to collect your personal information, to track it, store it, compile it, and even monetize it.
End to end encryption would counteract all of that. So other companies didn’t offer it.
But over time, their customers saw the protections that Apple customers enjoyed—and demanded those protections for themselves. So now, ten years after Apple launched FaceTime, even the most data-hungry technology companies have started building encryption into their communication products.
For those of us at Apple, that’s deeply gratifying. As Tim Cook has said, we want to be the ripple in the pond that creates larger changes. That means showing customers that it’s absolutely possible to design technology that respects their privacy and protects their personal information. So customers should expect that—and demand it. And that’s what we see happening.
In other words, we don’t define success as standing alone. When it comes to privacy protections, we’re very happy to see our competitors copy our work, or develop innovative privacy features of their own that we can learn from.
At Apple, we are passionate advocates for privacy protections for all users. We love to see people buy our products. But we would also love to see robust competition among companies for the best, the strongest, and the most empowering privacy features.
Of course, communication isn’t the only arena where these protections are necessary. Location is another.
Where you go says a lot about who you are. Like whether you go to a particular place of worship. Or a particular medical clinic that specializes in a particular illness.
There is an enormous potential for this kind of data to be misused. And the way some apps are designed, users may have no idea that they’re giving it away.
For that reason, Apple’s view is that users deserve to be more aware and in control of their location data. That’s why we’ve said it’s not enough for an app to ask you once whether it’s OK to track your location—and then, years later, even if you’ve never opened that app again, to continue tracking you in the background.
We’ve added new features so that users can allow access only while the app is running, or just for one session, giving users more granular control. We’ve also added reminders for users when they’ve granted an app background access to their location. It is up to you to decide whether an app has a legitimate reason to know your location, and it’s up to you whether you want that access to continue.
Our customers have been happy to use those controls, and third-party data clearly demonstrates this. The ad-tech firm Location Sciences reports that they’ve seen a 68% reduction in background location data available for ad targeting for iOS users since iOS 13 shipped last year.
This year, we saw another opportunity to empower users. Now, all of us are familiar with apps that ask to track your exact location. Some apps need that level of specificity—to provide turn- by-turn directions, for example. But many others do not.
For instance, apps that provide local recommendations don’t need to know precisely where you are, where you live, or where you work—which, again, can tell them exactly who you are. An approximate location—roughly 20 square kilometers around you—is more than sufficient. The same is true for local news apps, which can help you learn about what’s happening in the city you’re in without knowing the street corner you’re standing on.
That’s why, with iOS 14, we launched a feature called Approximate Location to enable users to only provide apps with… an approximate location. This is data minimization in action—apps that only have what’s necessary, and nothing more.
This is another place where I think we’ve raised the bar for the technology industry. Apple led the way with more granular location controls. And now our competitors are beginning to provide them as well.
Now, this brings me to one of the biggest privacy challenges we all face: tracking.
Apple has long been a leader on this issue—even before we launched iPhone, in fact. When we introduced our browser, Safari, in 2003, we were the first to block third-party cookies.
You are well aware that on other browsers, certain ads will follow you from site to site. What that tells you is that data brokers and advertisers have come up with new ways to track you online. And this was our motivation, three years ago, to introduce Intelligent Tracking Prevention—or ITP for short—to detect and block covert tracking on the web.
ITP uses machine learning on your device to distinguish between sites you’re visiting and sites you’re not. When sites you’re not visiting try to load your data, ITP stops them from doing so. And to give you even more awareness of what’s happening, Safari now gives you a privacy report, showing the tracking activity that ITP has detected.
When we launched ITP, other companies—the ones that had grown very attached to invasive tracking—said that users didn’t deserve to have these protections. And they claimed that ITP would, quote, “sabotage the economic model of the internet.”
Well, that hasn’t happened. Users have far stronger protections than they did before. And instead of collapsing, these companies have adapted. The ad industry as a whole has posted revenue increases every year since ITP launched—even as users’ privacy is now better protected.
Much like our encryption and location privacy features, ITP has also helped users understand that they can and should expect privacy protections in their web browsers. As a result, the technology industry again is following us down this path.
Of course, there’s more to do on this front. And there always will be.
We’re especially excited about one additional feature that for us represents the front line of user privacy today. You can think of it, in effect, as ITP for apps.
The new feature is called App Tracking Transparency, or ATT.
Its aim is to empower our users to decide when or if they want to allow an app to track them in a way that could be shared across other companies’ apps or websites. To do that, early next year, we’ll begin requiring all apps that want to do that to obtain their users’ explicit permission, and developers who fail to meet that standard can have their apps taken down from the App Store.
Requiring permission is a big change from the world we live in now.
And because it’s a big change, it has to be made in collaboration with the developers themselves. We want to make sure everyone is able to continue to deliver a rich experience for users.
Of course, some advertisers and tech companies would prefer that ATT is never implemented at all. When invasive tracking is your business model, you tend not to welcome transparency and customer choice.
You’ll find Federighi speaking at around 49.01 in
Just as with ITP, some in the ad industry are lobbying against these efforts—claiming that ATT will dramatically hurt ad-supported businesses. But we expect that the industry will adapt as it did before—providing effective advertising, but this time without invasive tracking.
Getting this right will take time, collaboration, listening—and true partnership across the entire technology ecosystem. But we believe the result will be transformative.
That goal—to constantly raise the bar for privacy across the technology industry—is one that all of us here share. And it is one that we can only achieve together, in collaboration, as leaders in government and business.
Through GDPR and other policies—many of which have been implemented by Commissioner Jourová, Commissioner Reynders, and others here with us today—Europe has shown the world what a privacy-friendly future could look like.
Indeed, Apple has called for an omnibus privacy law in the U.S. that would mirror the European approach: one that empowers consumers to minimize collection of their data; to know when and why it is being collected; to access, correct, or delete that data; and to know that it is truly secure.
Yet on their own, even the most visionary laws are not enough in themselves. These principles behind the regulation have to find expression in the technology that companies like Apple create. So as policymakers look at the evolving landscape and decide what steps are essential, we do the same, with the unique tools at our disposal.
Speaking again as an engineer: we are never content. Old solutions become out-of-date pretty quickly; the pace of change is relentless. But so, I think, is the pace of progress. Every day, we’re working to expand the frontier of what’s possible. To deliver great product experiences and great privacy, without compromising either.
Of course, the tools available to engineers and policymakers are very different. But our efforts can inform and reinforce one another—as they must. Together, we achieve results that would be impossible alone.
For those reasons and more, we at Apple need your partnership.
It’s already clear that some companies are going to do everything they can to stop the App Tracking Transparency feature I described earlier—or any innovation like it—and to maintain their unfettered access to people’s data. Some have already begun to make outlandish claims, like saying that ATT—which helps users control when they’re tracked—will somehow lead to greater privacy invasions.
To say that we’re skeptical of those claims would be an understatement. But that won’t stop these companies from making false arguments to get what they want. We need the world to see those arguments for what they are: a brazen attempt to maintain the privacy-invasive status quo.
ATT, we believe, reflects both the spirit and the requirements of both the ePrivacy Directive, and the planned updates in the draft ePrivacy Regulation. ATT, like ePrivacy, is about giving people the power to make informed choices about what happens to their data. I hope that the lawmakers, regulators, and privacy advocates here today will continue to stand up for strong privacy protections like these.
I also hope that you will strengthen Europe’s support for end-to-end encryption. Apple strongly supported the European Parliament when it proposed a requirement that the ePrivacy Regulation support end-to-end encryption, and we will continue to do so.
You can also count on us to keep doing more to empower our users to control their own data. We’ll keep working to raise the bar on what people expect. And keep challenging the entire technology industry to clear that bar.
In technology, evolution can happen so quickly that five or ten years can seem like a very long time. But when I consider Apple’s work on privacy, I take a much longer view. I try to imagine how the work we’re doing now can impact the future decades from now—even a century from now.
I can’t be sure, but I do have a hope.
I hope that we will be remembered not just for the devices we developed, and what they enabled people to do; but also for helping humanity enjoy the benefits of this great technology… without requiring that they give up their privacy to do it.
It’s in our power today to end that false tradeoff… to build, for the long term, not just a foundation of technology, but a foundation of trust.