Tim Cook on Why It’s Time to Fight the “Data-Industrial Complex” “They’re manipulating people’s behavior,” the Apple CEO tells GQ in a frank conversation about his hope that Apple’s new privacy initiatives will help solve some of the internet’s scariest problems. By ZACH BARON, January 28, 2021, in GQ
It has become relatively common for tech observers and even regular everyday citizens to warn about the insidious threats poised to society by technology companies run amuck. But those warnings rarely come from the top of the industry, and even more rarely do they come from someone as powerful and influential as Apple CEO Tim Cook. But on Thursday morning, Cook joined the CPDP Computers, Privacy and Data Protection conference in Brussels to give a surprisingly blunt and direct speech decrying the emergence of what he called “the data-industrial complex.”
In what was widely interpreted to be a reference to Facebook and other Apple competitors (none of which he named), Cook described a vast and opaque industry that has arisen around the capture of massive amounts of personal data, often without the knowledge of users, which is then aggregated and monetized and —at times — used for nefarious ends. This practice, Cook said, “degrades our fundamental right to privacy first, and our social fabric by consequence,” and contributes to an ecosystem full of “rampant disinformation and conspiracy theories juiced by algorithms.” It is a world, as he put it, referencing a now nearly-ubiquitous idea in tech criticism, in which you are no longer the customer, but the product.
Cook also highlighted two new Apple features. The first is what the company is calling a “privacy nutrition label” — a section on App Store product pages that explains every app’s privacy practices, including what they do with your data. The second, already more controversial, is App Tracking Transparency, a feature that will require apps to get permission before tracking your data, and which will become mandatory in the very near future. ATT, as Apple calls it, has been hailed by privacy advocates around the world as a welcome step in the effort to shore up individual rights against a massive and sometimes unscrupulous tech industry; it has also been harshly criticized by some of Apple’s competitors, like Facebook, which continues to rely on some degree of tracking to target the advertising it sells. In a December full page ad in the New York Times, the Washington Post, and elsewhere, Facebook alleged that “these changes will be devastating to small businesses” who depend on tracking-based advertising to build their brands and sell their products. (Needless to say, Apple disagrees.)
At the center of all this is Cook, who gave his speech in a suit and tie, but had already replaced them with a cozy looking vest when we spoke, less than an hour later. Over video chat, Cook is mild-mannered and earnest; he began and ended our conversation by throwing up the peace sign, in greeting and farewell. But he’d also begun the day by castigating, in harsh and unsparing language, unnamed “purveyors of fake news and peddlers of division” and lamenting the loss of “the freedom to be human” — not exactly stock or safe CEO speechifying. “A social dilemma,” he said, “cannot be allowed to become a social catastrophe.” During our conversation, we talked about the principles and trade-offs behind Apple’s push for greater privacy for its users, and just how dire the situation is around data harvesting and its effect on our social fabric; we also talked about whether Tim Cook is as addicted to his iPhone as the rest of us are to ours.SPECIAL LIMITED OFFERSUBSCRIBE TO GQ1 YEAR FOR
GQ: I thought we could start with your speech this morning: I was struck by the vehemence of it. You talked about “Rampant disinformation and conspiracy theories juiced by algorithms.” You talked about the degradation of our very social fabric. This isn’t normally the kind of thing you hear from a CEO of a major company. I’m curious what brought you to this point, to speak with that kind of urgency?
Tim Cook: Yeah, I’ve probably never been a normal CEO. That’s probably good to point out from the word “go.” I feel that way, Zach. I feel very much that we’re in a situation today where the internet’s become too dark a place. It can be so empowering. And yet what has happened is the tracking sort of without our consent — I don’t mind tracking with consent — but I think too many people are just tracking [without our consent], and people either are not aware of it, or they’re not aware of the extent of it. And so what we’re trying to do is sort of bring it back to people, and give people the power, give people the choice. Because you can see what happens when that’s not the case.
I’m curious about exactly that: what happens when we don’t get to choose what happens to our data. Obviously Apple talks a lot about privacy as a value, but what exactly do we have to be worried about? Is there an example that you could give of the kind of alarming stuff you are seeing as a result of what you call the data industrial complex?
Here’s one that I think is not well understood, but that your reader might be interested in: Think about for a moment, if you all of a sudden find out that you’re being surveilled every moment of the day. Your online life is being surveilled. And if somebody develops a 360 degree view of that, what is going to happen to your behavior over time? You’re going to restrict it. You’re going to begin thinking, Well, I don’t really want somebody to know that I’m exploring that, or looking at that, or investigating that. And you’re going to restrict and restrict and restrict. And who wants to be in that world where we’re self-censoring ourselves in such a mass way across society that you wind up with people that are thinking less, feeling less, doing less? This is not an environment any of us wants to be a part of. And I worry that that’s where we’re currently headed.
This issue, privacy, has at times seemed very personal to you — is that in fact the case?
Well, it’s personal for Apple in that we’ve been focused on it from the start of the company. We could see that this digital footprint piece could be abused in a way. We weren’t sure exactly how, but we knew that it would not be good. And unfortunately, that has played out in so many areas. You don’t have to look too far to find some recent examples, both in things that are in the news and not in the news. And so, this is not about Tim. This is about Apple and Apple really giving the user, empowering the user to make a choice. Apple’s always been about democratizing things, you know, democratizing technology — you used to have to be a gazillionaire to make a movie. Now you can make a movie on your iPhone. We love that. We love democratizing things like this, and we love democratizing the data down to the individual, where the individual is deciding whether they choose to share it or not.
I wonder if you could make concrete something you just alluded to, in terms of examples in the news of bad consequences stemming from the abuse of digital data. When you say that, is there something specific that you have in mind?
Well, you know, the tools that are being used to develop a 360 degree view of people’s lives are the same tools that are used to target them to form extremist groups, or to organize to ransack the Capitol, or whatever the situation may be. And these are not separate kinds of things. They’re the same thing. They’re manipulating people’s behavior.
There’s been a deep focus on how other tech companies, which provide their services for free, become places where the user is the product, right? That’s a phrase you’ve used before. And they use all this information to build algorithms that addict people, or radicalize them, as you allude to. How much of these new efforts on Apple’s part are an attempt to differentiate you guys from other companies who share the tech space with you?
It’s not about differentiation, Zach, it’s about the user. We are very user-focused. The whole company revolves around the user experience. And one of the key elements of the user experience these days is about privacy. And so, if you were to go back and sort of graph the last 10 years [of Apple], or more than that really, you would see a continual level of tools that are being shipped to help people protect their privacy. A few years ago, we shipped intelligent tracking prevention on Safari. This year, we’re talking about nutrition labels, our privacy nutrition labels and ATT. But each year we’re doing something. And so, just like we have a long term roadmap for iPhones and iPads, et cetera, we try to make a contribution on privacy every year as well.
You mentioned the user, and as a user myself, I think the iPhone is a magical device. I live through it daily. But also increasingly, in part because of some of the stuff that you’re referencing — occasionally it’s like holding a Medusa in my hand. There are all these apps that want to addict me with algorithms, or track me, or whatever. I wonder: do you ever feel that way? And do you ever worry that as a result of some of the stuff you’re talking about, people are developing those kinds of ambivalent feelings around your products, even though it’s not your apps that are doing it?
Yeah. I hope people can see sort of what we’re bringing to the table there. When we learned that people were worried about the amount of time they were spending on their devices, we created Screen Time. When parents were worried about their kids, about what they were looking at, we invented or created a lot of parental controls. And so we’re always trying to be either one step ahead, or, if we see something develop in a bad way, we’re trying to quickly figure out what we can do to contribute to the fix of it. We can’t fix everything. But we can do a lot. We can contribute in a meaningful way. And that’s what we’re trying to do.
I know you’re talking to me as the CEO of Apple, but I’m just so curious: Do you ever feel that way about your own phone? Like: Man, there’s a lot going on here, and I have a complicated relationship to it.
Sometimes I feel like I get interrupted too much. And so I’ve used Screen Time in a way for me to sort of strip out a lot of the notifications that I was getting. I asked myself, “Do I really need to know this in the moment, every moment?” And I wound up plummeting the number of notifications that I was getting. It’s a great tool. I don’t know if you’re using it or not, but I would encourage you to use it. Because what I found was that I thought I was getting a lot less than I was getting. And all of a sudden you see what the facts are and you go, “Well, why do I need all of this?”
You’re surely familiar with the critique of some of the stuff that you guys are doing now, which is that the sites and companies that provide a free internet to the rest of us do so basically by selling ads, and that these changes by Apple are going to threaten the business of all kinds of ad-supported sites and publishers. What’s Apple’s response to that?
Well, I think ads are great. And ads have existed a long time, and they’ve existed in times without this sort of invasive targeting. But we’re not trying to get anybody to change their business model. We have no objective to do that. All we’re trying to do is give the individual the right to say, “I want to be tracked,” or “I don’t.” That’s all we’re trying to do. And then with the privacy nutrition label, we’re just trying to give the user more facts so they can make an educated decision of whether they want to download this app or not. It’s not aimed at anyone. It’s about giving the user more power.
A few weeks ago, when you guys suspended Parler from the App Store, you went on Fox News and defended the decision. But the Fox News point to you was basically: “Well, why should you get to decide?” I imagine people will ask a similar question here. Why do you get to decide what does or doesn’t happen to people’s data?
Well, we don’t. I don’t want to decide, to be clear. Because I think that you and I may make a different choice. And what we want to do is supply people the tools so that they can make the decision themselves. That’s opposed to the environment right now, where companies are deciding. A company should not decide about whether they’re going to vacuum up data or not. It should be a conscious decision by you or I, whether our data, and what data, can be vacuumed and how it can be used.
Speaking about making the decision yourself… I live in California, where Apple has partnered with the state to create CA Notify, which is a COVID-19 exposure notification app. And as you were talking, I was thinking that I still haven’t enabled it, precisely because I had some concerns about being tracked. (The app, as Cook is about to point out, does not in fact track you.) How do you guys think about solving problems where one imperative — incredibly important work related to contact tracing — runs, or at least seems to run, into a second imperative, which is privacy?
Well, as it turns out, they’re not mutually exclusive because that is the most privacy-centric implementation that you could imagine. And we’ve sort of designed the architecture in that kind of way. And so it’s only the individual that decides, just like it is with ATT. It’s the individual that decides whether to share data or not. And so I can commit to you that that’s a very privacy-centric app. And I would encourage you to use it. We’ve worked with many, many states on it now, and we’re hoping that it can actually become a national kind of program, instead of a state by state kind of thing.
I’ve seen you say a couple of times that the current state of things regarding data collection and user privacy is unsustainable. I want to make sure I understand what you mean when you say that: What exactly is unsustainable about the current way things are done?
I think it gets back to the user. The current system is not sustainable because the user doesn’t like it. We’re getting a lot of feedback from users that they want the power to make a choice. That it shouldn’t be made for them. And so that’s what we’re supplying, is the tool to do that. I don’t think that the users — and the bulk of the population are users of technology in some way, not just ours, but the sum total of all of the market — I think that you sort of have a user revolt. And the power is with the people. I’ve always believed that. And I still believe that. And so what we’re trying to do is empower them in this way with data.
Obviously a lot of businesses currently run on this super data harvesting-centric model. Anything provided to you for free means that you are effectively the product, and the money comes when access to you and your data are sold to a third party advertiser. What do you think the future is for businesses like that? Do they have a future?
Yeah, I think so. I don’t think people have to change the business model, the business model being that ads support a product. I think that’s very achievable. It’s just that what I think is probably not going to be achievable is doing that without users agreeing to do it, because I think at some point governments are going to step in and regulate that. I mean, just think about it: Would we accept that in any other part of our life? No. It would be unacceptable. And so that’s the sort of the way I look at it. And I’m not sure the exact road. But I’m really confident and really optimistic about where we will head with this. I think we’ll give power back to the user. I think that the ad business can be very robust without these kinds of invasive, 360 degree profiles. And I think some people will say: “You know, I’m okay with giving my 360 degree profile because I like the targeted [ads].” And so people will be in the driver’s seat. Some will do it. Some will not. The ones that will not will have a different kind of ad experience, but still an ad experience that’s robust.
We’re discussing a decision that is located exclusively on Apple devices and the App Store, but is the hope that this will have a domino effect and change behavior more broadly across tech? And if so, walk me into the utopia we might arrive in, if everything happens the way it should happen in terms of people regaining control of their data.
You know, we try to be the ripple in the pond on this. And so it would be great if other people copied this and said, you know, users should have the right, they should have the control. And it would be great if companies would regulate themselves, so to speak, to get there. I’m not sure whether that’s realistic. But I think governments are going to step in and do it if the companies do not. And I’m not sure what the exact path of that happening will be. It’ll be different in each part of the world probably, but I think it will happen because users will demand it. And that utopia is a great utopia because you’ve lined up with all the great things about the web — the empowerment that you get, the knowledge that you gain, the things you can create — without the drain of the things you don’t want to give, like your data.