Facebook finally agreed, last week, to turn over to Congress ads that appear to be linked to Russia that ran during the 2016 presidential campaign. And Mark Zuckerberg, Facebook’s chief executive, is now talking about protecting the “integrity” of democratic elections. But the last several weeks have been a public relations nightmare for the company, with investigators probing what role it may have played in Russian efforts to “hack” the election, and activists in Myanmar complaining that the social network is censoring posts critical of the country’s government, which is currently perpetrating wide-scale ethnic cleansing. There is also the continuing concern over “fake news” and Facebook’s commitment to monitoring it. Over the weekend, the Washington Post reported that then-President Obama pulled Zuckerberg aside at a 2016 meeting in Peru and begged him to take the threat of fake news more seriously.
To discuss all these subjects and more, I spoke by phone with Zeynep Tufekci, an associate professor at the University of North Carolina–Chapel Hill School of Information and Library Science, a contributing opinion writer at the New York Times, and the author of Twitter and Tear Gas: The Power and Fragility of Networked Protest. During the course of our conversation, which has been edited and condensed for clarity, we discussed why Facebook can’t or won’t fix its problems, why propaganda spreads so easily on the platform, and why 2020 could be even worse.
Isaac Chotiner: How big a deal do you think these targeted ads were?
Zeynep Tufekci: OK, to be honest, I think the fact that there was foreign intervention of this kind is, on principle, a big deal. In reality, I would argue that it was one of the smaller vectors going on. But it should definitely be taken very seriously, because it kind of exposes certain fault lines that I think people have not been paying attention to before: The way that you can peddle conspiracy theories the same way you can peddle shoes, and the fact that the platform is built for virality. The fact that its user interface flattens things so much that the fake Denver Guardian looks the same as the New York Times. The fact that outrageous stuff, or very feel-good stuff, really does well on the platform algorithmically. So there are all these things. There’s the lack of transparency, the power of how the algorithmic programming—which feeds into the business model—promotes certain kinds of discourses and creates certain kinds of public spaces. This is something that I’ve been writing about for a long time.
So in general, I’m happy that [these issues] are getting attention, but it almost doesn’t matter that the Russians, quite likely, jumped on that bandwagon. I’m pretty convinced they jumped on that bandwagon, as one more party, but they’re just one more party, right? The problem is the big structural issue with what the platform easily enables.
I have a colleague at UNC. His name is Daniel Kreiss. His example is how much Facebook was basically an in-house ad agency in some ways, to the Trump campaign, which was short-staffed. That’s Facebook’s business model. And I’m not begrudging them their business model. A lot of these things are a natural consequence of the way they want their business, how they make their money, how they automate a lot, and don’t hire too many humans. Humans are expensive, algorithms are cheap. Humans don’t scale, and algorithms scale. Cultural problems don’t scale, and having one set of rules for the whole world scales.
But then you have these consequences. You have a place that’s optimized to make you pliable to ads. Right? It’s a place optimized to sell you certain kinds of messages. Any kind of messages. And by not distinguishing between what those messages are, and also by making considerations that determine how to make us pliable to those ads, Facebook is now also controlling our political information flow, our personal interactions. One set of rules that’s supposed to be good for some things, but it’s used across the board in a very powerful way for politics, for interaction, social interaction. Everybody’s kind of in that space, and I don’t think it’s been good.
We have spent a whole year saying that people just read what they want and ignore what they don’t, and don’t trust what they read. So why should we be worried about Russian propaganda going viral? Why is it so potentially powerful?
The thing is, what doesn’t stick is TV ads. OK? TV ads do not stick. We have ample evidence from scholarship that media effects aren’t very strong. Facebook is a place of socialization. And socialization is the most powerful shaper of humans and their culture. That’s why we have different cultures. We socialize ourselves into different cultures. So whenever we see a true experiment on Facebook, we see enormous real effects.
I find the idea that socialization doesn’t matter, and Facebook is a strong place of socialization, to be ridiculous, because what matters then? Why is there anything called culture if socialization is not a powerful force? The reason Facebook is important, in a way that TV ads are not, is that it’s a place that you interact with your social networks and information from your own information sources, plus pay-for-play stuff. They all mix together in this very flat interface, and that really matters.
You see something for a year, and it’s coming from an environment where your social networks are sharing it. I think it has an effect. Otherwise you’d have to argue that human cultures never change, and humans never change. They clearly do.
So how much can Facebook do to fix this now?
What we see from Facebook as a company is that, on the one hand, they seem to be quiet about the fact that their market compensation is approaching half a trillion dollars. Like, they’re really good for something. And they’re also like, “Oh, we didn’t do anything.” Either Facebook is a giant con, and that half a trillion dollars is a complete con, and they’re not good at the things they claim they’re good at. Or its power of influence is really important. You cannot be that powerful, obviously convince the world and advertisers that you’re still powerful, and have everybody try to use you to influence everyone else. But as soon as somebody does use you for exactly what you’re designed for according to exactly how your business model operates, just throw up your hands and say, “Oh, we’re powerless. It’s just the people.”
How do you understand this hesitation or unwillingness? One theory is that this stuff goes against the sort-of ingrained positivity that Silicon Valley prides itself on, and so they are unequipped to deal with downsides.
Right. I’ve argued that for a long time. But I don’t think it’s just optimism. Because I have no problem with being optimistic. I think some of this is just PR, right? I mean, they’re smart people. They either have a very influential platform that’s worth half a trillion dollars or they don’t. This isn’t an optimism or not question, this is a very simple point of logic that I don’t think it’s hard to explain to an engineer just because they’re an optimist or not.
The problem is that a bunch of people in Silicon Valley who are really smart at one thing—my geek tribe, right? I love my tribe that way. There are really cool things that they’re really good at. But a big fallacy of a lot of smart people is that they think, they genuinely believe that because they’re smart in one domain, they’re also equally smart in every other domain. I think that’s kind of the arrogance, the lack of humility, the lack of acceptance that a lot of things you’re really not good at. Every time a crisis like this happens, Facebook executives and engineers are like, “How could we have foreseen it?”
I can point them to thousands and thousands of academics. I can point them to first-year grad students who could’ve pointed stuff out to them. Some of this isn’t hard. Some of it isn’t a lack of optimism, or pessimism. It is understanding how the world works. It’s understanding how power works. They’re so homogenous. They’re all smart in such a narrow domain. And also so arrogant that their narrow domain intelligence applies to everything equally under the sun. And they just kind of talk themselves into, “Oh, this is unforeseeable.”
And their business model often depends on them not understanding this. They only react after the fact. So they make their money and they grow by reacting after the fact, and kind of just sort of steamrolling over everything, and they have become a half-trillion-dollar company. Everything we talked about is serving them now. They’re making a lot of money. Their engineers and executives are all in very good financial shape, they have a lot of power, and we don’t really have leverage over them. They’re effectively without competition. They just get occasionally embarrassed, and they kind of react, and then just move on.
How worried are you about this in 2020, and is there anything the government can and should be doing to deal with this as a legislative matter?
I am very worried, and once again, this is the part I want to emphasize. I think the Russian interference getting all this publicity is good in some ways. And I’ve been writing about it for a long time. But it bothers me, in some ways, in that what I wish people paid attention more to, is what it exposes. To be honest, to the degree there was a Russian operation, it was an amateur operation. It was not some super sophisticated thing. They just used Facebook the way it’s designed. It’s not like some deep understanding of U.S. politics and some very sophisticated spy thing. What happened is the United States has all these big fault lines and issues that are simmering and have been simmering. For me, the bigger point is what this exposes, right? You can just go in with a bunch of stupid memes and join the misinformation campaign that’s already been out there.
But look how pliable Facebook was to become this breeding ground of misinformation, how much its algorithms and business model helped it along, and how big an audience there also was. There are all these political questions, and socio-technical questions that are being exposed, and if the Russians hadn’t done what they had done, and if it’d just been a bunch of political operators in the U.S., which there were many people doing very similar things, I think it’s just as much of a problem. Of course it’s a problem that another nation state is intervening in electoral politics. But even if there had been not a single thing that came from that direction—
Can the government do anything about the structural problems?
Of course. We’ve always had issues that we’ve needed a combination of government controls, public pressure, company ethics, and Facebook’s own workforce, because there are a lot of good people there. There are a lot of good people who are thinking hard about these things. I think these are hard things. I’m sympathetic to how hard these things are.
To be honest, I use Facebook. It’s a great product in many ways. It’s allowed me to do a lot of things. But it has too much power in a way that is unchecked. I don’t want it to become the ministry of truth, either. But it’s such an important issue that—there’s a role for government. There’s a role for public pressure. There’s definitely a role for all the people in Facebook who are trying to take this seriously.
What is the role for government, then?
Find a way to create a situation in which, if I want to get off Facebook because I’m disgusted with some set of [Facebook] policies, that is viable for me to do so without losing access to both personal and civic things that I can, at the moment, only do on Facebook. At the moment, Facebook is effectively without competition and again, they do some things really well. But there’s no market discipline on there. A boycott would be a joke because you’d be cutting yourself off from life.
If there was more effective market competition, and there was more transparency —they got away with not disclosing political ads because the government was like “Oh, OK. Let’s just let you get away with this,” even though if you do the same thing on a radio station, you’re forced to disclose. You do that on a TV station, you’re forced to disclose. You do this on the most important information conduit in the 21st century, you just get away with it.
I don’t want government to come and try to manipulate the public sphere, either. I want them to help create a situation in which there’s more accountability, more transparency, and more choice, and more sensible things to deal with the complexity of the current public sphere that we’re all dealing with.