Election Meltdown: Professor Brendan Nyhan
S1: Hello, Slate Plus, members, I’m Dahlia Lithwick, and they cover the courts and the law for Slate. And this is going to be your third exclusive bonus episode of our Hot, Hot, Hot Election Meltdown series. Election law professor and election meltdown author Rick Hasen is here again. Hey, Rick. Hello, Dahlia. This week’s show tries to wrangle this huge and tangled and knotty issue of dirty tricks in U.S. elections.
S2: This week, we’re looking at the question of what kinds of underhanded things could people do to try to swing an election from misinformation to hacking into voter registration databases to old fashioned ballots stealing.
S1: And a key challenge is how we deal with this kind of disinformation this time, because we did it through us for a loop last time and there’s no reason to believe it’s gonna be better. You wanted to draw on Brendan Nyhan experience for this one.
S2: So, you know, people say, well, you know, the Russians spent one hundred thousand dollars on Facebook ads. They must have swung the election to Trump. And it turns out that, you know, very easy to believe that if you don’t like Donald Trump, but the way in which misinformation and disinformation affects elections is a lot more complicated.
S3: Brendan Nyhan, professor of government Dartmouth, who’s done a lot of research on fake news and political influence campaigns on social media. He’s done surveys. He’s tried to figure out what exactly does disinformation do. And the thing about Brendan’s work is that he tends to push back on the notion that the Russians somehow controlled or dictated the outcome of the 2016 election. Let’s have a listen.
S4: The 2016 election was notable in a number of respects. One was the unprecedented volume of misinformation and the role played by social media. So I want to break that down. The Russian interference effort was unprecedented in how brazen and open it was and seemed to have succeeded at gaining widespread distribution via Facebook and Twitter to many Americans. So the Russian bots and trolls and all the different ways that the Russian government effort tried to reach Americans was successful in the sense that millions of Americans had at least some contact with content that was produced by by the Russian government or people associated with it. And the goal typically was to divide Americans. In some cases, they promoted mis and disinformation. But in other cases, they simply highlighted real stories or real issues that divided Americans or that they thought might polarize us against each other. The question, though, is what effect that effort had and to assess that, too. To answer that question, it’s necessary to think about all the different kinds of information that people are exposed to during a campaign. While it’s true that millions of Americans were exposed to content that was created or amplified by Russian bots and trolls, at the best of our research suggests that’s a tiny percentage of the information that anyone was exposed to during the campaign. And everything we know about campaign communications suggests that those brief momentary exposures are very unlikely to have a lasting effect on people’s vote, choice or decision to turn out. We can’t rule out very elaborate stories about tens of thousands of votes in pivotal states in the upper Midwest, but there’s no convincing evidence that the Russian interference effort change the election outcome. The effects were probably quite minimal in some ways. The most powerful effect of the Russian effort was the way it’s divided our country against itself. In the period since, in that sense the Russian interference being detected may have been more of a feature than a bug. In other words, it may have served Russia’s interests more to be detected with this relatively low cost, low quality ham-handed operation than to have done something more sophisticated and and covert. The being detected created this ongoing controversy that has divided us in the years since and in part helped to undermine the legitimacy of the current president.
S5: Where does the Russian misinformation fit into misinformation war generally in 2016?
S4: So in 2016, the Russian misinformation was one part of a kind of torrent of misinformation that is directed at Americans, and it seems to have made up a very small part. The same is true for the so-called fake news Web sites that have received so much coverage in that period since the 2016 election. Those untrustworthy websites which frequently published false or unsupported claims about the candidates and about contemporary politics were like the Russian content. A tiny percentage of most people’s information diets. So it seems as though their effects were quite limited. Both represent very worrisome precedents. If we don’t act to address those concerns, they could become much worse. If the social media platforms gave a base for fake news entrepreneurs to do even more in 2020, that would be bad. If Russia scaled up its efforts in 2020, that would be bad. But in terms of 2016, the critical point to remember is that mainstream news still was the primary source of information for the overwhelming majority of Americans. And anyone following the campaign, therefore, was getting most of their misinformation from domestic political actors, from the conventional campaign itself, and most notably from the candidate who went on to eventually win the presidency. Donald Trump, who has engaged in an unprecedented level of misinformation since the very beginning of his campaign and there was almost no way to avoid being exposed to it.
S5: If you follow the campaign at all, and how would you say Fox News and here compared to in terms of its influence, compared to the influence of Russia?
S4: These comparisons are very difficult to make without the kind of granular data that’s not totally available to us. I think it’s fair to say that more Americans are regularly exposed to misinformation via Fox than were regularly exposed to misinformation via, say, Russian bots and trolls. That’s not to say that everything on Fox is misinformation, of course, but it does amplify certain kinds of misinformation, particularly in the primetime shows that are the most partisan. And in that way it plays a critical role in the misinformation delivery mechanisms that are most important on the right now.
S6: One of the things that you’ve been saying a lot and I’ve heard from Bobby Chesney and Danielle Citroen is that one of the biggest dangers of misinformation is it’s not that people will necessarily believe the lies, but that they’ll start disbelieving the truth. I think they call this the liars dividend.
S7: Is there good evidence that this is a problem that we’re starting to worry about? All of these things.
S4: I don’t know of good evidence to support that claim in contemporary American politics. But I’m worried about it. Observers of authoritarian and semi authoritarian countries have often commented on the way that their information environments are polluted. In politics these days, the fear is less about propaganda that convinces everybody to believe it and more about the way in which it degrades the information environment and causes people to give up on figuring out what is true. That pattern seems to have recurred in enough countries that the. Historical evidence indicates we should be worried about it happening here. I don’t know, though, that there’s convincing quantitative evidence of that process happening. For instance, since 2015 and 2016, in the way that story might suggest, I think it’s more of a possibility than a documented phenomenon.
S5: But it is very worrisome. So in 2017, in the US Senate race in Alabama, we saw some Americans trying to emulate the Russian tactics, particularly Democrats supporting Doug Jones, although Jones himself was not involved. There were these efforts to also ham-handed to make it look like the Russians were involved.
S7: You think those were successful efforts? And how would we know if they were or weren’t? You know, there was a pretty close race as.
S6: As Alabama Senate races go.
S4: I don’t think we know how successful they were. It’s very difficult to evaluate the success of campaign influence efforts that take place via Facebook because the platform is so closed and that problems become worse in 2016. The project was a failure in so far as the people who funded it. And even some of the operatives who carried it out have since repudiated it. I think that style of campaign has at least been stigmatized. I don’t think we know, however, how effective it really was. And it does highlight the risk that domestic political actors can use some of the same tactics that the Russians have used. And that really complicates some of the questions we’re dealing with here. There’s a very strong consensus that foreign actors have no place in our elections, but the questions become much more complicated when it comes to people who are part of our political process, but are stretching the boundaries of conventional politics. And, you know, our our our legal and regulatory system hasn’t fully caught up with that problem. And I think are kind of media ecosystem hasn’t either. You know, the platforms certainly have been caught off guard again and again by these sorts of incidents. And there’s no reason to think that they will catch the next one in time. You know, you can’t rerun elections or at least you couldn’t without great damage to the legitimacy of our democracy. So these kind of last minute sneak attacks remain a worrisome threat.
S7: So now the big thing that people are talking about for 2020 are deep fakes and other manipulated video and audio. You think they present a greater danger in the sense that we’re seeing things with our own eyes and maybe it’s harder to determine what counts as misinformation.
S4: I’m in the camp that the threat from deep fakes is so far overhyped. We have seen no successful mainstream deep fake having an important effect on national politics here in the United States or really anywhere else in the world. Most of the effort in that area is devoted to pornography and isn’t being used in the political sphere. The threat, I think, is more from what have been called cheap fakes, the kinds of low quality distortions of video and audio that take relatively little skill to create and can therefore be created at a large enough scale that some of them may break through and get traction before they’re detected and addressed. Right. So examples include a video of Nancy Pelosi that was altered to make her look like she was drunk. That spread in the spring of twenty nineteen, as well as a video of CNN correspondent Jim Acosta that was manipulated to make him look like he was pushing a White House staff member or intern, which took place in November. Twenty eighteen. So those are two examples where these low quality video manipulations spread widely.
S6: That didn’t take almost any sophistication at all to create who are the people most likely to be swayed by misinformation? Does.
S5: Does it matter by age or by political party? Does it matter?
S6: By, you know, education? Do we know anything about that? Because there’s so much micro-targeting now, you know, you can figure out who you want to send your messages to. So what what what do we know about those who are emotionally susceptible? What does that tell us about strategies to counteract misinformation?
S4: I think the ideal target audience for political misinformation is your core supporters, the people who have the strongest reason to believe some misinformation about the other side, which is almost always negative. Right. The misinformation that gets the most traction is overwhelmingly about why the other side is bad, lying, evil, malevolent, etc. And your core supporters are going to be most likely to believe that. And to help amplify it via their own social networks, word of mouth and so forth. That means that the ecosystems that committed partisans live in are an important kind of breeding ground for misinformation that may disseminate from that kind of digital fever swamps out into the mainstream. And it’s important to. Address misinformation at that stage. Be before it spreads more widely and we’ve seen this transmission process again and again as far as fringe claims move into mainstream discourse, whether it was the birther myth or claiming that Barack Obama was a Muslim or the pizza gate conspiracy theory. Again and again, these kinds of claims have started with a group of fringe hard core believers and then have been amplified by elites and refracted out into the mainstream.
S5: I want to switch gears now to the other topic I want to talk to you about. We can talk about misinformation, but I want to turn to how people talk about elections.
S6: And one of the things that Donald Trump has done is talked about stolen or rigged elections. But he’s not the only one. And I’m wondering, you know, what do you think are the dangers of people talking about elections not being legitimate? And how do we even know what this idea of election legitimacy is and how people have confidence in election outcomes?
S4: We could have a whole political science seminar on the concept of election legitimacy, but I think for our purposes, it’s it’s useful to at least recognize that this kind of diffuse sense of legitimacy, that the rules of the game are fair and that the election will go to the deserving winner is a kind of baseline condition for the peaceful transfer of power that is core to democracy itself. Right. The public needs to believe and affirm that the process that they’re going through, if they choose to participate, is going to be undertaken fairly. And if that legitimacy is sufficiently called into question, then the stability of the political system itself can potentially come into question. Now, the Unites States is quite far away from that, but we’ve seen that pattern in other countries, and it’s why observers worry about attacks on the legitimacy of the U.S. electoral system. Those seat may seem harmless until they’re not. All right. And again, that’s a pattern that’s occurred in other countries. When a system loses legitimacy, that the commitment to that system can come into question quite suddenly. And so it can be a kind of dangerous rhetoric to call the integrity of the system into question. That doesn’t mean that one shouldn’t criticize what’s wrong with the political system, though. So it’s a delicate balance to acknowledge the flaws of our system and how can be improved while also affirming the joint commitment of everyone participating in the system to its continued operation and to its baseline fairness, however imperfect.
S6: So final question, how worried are you on, say, a 1 to 10 scale about the 20 20 election going off successfully or unsuccessfully, a 10 being the most worried about about things breaking down and one being not worried?
S4: As a quantitative social scientist, I struggle to score myself on the scale. I would say I’m about I’m a seven. It depends how close the election result is. If. The margin is sufficiently large, I think it’ll be difficult for actors on either side to fundamentally call the result into question. But if we have a Florida type situation, all bets are off. And then I become much more worried. We’ve had a series of very close presidential elections, including ones in which candidates win who lose the popular vote. And if we’re really on that kind of a knife edge where the democratic legitimacy of the result is undermined in part by the structure of the American system in the way that popular vote losers can win. That’s dangerous stuff. And of course, it’s especially important to note that we have a president with unusually authoritarian tendencies in the White House at the time where this kind of a controversy could occur. That’s an especially. Worrisome situation, if we really are in sum, if we if the election process moves into unprecedented territory. Imagine a four a two thousand type situation with Trump in the White House. It’s easy to imagine lots of ways that process could go off the rails. I don’t think. Democracy itself is at stake. But the potential for. Real democratic erosion is not something to be to be ignored. The risk is there and it will depend on everyone defending the norms that our democracy depends on at that key moment.
S3: If we get there a 7, well, that’s really comforting. I mean, if that’s a 7, you know, who knows what a 9 looks like?
S8: Yeah. Now, if I want what Brendan Nyhan is having. If that’s a seven. Nevertheless, he’s pretty worried, but not super worried. And maybe that’s the best we’re gonna do this week. I’m just gonna have a long, cold drink. Seven between now and next week.
S9: That is a wrap for this exclusive Slate plus edition of Amicus. Thank you so much for listening in. And thanks to Rick for his hard work on the series. If you want to get in touch, our email, as ever, is Amicus at Slate.com. You can always find us at Facebook dot com slash amicus podcast. We worked really hard on this series and we would love your feedback. Today’s show was produced by Sara Bermingham.
S10: Gabriel Roth is editorial director of Slate Podcasts and June Thomas is senior managing producer of Slate podcast. We will be back with you with Part 4 of Election Meltdown next week with a big question. When is it ever okay to say that an election was, quote, stolen? We’re going to talk about the rise of incendiary rhetoric with Professor Carol Anderson. Charles Howard Candler, professor of African-American studies at Emory University, whose book One Person no-vote How Voter Suppression is Destroying Our Democracy is, I think, one of the greatest books written on the topic.
S11: And I really commend it to you as a companion to listening to the series and to Rick’s Book election.