S1: This ad free podcast is part of your slate plus membership.

S2: Welcome to if then the show about how technology is changing our lives and our future.

S3: I’m April Glazer and I’m Will Agreements. Hey, everyone, welcome to. If that were coming to you from Slate and Future Tense, a partnership between Slate, Arizona State University and New America. We’re recording this on the afternoon of Friday, November 2nd. We’re days away from the midterm elections in the United States. It’s the first major national election cycle since the 2016 presidential elections. These elections are, of course, critically important to the country’s future. Democrats have a chance to take back the House. New governors will be elected and so on. But they’re also important as a test of our election system. And in particular, the capacity of major tech platforms to fix the problems of misinformation and foreign interference that so muddled the 2016 elections.

Advertisement

S4: That’s right. So today we’re going to have a special episode of if then all about the midterm elections and the role of Silicon Valley, an online media and our beloved democratic process. We’re going to start with a roundtable with two extra tech journalists as if anybody wanted more tech journalists. Kevin Roose from The New York Times in Paris, Martineau of Wired, who have been reporting on issues of online speech, misinformation and election interference this year.

S5: I’m excited. Have both of them here with us in the studio. And then we’ll have an interview with one of the country’s top experts on election security and voting systems. He’s the former White House deputy chief technology officer. Ed Felten, we’ll talk to him about the problems that could rear their heads this cycle, namely with the very outdated tech that many of us are forced to use to cast ballots. Some of the voting machines we rely on are well over a decade old. Some don’t have a paper trail. Some seem to be very vulnerable to hacking.

Advertisement
Advertisement
Advertisement

S6: But here we are and we’ll end our show with a very special. Don’t close my tabs, Ed.. Where are we taking a look at the best way to watch the results as they come in?

S4: Tuesday night online today, we’re joined by Parris Martineau, a staff writer at Wired covering tech, online extremism and social media. Welcome, Paris.

S5: Hey, we’re also joined by Kevin Ruths, a columnist for Business Day and a writer for The New York Times Magazine. Kevin’s column, The Shift, examines the intersection of technology, business and culture, which, as far as I can tell, just means you get to cover all the weirdest and most interesting stories in the tech world. Welcome, Kevin. Thank you for having me. Big fan of the podcast.

Advertisement

S7: That’s very nice of you to say. So roundtables are super weird because we usually like to interview people and it’s like people on either side of the net here. But we’ll just have a conversation this time.

S4: And I actually want to start with disinformation, which is a topic I know we’re all really fascinated by. And I think it be good to kind of start with some examples to illustrate the types of things we’re talking about and we can move into some larger questions.

S8: There’s definitely been some disinformation circulating around social media that has to do with the election. But there’s also been disinformation circulating on social media that doesn’t have to do with the election.

Advertisement

S9: Right. Like but it has been happening kind of around the proximity of the election because the elections are happening on Tuesday. And I know that you, Kevin, I’ve been following disinformation on the caravan quite closely. And I’m curious if you could kind of give us some examples about the disinformation you’ve seen and also how that could affect the election. I know I’ve seen it kind of siphoned into some political messaging.

Advertisement

S1: Yeah. So I think we’ve sort of had enough of these events now that sort of take over the news cycle and inspire a wave of disinformation that we kind of know the pattern. So, for example, as you said, this happened with the caravan. The minute that, you know, sort of big news stories start breaking on this caravan, you started to see things going viral on social media. And some of them had sort of doctored or mislabeled images. So they would, you know, take an image from, you know, 2012 street protest. And with with, you know, a policeman with his face bloody from. From, you know, being hit with a rock or something. And they would say, look, that the caravan is is, you know, attacking policemen. And even though the images had nothing to do with the caravan. So we saw a number of instances like that. We also saw a big wave around the Cavenagh confirmation hearings. Christine blousy Ford was so turned into this character in right wing social Internet spaces and a lot of fake and mislabeled images, a lot of sort of viral rumours took hold. I mean, and you could just sort of go to any large conservative Facebook group or Twitter, you know, profile and you would see just examples of blatantly false and misleading information. And all this is sort of produced by what? From what I can tell, like a pretty tight group of people. And sometimes they want to sort of deliberately sow confusion and sometimes it’s just shared by sort of innocent randos who see it on their Facebook feed and agree with it and don’t really feel like fact-checking it. So I’ve been doing a lot of that stuff, but that’s not as you said, that’s not specifically election related. But it does play into these broader themes that I think. Do influence how people vote.

Advertisement
Advertisement

S5: Right. Because they make the migrant caravan bearing down on us is supposed to scare people into getting out to the polls. I know you’ve done work also on the sources of online misinformation and amplification of these false stories. What have you found about who’s sharing these? Who’s coming up with them?

S10: Yeah. More often than not, these sort of stories seem to come from the usual suspects of places. Be at 4chan or 8chan. Read it gab before it was shut down. It’ll often start with maybe one person posting a specific meme or making a particular claim, which then ends up being repeated on someone’s YouTube channel or in a Facebook group, which eventually will then make it to a larger, more open sphere is like a subreddit or on Twitter, and then from there it’ll kind of take off. I guess. Another good example of kind of this phenomenon is in relation to the recent totally bunk allegations made by Jacob Wall and CO. About Mueller, Gateway Pundit had written an article on these totally false allegations and you could kind of track in the minutes and hours afterwards on Crowd Tangle how these things start to spread. And at first you saw a bunch of power users on Twitter in the extreme right for your kind of pick up the Gateway Pundit article on the allegations, and then it proliferated to more mainstream users on Twitter. Then it went to read it and kind of climbed up on the popular subreddits like the Donald.

Advertisement

S11: Right. So like when I see so these articles, they get written on like, you know, Gateway Pundit. And then I see these kinds of websites that get started just for the purpose of Facebook pages, having somewhere to send traffic to, then take that Gateway Pundit article that was like premised on something that percolated somewhere in the cobwebby corners of the Internet.

Advertisement
Advertisement

S7: Whether it’s usually not garbage, they see it like on Dischord or I see it on the chance somewhere.

S12: And then they like just block quote that Gateway Pundit article and that becomes the kind of like click bait place or that kind of click farm that it goes to. And then we see this stuff has proliferate wildly.

Advertisement

S13: One example that I want to say about how misinformation the caravan has been kind of siphoned into the campaigns or the campaigns around the upcoming elections, as we saw stuff like from the Republican Senate candidate in Maine, Eric Brakey. I guess that’s how you say it. He spread the theory that there were of ISIS operatives in the refugee caravan. He was like tweeting about that. A member of Congress who’s up for re-election in Florida shared a video that the refugee caravan claimed it might show people that were getting paid by George Soros. And so like I’ve seen a lot of like fixation on the caravan, on on Nancy Pelosi, on Soros stuff. That’s not necessarily true. That is kind of getting. That’s not necessarily about the election either. Getting kind of. And to be clear, these people in their House districts and stuff have they should not be running on the caravan. That is not should not be an election issue. But it’s something that they’re that they’re pulling into their ads, they’re saying on Twitter.

Advertisement

S1: Well, that’s something that I’ve sort of observed this cycle that I think is new and really interesting is there used to be this adage that all politics is local. Yeah. You know, if you were running for city council somewhere in Kansas, you wouldn’t campaign on federal health care law because that’s not your job. You don’t have anything to do with that. But as I’ve been researching and looking into social media this cycle, you see all these fascinating sort of local campaigns running on national issues. So like I did a big story about Facebook ads and sort of how candidates for small offices, very, very down ballot offices are advertising on Facebook and you see people running for school board who are talking about ISIS caravans. You see people who are running for city council and they’re running on the wall. And Ms. 13 or, you know, Colin Kaepernick or they’re running on, you know, Republican, you know, tax plan. And it just it makes no sense logically. Right? There’s no but it’s sort of what happens when you put when you sort of shove all politics through this algorithmically sorted feed. Right. That you have to kind of aim for the broadest issues.

Advertisement
Advertisement
Advertisement

S14: Yeah. The only way that they’re going to be able to get their message seen by the most number of people is if they speak on these specific divisive topics that everyone is clicking on that everyone wants to hear about. So there’s like the the gutting of the local newsroom, they’re just part of this, too, is that there’s just not as much local coverage. But it’s also this is what these platforms for made to do, like the democratizing nature of these platforms gives everyone a voice.

S8: And then, yeah, the only way to to kind of get your stuff to percolate up might be to focus on the hot button issues.

S5: Well, to be fair, I mean, it’s not it’s not just the right right. I mean, like every Democrat in the country running against Trump, you’re not running against the Republican opponent. And I don’t I don’t think that’s crazy. But it does I think it does reflect. How when you’re when you’re in a country with a charismatic populist leader, that leader just like it just infects everything that you can’t you can’t avoid them.

S1: But I think you don’t want to both sides this too much, because I think you do see Democrats running on healthcare, running on local issues in their communities. I think there’s obviously a lot of anti-Trump stuff that’s going to power people to to go to the polls. But it’s really interesting. I mean, you don’t think you would have seen this, you know, 10 years ago? I think it’s a real reflection of the times we live in and how people are getting their information.

S15: So why do you think there’s. You’ve been showing on Twitter that some of the pages that have been getting the most clicks on Facebook are they have been popularized the most are far right pages or like kind of like Republican leaning, not even far. Right. But just like right leaning stuff. Why is that, do you think, more popular on Facebook? I have my own theories. And here’s what you think.

S1: Well well, as the resident expert on this stuff. But I my my sort of like backseat theory is that it’s just it’s a couple things. One, it’s that these far right Web sites are more likely to they’re better at sort of creating engagement. They’re like they’re just better at whether. It’s finding issues with sort of salient or early or salacious details. They’re better at sort of playing to people’s, you know, the bottom of their brain stem, sort of raw emotions. But I also think it just reflects Facebooks user base. I mean, if you look at who is using Facebook the most, they’re their fastest growing user base is is people over 50. And you don’t see that many, you know, young people, a lot of them have migrated to Instagram or Snapchat. So the people who are left on Facebook using it very act actively. I would guess, although I don’t know this for sure, tend to be sort of older and more conservative.

S5: But then there’s also I mean, I think part of it is that liberals have the mainstream media and liberals generally trust the mainstream media were comfortable sharing The New York Times or CNN or Mother Jones or Slate and conservatives have been trained or just don’t trust the mainstream media. And so there’s this vacuum of of information that’s going to that’s going to cater to their worldview. And so I think that’s one reason that you see more of the mummification and the sort of trumped up false news on the conservative side. But Paris, I wanted to ask you one more question about the about the way this gets around. You’ve written a little bit about about how sometimes it’s bot’s, but sometimes there’s like people who act like bots to amplify information, right?

Advertisement
Advertisement

S16: Yeah. I think that the word bot is something that I struggle with and kind of that I struggle with deciding whether or not to use because it is. Yes. A good description of a specific type of action.

S10: But oftentimes we’re talking about the large groups that are kind of amplifying information that I’ve particularly studied in the extreme right sense of it. But oftentimes it’s not a bot that is just retweeting something 400 times a day that happens to look exactly like cyborg like activity. It’s a human in one case. I was tracking this network of 400 or 500 accounts that all were retweeting articles, links and just tweets from a specific network. About five or six hundred times a day was the average number of free tweets that were generally tweeting their own maybe 50 times a day. And as we looked into it more. While it’s easy to call that bottleneck like activity, the people behind it were mostly just human beings who sat in their phone the entire day and just retweeted the same things over and over, often from these specific groups called a deum rooms I think is the unofficial name for it. It’ll be a large deum group that sometimes you have to pay to get into.

S16: Oh really? Yeah. Where people pay to get it. Yeah. Where people will essentially post an article, a link or a tweet and then everyone in the room is expected to retweeted them. So it kind of creates these networks of people who are all amplifying each other’s.

S7: What a weird privilege or like a way to get out. I’ve definitely seen various groups or you have to like get screened in some way before they let you in, but never money. So that.

Advertisement
Advertisement

S16: Yeah. It was interesting because it. Yeah. The group that I was following was around this one account still active today called the Bradford File which tweets like insane, very ultra right wing content. And essentially his deum group was full of women who were like wanted to sleep with him, essentially had a crush on him somehow. And so like he’d flirt with all of these like very old women located in Florida. And in turn, they would retweet all the tweets.

S11: What an economy. I have not heard of that. That’s fascinating.

S17: Yeah, I got to get my own DMK group.

S18: Don’t know. We’re not. We’d like to pay to join a D M group with me just how much he charges. I’ll do like a dollar or less. You undersell the market.

S19: You know, one thing I want to bring up, though, about why right wing content proliferate. They think, well, we’ll make a good point there. I got to earlier the summer interview a trio of guys who run probably though, one of the largest pro-Trump Facebook hubs of pages.

S12: So they control like, you know, eight pages, I think. And have, you know, millions and millions of followers total. And I had a really long interview with them. I don’t know why they agreed to talk to me, was very hard to find them, as I know you’ve experienced, too, like finding these people.

S20: And I finally got them on the phone and they’re like, yeah, okay, I’ll talk to you. They said they feel like the country’s watching two different movies that that, you know, people who like Trump get the news.

S12: They get they get CNN, they get New York time. And who don’t like Trump? Who don’t like Trump? I’m sorry. Yeah. They get they get like kind of all this. What I would consider verified fact check reporting about all of the troubles that his administration has been having that really needs to be reported on is important. But that really turns people off who like Trump, like it’s not that they they just don’t want to see bad news about someone they like. And so because of that, they’ve turned to either Fox News, which they can’t watch all day because they have a job or and Facebook. And Facebook has kind of filled the void for people that don’t want to see, even if it’s true, bad news about someone they like.

Advertisement
Advertisement

S1: Well, and the biggest publisher on Facebook is Fox News.

S6: Right. And so there you go. Yeah, it’s a replicates.

S12: And so you know it. And I you know, I’ve heard that theory before also that if you believe in something that’s not necessarily based in truth, you have to create a lot of media to kind of prove it’s true. So if you believe the earth is flat, you’re going to make a lot of YouTube videos to prove it’s flat if you believe the earth is round. You don’t really need to make YouTube videos to prove the earth is round.

S1: Well, it’s also just more interesting. Right. It’s more interesting to learn a secret theory that the world is flat than to watch a two minute video about that. The earth is round. So these platforms that sort of prioritize and reward engagement, there’s this sort of secret knowledge economy that’s popped up to sort of game that.

S16: Yeah. This has been that’s been kind of my thinking when I’ve been studying like the Q8 on conspiracy and a lot of these other groups is that it kind of gives people this sense of purpose and self-worth to believe in this, because then they’re the only people who truly know what’s going on. They believe that they possess some sort of special information. The rest of us are missing out on. They’re the only ones that understand the way the world is really operating.

S10: And they kind of encase themselves in this bubble and just kind of dive deeper and deeper into it.

S21: So here’s a question that I’ve been I’ve been trying to grapple with. We know that in the 2016 election, a lot of the misinformation was originating in Russia. We didn’t know that at the time of the election that came out later. So in the 2014 midterm cycle, I haven’t seen that much reporting or evidence of Russian interference or interference from other countries. But I’m wondering, does that mean that it’s not happening or is it just that, again, we won’t know until later that it’s happening?

Advertisement
Advertisement

S22: Well, I will say that Facebook has gotten a lot faster about telling us what it has happened. So if it has happened lately, it seems like they tell us within the week. But I don’t know, maybe having your thoughts on that.

S1: Yeah, I have a couple thoughts. One is that I think there is not as much need for I mean, that in 2016, the playbook. You know, that Russia came up with was about stoking division. And like we sort of got that now, like, we’re sort of really mission accomplished.

S11: Polarization exists now and they don’t need a wedge that.

S1: Right. Exactly. So like kind of mission accomplished on that one. And then I think we’re also seeing, you know, different forms of disinformation. So a lot of it’s not happening out in the open might be happening over, you know, text messaging platforms. It might be happening in sort of more closed spaces where it’s harder for us to see. Like we saw that in Brazil just recently with their election. There was a ton of misinformation, most of it. It seems like was flowing through WhatsApp. And so I think that’ll be sort of something to watch.

S7: So we might have the Fornier difference actually might be seeding the disinformation and some of the forums where they work to kind of strategically percolate it.

S16: Yeah, I think one avenue that people are not talking about enough, which I’m not sure how we’d even begin to counter is Facebook groups, which are huge and especially nice, those right wing circles. There’s no one really policing them because the majority of Facebooks, the majority of the ways Facebook goes about policing sort of fake news and things like that relies on reports or some sort of interaction from the people that see the content and Facebook. And within these groups, everyone wants generally to see this sort of content, be it like fake stories about the caravan or things like that. And so there’s no one to report it and then it percolates and ends up on a platform like Twitter or Facebook generally.

Advertisement
Advertisement

S7: And so, you know, just to change topics, another thing I wanted to talk about well, I have you guys here is, for instance, the rise of wasnâ€t and Trump, these people who we wouldn’t have seen gain so much popularity if it weren’t for the democratizing effect of social media. Right. Which is something we used to celebrate so much. And now it’s kind of given a platform to people that typically wouldn’t get past. Kind of the gatekeepers, the traditional gatekeepers of media. You know, our editors, for example. And so I’m curious what you guys think about about that, about kind of like Facebook kind of giving people the opportunity to organically or inorganically if they’re using disinformation campaigns of rice.

S23: Well, the way I’ve sort of been thinking about this, because I think it’s, you know, to say that Trump is sort of purposefully engineering or wasnâ€t is purposefully engineering, some campaigns that are, you know, tailor made to the realities of social media amplification. I don’t think they’re thinking about it in that sophisticated way. I think they’re just really good at getting attention. And, you know, Trump has been in the business of getting attention since way before the Internet, since way before Facebook. He had a TV show. He was a regular sort of personality in the media. Both NRO has also been trying to sort of get attention and amplify his voice for a long time. And to a certain extent, that plus shamelessness gets you a long way in in the media environment. And you know, when Trump had a TV show, when he was on The Apprentice, he had, you know, an hour a week. And he could he could drive big ratings within that hour. But then the show ended and something else came on. What’s happening now is essentially online. There is no one hour time limit like he can just if he can keep the attention on himself. He can have 24 hours a day, seven days a week. And so he’s programming in a way that I think is really is really useful for him.

Advertisement
Advertisement

S5: The way you put that makes sense to me, because, you know, we use of a media environment where they were these big corporate gatekeepers that decided what news we would all see. And so they would not give attention to somebody like Trump. Now we have an information environment where attention is the currency like on Facebook. If it gets attention, then that’s what goes to the top of the feed. So that makes sense that that attention seekers would be thriving in that kind of environment. But one thing that that the Brazil election question that it raised for me is I used to blame a lot of this on Facebook algorithm because they’ve written the software to to optimize for stuff that’s divisive, that like engagement. Yes. Stuff that appeals to your biases, that gets you an emotional response. But in Brazil, a lot of the problem was on WhatsApp and there’s no there’s no equivalent algorithm on WhatsApp. And yet we still see huge problems with misinformation. So I’m wondering, like, is it was it really Facebook algorithm or was there something deeper all along about. I don’t know about the Internet or, you know, is it is it specific to WhatsApp? Could there be even be a platform that would not lead to greater polarization and foster misinformation and that kind of thing?

S7: So it’s just sharing. I think people just like to share like agro negative stuff in how they’re like, oh, here’s some bad news you might think is interesting and maybe it’s not. Yeah, maybe it’s not just people who can game it, but it’s also that people like to just share bad news.

S16: I don’t know. Yeah. I think that a lot of the issues we see within Facebook algorithm reflect in a way the issues with our society in the way we kind of share news and consume content just generally as people. And we can see that even within these kind of closed communities that people are sharing the same sort of polarizing, often untrue things solely because they realize that that is what gets engagement within your own community, whether or not it is being amplified by some sort of algorithm.

Advertisement
Advertisement

S23: One thing I’d be really interested to see, and there probably is no way to run this sort of a/b test.

S1: But if you could see sort of how misinformation would would work in a society that had WhatsApp, but that didn’t allow people to forward messages to, you know, vast numbers of people, because a lot of what’s happening in Brazil is that a piece of misinformation will spread. You know, people have a group with, you know, 100 of their friends and neighbors and stuff, and they’ll just forward the message to that entire group. And then, you know, maybe 50 of those hundred people will forward it to other groups that they’re in. So this sort of viral mechanics are there. Even though the feed is not algorithmic, like Facebook, the sort of sharing mechanism is still there. And I’d be really interested to see what happens if you just if you just don’t have that. If maybe you can max out, you can forward a message, you know, to three people. And how does that change the ability of disinformation to spread?

S22: Hey, you just don’t give people as much power.

S21: Is that with it sounds bad?

S8: No. Well, I just think it will broadcast is what this is about, right? It’s about the ability to broadcast, you know, and should you limit that locally like an hour, localize that, or do you just make it as a global and vast as possible?

S22: And I don’t know. I I think localization of broadcast kind of keeps it at home and keeps it understandable and makes sure that it doesn’t get as toxic. That doesn’t seem bad to me.

Advertisement
Advertisement

S21: Right. So it’s it’s it’s the centralized platform with the ability to take any one person’s voice and just blow it up.

S22: Yeah. And, you know, we’re just talking around with ideas now. You know, one last thing, though, I wanted to talk about is Facebooks, folks, war room. Do you guys think that the platforms, particularly Facebook, has done a good job so far?

S11: I mean, clearly, I know with with Brazil at WhatsApp, which is a very big challenge, they haven’t they were not able to rein that in. And I’m not really asking you guys to assess one of the hardest questions to us. That’s what I’m I’m curious what you think about what they’ve they’ve done so far.

S1: I think one change is that a lot more journalists and researchers and NGOs are paying attention to misinformation. And I think that, in turn, has forced Facebook to pay more and more accountability because more people are watching. Right. And in a way, they’ve kind of sort of outsourced a lot of that work. I mean, we still we, being journalists, still find a lot of things that break Facebook rules and bring it to their attention and then they, you know, crack down on it. But a lot of times it’s not them proactively going out and looking for it. It’s them being sort of responsive to other people, finding stuff on the platform, which is which is, you know, to be fair, a good change. They could they could say, like, thank you for your reporting. We’re not going to do anything about it. I think they’ve been more responsive than they have been in the past.

Advertisement
Advertisement

S16: Yeah, although I mean, I don’t I’ve seen a lot in my reporting or some of my colleagues reporting that oftentimes Facebook will have seen these sort of things that break their rules multiple times for users will have reported them. And Facebook does nothing. But it’s only when a journalist or somebody with the power to call them out a platform actually brings it to their attention that they will respond in some way and even then, not always readily.

S21: Harris, you wrote that you wrote the post for Wired about how with the the Met the pipe bomber suspect, he had actually been reported for many death threats on Twitter by a person who was somewhat of a media person.

S16: Yes. Was on Fox. Right. And nothing happened. And then, of course, they didn’t even like him. She had reported him to Twitter. They said nothing, did not suspend his account. And they only, like, corrected that. Like what? Well, that was after they found out whose account it was, right after they had identified the bombing suspect in the news.

S21: Yes. I think we can probably agree that that the platforms are still pretty bad at all. The things that they were bad at two years ago.

S16: They’re just responding to whenever they get criticism and have to do something or else face having to sit in front of Congress for a couple of hours, have a bunch of bad photos of them taken.

S22: To be clear, that’s not real liability. Right. Like, I mean, so much of this stems from the fact that they don’t have liability for what users post on their platform. And because of that, it’s really allowed them to look the other way for, you know, 15 years. And I always bring it back to the Communications Decency Act. I’m not going to make a verdict whether or not that’s a good thing or a bad thing, but it has fostered a culture amongst Internet platforms that is inactive.

Advertisement
Advertisement

S5: I wonder if our listeners play section 230 bingo at home.

S12: Fascinating. It’s something I really think about a lot. I I you know, I read a lot about the concept of liability. I think about it’s very powerful. And and when it’s not there, it’s very powerful, too.

S21: So now I think you’re right, though. But I mean, certainly they’re trying now. I think they’re trying they’re not doing a great job.

S11: They’re still overwhelmed. I don’t want to be regulate. They don’t be forced to do it. So they’re trying to do it. Yeah.

S21: And so the question to me now, it’s just like is that are these problems even solvable? I mean, they admit now that it’s an arms race and they’re never going to fix misinformation. But. But I just wonder if it’s ever really going to get better or if this is where we’re stuck.

S11: It have to be an arms race like will there be a future will work. One will say like, oh, remember how like messed up 2016 and 2018 and 2020 was like, you know, like, will we ever be able to say that’s not how it’s going to be anymore? You know, it seems like short of just major structural changes, which might involve like a pop, like some scaffolding of policy that forces them to do something.

S20: I’m worried that this is just going to be a fixture of our democracy. And I don’t know if social media, the way it exists now and electoral politics can really coexist in a healthy way.

S21: I’m not sure there’s there’s a small bright spot, which is I covered a study and academic study that found there actually has been less false news and misinformation on Facebook in the past year. And that’s compared to Twitter, which hasn’t focused on stamping out misinformation. Facebook has at least moved the needle somewhat. But I wanted to ask the last question, Kevin in person, do you think are things getting better or are they getting. Worse, that’s a great question.

Advertisement
Advertisement

S24: I don’t know.

S25: I think things are at a steady state of bad and occasionally the needle moves towards. OK. That’s an order. I don’t know when you were just saying that Twitter just introduced a feature this week for people to be able to report fake news or misinformation, which is insane that it has taken until November for Twitter to add that. I mean, there are small victories to be had, but they’re much too late.

S26: Yeah. Think of it like if you had a boat that had a huge hole in the hole and you know, it’s that’s a problem. Right. And like adding more buckets is like a good thing.

S18: Yeah. Why don’t you have more bughouse? Be good.

S17: But like you still got the freakin hole in your. And at some point you got to patch that. And I think we’re at them adding more buckets phase right now.

S11: Yeah. I guess I’ll just say I hope that there is a day in the future. Maybe it’s the speculative sci fi where we look back and say, man, that was a mess. I’m so glad it’s not like that now. I don’t see that day coming. Why? I hope so.

S17: I hope, like we look back at now with like our like, you know, how you look at ads from TV from like the 1950s and everyone’s like smoking and, you know. Yeah. Or Scates and, you know, it’s not the healthiest thing. I’d like I hope we look back at at what’s happening now in the same way that we look back at those ads.

S2: All right. We’re gonna take a quick break. When we come back, we’ll have our interview with former White House deputy chief technology officer Ed Felten.

Advertisement
Advertisement

S6: Just a quick note, we recorded our interview with Ed Felten on Tuesday, October 30th.

S27: Our guest today is Ed Felten, professor of computer science and public affairs at Princeton and the director of the Center for Information Technology Policy, formerly Felton, served as the deputy chief technology officer at the White House under President Obama. His research has largely centered around issues of government, transparency and cybersecurity, with a special focus on voting and election security.

S15: And he is quite well known for having hacked into many, many voting machines over the years in order to reveal how dangerously porous the technologies that we rely on to tally our votes really are.

S6: And he’s testified to Congress on the issue, too. Professor Felten is also a member of the Privacy and Civil Liberties Oversight Board, which works to ensure that efforts by the executive branch to protect the nation from terrorism are balanced with the need to protect privacy and civil liberties. Professor Felten, thank you so much for joining us. Thanks.

S4: So you first started hacking into voting machines in the late 90s at Princeton. If if if I’m correct there and I’m curious, what were those machines and what flaws did you see then? And I guess I want to know if you can tell us, you know, that was 20 years ago or so.

S28: If we’re still seeing the same problems today, we see a lot of the same problems today that that we’ve seen in the past, and mostly because the machines have not been upgraded. And in many places, what we found back then, it was really two things. First of all, that there were fundamental vulnerabilities because of the use of paperless computer systems in voting. That’s a risky thing to do in itself. And then on top of that, the systems that were actually out there in the field were not very well secured. And in some places in the US, there are new machines in use that are more secure. But in a lot of places, including my own home state of New Jersey, we’re still using the same old equipment as we have for a long time.

Advertisement
Advertisement

S6: I mean, you saw some of these voting machines are actually for sale on eBay. Back then, writer. Is that still the case?

S28: It still is. Yeah. When when a state or county switches machines or they take some out of service, they typically will sell them for surplus. And so you can buy them on eBay and other places. And that’s how we got a lot of the early machines that we studied.

S15: And so I remember reading back in 2008, one of the voting machine manufacturers actually threatened to take legal action against you for studying and testing the security of these machines. Has your research led to a hardening of these voting machine technologies?

S28: I think the long term impact of the research that my team and others have done has been more to get states and counties to switch to more secure systems. But that happens very slowly. We’re still have something like 30 percent of U.S. voters are voting on systems that that are suspect by design.

S29: So before we get into the problems with the current machines, I want to ask what’s maybe a really basic question, but what does it look like to hack a voting machine? I mean, is it a person standing there at the ballot box in front of the machine and doing stuff to it? Is it that they’re tapping in somehow remotely? Well, you know, when you hack them, what does it look like and what might it look like if this were to actually happen in an election?

S28: When we study a machine, we first kind of take it apart in our lab to understand everything about it. And then we try to figure out how someone might be able to to to modify the machine or the results. And that typically involves just changing the software on the machine, literally just installing a software upgrade that or update that it wasn’t authorized by the manufacturer that causes the machine to do something else.

Advertisement
Advertisement

S30: And so usually it involves either having hands on the machine physically hands on somewhere. It might be in the warehouse where the machine is kept or it might also involve if the machine has some kind of networking or wireless capability breaking into it that way.

S15: Have we seen instances of hacked voting machines? I know. And we’ll get into this shortly that there’s been problems with the technology having bugs or not working it. Right. But but have we seen instances of hacking?

S30: We don’t have confirmed cases in the US of of hacking that affected elections. You know, as you said, we’ve seen we’ve seen quite a few examples of errors or things that shouldn’t have happened happening, but we haven’t seen those sorts of errors.

S28: But then again, part of the problem is that it would be hard to tell because the vulnerable machines don’t keep the kind of records you would need to keep in order to be sure that there wasn’t a problem.

S29: Yeah, I was going to ask, is it is it just that we don’t know and it probably has happened or are there actual barriers that that have prevented this from happening? I mean, if it hasn’t happened, what’s the obstacle that has kept it from happening, you think?

S30: I think the factor that has kept it from happening is that the people who have the capability of doing it have not chosen to to manipulate an election. We knew in 2016, we’ve known before that there are people who have the capabilities to to mess with the with voting machines, but they just haven’t so far. And, you know, we can count ourselves lucky. But we shouldn’t stay in this position where we have to rely on the bad guys choosing not to act.

Advertisement
Advertisement

S15: Yeah, that’s quite unsettling. We know earlier this month, Texas officials charged that votes intended to go to veto or work instead went to Ted Cruz. And the voting machines, which are the east slate machines made by Hart InterCivic, had switched the votes. And so I also remember reading with this story that those voting machines were running on something like 2007 software. Is this something that voters should should really worry about? I mean, that is such ancient software.

S28: There are a lot of voting machines, electronic voting machines that run old software. That’s true in Texas. That’s true in Georgia. It’s true in New Jersey and a bunch of other places. Typically, these machines don’t have their software updated very often, and that has something to do with sort of cost and maintenance issues. And also that software updates in some cases need to be certified through a sort of slow and expensive process which pushes people away from actually doing that. So all the more reason not to have to rely on the software being correct.

S29: What was the issue in Texas? I couldn’t get full clarity on that. Do you have a good understanding, do you think, of the vote flipping or vote switching bug as I understand it?

S30: It’s a sort of use user usability problem, a user interface problem. So this particular voting machine has a strange interface where there’s a sort of weald that the voter can turn and then a button to press to make to record their choice. And apparently, if users go faster than the machine anticipates, you can get unexpected results. And this kind of points to another issue that folks have had with electronic voting machines was this often their usability problems that cause that caused more voters to leave the voting booth not having cast the vote they thought they did then. Then we really want.

Advertisement
Advertisement

S29: And that’s the argument, of course, for that for the paper trail. Right.

S30: A paper trail helps really for electronic voting. A paper trail is the most important safeguard because it creates another record of the vote, which the voters saw. And the thing about paper is that it’s less surprising in how it behaves than. Computers can be. You kind of know that if you take a pencil or pen and make a mark on a piece of paper and put that paper in a box and then you come back later and look at the paper again, it will still have the same marks on it. That’s not necessarily the case with a computer. Right. If a computer records some information and then you come back later, it might have changed. That’s just the nature of what how computers work. So paper trails are the most important is is the most important safeguard we need against all of these sorts of problems, whether it be malice or error or usability. Paper trail helps with all of those, right.

S29: Some might say Delaware just recently approved new voting machines. They do have a paper trail. But should we be thinking about going all the way back to just pure paper? I mean, the whole push toward voting machines really gained momentum after Bush v. Gore with the hanging chads in Florida. Paper obviously has its own problems. What’s the you know, what’s the optimal solution, do you think, at this point?

S30: From my standpoint, I think the best system is one that keeps both paper and electronic records. You have a paper record which the voters saw and verified, and you also have an electronic record. And the benefit of having both is that each one has its pros and cons from the standpoint of reliability or security. But if you keep them both and then check them for consistency against each other, then you’re in the best position to detect a problem if there is one.

Advertisement
Advertisement

S31: And so a good example of a system like that is an optical scan system where the voter marks a paper ballot and then the voter feeds that into a scanner in the polling place and the scanner keeps an electronic record. So best practice, no one in the polling place is to have a voter verified paper record along with an electronic record and then best practice. Number two is to actually compare them by a statistical audit. After the election.

S32: So I’m curious, are there federal standards that voting machine companies have to adhere to in any way?

S15: You know, because it seems like it’s something that they should work out already, that, you know, they shouldn’t be switching votes or that these usability issues, these kind of you can think of them almost as usability bugs. Because if if the system can’t handle all of the information is coming in, added at once, then then it’s not working. Right.

S32: I’m curious, are there any standards that that these companies have to adhere to across the country so that way they can be used in elections?

S28: There are federal standards and most of the states have voluntarily adopted the federal standards. But those standards are old and they’re not very comprehensive. And some of the machines may have been certified against the standard that existed when the machine was new. And so those could be standards that are quite old and might not have much of anything about security or usability in them. Back in the day, the standards were really written and sort of thinking about the old fashioned big metal lever machines and the federal government and that the whole policy process is still kind of catching up in terms of standards.

Advertisement
Advertisement

S4: And so you you worked at the the White House under the Obama administration. I’m curious, why wasn’t there more progress on this issue then or when will we see progress on this issue?

S15: I know it was only in January 2017 that election systems were designated were designated as critical infrastructure, like the electrical grid, as that would get kind of federal protections.

S30: Sure. One of the core challenges here is that elections are really run by the states and counties as opposed to being run or managed in a centralized way. The federal government can set standards, but at the end of the day, it’s your county clerk probably who is the most important person for the operation of voting in the place where you vote. And because it’s so decentralized and because these things are run by officials who often don’t have a lot of technology expertise available to them, it’s very difficult to get coordinated action across the whole country. And so what we’ve seen over the past, say, 15 years as the security of voting machines has come into focus as an issue, is slow progress as more and more states and counties adopt more secure practices. But it’s going to be quite a while, probably before we move forward. There have been efforts to pass federal legislation in this space. There’s a bill called the Secure Elections Act, which is now which is now pending, but things tend to move slowly.

S15: The voting machine industry, which I’m reading, is like a $300 million a year industry. And according to some fantastic reporting from Kim ZETTER in The New York Times magazine, there’s this revolving door between voting machine vendors and election officials. And I’m curious if one of the reasons why we’re not seeing updates on the local level is that there may be a corruption issue.

Advertisement
Advertisement

S30: I don’t know if there is sort of clear corruption, but there is a sort of tight charity of people who are involved in election administration, whether on the vendor side or the election official side. And I think the concerns about about the cybersecurity of elections have been pretty slow to percolate into that community. You know, and this is the thing. This is not unusual in the voting machine space. You see a lot of different industries and sectors that are slow to. Catch on to how serious the security problems they face could be, and often it takes someone in a sector getting burned before the sector really wakes up and starts to take cyber security more seriously. And we certainly don’t want to be in a situation where someone in the voting space or election space has to get burned before we take this more seriously.

S29: I know one thing that election security experts have been concerned about for a long time is that the software in these systems is proprietary. So you have these different private companies making the voting machines, building the software. And when researchers say, hey, can we see your software, make sure it’s safe. Make sure it doesn’t have bugs in it. They say, no, you can’t see it. Is that still a problem today in what you know? Has there been any progress in getting them to open that up or moving toward a more open source approach?

S30: There’ve been some efforts to make open source of voting software, but the major vendors are still operating in a closed source way. And this really comes down to what are the contracts that states and counties signed when they buy systems. Because the degree of freedom that they have to inspect or reverse engineer analyze the systems depends on what’s in the contracts. Sometimes there are terms in there that say thou shalt not examine or do security analysis on a system. And that’s obviously, in my view, not something that a public official should be signing for a technology like this. There are other there are other situations where officials insist on having more ability to inspect. And many of the most useful studies of voting machine security have come about because of officials who who who sort of put their foot down and insisted on more, more freedom to have machines tested.

Advertisement
Advertisement

S29: Yeah, it seems like maybe one dimension of this is a problem with technological literacy on the part of the representatives at the state and local levels who maybe don’t have the information needed to evaluate these systems as they’re making these decisions on behalf of the public.

S30: It’s true. There’s not there’s not a great deal of information that officials have about the how the machines work or about the security. And certainly a lot of machine lot of decisions have been made in the past that the officials might regret now. But budgets being tight, it’s not easy to admit error and spend another pile of money on new systems. The good news in this area is that I think it’s now pretty clear that the goal should not be to have systems that are that need to be bulletproof in terms of their security of the goal instead should be to have an overall system that is resilient so that if something goes wrong with the software, if it is if it behaves strangely that you have something to fall back on, you have a paper ballot, you have a an audit and recount capability so that whatever goes wrong, you’ll be able to recover. And at the end of the process, voters will be able to have confidence that you got the result right in the end.

S32: So I’m curious, what’s your biggest concern for the 2018 election? I mean, I feel like I hear this same conversation, you know, every two years, every time there is an election where we’re talking about this cure to these voting machines.

S15: Again, what do you what are you worried about this time around for next week?

Advertisement
Advertisement

S30: Well, it’s the same worry that we’ve had in past election cycles. Unfortunately, it’s partly what happens if somebody tries to manipulate the systems and change the result of the election. But as in 2016, there’s probably greater concern about the possibility that someone will try to undermine confidence in the election, to try to undermine the legitimacy of the process by trying to cast doubt on the result. And that could mean just trying to cause chaos in some way and then trying to spread rumors about misbehavior or spread conspiracy theories. The worst outcome that I think that I feared in 2016 and the thing that is the biggest concern in this cycle is that at the end of election day, we genuinely won’t know who the voters wanted to put in charge because we don’t have really a roadmap for dealing with that kind of uncertainty. The whole point of an election or the way we should think about election processes and security is that the goal is to produce convincing evidence as to what the voters wanted to do. And if we’re in a situation where we don’t have convincing evidence pointing in either direction, and yet it’s the end of election day and there really are not do overs in American elections, then then we’re in a difficult situation. And I think that’s the thing that I would worry about the most.

S15: Right now, we’ve we’ve seen similar situations turn into quite chaotic ones. Professor Feldman, thank you so much for joining us. Thanks.

S5: All right. We’re going to take one last quick break.

S3: When we come back, we’ve managed to cajole our roundtable panelists Kevin Roose and Paris Martineau into sticking around for a special edition of Don’t Close My Tabs.

Advertisement
Advertisement

S19: So it’s time again for. Don’t close my tabs only this time we’re not going to talk about what we typically talk about.

S9: We’re going to talk about how to watch or not watch the election results as they come in on the Internet. It’s a big question, I think. I kind of want to start with Kevin, because you’re from The New York Times and you guys have that infamous ticker election needle.

S33: Yeah. Oh, yeah. What is this?

S9: Yeah, but it’s it’s, you know, super reliable. I think it has not. No, I’ll have it open as a tab.

S1: They it. I believe the word is triggering.

S33: Triggering. OK. That’s because of this. Because of course of the 2016. Yeah. RELIABLE was not you.

S21: I started out showing oh Hillary Clinton is going to cakewalk to victory and then over the course of the night then you know what?

S14: Then she started moonwalking away from that.

S17: Remember when I saw someone I saw in a Halloween costume the other day that was just the needle from the scariest costabile?

S7: Yeah, it’s scary.

S21: So are you going to watch the needle and show solidarity with your New York Times colleagues?

S17: You know, I’m only working on election night, so I hope that I have time to to see anything. But, yeah, I’m pregnant. I’m pro-NATO. I’ll go out on a limb. I think the needle is widely misunderstood or misunderstood for this interested needle. But, you know, we have an amazing team at the Upshot and they’ll be doing, you know, live returns and I’m sure they’ll be great. So I’m a stick with the home team here.

Advertisement
Advertisement

S7: Also, this crew here is probably better at knowing kind of the worst places to look for information on the election that I’m going to be looking at.

S34: My 27 TweetDeck columns we write raise a terrible choice and no one else should do. Yeah, I shouldn’t do it. I’m going to try and dissuade myself. But I know in my heart that I’m going to stare at them and see a bunch of stuff that is probably untrue. Probably not yet confirmed. Probably too much information for the time, but going to do it anyway.

S5: Wait, I’m really curious now. Give me a couple of examples of of your 27 tweet.

S18: I have like so many different lists. So my favorite dot com is what you’d think Twitter might have that tweet, that column. But I guess I’ve never given it up yet.

S25: My favorite tweet deck column is the activity column that just shows you everything that ever owned you follow or or have added to it everything they like or interact with on Twitter because that’s good. But then I have TweetDeck columns, a million different lists like not see other people who I think is interesting to look into. I have I’ve wired co-workers column. I also have a million different keywords or phrases or particular length at any given time that I’m watching to see if pop up. It’s definitely very stressful, I think.

S34: Our editor in chief, Nick Thompson, came over to my desk the other day and I have a large screen in his tweet on it. The only reason why I have a second screen and he would wholly tweet deck and then walked away. But don’t do it. It’s terrible. Waste my entire life.

Advertisement
Advertisement

S26: Take a walk. You know, go vote yes. Then take a long walk in the sunshine and don’t do it.

S18: Yeah. Well, the results will come in at some point. There’ll be no hugging dog. Yeah. Practice self-care. Yeah. Go be outside in nature.

S8: Politico has really good state by state. I’m interested in like governors and like judges. I’m such a dork. And like state ballot initiatives and stuff because I haven’t got. I like I like I like to watch like the extreme to mark the pit of extreme democracy and all the states that are like voting on weird ballot measures. Politicos really good for that. Always, if you like niche election idea or you think it’s funny or entertaining. I’m gonna be looking though for misinformation. So, you know, I’ll be obviously like scouring the socialism world. Well, I’ll be looking for that. You know, if you if you get anything, you could call anyone here at this table that you think is misleading.

S18: Darling, call me first. Yeah.

S20: Call whoever you like the most. And I think. Yeah. I don’t. What are you gonna be watching?

S21: Well, one thing that I’m going to miss is Slate. Used to have been here. I’m being the homer as well. But Slate, you know, this thing called Instant Spin ROOM, where it had one column of conservatives on Twitter, like in like fairly reasonable, intelligent people on both sides, conservatives and liberals. And you could look at that. And it just it was such a different experience having a balanced feed between conservatives and liberals, because my feed is is heavily, heavily liberal. And I always find that really fascinating because the conversations are just totally different. We won’t have that this here. So this is a totally unhelpful recommendation you could make to TweetDeck columns then. Then you’ll have twenty nine TweetDeck concreteness. But one other thing I’m gonna be watching is Apple News. I’m interested in Apple News because I’m just so tired of covering the depressing morass that is Facebook and Twitter. And Apple is doing things in a different way where they’re employing human editors with actual journalism experience to decide what the top stories should be. And so I’m sort of following that and rooting for it in some ways, although I’m not sure that a world ruled by a journalism world ruled by Cupertino in the long run is any better than what we have. But in the short run, it’s it’s it’s encouraging. Because it’s not full of fake news and they’re pulling interesting stuff from from real sources that do real journalists.

S8: A question about that. And also, I know one thing I do because I am in my early 30s and therefore I use Google for like spelling as well as like the check scores on things or whatever.

S4: So I’m probably just gonna be like Googling, like election results. And Google will then pull in the top of its search results, something from somewhere. And I don’t know where they’re gonna be getting that from. Do you know where Google’s going to be sourcing its results?

S33: Something we should all probably figure out, the MCO lottery. I know.

S8: Yeah. And so that’s gonna be interesting to watch, too, as kind of ‘cause I know a lot of people are like me and just like Google for stupid answers.

S33: And I’m going to be I got five fast facts about.

S16: I think that some elections they pulled directly from the ballot results from the specific states or counties because there was just a security exploit found in one of the counties that allowed Google to pull it. I’m targeting one county.

S7: It was, but none of us read FiveThirtyEight. Yeah, I’ll probably have one of those house open, too. Well, thank you all so much for joining us. This was a really nice roundtable and it was great to talk to you both.

S33: Thank you. The table is round the oval table. I don’t know anything. OK. Thanks.

S2: All right. And that’s our show. You can get updates about what’s coming up next by following us on Twitter at ifwe then hide.

S3: You can also you know, I said this then at Slate.com. Send us your tech questions, show and guest suggestions or just say you can follow me and will on Twitter as well.

S2: So much Twitter. I’m at apro Laser and Will is at Will auriemma’s.

S3: And thanks again to our guests, Kevin Roose. You can find on Twitter at Trevan Ruess. That’s Kevin Auro OSCE and Harris Marciano, who is on Twitter at, you guessed it, Paris Marzano. That’s Paris. And they are T.I. and he a.

S35: And thanks to everyone who’s left us a comment or review on a podcast or whatever platform used to listen. We appreciate you doing so. And if you haven’t done so, go ahead and try. It’s a very nice thing to do and we would appreciate it if you did.

S3: If there is a production of Slate and Future Tense partnership between Slate, Arizona State University and New America, a producer is Max Jacobs.

S35: We’ll see you all next week. And more than anything, if you voted. Great. If you didn’t vote. I’m sure you have a good reason not to. And good luck out there.