S1: This ad free podcast is part of your slate plus membership.

S2: Hey, listeners, a quick note before our show. So, of course, just hours after we left the studio on Tuesday afternoon, The New York Times broke a front page story about yet another Facebook breach of trust. This time, it turns out that Facebook was granting other large tech companies special access to users private data that included lists of all their friends. Even in some cases, their private messages. This was done as part of partnerships that integrated Facebook into those company’s products. For instance, a tool that lets Spotify users message their Facebook friends from within Spotify. This week’s episode is actually our last news show before the holidays. Although stay tuned because we have a couple of great holiday episodes coming up with some of our favorite interviews from the year. Questions from listeners. But you can be sure we will address this in the new year as part of our ongoing coverage of the social media reckoning we’ve all been a part of.

S3: Welcome to if then the show about how technology is changing our lives and our future. I’m Will Arenas and I’m ever blazer.

S4: Hey, everyone, welcome to if then we’re coming to you from Slate and Future Tense, a partnership between Slate, Arizona State University and New America. We’re recording this on the afternoon of Tuesday, December 18th.

S5: On today’s show, we’ll talk about how Taylor Swift use face recognition to surveil the crowd at a recent concert and whether that’s smart, scary or both.

S4: Well, welcome to the show. Rene D’Arista. She’s an expert on cyber security and online misinformation, dressed up as the lead author of a new report to the Senate Intelligence Committee on exactly how Russian operatives weaponized social media in the 2016 election. The scope of those operations and why it may be just the beginning of a new era of what she calls global information warfare. That’s all coming up on today’s If.

S6: All right. We want to talk about a story that first made headlines last week, but it feels like a taste of a future that’s worth talking about. Maybe before it becomes the present. Rolling Stone reported that the pop star Taylor Swift used facial recognition to check for stalkers at a concert she gave at the Rose Bowl in May.

S7: You know when you want.

S8: Yes, this this came out an article in Rolling Stone late last week that, you know, during a Rose Bowl performance and this kind of stirred a lot of drama because people didn’t expect to be surveilled. I suppose that a Taylor Swift show. I thought it was interesting that people were were deeply concerned about this, actually.

S6: Yeah, totally. When I first saw the headline, I thought that they had like cameras sweeping over the crowd at the concert, but there actually wasn’t quite how it worked. They had a kiosk setup where they were showing scenes from her rehearsals. And so people would walk by the kiosk as they came in to the Rose Bowl and they would look at the screen. And while they were looking at the screen, there was, unbeknownst to them, a camera that was looking back at them, taking pictures, their face, sending it to a command post 2000 miles away in Nashville, and then running it against a database to see if any of them were one of Taylor Swift known stalkers. She’s had a couple people arrested in the past for stalking her. So it’s understandable that she would that she would want to know if one of her stalkers is in the crowd. But also, I think, understandable that people were were a little bit nonplussed to have been subjected this technology without their knowledge or without their consent at a at a show that they went to.

S9: I want people, though, to know really clearly that this technology is used without their knowledge and without their consent all the time. And I agree that it should be limited and that, you know, this isn’t something that should happen without people’s knowledge or consent. But it’s by no means the first instance, a facial recognition technology being deployed at a stadium. And increasingly we’re seeing experimentation with with this type of technology, whether it’s cameras, you know, set up at a concert or a basketball game to read people’s facial expressions to better advertise to them. I’ve read things about this and written things about this. It’s certainly something that exists. It’s an idea that exists. And if it’s technology that exists is technology that is probably going to be used in some way. That said, it’s it’s disturbing. And I’m glad that this kind of ruffled feathers for folks, because we should not have a frictionless relationship with such penetrative, ubiquitous surveillance, especially considering it’s going where. All right. We don’t know. Nashville, OK, that’s where I’m from. Like, what does that even mean? Who’s holding that? What are they doing with that information? Is it something that can be subpoenaed by the police? We don’t know.

S6: Right. And, you know, there are states that have laws about how you can’t record someone without their permission. But according to our colleague Aaron Mack’s post and Slate about this, there are very few states, the United States, that have any laws about biometric authentication. And particularly when it comes to private businesses like a concert venue. Jay Stanley from the ACLU said it’s kind of a Wild West out there right now. As long as it’s private property, they can take your image and do whatever analytics they want with it, including facial recognition. Yeah, it just it seems like it has to change. You know, it’s not just activists who are calling for this. We talked to the CEO of face recognition company who says that it needs to be recalled regulated. We have Microsoft out there now trying to say that face recognition should be regulated and explicitly calling for Congress to act. But still, nothing from nothing from Capitol Hill so far.

S9: Yeah, there’s no like a bill that I’m aware of sitting on the shelf that Congress members are prepared to rally behind. But there are laws in states that that regulate this specifically, particularly in Illinois and Texas. There are facial recognition laws, although companies like Facebook have used NRA like tactics over the past few years to really try to defeat those laws and bring them down. So, you know, it’s it’s it’s something that I don’t expect to actually get any better. I expect that this technology to become even more ubiquitous and even more widespread and even more frictionless moving forward. And I hope that we continue to interrogate it when we see instances like this that we don’t expect or else it’s just going to happen without any interrogation. And that is certainly not a healthy relationship to our technologies.

S6: All right. We’re going to take a quick break.

S3: When we come back, we’ll have our interview with Rene D’Arista, the author of a new report to the Senate Intelligence Committee about Russian misinformation operations in 2016 election and becoming information.

S6: Our guest today is related Resta. She is the director of research at New Knowledge, head of policy at the nonprofit Data for Democracy and a Mozilla fellow in Media Misinformation and Trust. She regularly writes and speaks about the role that tech platforms and curatorial algorithms play in the proliferation of disinformation and conspiracy theories. Teressa is the lead author of a new report to the Senate Intelligence Committee on exactly how Russian operatives weaponized social media in the 2016 election. The scope of those operations and why it may be just the beginning of a new era of what she calls global information warfare. Renita Resta, welcome to F then. Thanks for having me.

S10: OK, so we’ll start with some of the key findings from your report or the report that you kind of helped lead with with a lot of other researchers. And one of those is that social media companies under reported or weren’t quite forthright with the information that they kind of slowly leaked in dribs and drabs to to researchers since 2016. And the other is that platforms that are owned by the large companies, namely Instagram owned by Facebook and YouTube owned by Google, play a much larger role then than then really has been made out in congressional hearings and in investigations and in the media.

S8: It seems like these two things may be interconnected in some way in that people weren’t getting the information, not people, but rather investigators, researchers, Congress, the press. We’re getting information from these companies as much as they should have. And then we also really weren’t looking into kind of the role of of some of their subsidiaries.

S11: I can probably put some of that in context, if you’ll let me go back a little bit. So following tweets following election 2016, there was the beginnings of an understanding that perhaps the Russian operation on Twitter, which was sort of known about, had actually extended to other social platforms as well. And so there were researchers outside researchers myself, a collection of other people. Jonathan Allbright was one of the leading figures in this. He was one of the members of my team. We were kind of scouring the Internet looking for evidence of this. At the same time, there were some amazing journalists who were doing the same thing. Who? One of them. I wish I could remember which paper it was. I think maybe The Daily Beast wrote an article about how what appear to be Russian trolls were running the largest Texas secessionist page on Facebook. And I think shortly thereafter there was another investigation into black matters U.S.. And so in dribs and drabs and then funny enough, in the Russian press as well, there started to be this emergence of these these leaks from people who had actually worked for the Troll Factory saying like, yeah, we totally took on the American election and here’s how we did it. Right. And so those of us who were trying to find these breadcrumbs to get a sense of the scope of the operation, I was also at the time beginning to communicate with Senator Warner and some of the other senators about what became increasingly obvious was like a structural problem with social networks. So I met up with Tristan Harris, who, of course, talks about the impact on like on people, individuals and their experience of social networks. And Guillaume Czeslaw, who is a one of the people who created the YouTube recommendation engine, Sandy Parkhill, us, who’d been at Facebook and kind of came out as a whistleblower saying, you know, they didn’t necessarily always do the best job but their data. So there were these two parallel threads. One was disinformation research. The other was sort of social network accountability for what increasingly seemed like misinformation, polarization, radicalization. And those threads were, you know, we were really kind of pulling on both of them in 2017. And as the evidence of Russian interference became more and more obvious, we began to say, hey, maybe we should have some hearings on this.

S10: Right. We know one thing that really struck out in your report was that we’re not just talking about Meemaw’s and, you know, activist groups and d.m.c here.

S8: We’re talking about very, very detailed operations to really come off like faux activists or advocacy groups. I mean, you you talked about in the report how there were e-commerce sites selling things like sex toys, the recruitment of Americans to work with Russian agents through job listings, free self-defense classes, the solicitation of photos for a calendar of women offering counselling to followers on a page called Army of Jesus. Right for people who were struggling with porn addiction.

S1: I’m curious why they went through so much kind of work to appear authentic here.

S8: Was this level of posturing necessary for the Russian operation? Because it’s just fascinating how detailed it was.

S11: I think that’s I think it’s actually the detail that that makes it popular. So it’s a, you know, maybe a chicken egg problem. We don’t have a ton of insight into how many people converted to follow the page and when. But if you think about your experience on social media, you engage with what is effectively like a media brand. That’s what they were turning. A lot of these pages were really masquerading as media brands. They had a Web site. They had Merche, they had a podcast or some they had podcasts. They had all of the. Things that, you know, you would think of when you think about like, oh, I like this media property, it’s independent media. They were very adamant about that. We didn’t trust the media. So we became it was the tag line for black matters. Even those kind of grammatically wonky, they had a huge banner announcing the launch. They had banners announcing we had 100000 subscribers on our site. You know, it was really growing a growing an ecosystem. And in order to do that, authenticity really wins the day or. People follow brands because they feel an emotional resonance, a connection. That level of detail, the way you communicate that stuff does matter. And so if you think about it as a social media marketing agency building up small brands, that’s effectively what is happening here.

S6: And so some of the revelations in these reports have sparked a backlash from the NAACP, which is actually calling for a boycott of Facebook on the grounds that that a lot of this recent Russian disinformation had targeted the black community in particular. Can you talk about why that might have been and what effect it may have had?

S11: It’s important to understand that the topics that they picked were based on real pre-existing social divisions. So they didn’t create rifts. They exploited them. And there is no way to deny the deep struggles and challenges that America has had with race for decades, including as this operation was taking place from 2014 or so till till now. Things like the Black Lives Matter movement trying to achieve change, the sort of social debates that that has that that has led to the uncomfortable feelings that that has brought up. So as this happens, they are able to to latch on to that and to increase the feelings of alienation and grievances among people who have legitimate feelings of alienation and grievances, but to kind of double down on it. You see a lot of if you look at the far right content, you see a lot of there’s pre-existing rage and they really work to amplify that rage. The black community themes of alienation really deeply leaning into the alienation. This country isn’t for us. Right. And so they take things that are already there and they work with what they have because there’s a saying that the best propaganda is, you know, 90 percent true. Right. And that’s because it has to if it’s easily discredited, people dismiss it. So it has to feel real. It has to feel resonant. I think the reason that they went so hard for the black community with the suppression narratives is that candidly, the black community is a powerhouse when it comes to voter turnout. Right. They they turn out, they vote. They have historically voted very strongly in alignment with the Democratic candidate. So it’s almost a recognition of the power of the community and a recognition of the deep underlying rifts in our society, that that is the reason why they would lean so hard into targeting the black community.

S6: And so one of the questions that has come up in the wake of your report is, well, did this swing the election? I mean, that’s a question that people keep coming back to. But I think I think you’re kind of tired of that question, right.

S11: I you know, I don’t have an answer to that question. Right. And and nothing in my nothing in the data set that I was provided would give me the answer to that question. So I think there’s a couple of things here first. You know, as far as like is the only thing worth investigating whether or not Russia flipped an election? I would say the answer is no. Right. I mean, I was talking to somebody in law enforcement, the way he put it was attempted murder is still a crime. You know, you still go and investigate it even if they didn’t get it done. There was an assault on American democracy. There was a foreign adversary who spent three years manipulating and targeting American individuals, pretending to be American, interfering in social conversations, political conversations. That is something that we need to understand. We need to understand how they did it, if only to detect it faster in the future, because they don’t seem to be you know, there’s no indication that they’re going to go away. As far as they’re concerned. It was successful. They they re-up their budget. So I do. I wouldn’t say frustration because I understand that everybody would like an answer to that question when we think about impact. I think it’s also important to look at did this change attitudes within the community? Did it shift the Overton Window? Was the propaganda effective? It continues to perpetuate it. It continues to be propagated in the communities that that it targeted. And I think that that’s in part naivete. Right. Like who goes and looks at a meme and thinks so this is Russian propaganda. I do all the time now. But most people don’t. And so it’s this content is still out there because on the there is a kernel of truth in a lot of it. And that’s what makes it such a challenging conversation.

S10: Yeah. And it’s also like effective propaganda.

S9: It’s not something that you really see that the whites of its eyes, you know, it’s hard to know if something is effective. When you put a piece of information out there, it’s broadcasting. Right.

S10: So, you know, one one question I have is, is that about the calling this kind of an information warfare is this is a kind of phrase that I see batted around a lot in these conversations.

S12: One hesitancy I have with it. Is that by calling it, you know, war, putting it in that militaristic context. It could open the door to increased surveillance of social media platforms that that could potentially, you know, affect communities that are already over surveilled, particularly by police that are closely watching the communications activities of black American communities.

S8: And so what are your thoughts on this kind of framing of information warfare? Is it useful? Are there are there pitfalls?

S11: You know, I I wrote that essay on the digital Maginot Line, and I and I used the metaphor of war in there. And it took literally six months for me to feel confident releasing that piece. Yeah. In part because I really worried about the terminology and the war metaphor. What I would say to that is that they think of it as a war. And and that is where if you read the the indictments now or if you go and you read project locked up the the way that they describe what they’re doing, this is not a oh, we’re just going to mess around with some Americans or they have real strategic objectives. This is a toolkit that they have and this is the framing that they use. And so I got to thinking as as I read more of this, or even if you look at kind of domestic trolling groups that use the phrase me more. Right, the great me Moor’s of 2016. So there’s this there’s there’s this sense among like people who use these tactics and believe in the power of the outcomes that they can affect. And they are using terms like war. And then the rest of us are kind of over here talking about like, well, some shit posters on the Internet. And and there’s a there’s a kind of a real divide there and how we’re thinking about it. You know, we’re treating it as like, oh, this is just a problem of governance. We just have to do a better job, you know, detecting this stuff earlier as opposed to thinking about ways to deter it, which is which is the framework that you would use if you were thinking about it more in militaristic terms. So I absolutely understand the reservations and I and I feel them acutely myself at the same time. I don’t think that we’re well-served by pretending that these are just sort of disparate attacks that happened to look the same way when they really do have in many countries the goal of regime change. Right.

S6: Yeah, I also had a few reservations about the use of war, although I can I can understand it. I mean, one thing that happens in wartime is that you might suspend normal laws or normal civil liberties or that sort of thing. I would worry if that’s one of the implications of it. But I see your broader point that this is this is a long term thing. This is a multi. Many states are involved in it. It’s just going to keep growing. It’s not a one off and it’s something we have to be prepared for. I wanted to talk a little bit about your your other experience studying misinformation and virality and network effects on social media and what it is about the social networks that made them so vulnerable to this. I’m curious whether you think that there were they the victims of this or were they culpable? And what about the social networks enabled this Russian campaign to be so effective?

S11: So I think that there’s a structure, you know, our information ecosystem evolved in a certain way. And you can trace back how the platform’s kind of grew and acquired other companies and really sort of we amassed an information ecosystem that’s largely controlled by five kind of big entities. I think the interesting challenge of that is that it does kind of create these ready-made ordered audiences for propagandists simultaneously. They know quite a lot about the users on their platforms because they’re there serving them ads. And so they are gathering data about those of us who use the platforms constantly for the purpose of selling ads. But then also for the purpose of making recommendations. So they have to keep you on site in order to continue to serve you content. And as part of that, this is where curatorial algorithms come into play. Some of the stuff that that I talk about a lot, which is what are the you know, we can talk about the tactics of the IRA all day long. What what is the information environment that generate that leads to those tactics? Why do those tactics work? And I think that we have this idea of mass consolidation of audiences, precision targeting and then gambol algorithms. When you have these curatorial algorithms particularly early, you know, in 2015/2016, you might remember what a disaster Twitter trending was prior to Twitter really taking into account things like quality of accounts. Any botnet could make anything trend and regularly did. And so this is where the architecture of the information ecosystem just lends itself to influence operations in big, big and part because they are producing content with the goal of virality. They’re producing highly emotionally resonant content. They’re oftentimes really working hard to kind of own their keywords and make it so that when you search, you know, when you search for a term, they’re what you find. And this is just you know, this is how the environment has evolved. When we talk about victim, the idea of the tech companies victim there is I would say they did not expect this. And this is not necessarily the kind of alignment that one would expect. Right. Who’s thinking about how is Russian intelligence going to game my platform? But where I do kind of assign some culpability is that starting in around 2015, we were talking about ISIS and we were looking at other malign entities, terrorist organizations that had begun to kind of co-opt the platform. And if you remember the conversation around that time, people were really like, oh, what if we kick ISIS off Twitter? I mean, who’s next? Right. So it was framed as this this slippery slope. It was binary. It was like either we let them stay on here or were you know, we’ve just moved into an environment of mass censorship. And so there were some we we really erred on the side of not moderating. And the government, the tech platforms really weren’t collaboratively working together in any way. This is relatively soon after the Snowden revelations. They didn’t want to be seen as cooperating with the government. So there was this challenge is there is a sort of like this moment, this almost like inflection point where we could have taken the time to look more deeply at this stuff. But it was just seen as like, well, this is just one random terrorist organization and not a big deal. When as we look back at that, if you look back at that now, you’ll realize that while that was happening, the Russian operation was already under. And if you look at DARPA, right, which the Defense Advanced Research Projects team, that its job is to prevent strategic surprise. And in this case, they were they were running studies looking at whether propaganda on social platforms was gonna be a problem starting in 2012. So there were sort of indications that maybe we should have been thinking about this and weren’t. I think it’s really hard and I’m not really interested in like kind of pointing the finger of blame back. I think where the finger of blame is warranted is actually in how they comported themselves through 2017 when there was indications that when this became kind of abundantly clear and we still had that kind of hedging from Facebook as a. Those two being immediately transparent about it. I think they’ve come a long way since then, but that was that was a particularly kind of tense time. How do you how do you make this behemoth company accept accountability for what happened?

S6: Yes. So what is your recommendation at this point? Me, as you said, there’s been a lot of movement in terms of what social media platforms are willing to do these days and in moderating in working with the government and taking the advice of researchers or experts. But is there more that’s needed? I mean, are we in a better place for 2020 than we were for 2016?

S11: I think we are. I think we are, because I think some of the the calls for multi-stakeholder ism from, you know, myself and others have actually been resonant in some ways. Right. So we had a great example. The investigation project that I did was outside experts working with the government to understand what happened. The sort of third piece of that is really creating this multi-stakeholder system that incorporates the tech companies and that rather than being backward looking where we’re all still, you know, trying to suss out what happened in 2016, that informed by these findings, we say these are the this this is how this investigation went. This is how it could have been improved. This is how we can structure it to be forward looking and to find things that are going to be a problem in 2020. I think that 2018 midterms were a almost a pilot project for that. I know I was in touch with, you know, I could back-channel things to tech platforms. Hey, look at this bot. Hey, look at this thing. And they were very receptive to it. So we had moved past that that period of. Okay, thanks. Whatever. And we were solidly in the realm of this is great. Thank you for the you know, our team will get right on this kind of thing. So that’s that’s where I hope that we kind of formalize some structures to to kind of continue on the positive work of 2018.

S6: All right, Rene Harasta, thank you so much for joining us on if then. Thanks for having me.

S13: One more quick break and then we’ll do. Don’t close my tabs. Some of the best things we’ve seen on the Web this week.

S6: It’s time again for don’t close my tabs. April, what tab could you not close this week?

S12: My tab this week is one that I’m not done reading, but I think still falls into the category of PAP’s because it’s one that’s still open for you and I have not closed. It’s in Logic magazine, which is one of my favorite imprints about kind of critical technologies, studies and explorations. It’s from the issue play their newest issue. And the article is entitled My Stepdads euge Data set by Gustavo Turner. And it is about big data and porn and about how data and large amount of data that are collected from people who watch porn is reshaping the porn industry. And the way porn is presented, the way media talks about sexuality. It’s about concentration of power. People who pay for subscriptions for the weirdest things. Therefore, not just weirdest, but perhaps, you know, most fringe things, whether it’s, you know, stuff about incest and and really kind of faux pas topics, you know, the fact that people are willing to pay for that, that causing the popularity of that type of content to surge, even if it’s not what most people necessarily want. So I’m not done reading it. A super look forward to finishing it, though, because porn is such a big part of the Internet and it’s something that. And it’s something that we don’t talk about a lot.

S6: Yeah, really is it really remains one of the under-covered parts of the Internet and under examined. And maybe that’ll change in the coming years. I don’t know. But there is still this sort of taboo in talking about it, writing about it, even though so much Internet activity is its happens on porn sites.

S10: Well, what is your tab that you cannot close this week?

S6: I might have this week comes from a site I had never heard of before. Somebody flagged on Twitter I clicked through and I’m glad that I did. The site is called the Pudding, and its little tagline is that it creates data driven visual essays. It has a patry on that says each story and the putting represents a few weeks of the author’s life spent researching, analyzing and coding a culturally rich topic. The one that I bumped into is called Population Mountains. It’s by someone named Matt Daniels. It says This is a story about how to perceive the population of cities. It fascinated me because I’m somebody who ever since I was a kid, I would read those lists like in encyclopedias or now in Wikipedia of what are the world’s largest cities. And the lists were evocative, but you couldn’t really picture anything about the place just from seeing the raw numbers. And the number of megacities in the world, of course, has exploded in the past 20 years or so. And what this visualization does is it kind of brings home that part of that population explosion and then the scale and the scope of these cities and population centers around the world in a way that you can’t get just from looking at the at the raw numbers. So it shows wherever there’s more people in a given square mile or whatever, you’ll get a taller peak in this visualization. And so downtown New York, Manhattan looks like skyscrapers because there there’s such population density there. If you go to Washington, D.C. or Atlanta, it looks more like rolling hills because people aren’t quite as concentrated. Singapore looks like a profusion of skyscrapers. And you can also compare cities and regions side by side in a way that that I haven’t seen possible elsewhere. So like there’s one visualization that shows California side by side with the island of Java and Java just makes California look like an empty desert. It makes California look like a ghost state compared to this is cities like Jakarta and Bandung. It shows how Kinshasa in the Democratic Republic of Congo dwarfs Paris in population size. But again, it’s not just the size, but you sort of get a feel for what these cities look like or what the population density is shaped like that I’ve never seen before. It was really striking to me. And there’s there’s just a lot to take from it.

S9: Wow, that sounds fascinating. I am really excited to dig into that. Thank you so much.

S6: Well, yes, the u._r._l for that site is putting Dott cool. You can also partly funded Google by searching population mountains.

S13: All right. That does it for our show. You can get updates about what’s coming up next. My mom is on Twitter at Ifan Pod.

S4: You can also e-mail us at if then. At Slate.com. Thanks so much to everybody who wrote in in response to our questions. We will have we’ll feature some of your responses on a forthcoming episode.

S5: You can follow me and well on Twitter as well. I’m at apre laser and well is at will auriemma’s. Thanks again to our guest, Rene Duress. You can find her on Twitter at no upside. Highly recommend that follow.

S4: And thanks to everyone who has left us to comment or review on Google podcast, apple pie charts or whatever platform you use to listen. We really appreciate it. Not only because we like to be praised, but because it helps other people discover the show without those reviews. Nobody would find if that is production of Slate and Future Tense, a partnership between Slate, Arizona State University and New America. Our producer is Max Jacobs.

S5: Thanks to Alberto Hernandez and Cody Hamilton for engineering, Berkeley, California.

S4: And thanks to Nick Holmes that Occupy Studios in Newark, Delaware. We’ll see you next week.