A New High-Tech Weapon in Ukraine

Listen to this episode

S1: When the Russian warship Moscow sank last week in the Black Sea south of Ukraine, some 500 crew members were reportedly on board.

S2: On fire, badly damaged, listing with a huge column of smoke rising above. And the life rafts already deployed in.

S1: Ukraine said it sank the warship with two Neptune missiles, which U.S. officials backed up. Russia claimed a fire on board caused ammunition to explode, damaging the ship.

S2: Russia’s Ministry of Defense saying the ship lost its stability as it was towed to port after what they say was a fire on board, adding the ship sank in a stormy sea.

Advertisement

S1: At this point, Russia and Ukraine were locked in familiar positions in the information war. Russia made the next move.

S3: The Russian state held this big kind of ceremony for the surviving sailors and officers and whatever who were on the ship. And there were, you know, two or 300 or so people kind of in this big line. I think it was the Sevastopol. You could see them all lined up, like hundreds of soldiers lined up, all getting their medals and being recognized and whatever.

S1: That’s Eric Toler. He directs research and training for Bellingcat, the group that specializes in open source and social media investigations. You can think of him as a professional Internet sleuth, but for journalism.

Advertisement

S3: That’s a lot of questions people have about this. You know, especially considering Russia has not been the most truthful when it comes to this sort of thing, is, you know, are these actually sailors, are they just random people like right up the street and, you know, put on a uniform and, you know, maybe all 500 crewmembers are dead. And these are just, you know, random actors that grab a random young man they grabbed to put on their uniforms and have the ceremony.

Advertisement
Advertisement
Advertisement

S1: That question whether these were actual sailors who survived and how many matters. It affects how the Russian public views the war. It affects Russian parents, some of whom have come forward after the sinking of the Moscow demanding answers. That’s where Eric comes in. He used facial recognition software to identify the men in the video through images in Russian social media.

Advertisement

S3: And it pops up either the account of the sailor or of most often their wife or girlfriend. And for the most part, they look like they were legit. They were actually sailors from Sevastopol. I don’t know if they were stationed to be on this ship. Like, you know, I don’t have a lot of selfies taken on the missile cruiser, but I could tell these guys were clear. Most of the guys I searched, maybe, you know, a half dozen or so were clearly sailors and they were from the same town where this ship was operating out of.

S1: Eric is the first to admit that on its own, this morsel of information isn’t going to change the course of the war. But it’s a tiny, momentary clearing in the fog of an often obscured picture and facial recognition, whether it’s used by researchers like Eric regular citizens or the Ukrainian government, is changing the way this war is fought. Today on the show, facial recognition on the battlefield, what it can do, what it can’t, and whether its use in Ukraine is a testing ground for other places. I’m Lizzie O’Leary and you’re listening to what next TBD a show about technology, power and how the future will be determined. Stick around and. One of the main facial recognition tools Eric uses in his work. The one he was using to investigate that video of the Russian sailors is a program called Find Clone. It’s cheap. He says it costs about $5 a month and it’s easy to sign up for.

Advertisement
Advertisement
Advertisement
Advertisement

S3: At some point. And I think it was 2018 or so. The guy or guys who run it, the race in Moscow, they scrape basically all of contact VK or for short, which is the Russian Facebook. It has hundreds and hundreds of millions of users. It’s extremely popular. They scraped every single photo from a site like every one of them, and they then took these hundreds of billions of photos and they ran the machine learning algorithm to do like really good facial recognition on it. And they were able to link every photo back to its original post and profile. So in short, every single photo on Vic was put through this, put through the ringer of machine learning, facial recognition, and put on the side and you can put a face on there and search it. And the benefit from this is not just like, you know, you search a face and you see the person’s profile, which happens, you know, maybe 50% of the time. But what happens more often is you search your face and the person you’re searching, especially, you know, we do a lot of work with like Russian spies and security service officers. These people don’t have accounts usually, but their wives do. And they’re all college buddy still and their brothers do and their moms do and their kids do. And so you’ll find them in the background of photos and they’re sitting there, you know, a birthday party they had. Right. You could see their face behind a cake. And then you look at the name and identity of the person uploading the photo and you can figure out, oh, that’s them because they have the same last name or they they live in the same town, same last name, and they can see them, their wife in the photo or whatever.

Advertisement
Advertisement
Advertisement
Advertisement

S1: What is it? I don’t know. What’s it like to be using these kind of tools? Like, is it is it exciting? Scary. I’m sort of curious what internal response it provokes in you.

S3: Eventually kind of desensitized to it because you’re using it so much. But it’s the first time you use it like it feels like it’s like your mind’s blown kind of because you’re like, how is this possible? How could this be so easy? You just plug it in face and there you go. There’s the profile. There’s a family. We have some spy that we’ve been, you know, researching for months and months. And we have our passport photo and put the passport photo off, but all of a sudden the whole world is opened up. It’s also very creepy because you’re you’re doing the searches and you’re pulling up all these kind of innocent bystanders. Right. You’re finding their family, people who have no idea or we have limited understanding about who these people are that we’re investigating. You feel like you’re kind of, you know, getting a view into your life that you probably shouldn’t be able to. But because of this tech, it’s possible.

Advertisement

S1: But it’s not just the tech. It’s also possible because of the specific environment Eric is operating in. The Russian Internet, especially when it comes to data and privacy is worlds away from the American one.

S3: You have constant, constant, constant data leaks coming out of Russia, like massive leaks, like not just like, oh, some consumer data leaked from some website where you signed up to. It’s like the entire government registry of cars registered in Moscow for three years was leaked. Right. So if you own a car in Moscow, then your passport number, your date of birth, your address, your phone number is is public and online. Russia is kind of a wild, wild like bubble east, I guess you could say, of data privacy and data legislation because they’re basically Islam. About a year ago, if you for about 50 bucks or so, you can buy someone’s cell phone records. There are a lot of stories about people who, you know, wives who think their husbands are cheating on them and they buy their husband’s cell phone data and they see that they’re calling some woman at 2 a.m. right at the mail, and they know that they’re cheating on them. This doesn’t exist in the US at all. So kind of hand in hand with these Russian tools is kind of the background of the general culture, I guess you could say, or lack of or lack of privacy with online data in Russia in the first place. So it kind of goes hand-in-hand with that.

Advertisement
Advertisement
Advertisement
Advertisement

S1: Yeah. I think in the U.S., you know, a lot of the conversations, particularly around facial recognition tech, have been about domestic policing and data gathering to be used. In that vein, I wonder how you would describe the role that that same tech has been playing in war over the past few years.

S3: Again, Russia’s kind of its own thing here. It’s not just not just fine, Glenn. There’s lots of facial recognition services out there for Russia, and it’s been used in the war for, you know, for years now. Basically, since these services started popping up around 2016, 2017, people have been using this to identify soldiers, fighters, especially back in the earlier stage of the war, when Russia denied, you know, ever being involved in the war at all. And people would, you know, run these on fighters and soldiers to kind of prove that they’re, you know, Russian mercenaries or soldiers or whatever. So it’s been used quite a bit. It’s it’s not quite apples to oranges when you compare how this is used by, you know, us police and government forces and the U.S. and probably the UK and Europe.

Advertisement

S1: I mean, I’m listening to you talk about this and it sounds like this kind of. I hate to use the phrase perfect storm, but in some ways it seems applicable because you’re talking about a place with a very different data culture and the bottom up abilities to use things like find clone. They’re just like so many different permutations of it going.

S3: Yeah, it’s a it’s like four or five things all hitting it at once. Right? And one more thing you guys are that is just petty corruption, right? This is a big reason why so much this data is out there is because of people who leaked this information just because their salaries are not high enough, you know, cops who don’t get paid enough. So on the side, they they go to, you know, databases and sell data. And the same thing here, that’s why you uncover all these, you know, spies with the Russian military intelligence. And the FSB is because of this perfect storm of information. And, you know, a lot of people ask, you know, why don’t you do this for the CIA? Why don’t you choose for Mossad or am I six or whatever? Well, you know, it’d be great if we could, because it’d be fascinating to see, you know, what people are doing different operations all over the world and, you know, the dirty work and all that stuff. But these circumstances don’t exist basically anywhere else on earth of this perfect storm of data. Everywhere you have ubiquitous Internet connections. And out there, you know, everyone uses the Internet, everyone uses social media. In Russia, just like in the West, you’ve got this petty corruption with data everywhere. Just because of all these different parameters and factors, it’s just not possible to do the same sort of work on basically anywhere else in the world just because of this perfect storm of circumstances.

Advertisement
Advertisement
Advertisement
Advertisement

S1: Recently the circumstances got a bit more complicated. The Ukrainian government started working with Clearview AI, the controversial American facial recognition company. Clearview is famous or perhaps infamous for working with law enforcement and for scraping people’s photos from the Internet without their consent. The company says it offered its services to five Ukrainian government agencies free of charge, according to multiple news reports. That’s led to Ukrainian soldiers scanning the faces of dead Russians to find their identities, then contacting their families.

S3: They published videos about this, so they actually publish videos showing the chats that they have with the mothers of the dead soldiers. And this is kind of like schadenfreude, right? Is supposed to be like, you know, look at them suffering, look at the family, you know, look at the mother being shocked and horrified about her son being dead right.

Advertisement

S4: Next to notify their beloved one about the death of a soldier and attach a photo of the body. As of now, we have managed to identify 582 abandoned corpses, informed relatives.

S3: The first self I know that’s kind of like the most like twisted and worst version of this, but I think there are some cases that could be called, you know, benign good cases. But it’s a little more limited by journalistic investigation, things like that, where there is an editorial process involved and fact checking and all that stuff. And I guess, you know, do you who’s on Twitter harassing families?

S1: I think there is, though, a narrative, at least that I have seen in the press about the sort of, quote unquote, good use case of facial recognition in in this war, maybe on the side of Ukraine. And I want to unpack that a little bit, because I think that’s one of the difficult things about technology. You will hear a story about the nefarious use case and then a story about the good use case. And as someone who deals in the kind of gray areas, I wonder how you react to to kind of seeing all those different pieces of information come out.

Advertisement
Advertisement
Advertisement

S3: Yeah, it’s really hard to unpack. So maybe the more nefarious and dubious stuff is Ukrainian soldiers contacting the families of dead soldiers in order to like maybe harass them and mock them or whatever. Right. Maybe the more benevolent or good use case of is that there is such a thing is the same thing. But Ukrainian-Russian could have been a journalist trying to contact the family to talk about, okay, have you received compensation for the death? By God, have you been told about this is the government. I’ve been have you been told the hush up by the government? You know, that sort of thing about looking at the government policy about relating to dead soldiers.

S1: How much do you worry at all about the risk of misidentification with these tools, particularly being used by civilians and not people who are subjecting what they do to fact checking and a multi step verification process?

S3: I mean, the classic case everyone talks about with this is Charlottesville.

S1: Eric is talking about the white supremacist rally in Charlottesville, Virginia, in August of 2017 when videos of the rally were shared across the Internet. Lots of amateur sleuths tried to figure out who was there, but they didn’t always get it right.

S3: They didn’t have facial recognition tools people use for this. People just eyeballed it and they said, this person is this person, and they tried to get them fired from their job or whatever, right? And sometimes they’re right and very often they were not right. There was a famous example about a guy who was a professor at the University of Arkansas. I think he was falsely accused of being at Charlottesville right when he was just some random guy. So you could you can see how very easily you can go down those rabbit holes.

Advertisement
Advertisement
Advertisement

S1: I mean, listening to you, it actually almost makes it sound like the Charlottesville misidentification would have been lessened by people using facial recognition tech instead of just eyeballing it.

S3: Maybe I could see it just because people were so rabid about this, right? People are so fired up. I think it could go the other way, too, just because there’s there, people would have a false sense of security, of confidence in using these platforms, too. All it takes is one person.

S1: When we come back, what happens if Americans can get a cheap facial recognition tool to. As Eric said earlier in the show, there are a lot of circumstances in and around Russia that make it easy for average people to use facial recognition. That’s something that hasn’t really happened here, at least not yet.

S3: All it takes is one big dump of photos from Instagram or Facebook, and then it could be applied. So it’ll never be to the degree of Russia and Ukraine just because the base of the country is just so corrupt and rotten of both countries with data leaks and corruption and all that stuff, it will never get to that level. Fingers crossed. Hopefully it doesn’t. But there are kind of we have our own American version of this. We have our American exceptionalism with this and that. We have lots of access to commercial apps and providers. That could be link to some point. You know, you have, you know, Facebook and all these places work in conjunction a lot of different apps that also scrape data. There’s a million examples of people buying commercial data. This is legal. It shouldn’t be, but it’s legal. You can buy people’s geolocation data from different apps that track it. Most famously or maybe infamously, there was an example of someone who’s able to buy the geolocation data that came from Grindr. And they used to out a priest, I think was a Catholic priest. Catholic priest, I him. So I don’t know if we’re going to have something exactly like this on the facial recognition side, but if we ever do get to that singularity of data being available to everyone, it’ll probably be through something like that of some kind of leak or sell data through commercial apps related to somehow related to your face. Like maybe it’s face to face. Actually that was around a while ago. Gets hacked or sold or maybe it was a big data dump from Facebook or LinkedIn or something. If this happened for Facebook, I think the world would burn down with how people would use it.

Advertisement
Advertisement
Advertisement

S1: How do you think about how race fits into this picture? Because obviously there’s been work by, you know, Timnit Gebru, Joycelyn Weenie and others that shows that facial recognition works less well on darker skin tones and AI. That’s something that I think about a lot if we’re thinking about, say, an American application application in a more diverse country.

S3: Yeah, that’s definitely an issue because these things are tested most likely most often on the on white people. Right. And this is something I run into a lot when I run my clone on people who are not ethnically Russian or white. Like, for example, you know, Russia’s a very diverse country with, you know, hundreds of races and ethnicities and religions and all that stuff. And sometimes I run facial recognition for people who are from the Far East. So like Brits, for example, this is a ethnic group that’s near Mongolia. So I run facial recognition on them and it’s much the results are much worse than if I were to run on an ethnically Russian person. If I run a facial recognition of a black person on clone, it’ll bring up, you know, Barack Obama, an NBA player. Right. Because but that’s mostly but that’s mostly because it doesn’t have a ton of you know, there’s not a lot of black people in Russia.

S1: That’s what the site was trained.

S3: Exactly. So the data is just not as good.

S1: I’m struck by the tension between two things you said. Just number one is the sort of like it’s going to happen. These tools are going to be more widespread. Around the world. And the if it happened with Facebook, the world would burn down.

S3: It doesn’t exist behind closed doors. Right. Google, Facebook, Amazon, Microsoft, all these places already have extremely powerful facial recognition. That works. If you go to Google photos, it has built in facial recognition. That’s extremely powerful. Like it, but it’s within your own data center, closed data center. If I have, you know, 5000 photos of me and my family, my friends, it’ll say, okay, here, here’s you, here’s your buddy, here’s your son, here’s your wife, and they can pick them up. It’s kind of like a dam, right? Of what could burst through the dam is if this stuff is made commercially available. All it takes is one person on GitHub plus a data leak. And once that happens, you know, it’s fair game.

S1: Aric Toler. Thank you very much.

S3: Sure. Thanks for having me.

S1: Eric Toler is the director of training and research at Bellingcat. That is it for the show today. TBD is produced by Ethan Brookes, where edited by Tori Bosch. Joanne Levine is the executive producer for What next? Alicia montgomery is the executive producer for Slate Podcasts. TBD is part of the larger What Next Family, and it’s also part of Future Tense, a partnership of Slate, Arizona State University and New America. I want to take a moment and recommend that you listen to Tuesday’s episode. It’s a conversation with a caseworker who quit his job after having to investigate families of trans kids in Texas. We will be back next week with more episodes. I’m Lizzie O’Leary. Thanks for listening.