Future Tense

Scammers Are Using Deepfake Videos Now

Vice President and Director of Studies at the Center for a New American Security Paul Scharre views in his offices a deepfake video of Barack Obama, created by filmmaker Jordan Peele.
Deepfake video of Barack Obama, created by filmmaker Jordan Peele in 2019, is one of the most famous examples of deepfakes. Robert Lever/AFP via Getty Images

Highly realistic deepfake videos didn’t quite make the splash some feared they would during the 2020 presidential election. (Less sophisticated “cheapfake” videos certainly did make the rounds, though.)

Nevertheless, deepfakes are causing trouble—for regular people.

In March, the Federal Bureau of Investigation warned that it expected fraudsters to leverage “synthetic content for cyber … operations in the next 12-18 months.”

In deepfake videos, which first appeared in 2017, a computer-generated face (often of a real person) is superimposed on someone else. After the swap, the fraudsters can make the target person say or do just about anything. To get a realistic image, scammers use artificial intelligence that studies photos and videos of the person from different angles. To produce a deepfake video, it is not necessary to be a professional hacker. Many popular programs for making face-swap videos are free, and it is possible to create one even on iPhone. (However, high-quality content still requires a powerful computer.)

Advertisement
Advertisement
Advertisement
Advertisement

As technology becomes more and more available, fraudsters have started to use it in different spheres of life. On Wednesday, the Times of India reported about a new type of cybercrime in India: sex extortion involving deepfake porn videos not of the victim but of the fraudster, at least at first. Usually, a scammer (using a profile photo of a woman, almost certainly a fake one) sends a friend request to the target person on social media. Then the scammer  exchanges text messages with the victim to build some trust and finally makes a video call. The fraudster uses computer-generated video of a woman to entice the victim masturbate , which scammers record and use afterward to blackmail the person. At least two men in the western Indian city of Ahmedabad reported such extortion calls to the police. One of them complained that fraudsters demanded about $3,000. The men themselves reportedly didn’t realize the woman on video was a deepfake—it was revealed after the police investigation. (One of the men said he “did not indulge in any obscenity,” so it’s unclear what he was being blackmailed with.)

Advertisement

When the victim doesn`t fall for a computer-generated video and refuses to participate in virtual sex, the scammers may then create a second a deepfake video, making it look that the target person was engaged in a sexual act. According to the Indian Express, a resident of Delhi was blackmailed in this way. The woman, who became his friend on Instagram, appeared naked during the video call. The man ended the call as soon as he realized what was happening. Yet the fraudsters still tried to blackmail him. He refused to pay up as, according to him, he did not do anything obscene. But that didn`t stop the criminals. “The scammers had taken a picture of my face from the video call I had with them and superimposed it on someone else’s body. In the video they shared, it appears as if I am having a sex chat,” the newspaper cited the victim as saying. The manipulated video was sent to his family and friends.

Advertisement
Advertisement
Advertisement

Scammers are using deepfake videos in other ways, too. In Russia, criminals have learned how to collect personal data with this technology. On Sept. 6, scammers posted on social media a deepfake video featuring Oleg Tinkov, the founder of Tinkoff Bank, which is one of the 15 largest banks in Russia . The “clone” of Tinkov called on people to use the bank’s investment tools, promising to give every client 50 percent of the amount of investment as a bonus. Once users clicked on the link mentioned in a video, they were taken to a fake website of the bank and asked to share their names, emails, and phone numbers.

In the U.S., deepfakes have also started showing up on dating apps. According to the Daily Beast, in 2019, a woman from California was scammed out of about $300,000 after being led on by manipulated clips. The elaborate scheme involved two scammers (or one fraudster who used two different accounts). The first criminal she met on the online dating site pretended to be Sean Buck, U.S. Navy vice admiral and the superintendent of the Naval Academy. The woman talked to him on Skype regularly. The second scammer posed as an American who, after months of communication, told her that he had been imprisoned overseas. The woman asked “Buck” to help to release the prisoner, and “the admiral” made her transfer thousands of dollars “to pay the lawyer.” Later law enforcement discovered that every time the victim spoke to “Buck” on Skype, she was actually watching “manipulated clips of preexisting publicly-available video of the real Admiral Buck”.

There might be many more cybercrimes using deepfakes—not all of them seem to be reported since victims might be embarrassed about being scammed. Besides universal recommendations not to answer video calls from strangers and avoid sending them money, experts have specific tips on how to recognize deepfake videos and protect yourself from cybercrime. As the FBI pointed out in its March warning , look for too much space between the subject`s eyes, visual distortions around pupils and earlobes, syncing issues between face and lip movement, and a blurry background on the video.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.

Advertisement