Future Tense

A Woman Frustrated by Unsolicited Dick Pics Decided to Make Her Own Filter

Why couldn’t the big social media companies do this?

Photo illustration of a woman looking at her phone in surprise.
Photo illustration by Slate. Photo by Khosrork/iStock/Getty Images Plus.

Last week, Kelsey Bressler woke up, checked Twitter, and found a photo of a stranger’s dick in her DMs. Understandably, she was not pleased—“Nothing like waking up to an unsolicited dick pic,” she tweeted before messaging the stranger back to tell him his antics were not OK. A friend Bressler met online through activism work saw Bressler’s tweet and offered to make something she, and every other person who’s experienced cyberflashing, would want: a filter that can recognize and automatically remove penis photos.

Advertisement

Bressler, like many women, is intimately familiar with this type of online harassment. According to research from Pew, young women in particular are twice as likely to experience sexual harassment online; 53 percent of women 18 to 29 years old said they’d received an explicit image they did not ask for. Another survey by market research company YouGov found that 78 percent of millennial women said they’d received an unsolicited dick pic. As men send their willies willy-nilly via dating apps and social media messages and abuse iPhones’ AirDrop feature to penis-spam their fellow passengers on the subway, Texas and New York City are working to criminalize “cyberflashing.”

Advertisement
Advertisement
Advertisement

In theory, the major social media platforms and some dating apps have filters in place to prevent strangers’ dicks from sliding into your DMs. But Bressler says that in her experience, they don’t work. On Twitter, for example, she’d already selected the option to prevent “sensitive images” from automatically appearing in her feed, but it’s unclear whether that setting also filters content in direct messages. “I don’t think it’s working as intended, or maybe it’s not meant to block out pictures of penises,” she says.

Advertisement

At any rate, it isn’t consistent. In 2017, another Twitter user ran into a similar problem when her harasser created three accounts, all with the same display name and avatar, and messaged the same dick picture from each account. When she reported each account, Twitter gave her three different responses: one told her to report the dick pic directly, another locked the reported account, and a third found “no violation of Twitter’s Rules regarding abusive behavior.”

Advertisement

So Bressler decided to take matters into her own hands. After her friend created an A.I. and used a database of not-safe-for-work images to train it to recognize penises, Bressler turned her experience on its head: She went on Twitter to ask for dick pics to test the filter’s accuracy.

Advertisement
Advertisement

In response, Bressler’s received at least 300 pictures. Not all are dicks, Bressler says; some are of people putting their fingers through their pant zippers, emulating a dick. She’s gotten some photos of Trump, too. But in general, she says the filter’s done an excellent job of identifying “vanilla penis pictures,” catching and screening over 95 percent of your run-of-the-mill dicks.

Advertisement
Advertisement

The current filter is set up as a Twitter plugin; it crawls your messages and compares any photos to what it’s learned from its library of dicks. If it determines it’s a penis, it deletes the photo. On the user’s end, all that appears is a blank message.

More fringe cases have fooled the A.I. “There was an entire penis coated in glitter, so it looked metallic,” says Bressler, as an example of one that slipped through the filter. “Someone submitted a picture of their penis in a little penis cage—didn’t catch that one. And someone else put their penis in a hot dog bun.”

Advertisement

As someone who has never sent an unsolicited dick pic, I have always wondered about the privacy implications of doing so. Are penis-spammers unconcerned about their identities being linked back to their appendages? In this case, I could see how one might be inspired to contribute your Johnson to science—but I’d also be deeply uncomfortable with any researcher having photos of me, let alone any photos of body parts I usually keep clothed. I asked Bressler about the privacy implications, and she says they’re hoping to keep things as anonymous as possible. “The pictures are being deleted immediately; we’re not saving them anywhere, and the only ones sticking around in my inbox are the ones not being caught,” she says. But still, as a victim of revenge porn, Bressler is all too familiar with what can happen when intimate photos fall into the wrong hands. “If someone is really concerned about it, I’d say just don’t participate in the project.”

In the long term, Bressler hopes that the tool can be made more widely available, whether that’s through sharing code with others or working with tech companies to implement filters directly on their platforms. “Our intent was to make something to prove it can be made,” says Bressler. “If it takes two random people coming in and making a solution, then we should ask, why aren’t companies taking this more seriously?”

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.

Advertisement