Work

Tell This Bot About Your Experience of Harassment. It Might Actually Help.

Collage of woman holding a smartphone with #metoo app.
Photo illustration by Slate. Photos by Thinkstock.

Last summer, while psychological scientist Julia Shaw was visiting San Francisco with three other friends for July Fourth, inspiration struck. They had all been discussing the harassment-related firings at Uber. In some cases, complicity extended beyond the perpetrator to the human resource department: Instead of supporting the employees who came forward about the abuse, it dismissed or ignored them. Other tech companies followed the same old fashioned script: Societally, we tend to disbelieve, blame, or retaliate against victims of sexual harassment and abuse.

It’s no wonder, then, that 75 percent of sexual harassment incidents go unreported, according to the EEOC. That data point intrigued Shaw. She wondered—could her research on memory, combined with her friends’ skills in artificial intelligence and development, make it easier for victims to report abuse?

While other San Franciscans were celebrating Independence Day, Shaw’s team got to work on the concept that would eventually become Talk to Spot, a platform that allows users to report abuse or harassment they experienced at work anonymously, to an A.I. chat bot named Spot. Bouncing between co-working sites across the city, Shaw and her team finally took a break to watch fireworks from a rooftop. “It felt like the beginning of a new era,” she recalls.

Indeed, it was: people have been bungling some of the messiest sexual harassment-related problems for years—in part because those who are tasked with the problem-solving, HR representatives, are torn between advocating for vulnerable employees and watching out for a potential lawsuit. Two recent examples: Former Uber engineer Susan Fowler reportedly took her experience to HR only to be dismissed because her harasser was a “star performer,” while the complaints filed against Harvey Weinstein were allegedly channeled back to him.

Could a suite of technical tools designed to override human biases have done a better job?

In February, Shaw’s team launched their bot, which uses a cognitive interviewing style, the kind that police use when they’re trying to get information from victims, suspects or witnesses in a neutral, non-leading way. Instead of making assumptions about an interviewee’s credibility or their story like a non-scripted human might do, this style seeks to simply gather information rather than push the interviewee or ask leading questions. With 40 years of research behind it, cognitive interviewing is also “ the best practice for interviewing for highly emotional memories,” Shaw says. “Peoples’ memories can be questioned if they don’t have supporting evidence.” It’s a way of gathering evidence that’s more standardized than an untrained HR representative asking questions—which may re-traumatize the victim or impact their retelling of their story. Once Spot gathers the story , users can decide whether they wants to keep it for themselves, or forward a PDF of the story to an HR representative or someone else. Spot also allows users to stay anonymous by sending the file from a separate email.

Shaw is part of a larger group of entrepreneurs who have launched harassment and discrimination specific platforms over the last year to empower victims and to capitalize on many HR departments’ ineptitude. Although many entrepreneurs like Shaw characterize their products as complementary to people in HR, these tools still raise big questions: How much can tech replace humans? And even if it can, should tech replace humans in these tasks?

The phenomenon of underreporting assault is also part of what inspired Jess Ladd to launch her nonprofit Callisto back in 2015.

“What we’ve found from our work with victims and survivors is that not everyone wants to go in and talk to a human,” she says. “As a first step that often feels big and scary.”

Callisto began as a platform for college students to find information about assault, and to determine whether their assailant was a repeat offender: Through the platform, a victim could submit their assailant’s name and a unique identifier, and if another person had also submitted the same unique identifier, the contact information of both victims would be sent to the school for follow-up and investigation.

To Ladd, this is a key innovation because knowing you aren’t alone decreases risk in eventual reporting. When victims know they aren’t alone, this can help motivate them to come forward, especially when they are worried about retaliation. In March 2018, Callisto announced plans to expand that system to tech company founders.

She sees a future in which electronic case recordings will prove to be more effective than in-person investigations or reporting processes. Citing research that suggests people disclose more, and sometimes feeling more open with computer platforms, Ladd says that in the case of talking about harassment, a victim may be less likely to fear judgment when talking to an automated program. (See this story, for instance).

But experts say the lack of judgment may come with a trade-off: an empathy deficit.

“There’s an emotional dimension many times to the harassment that victims have experienced, and they can’t have an empathetic relationship with a website,” explains Safiya Noble, a University of Southern California professor and the author of the book Algorithms of Oppression.

Indeed, says Michelle Miller, co-founder and co-director of the workplace organizing and problem-solving platform coworker.org, reliving a trauma in the presence of a robot rather than a human could “further isolate a person from a system of support.” It’s not just about the data and the legal liability, she says, it’s about “creating a space for support and comfort.”

Miller and others I spoke to also worried about users and companies assuming that tech tool would automatically be free of bias.

“No matter how ‘unbiased’ a technology seems, chances are it has still imported the biases of its founders in the way it was designed,” says Miller. “Data collection and reporting through tech could have a really positive impact. But the idea that an algorithm has less bias than a human is tremendously dangerous. At least if you have a human making a decision, we know enough to really be thoughtful about second guessing, checking, having some backup,” whereas users of a tech tool may not take that extra step.

Still, many experts see potential in platforms that hope to augment, rather than replace, human interactions. Take, for instance, Empower Work, a nonprofit founded by Jaime-Alexis Fowler. It allows users to talk through a workplace problem with actual trained professionals via a text and web-chat platform, rather than risk being overheard on a harassment hotline. Since launching the platform in June 2017, Fowler says she’s been surprised by what users seem to value most: conversation with another human. “I thought people would want additional resources, and for us to connect them to more information,” she says. “In most cases, that’s not what people want,” which, Fowler acknowledges, could also be due to the way that Empower Work advertises itself as a place for people having challenges at work to “connect with one of our peer-trained counselors.”

Ladd, the Callisto founder, has found something different. She says that the most visited part of the Callisto site besides the home page is the information on reporting options and how to help a friend. “People just want clear information about what their options are, along with information about what sexual misconduct is, and what will happen if they come forward to report it,” she says.

Tech as simply a source of information isn’t sexy, she acknowledges. But it’s crucial. And that introduces a larger point: that perhaps the things that workplaces and victims need most are ones that won’t present a lucrative market opportunity.

“Maybe there are things that we need in order to keep us safe that are never going to turn a profit, and that’s OK,” says Miller.

In the end, any targeted or narrow approach will only do so much: a problem as complex as sexual harassment must be solved systemically, through a suite of reforms and policies. These could include changing policies or norms that reward (or implicitly condone) aggressive or inappropriate behavior, revamping management practices that don’t take inappropriate behavior into account for promotions or raises, and developing and consistently analyzing metrics for change (say, a lower attrition rate for female employees) to hold everyone accountable. What’s most likely to make workplaces safer will be an interplay between smart tech tools and well-trained humans.

In the meantime, Ladd suggests that the influx of new tools and resources on the market will help guide us towards that augmented future. “By giving victims a sense of what options are out there other than going to HR, HR will suddenly have to exist in a competitive market,” she says. “It will face economic and social pressure to improve.”

Elizabeth Weingarten is the director of the Global Gender Parity Initiative at New America and a senior fellow in the Better Life Lab program.