Nextdoor, a Social Network for Neighborhoods, Has a Racial Profiling Problem. Will This Change Fix It?

Nirav Tolia, chief executive officer, of Nextdoor.

Drew Angerer/Getty Images

This post originally appeared on Inc. This is something any company that maintains an online community will want to pay attention to. Neighborhood social network Nextdoor found itself facing intense criticism last year over instances of racial profiling on its platform. The issues reached a point where officials in Oakland, California, debated whether to ban use of the platform by city departments. Users were reporting everyday activities of members of minority groups in their neighborhoods as “suspicious.” Some users allegedly threatened to report others who raised concerns about racist posts. One user who had raised concerns was said to be removed from her neighborhood group. “I’m a person of color so it really cut deep,” Nextdoor CEO Nirav Tolia told Fusion recently of the reports, which Fusion had first written about in 2015 and which the East Bay Express had also reported on. “We hated the idea that something we built would be viewed as racist. … I hadn’t seen it in my own neighborhood’s Nextdoor and so didn’t realize it was an issue for us. Once I got past that, I was powered by the challenge to do something about it.” Now Tolia says in a blog post that his San Francisco–based startup, last valued at $1.1 billion, has found a solution. It’s a form-based process that takes users through an extended series of steps before they can post to the social network’s Crime and Safety section. Nextdoor says the new process has been shown to reduce instances of racial profiling in test markets by 75 percent. The company rolled the process out nationwide on Thursday. “As Nextdoor has become one of the places where neighbors talk about how to make their local communities better, it is natural for the issue of race to be discussed and debated,” Tolia writes in the post. “But it’s not acceptable when mentions of race take the form of racial profiling.” Other changes the company has made in the last year include adding an option to flag posts as racial profiling and updating member guidelines. Nextdoor says in a statement that the changes reflect the company’s mission. “We can’t speak for other platforms, but for us it was about staying true to our mission, which is to build stronger and safer neighborhoods.” Tolia writes that activist groups and city officials in Oakland played critical roles in designing changes to the platform, including the new reporting process. Activists told Inc. earlier this year they had requested a multipart reporting process whereby a user would not be able to provide a physical description of an individual suspected of committing a crime or suspicious activity without first describing the activity. Moreover, physical descriptors would require more information than the perceived race of the person being described. Shikira Porter, a representative from Oakland activist group Neighbors for Racial Justice, one of the groups Tolia mentioned, tells Inc. the recent changes are a “step in the right direction.” Still, she adds, her group hopes to see more changes to address what she describes as apparent loopholes in the mobile app for Nextdoor, that allow users to continue to racially profile their neighbors by posting “reports like ‘Black male hoodie making a U-turn.’ ” Porter raised additional concerns about language used in posting tools that allows users to post about “suspicious activity,” a category she views as overly broad, as opposed to restricting reports to notifications of specific crimes that were witnessed. “Now, we have a form that expects a full description, which is helpful,” she says. “However, it does not address the deeply rooted and embedded racism/bias that neighbors carry about ‘suspicious’ people of color.”