Snapchat is apologizing, taking action, and opening up about its advertisement-review process Thursday, following a backlash over an offensive ad that ran on the social app earlier this week.
The company confirmed to me that the ad, which made light of domestic violence, was reviewed by a human at Snap, and said the company is investigating exactly how it got approved. It also said that it has blocked the advertiser—a mobile game called Would You Rather—from its platform. And it clarified its policies around which of Snapchat’s “self-serve” ads undergo human review and which ones are approved automatically.
The ad, placed by a mobile video game called Would You Rather, asked Snapchat users if they would rather “slap Rihanna” or “punch Chris Brown”—a reference to Brown’s assault of then-girlfriend Rihanna in 2009. Snap took the ad down on Monday and issued an apology. But Rihanna responded Thursday with an Instagram post calling out Snapchat for “bring(ing) shame to DV victims.” She wrote: “I’d love to call it ignorance, but I know you ain’t that dumb!”
The renewed criticism sent Snap’s stock sliding Thursday, and prompted the company to issue a more sincere-sounding apology. “This advertisement is disgusting and never should have appeared on our service,” a Snap spokesman said via email. “We are so sorry we made the terrible mistake of allowing it through our review process. We are investigating how that happened so that we can make sure it never happens again.”
The company also provided some insight into its advertising review process when I asked about exactly how this happened and how it plans to avoid similar mistakes in the future. The vast majority of ads placed on Snapchat are now “self-serve,” meaning that advertisers follow an automated process to design and purchase their own ads on the platform rather than working directly with Snapchat representatives. This is akin to how advertising typically works on Google, Facebook, and other large internet platforms.
Snapchat earned an early reputation for relatively high-quality ads, which it has struggled to maintain since launching its self-serve platform in May 2017. The company has sought to differentiate itself partly by employing human reviewers to vet these ads before they appear on the app—an approach that larger and more established platforms are increasingly adopting in response to criticism about their automated processes, but which is difficult to do without a massive workforce given their scale. For instance, Facebook announced in October that it would hire 1,000 more people to manually review ads on sensitive topics.
Yet the Rihanna ad is a reminder that human review does not guarantee perfect results. Snapchat’s advertising policies prohibit “shocking, sensational, or disrespectful content,” and the company said the Would You Rather ad should have been rejected on those grounds.
Asked whether all ads on Snapchat undergo human review, a spokesperson told me that the vast majority do, although there are exceptions. The company maintains what it describes as a small whitelist of previously vetted advertisers who have free reign to place ads without further human review. It also allows advertisers to automatically republish ads that the company’s human reviewers have already manually approved, provided both the ad and the targeted audience are the same. Neither of those was the case with the Would You Rather ad, however, which is why it was reviewed by a human.
The company noted that it supports the nonprofit National Network to End Domestic Violence, whose executive vice president sits on Snap’s safety advisory board.
It seems clear that a competent, professional human reviewer with sufficient awareness of pop culture and the discourse around domestic violence in the United States should have rejected the ad. What’s less clear is whether big social media firms have taken the steps needed to ensure that all of those preconditions are in place when they hire, train, and support their human moderators. Various media reports over the years have highlighted disturbing working conditions for human content reviewers, who tend to be contractors rather than employees and sometimes work in countries far from the ones whose content they’re evaluating. Snap did not immediately respond to further questions about its labor practices for ad reviewers.