As journalists have worked to report on and analyze the shooting of Walter Scott by a North Charleston, South Carolina police officer, a warning has surfaced on Facebook. When news organizations upload footage of the shooting to the social network, some of the clips are prefaced with a black screen from Facebook that reads, “Warning – Graphic Video. Videos that contain graphic content can shock, offend, and upset. Are you sure you want to see this?”
Video uploaded by NBC Nightly News and the Post and Courier, Charleston’s local daily newspaper, both received this Facebook warning. Facebook says it has been using these types of messages since late 2014. “We … ask that people warn their audience about what they are about to see if it includes graphic violence,” a Facebook representative wrote in a statement. “In instances when people report graphic content to us that should include warnings or is not appropriate for people under the age of 18, we may add a warning for adults and prevent young people from viewing the content.”
Other social networks are dealing with similar concerns. In August, for example, Twitter made the controversial decision to suspend accounts that shared images or video footage of journalist James Foley’s beheading by ISIS.
In the case of the Post and Courier video, the newspaper had already added a warning, which read, “The following contains unedited, graphic footage of the April 4, 2015, shooting of Walter Scott,” but this was apparently deemed inadequate by Facebook. After the newspaper received word that a Facebook user had flagged its video as inappropriate, the black-screen warning started showing up, and the video no longer autoplayed in newsfeeds.
The warning seems to be Facebook’s effort at placating both sides in debates about the appropriateness of video content. If someone complains about footage that doesn’t actually go so far as to violate Facebook Community Standards, the solution is to slap a warning on. The system works on a case-by-case basis and is subjective. If two news organizations post the same footage, but only one gets flagged by a user, Facebook will only put the warning on the video connected to the complaint. Or if a video already has a disclaimer, like the Post and Courier’s, Facebook can still make the internal decision to add its warning on top. Similarly, the determination of whether the footage violates Community Standards in the first place seems to be based on case-by-case assessments of a number of factors related to how the post is constructed and what it says.
Laura Gaton, the Post and Courier’s digital editor, says she protested Facebook’s warning because the newspaper had added one of its own. She reports that the social network didn’t directly respond, but said, “Your video was reported for violating Facebook’s Community Standards. Since it doesn’t violate this community standard, it was not removed.”
Facebook’s message seems to emphasize its delicate position as arbiter between offended users and those trying to dissemenate information. But as Facebook lays groundwork to host more news media directly on its site, beginning with video, it will become more difficult for the company to walk this line. The warning as it exists now doesn’t just contain a statement about graphic content, it also contains the question, “Are you sure you want to see this?”
“It seems pretty ridiculous that Facebook, which is trying to insert itself more and more into the news business, would essentially censor a major national news video,” Gaton said.