The Industry

Facebook Announces New Policy Banning Misinformation That Leads to Violence

Facebook content from Buddhist extremists in Sri Lanka has been linked to violence against the Muslim minority in the country.
Facebook content from Buddhist extremists in Sri Lanka has been linked to violence against the Muslim minority in the country. ISHARA S. KODIKARA/AFP/Getty Images

Facebook on Wednesday announced a new policy for removing misinformation, including altered imagery, from the platform that is intended to cause or exacerbate violence. Misleading and inaccurate information spread through the social media site has been linked to violence in Myanmar and Sri Lanka. U.N. investigators reported that extremist Buddhists in Myanmar used Facebook to spread public messages in their ethnic cleansing campaign against the Rohingya Muslim minority. Extremist Buddhists in Sri Lanka have also taken to Facebook to post misinformation denigrating the Muslim minority, which has led to communal attacks.

Advertisement

A Facebook spokesperson wrote in a statement, “Reducing the distribution of misinformation—rather than removing it outright—strikes the right balance between free expression and a safe and authentic community. There are certain forms of misinformation that have contributed to physical harm, and we are making a policy change which will enable us to take that type of content down. We will be begin implementing the policy during the coming months.”

Advertisement
Advertisement
Advertisement

Facebook is beginning to work with local organizations and international partners in foreign countries, who can often more quickly detect misinformation campaigns on the ground, to help it enforce the policy and conduct investigations. These partners, which include civil society organizations and threat intelligence agencies, are encouraged to reach out to the platform’s moderators with disingenuous content, along with proof that the information is false. The moderators must also determine whether there is an imminent threat of violence stemming from the misinformation. If all the criteria are met, the moderators will room the content from the platform. Facebook also plans to use media-matching technology to help identify potentially dangerous misinformation.

Advertisement
Advertisement

Facebook claims that it implemented this policy last month in Sri Lanka, where users were posting allegations that Muslims were poisoning the Buddhist majority’s food. A local partner informed moderators that the content had the potential to spur violence. Even though there were no signs anything had occurred as a result, the moderators nevertheless decided to take down the posts because there were previous cases in which similar content had been associated with violence in the country.

The new policy comes the same day that CEO Mark Zuckerberg landed in hot water for saying that Holocaust deniers who spread misinformation on the platform are not “intentionally getting it wrong” during an interview with Recode’s Kara Swisher. He later clarified, “I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny that.”

Advertisement