Last week, United Nation investigators briefing the press about the humanitarian crisis in Myanmar pointed their fingers at a surprising perpetrator: Facebook. The officials said that hate-filled Facebook posts have helped amplify deadly ethnic tensions in the country, where military strikes since last August have spurred about 700,000 Rohingya Muslims to flee to Bangladesh from what the U.N. suspects may be genocide. One of the U.N. investigators said Facebook has “turned into a beast,” alleging that it has been used to incite “a lot of violence and a lot of hatred against the Rohingya or other ethnic minorities” in Myanmar.
As it happened, last week’s guest on the Slate podcast I co-host was Facebook’s news feed chief, Adam Mosseri, so we asked him about Facebook’s role in Myanmar and what responsibility it has to its users there. He was quite candid, even if, as he allowed, Facebook is still working toward finding the right approach to the situation. “Connecting the world isn’t always going to be a good thing,” Mosseri told us, acknowledging that at Facebook, “we lose some sleep over this.”
Mosseri said that in Myanmar, Facebook has had trouble finding third-party fact checkers to partner with to help curb the spread of fake news there (which is one approach the company started trying in the U.S. after the 2016 election), and he stressed that the company “would look heavily” for bad actors violating the company’s terms of service in Myanmar as a way of addressing some of the deeply troublesome hate speech that has gone viral on the platform there. But ultimately, he admitted, what’s happening on Facebook in Myanmar has been a challenge for the company, which is still figuring out how to address it.
Kevin Roose, a tech columnist at the New York Times, wondered on Twitter whether Facebook ought to do something more dramatic: pull out of Myanmar entirely. “If Facebook is losing sleep over how it’s being used in Myanmar and can’t find a solution, why not just…shut it down there?” he asked.
It’s an interesting question: If Facebook is boosting hate speech in Myanmar, causing real-world harm in the process, why should it be there in the first place? I decided to put it to some people who are familiar with the situation on the ground, and in some cases currently have boots on it.
They stressed that there are myriad important ways that people do use Facebook in Myanmar, as well as significant ways in which the social network is failing its users in the region—for example, by not sufficiently responding to reports of hate speech and by forcing the local activists who do that reporting to use their real names, a dangerous act in country that sanctions violence against an ethnic minority. But instead of Facebook exiting Myanmar, all the activists and academics I spoke to insisted that people in Myanmar would be far better served by the platform if it took a more nuanced, region-tailored, and faster approach in its community moderation.
“Essentially, what people do on Facebook in Myanmar is what people do on Facebook all over the world,” said Jes Kaliebe Petersen, the CEO of Phandeeyar, a technology community center in Yangon, Myanmar’s largest city, that hosts trainings and events for activists, startups, and NGOs working with technology for social change and local entrepreneurship. “Online shopping on Facebook is really big in Myanmar. People wish each other a happy birthday on Facebook just like anywhere else.” In other words, they shouldn’t have to go without what’s become an essential service because Facebook is struggling with one (extremely important) part of its job.
“Facebook is the main social media platform here,” said Tin Htet Paing, a freelance journalist based in Yangon.* “People spend most of their waking hours on Facebook, and it has become primary way people get news.” Tin told me that if Facebook pulled out of Myanmar, “It would not make sense at all.” Despite the fake news that spreads on the social network in Myanmar, news outlets and journalists rely on Facebook to spread accurate information, too.
But, just as the U.N. investigation concluded, nearly everyone I interviewed for this story also acknowledged that Facebook is a dangerous mess in Myanmar and felt the company needs to do something about it fast.
Part of Facebook’s task involves being more culturally competent. I spoke to Aye Aye Dun, the co-founder of Saddha: Buddhists for Peace, a small group of Burmese American Buddhists that works to counter hate speech online both in Myanmar and among diaspora communities. Dun told me that her group has reported content on Facebook, including statements in favor of military violence and against the Rohingya. “I don’t know if any of it has been taken down. For American moderators of Facebook to see these things as hate speech requires an understanding of the political environment in Burma,” she said, adding that it’s often not easy to determine what is and isn’t hate speech in the region, since it’s sanctioned by the military. “Nationalism and pro-military sentiment has been very, very high” and people share graphics saying that they support what the military is doing, she said. But in the case of the Rohingya in Myanmar, that could mean supporting violence against an ethnic minority.
“When we see highly problematic content online reported on Facebook, it has taken too long to take it down,” said Petersen. “When it takes 48 to 72 hours, hundreds of thousands or millions of people could have been exposed.”
Dun said that even speaking out against hate speech on Facebook can trigger threats of violence. “I’ve had friends who have been sent death threats and accusatory sentiments from strangers. You can be branded a traitor from friends and family,” Dun said. That’s one reason why many people who do take action to flag hate speech on Facebook in Myanmar have decided to do so under names that aren’t their own, despite Facebook’s rule requiring the use of real names, a policy that doesn’t jibe with the needs of human rights activists in the region.
There’s also a law in Myanmar against online defamation that has been used to silence criticisms of the government and the military on social media. At least 60 people have been arrested under the online defamation law, according to Gerard McCarthy, associate director of the Myanmar Research Centre at Australian National University. People who are trying to flag hate speech or counter posts inciting violence often end up getting reported as promoting hate speech themselves, says McCarthy. Facebook then locks up their accounts, and to fight the allegations, the company has requested these activists provide their real name and proof of identity to get back online. “Then they have to use their real name on Facebook if they want to post anything, and everyone is scared to try and curtail the kind of hate speech that’s coming from pages like Wirathu, who is that very virulent monk who was on the front page of Time magazine.” Wirathu, who has used his social media presence to disseminate anti-Muslim hate speech for years, only had his Facebook page removed at the end of February.
And since Facebook has been slow to remove hate speech and has forced activists to use their real names, the people who do use their Facebook accounts to promote violence have learned how to make sure their messages spread far. “Now, instead of pressing share, people are copying and pasting the post to their own account to duplicate it,” said Petersen, which means the process to flag and remove the hate speech often has to start all over again.
Facebook doesn’t have an office in Myanmar, which may be one reason the company has had a hard time grappling with the local complexities of how hate speech spreads online in the country. “There’s a responsibility of Facebook’s moderators to recognize who is exploiting their social media tools to stoke violence that materializes in hate crimes and widespread systematic violence,” said Dun, but that’s something that might best be handled by Burmese people who have a deeper understanding of political and social life in Myanmar.
“Having people who can do content moderation that’s much faster right now would go a long way,” said Petersen. “They need to have people on the ground to better understand the subtleties.”
But even if Facebook did local hire staff to remove hate speech in Myanmar, there’s no guarantee it would be successful. Facebook has, after all, failed repeatedly to remove hate speech in its home country, too. In the U.S. the company has been accused of repeatedly suspending the accounts of racial justice activists while letting posts calling for violence stay up. In May of last year, for example, the company suspended the account of a Black Lives Matter activist who dubbed “all white people” are racist in the context of learning about racial justice, while a post a few weeks later by Louisiana Republican Rep. Clay Higgins following the London terrorist attack last year that called for hunting and murdering “radicalized” Muslim suspects is still live on the site. For Facebook to be successful in making the internet in Myanmar safer, the company would likely have to be open to changing its current content moderation strategies and building capacity in the region with people who understand the nuances of how hate speech and violent messages are communicated there.
In the meantime, Dun says it’s best that the social media network remains in the country. “Without Facebook, the people who do not advance the military’s propaganda might be prevented from speaking out more. Some people say it’s easy to be a social media activist, but in the context of Burma, if Facebook is shut down, since a lot of the allies we’ve made are through the internet, it would be a lot harder to contact people. It would be like trying to contact people in Burma like it was a closed country again.”
Correction, March 22, 2018: This article originally misstated that journalist Tin Htet Paing is based in Hpakant. She’s actually based in Yangon, though she was reporting from Hpakant at the time of the interview.