Angry mobs of Buddhists in Sri Lanka last month attacked minority Muslims, burning mosques and killing at least one. Those riots appear to have been triggered in part by false stories spread on Facebook and WhatsApp. And despite efforts by governments and nonprofits to alert them to the mounting risk, Facebook is accused of doing next to nothing to remove clear incitements to violence in the weeks leading up to the attacks.Angry mobs of Buddhists in Sri Lanka last month attacked minority Muslims, burning mosques and killing at least one. Those riots appear to have been triggered in part by false stories spread on Facebook and WhatsApp. And despite efforts by governments and nonprofits to alert them to the mounting risk, Facebook is accused of doing next to nothing to remove clear incitements to violence in the weeks leading up to the attacks.
The sequence of events in Sri Lanka is detailed in a bruising new report by the New York Times, which threatens to undermine Facebook’s longstanding claim to be a force for good in the world. At their heart were allegations of a plot by Muslim Sri Lankans to sterilize the country’s Sinhalese-speaking Buddhist majority, supported by a false story on Facebook saying that police had seized 23,000 sterilization pills from a Muslim pharmacist in the town of Ampara.
In an episode eerily reminiscent of reactions to the Hillary Clinton Pizzagate conspiracy theory, those stories led a mob of Buddhists to storm a Muslim-owned restaurant in the town of Ampara, falsely claiming its food was laced with drugs. The exchange exploded into beatings, rioting, and mosque-burning. Video of those events was also uploaded to Facebook, feeding further violence and the death of a 27-year-old aspiring journalist.
Aside from the brutal violence itself, the most disturbing part of the Times report is the allegation that Facebook, which has no offices in Sri Lanka, ignored or deflected repeated attempts by government officials and nonprofit monitors to intervene in a growing storm of hatred. As early as October of 2017, Sri Lankan officials pleaded with Facebook to better police hate speech, hire more Sinhalese-speaking content screeners, and establish a direct point of contact with local authorities.
Instead, Facebook insisted its content-flagging tool would be enough to alert the company to dangerous content. Members of a Sri Lankan group called the Center for Policy Alternatives did as recommended, repeatedly flagging posts including messages such as “Kill all Muslims, don’t even save an infant.” But “nearly every report,” according to the Times, was deemed to not violate Facebook’s standards. According to the Times, Facebook still has not filled around 25 positions for Sinhalese screeners that have been open since June.
The violence in Sri Lanka mirrors similar events in Myanmar, India, Mexico, and even the United States. They strike at the heart of Facebook’s utopian promise to connect people, showing that such connections can spread violent hatred as quickly as cute baby pictures.
Ethnic and religious resentments are not created by Facebook. But as the Times points out, Facebook’s core structure – including an algorithm that prioritizes content that gets the most engagement – may help foment outrage and tribalism. In nations with weak legal systems, citizens may be more likely to take justice into their own hands.
“The germs are ours,” as one Sri Lankan official told the Times, referring to the sectarian divisions in Sri Lanka, “but Facebook is the wind, you know?”
David Z. Morris
Leave your comments
Login to post a comment
Post comment as a guest