WASHINGTON - Facebook's mission statement is to "bring the world closer together." But its failure to rein in rampant misinformation and violent content could be the Achilles' heel of its global ambitions.
When the Sri Lankan government decided to temporarily block access to Facebook, WhatsApp and Google's YouTube in the aftermath of deadly bombings on Easter Sunday, April 21, it made a unilateral decision: The risks from rampant misinformation and fake news on these platforms were greater than the communications benefits these channels could bring during a crisis.
The episode proves Facebook's "combination of unwillingness and inability to deal with the misinformation problem is one of their great weaknesses," said Paul Barrett, deputy director of New York University's Stern Center for Business and Human Rights.
The Sri Lanka shutdown offers a glimpse at one potential stark future for social media companies such as Facebook, which might see their operations severely curtailed more often in certain corners of the world if they don't quickly improve their playbook for policing misinformation. This risk is especially pronounced in emerging markets, where Facebook is trying to rapidly attract new users - even though it may not yet have the resources to combat harmful content in these regions.
"This is an extreme step in terms of regulation, but one that I think will be becoming more common in the future," said Barrett, whose center seeks to train business leaders to make decisions on human rights issues. "We may be at a breaking point in terms of how people around world view big social media companies."
Facebook and other technology giants are facing this dilemma because they haven't been proactive enough in investing in content moderation technology and policy staff in smaller markets, such as in Sri Lanka, where Sinhala is the predominant language, said Dipayan Ghosh, co-director of the Platform Accountability Project at the Harvard Kennedy School.
Ghosh said Facebook was not designed for the needs of the cultural groups in Sri Lanka, and the algorithms it is developing to detect harmful content is most effective in languages such as English, and is not equipped to deal with rampant misinformation in Sri Lanka's native language.
In a time of crisis, where government leaders are worried misinformation could exacerbate on-the-ground violence, "there's no other choice for the government but to shut it down temporarily," Ghosh said.
And other governments more likely to take Sri Lanka's lead to block social media would be those in "fragile societies where the value of free speech is subordinated to other concerns," Barrett said.
Sri Lanka's shutdown should put pressure on Silicon Valley companies to make greater investments in content moderation technology and policy resources in all of its international markets, Barrett said.
Emerging markets might not be the biggest moneymakers for Silicon Valley tech giants in terms of ad revenue, but they are essential to Facebook's expansion as it continues to try to increase its users around the world and show Wall Street it continues to grow at a fast clip.
"I don't think the social media companies can do it all from Silicon Valley anymore," Barrett said.
Facebook over the weekend pushed back on the idea that a ban was necessary. "People rely on our services to communicate with their loved ones and we are committed to maintaining our services and helping the community and the country during this tragic time," the company said in a statement.
Yet the Sri Lankan government's move is bad optics for Silicon Valley a time when policymakers around the world are increasingly considering measures to regulate technology companies' handling of harmful content. The decision to block access - even for a matter of days - is "terrible from a policy perspective" for Facebook and other tech giants, Ghosh says.
Already, authoritarian countries like China broadly block their citizens' access to American Internet services that foster free expression, but as my colleagues Tony Romm, Elizabeth Dwoskin and Craig Timberg wrote yesterday, more democratic governments are increasingly targeting social media companies. The United Kingdom has been considering a broad range of actions to take against social media companies that host misinformation, and Australia recently passed a law that would allow the government to fine companies that do not swiftly remove violent content.
"I think that these kinds of incidents really add fuel to the fire," Ghosh said.
This article was written by Cat Zakrzewski, a reporter for The Washington Post.