2021November 2021Opinion

Facebook’s Complicity in Regional Violence Must End

Elsie Tierney
Staff Writer 

The international social network, Facebook, currently hosts more than 2.8 billion monthly users worldwide in more than 190 countries and 160 different languages. With all of this diversity among their members, it becomes difficult for Facebook to thoroughly review all posts coming from all corners of the globe. 

Facebook employees have warned the public for years that the company consistently fails to effectively police abusive content and hate-speech, specifically coming from high-tension regions, according to Reuters. Currently, more than half of the languages used on Facebook do not have any community standards, specifically in the local languages of Myanmar and Ethiopia, two high-tension areas. Internal documents from Facebook analyzed by Reuters showed that Facebook has not hired enough people who have the knowledge of local languages or current events to monitor harmful posts. The same documents showed their artificial intelligence systems are not thorough enough either.

Other internal documents obtained by the Wall Street Journal showed that Facebook’s screening system has a list of high-profile users that are exempt from punitive action if they were to post something that goes against the terms of service.  This system is called “Cross-check” or “X-check” and was meant to be an additional layer of protection, however, it has evolved into a system that fuels problematic content, as notes Amnesty International.  Facebook’s algorithm promotes this misinformation because of how engaging it is to readers, while maintaining activity on the site.

Facebook’s complacency in spreading misinformation was first highlighted during the violence between the Buddhist and Rohingya Muslim populations in Myanmar. According to BBC News,  in 2014 an extremist anti-Muslim monk shared false claims that a Buddhist girl had been raped by Muslim men. A few days later, the men accused were attacked with  two of them being killed. Facebook itself admitted their platform was complicit in inciting this violence, while UN human rights investigators came to the same conclusion.

Another example of Facebook’s complicity in genocide is the case of Ethiopia’s civil war, a conflict unfolding between the new and old rulers in the Tigray region of the state. In a report from NPR, Zecharias Zelalem, one of the main journalists covering the conflict a noted how, “prominent Facebook posters would post unverified, often inflammatory posts or rhetoric that would then go on to incite mob violence, ethnic clashes, crackdowns on independent press or outspoken voices.” 

On September 27, Zelalem saw a post from a media outlet that blamed members of an ethnic minority group in Ethiopia for a series of murders and kidnappings. On Sept. 28, the village cited in the post was burnt to the ground and the citizens were murdered. The post remains live as of October 15.  

Facebook cites Ethiopia’s crisis as a company priority and has denied the allegations that it fueled some of the violence, yet many question the company’s claims. “I can quite honestly say that Facebook has [done] not nearly enough,” said Zelalem said to NPR.  

Facebook continues to exhibit complacency in instances of online misinformation turning into offline violence and has only vaguely claimed to take action.  

In the case of Myanmar, Facebook has been working with a UN team of investigators called the Independent Investigative Mechanism for Myanmar (IIMM) since 2019.  While the two groups hold regular meetings, the head of the IIMM has stated that he does not feel Facebook has provided all of the information they have requested, and, certain requests, Facebook has reconsidered sharing.

A Washington judge’s order for Facebook to turnover internal documents related to accounts that helped fuel the genocidal violence against the Rohingya was actually appealed by the company in October, according to Bloomberg.  These documents were requested by The Gambia, who filed charges in the International Court of Justice against Myanmar, accusing them of perpetrating genocide against the Rohingya. 

While Facebook said they were willing to work with The Gambia, they also argued the Judge went too far in requesting the broad release of non-public information due to a federal law protecting the privacy of its users. However, upon further review, it was found that Gambia’s claims were permissible. 

Facebook has a habit of ignoring the very things they claim to be holding at top priority, while continuously making excuses for the results of their complacency. The company needs a stronger screening system for the hate speech continuously spread on their website or some form of government regulation to hold them accountable for their empty commitments to stopping the spread of misinformation.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Share This