Focus on Social Media Mobilization: Myanmar’s genocide
By Alyssa Tolentino
Once one of the least connected countries in the world, Myanmar was eager to catch up to the ever-connected, globalized world when it emerged from decades of military rule in 2011.
This eagerness, coupled with the ethnic tensions between the Buddhist Rakhine and the dominantly Muslim Rohingya, resulted in the mobilization of social media on a grand scale to spread hate speech and propagate a genocide that has taken over the narrative of Myanmar’s transition to democracy.
Before 2011, the junta of Myanmar kept its citizens as isolated from the rest of the world as possible, says the Council on Foreign Relations. According to the International Telecommunication Union, a UN agency, few people had telephones and only 1.1 percent of the population used the internet in 2012.
Everything changed in 2013 when a quasi-civilian government oversaw the deregulation of telecommunications, says Reuters. All of a sudden, the state-owned phone company faced competition from two foreign mobile-phone entrants from Norway and Qatar.
Many saw Facebook as the complete package; one can access news, videos, other entertainment, and even message others all in one place. Being on Facebook became a status symbol. Even the government itself uses it to make major announcements.
With its increasing ubiquity, however, hatemongers have taken advantage of the social network to spread anti-Muslim sentiments. Human rights activists from inside the country tell CNN that posts range from recirculated news articles from pro-government outlets to misrepresented or faked photos and anti-Rohingya cartoons.
Meanwhile, the Myanmar government and military have been using the platform to present their own narrative of the Rohingya crisis. According to WIRED, the office of the Commander-in-Chief posted photographs of dismembered children and dead babies, claiming they were attacked by Rohingya terrorists, to counter criticisms from Western countries.
Tensions reached a boiling point on July 2, 2014, when a mob of hundreds of angry residents surrounded the Sun Teashop in Mandalay, Myanmar’s second-largest city when rape accusations against the teashop’s Muslim owner went viral on Facebook.
WIRED reports two casualties—one Muslim and one Buddhist—and about 20 others injured during the multi-day melee. Five people, including a woman who admitted she was paid to make the false rape claim, were sentenced to 21 years in prison for their roles in instigating the riots.
After the Mandalay riots, a panel discussion in Yangon, Myanmar was arranged where Mia Garlick, Facebook’s director of policy for the Asia-Pacific region, told the audience that the company planned to speed up translation of the site’s user guidelines and code of conduct into Burmese. According to WIRED, the Burmese language community standards did not launch until September 2015, 14 months later.
Facebook’s internal community standards enforcement guidelines defines hate speech as “violent or dehumanizing speech, statements of inferiority, or calls for exclusion or segregation” against people based on their race, ethnicity, religious affiliation, and other characteristics.
However, Reuters spoke to former content monitors who said that the rules wereHarshana Ghoorhoo inconsistent and that training instructs them to err on the side of keeping content on Facebook. One former employee said, “Most of the time, you try to give the user the benefit of the doubt.”
For example, Reuters found this post in Burmese from August of last year: “Kill all the kalars that you see in Myanmar; none of them should be left alive.” Facebook’s translation into English reads: “I shouldn’t have a rainbow in Myanmar.”
Many tech companies and NGOs have joined the effort in translating and monitoring posts on Facebook. In response to the flood of hate-filled posts, a group of these firms and NGOs wrote an open letter to Facebook CEO Mark Zuckerberg, condemning the “inadequate response of the Facebook team” to escalating rhetoric on the platform in Myanmar, CNN reports.
In May 2015, David Madden, founder of the tech firm Phandeeyar based in Yangon, warned Facebook executives that the company risked being a platform used to foment widespread violence, akin to the way radio broadcasts incited killings during the Rwandan genocide.
Under international law, incitement to genocide is as a crime. There have even been cases of musicians being prosecuted in the International Criminal Tribunal for Rwanda, like the case of Simon Bikindi, says the New York Times.
In March, the United Nations accused Facebook of “substantively contributing” to the “level of acrimony” against Rohingya Muslims in Myanmar. In his Senate testimony, Zuckerberg claimed that the social media site was hiring dozens of more Burmese speakers to review hate speech posted in Myanmar.
Four months after Zuckerberg’s pledge to act, Reuters uncovered hundreds of posts that call the Rohingya or other Muslims dogs, maggots, and rapists, suggesting they be fed to pigs or simply exterminated. One of these showed a news article from an army-controlled publication about attacks on police stations by Rohingya militants. “These non-human kalar dogs, the Bengalis, are killing and destroying our land, our water, and our ethnic people. We need to destroy their race.”