CybersecurityEditors' PickFeaturedHuman RightsOpinion

Storytime: Social Media Leads to Violence

By Sohaa Khan

Media consumption has been influential for as long as its conception and is often the root of where many take inspiration for their beliefs and ideals and feel empowered to share their opinions – whether good or bad. This is why it is no surprise that social media platforms have been used to spread hate speech, misinformation, and extremist content to incite online tensions leading to real-life violence. This issue is further exacerbated by social media platforms, such as Facebook, doing little to block hate content except when it is too late and the damage is already done. Unregulated fake news spreads like wildfire and jumps from one social media platform to another and rather than immediately blocking content or the users behind it, Facebook has shown again and again that it values clicks and shares over the lives of innocent people.

The problem that techno-determinism poses is that technology determines the cultural directions and opinions of society and vice versa; technology and society are entangled in one another and will always be influencing each other. This makes it difficult to determine whether people who commit violence are first influenced by social media or if social media is influenced by what is happening in the world. Unfortunately, the answer can be both which makes it harder to tackle the problem and determine what solutions can be implemented first. An experiment was conducted by a Facebook researcher in 2019 to experience social media in India and they would “like” any recommended pages that came across their feed. This experiment, among many others that were similarly conducted and compiled into the Facebook Papers, showed that Facebook truly does not care about the misinformation and hate speech that easily spreads on its platform and claims to not have the resources to do anything about it. With 87% of the company’s budget focused on North America, Facebook only spends 13% of its budget on the rest of the world despite only having 10% of its userbase located in North America. The problem this poses is that while the majority of Facebook’s userbase is outside of North America, it has delegated little resources to utilize when issues aggravated by Facebook arise. Facebook needs to reallocate its budget to be able to handle content management and stop the spread of misinformation in other countries. This will then allow Facebook to be able to set up regional offices in these countries where the spread of misinformation is high so they can be the first ones alerted about a potential growing threat and handle it.

Facebook’s mishandling of Myanmar

The infamous 2021 military coup in Myanmar is a prime example of how Facebook is complicit in the loss of lives and how the situation was grossly mishandled. Facebook’s algorithm, much like the experiment done in India, promoted posts that praised the military and pushed for violence. Many people in Myanmar rely on Facebook not only as a social media site, but as a news source which can often be biased or posted with individual user’s comments on matters. Sharing these articles with negative comments enables others with similar views to be more prominent and voice their opinions. These echo chambers where people are free to spread their extremist views and receive validation from others with the same views will then regard these extremist opinions as facts and disregard the real narrative. When posts that instigate violence gain more likes and shares, this creates a problem that spills over into real-life leading to radicalized people who create violence – which is what happened in Myanmar.

Facebook responded to the violence by taking down content that praised the coup and then the platform was blocked by internet providers at the request of the military. However, another experiment was done similar to the one in India in which a clean Facebook account was set up and the algorithm was tested. The results showed that even though Facebook is removing posts that praise the coup, there are still pages and groups that express similar sentiments as pro-coup posts that are being suggested by the algorithm. By further promoting these echo chambers and allowing them to continue to exist, the platform is still complicit in the violence that has ensued.

What can be done?

International standards need to be set by international organizations, such as the United Nations, to handle this issue. Due to social media’s transnational nature, by agreeing upon international standards on how to handle misinformation and incendiary posts, international organizations can provide guidelines for how to deal with social media platforms and individual users who do not comply.

Facebook, along with other social media platforms need to realize that they hold more power than they think. The effects of social media on society are too strong and are integrated into everyday life that this reality cannot be ignored. Social media platforms need to be able to allocate enough funds to handle issues such as in Myanmar and others across the globe. By focusing on Western users, the rest of the world is left to deal with the causes and effects of misinformation that have run rampant on social media sites and the violence it brings. If international standards are set on funds that need to be set aside for dealing with the consequences of online misinformation and hate speech, this will ensure that social media platforms can be held accountable when they choose to take no action. In addition to this, regional offices need to be set up as first responders in these situations so they can help stop the spread of misinformation and hate speech. The world of social media will continue to expand and it is imperative that this issue is resolved before more lives are lost over likes and shares.

 

 

 

 

Sohaa Khan is a second-year M.A. candidate at the School of Diplomacy and International Relations at Seton Hall, specializing in International Law and Human Rights and Asian Studies. She is the Deputy Editor-in-Chief for the Journal of Diplomacy and International Relations, Director of Communications for the Graduate Diplomacy Council, and a member of the Sigma Iota Rho Honor Society. She hopes to continue studying the relationship between social media and how online hate speech and misinformation lead to violence.

Share

One thought on “Storytime: Social Media Leads to Violence

  • Great article!
    Agree with various points about Facebook handling misinformation better in other parts of the world.
    I’m no fan of big tech either but it might be a little unfair to bring in Facebook’s emphasis on North America because most of it’s revenue comes in from these areas- similar to the 20-80 rule where 20% of the customers bring in 80% of the revenue and Vice versa so maybe from a business standpoint it might make sense to focus on this. Which brings me to another great point you mentioned was about how social media has more power than it knows.
    For example the battle between Russia and Ukraine, it seems like we’re getting more information and insights from social media than the news channels.
    Misinformation about war can easily spread and cause more confusion and even panic in society. So from a benefit to the world perspective it might make sense to spend effort to improve the platform for ALL users.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *