TechnologyTrending

X Struggles With How To Deal With Misinformation

Ethan Kassai
Staff Writer

In the ever-changing world of technology, news, and media, one mantra seems constant: misinformation. COVID-19, the Presidential Election of 2020, and very recently, the conflicts in the Middle East, have brought misinformation to the front of the minds of many. Social media has proved to be quite an issue in this regard, with spam bots and users alike on multiple platforms spreading inaccurate and sometimes completely false information. Platforms such as Instagram, Facebook, and X have seen the brunt of these incidents.

When Elon Musk acquired X, formerly Twitter, back in October of 2022, his priorities were focused on improving free speech and combating misinformation on the platform. As soon as the acquisition was completed, he removed tools and features from the platform that were originally in place to make room for his proposed solution, called Community Notes. As written on twitter.com, “Community Notes aim to create a better-informed world by empowering people on X to collaboratively add context to potentially misleading posts.” This is done by X-approved contributors leaving notes on posts, providing context, clarification, or information relevant to the post. Then, contributors from various points of view rank the notes as either helpful or unhelpful. If there are enough helpful rankings, the notes are approved and shown with the post to ensure the post is presented and interpreted accurately. Up until recently, this system has proved to be mostly effective at preventing misinformation, with some saying that it’s more successful than previous measures to combat the same issue. It also provides a less biased and more community-based approach to getting people accurate information.

Community Notes have been the prevailing way to combat misinformation during the Musk era (Photo courtesy of Medium)

This has proven especially effective in the world of news. Many sources, including prominent ones, will post headlines with attached articles, but only put certain key words or statements in the headline. This can lead to possible falsehoods and misunderstanding surrounding an issue, and ultimately cause confusion among X users. The context provided by Community Notes will clear up confusion and ensure the article is understood properly.

Though this approach has worked well up until this point, the Hamas attacks and the succeeding conflict in the Middle East has led to an increase in spam bots and users alike, spreading propaganda and inaccurate information leading to much confusion about what is really going on. This, of course, has overwhelmed Community Notes, leaving them almost ineffective which ultimately led to calls for more to be done by Musk and X. Some are even saying that X’s efforts are making the problem worse. According to an audit done by NBC News, about 2/3 of the top posts regarding Hamas and Israel had no Community Notes and only 8% of the remaining posts did, meaning that many posts, even with inaccurate information within them, were still being shown to the public without proper vetting. In an anonymous statement by a Community Notes contributor to Wired, the contributor said that the vetting process is too slow and cumbersome to keep up with increasing demand. He also said that the tool is very vulnerable to manipulation, meaning that even the Community Notes could contain misinformation. So, what is ultimately going to be done to deal with this?

In Europe, the EU is planning to open an investigation into misinformation on X in accordance with their new DSA (Digital Services Act), which is intended to be a check for the power of tech companies, to hold X more accountable for the content of their platform. In the United States, there isn’t much that the government can do to X due to Section 230 protections which limits the responsibility of platforms for the content present on their site.

Investors, users, and world citizens in general should keep a close eye on the developing investigation of X by the EU. Ever since the birth of the internet, it has been an ongoing debate on where corporate autonomy ends and protected free speech begins. X’s case in Europe could be an impactful guiding factor in this based on the tech giant’s scope and influence. It is undeniable that the way social media companies, and the government bodies that regulate them, balance free speech claims against private platform autonomy and this has the potential to drastically shape tomorrow’s business, regulatory, and social fabric. So, ultimately, if Musk doesn’t come up with a viable solution to this ravaging misinformation problem in a timely manner, fines from the EU may be the least of his and X’s worries.

 

Contact Ethan at ethan.kassai@student.shu.edu

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest