2023October 2023September 2023Analysis

The Concerning Political Implications of Elon Musk’s “Absolutist” Free Speech Policies

Catherine Anderson
Staff Writer

Elon Musk took control of X, formerly known as Twitter, in October 2022, and in the time since, he has made staffing and policy changes that have directly contributed to a rise in both misinformation and disinformation on social media. These changes have been echoed across the social media and tech industries, which has the potential to impact the 2024 U.S. presidential election. Unless more aggressive approaches to targeted misinformation are implemented by social media companies, the ease with which voters are exposed to it may directly skew voting patterns and voter turnout in 2024.  

It is important to note that, even though the terms are often used interchangeably, there is a difference between misinformation and disinformation. Misinformation is simply “false information that is spread, regardless of intent to mislead.” Disinformation, on the other hand, refers to false information that is created and spread intentionally. This includes propaganda, manipulated facts, and intentionally misleading or biased information. While misinformation is not malicious in the way that disinformation is, both types of false information are dangerous, and both can spread rapidly on social media. 

The spread of disinformation on X has gotten worse under Musk. In fact, The Washington Post says studies have shown that changes Musk made to the company directly led to increases in the spread of Russian propaganda. Musk, for example, dropped the state-affiliated labels that Twitter had been using for Kremlin-backed accounts. He also implemented a system that allows X users, including Kremlin-backed accounts who spread propaganda, to pay for verification, which makes their posts more prominent. 

This problem is not unique to X; the same study showed that pro-Kremlin accounts reach the largest audience on Meta’s platforms, such as Facebook and Instagram, says the Washington Post. The propaganda posts usually contained hate speech, boosted extremists, and posed direct threats to national security. However, even though a large portion of this disinformation is spreading on other social media platforms, experts believe that Musk’s policy changes contributed to the changes in the policies of other social media giants. Shortly after he took over Twitter, Musk fired the director of the department that protected Twitter users from fraud, harassment, and offensive content, reports The Washington Post. In the weeks that followed, the company laid off over half of its staff, “crippling” the teams responsible for making decisions about what to do about misinformation on the app. He made other policy changes on the app, such as allowing cause-based ads, and later, political ads, which had been banned previously, writes Reuters.  

When Musk made these changes, it showed other major social media companies, like Meta, that they could do the same thing. According to the Washington Post, employees at Meta have said that the company was planning on announcing a policy to ban political ads, like Twitter had done previously. When Musk began to market his plans for Twitter as a “safe haven” for free speech, internal support for the political ad policy at Meta dwindled. When Twitter reversed its own ban, it seems to have discouraged Meta from ever implementing the policy. 

Additionally, Mark Zuckerberg said in a podcast interview that when Musk initiated massive layoffs, it encouraged other large tech companies to do the same thing. Since that interview, Meta has laid off over 20,000 workers, The Washington Post continues. It also now gives users the option to opt out of fact checking services by hiding misinformation warning labels. These policy changes and workforce cuts, which have become an industry wide trend, have left the major social media companies completely unprepared to combat misinformation, which may lead to dire consequences for the 2024 U.S. presidential election. 

In order to determine exactly what misinformation affecting elections looks like, one needs to look no further than the 2020 election . As  reports, disinformation leading up to the 2020 campaign was often tailored to target specific communities, such as people of color or immigrants. Groups who sought to spread disinformation preyed on specific fears and vulnerabilities of a given group and tailored their messages accordingly. Immigrants who previously lived in authoritarian regimes, for example, are more vulnerable to misinformation that claims American politicians want to turn the U.S. into a socialist state.

In addition to messaging that targets these fears, immigrants whose first language is not English are vulnerable to misinformation that can arise from mistranslation; this can be unintentional, but it is also an opportunity for “bad actors” to take advantage of the situation and inject disinformation into translated news pieces, reports the Associated Press. An example of this, according to Politico, occurred in several WhatsApp groups, where the word “progressive” was being translated to “progresista,” which, in Spanish,  has “far-left” connotations, and is more similar to words like “socialista” and “comunista.” Whether this was intentional or not, such a mistranslation can play on the fears of people who had lived in authoritarian regimes because the use of that word reaffirms concerns that left-leaning politicians are not merely progressive, but outright socialist. In the context of elections, this kind of misinformation would discourage its targets from voting for these “progressive” politicians. This would push them to vote for more conservative candidates when they may not otherwise have done so. 

An additional difficulty for minority groups is that they often do not see their communities’ issues represented in mainstream media, so they turn to smaller platforms. As the Associated Press reports, disinformation spreads rapidly on social media apps like WhatsApp or WeChat, both of which are used heavily by communities of color. Disinformation targeting these communities was already a massive problem prior to the 2020 elections, and experts expect it to become worse in the upcoming 2024 election. 

As voting policies have changed and given more political power to minority groups, such as the Asian American, Black, and Latino communities, disinformation efforts targeted at these groups have also increased, reports the Associated Press. Politico reports that in many cases, these disinformation campaigns manifest as attempts at voter suppression. Experts expect claims of election fraud in 2020 and 2022 to continue to be a part of these campaigns, and many disinformation campaigns are attempts at voter suppression. Claims of election fraud, for example, may be targeted at communities of people who may be predisposed to distrust voting processes, such as people from countries whose elections were not free and fair. Because of this distrust, they are more likely to be skeptical of voting processes and are likely more susceptible to disinformation of this nature. If actors spreading disinformation can successfully convince people that the election is rigged and that their vote does not really matter, it is likely that they could stop them from voting altogether. 

Claims of election fraud in 2020 originated with President Donald Trump, who claimed that he was the rightful winner of the election. Disinformation that repeats these claims, when targeted at minority groups, could have the effect of suppressing voters who would vote against Trump in the 2024 elections. In 2020, according to a New York Times poll, 87 percent of Black voters, 65 percent of Hispanic and Latino voters, 61 percent of Asian voters, and 55 percent of other, non-White voters, voted for President Biden. Combined, these minority groups made up about 33 percent of American voters in the 2020 election. Only 41 percent of White voters, who accounted for 67 percent of American voters, voted for Biden, according to the poll. This means that Biden won the votes of minority groups by enough of a margin that he was able to overcome the fact that he did not win a majority of White voters. 

If propagandists could successfully influence a large enough portion of these minority votes, and either sway them to vote against Biden or to simply not vote at all, the Republican nominee in 2024 would be more likely to win the election. This is dangerous primarily because the President would have been elected by voters who made their decisions based on false information and who, if they had been more accurately informed prior to the election, would have voted differently. This is especially dangerous if, as Politico reports, disinformation efforts continue to be more effectively targeted at specific groups. If propagandists can tailor their messages to prey on the individual fears of certain groups, they are more likely to succeed in influencing their targets. Due to significant layoffs, as the Associated Press points out, social media companies may not be equipped to combat this level of disinformation. 

Considering that disinformation is spreading at a faster rate than it ever has, reaching a broader audience than it ever has, and is clearly targeted at certain vulnerable groups, social media companies must take measures to combat its spread. Because social media is the medium by which disinformation is spreading, and because social media is the primary source of news for a growing number of people, as the Washington Post reports, it is the responsibility of the social media companies to put safeguards in place. Instead, as experts call for more aggressive measures against misinformation, these companies are actually rolling back safeguards. For Elon Musk and other industry leaders, the defense is actually quite simple: free speech. 

According to Al Jazeera, conservatives view many of these fact-checking policies as a “heavy-handed” approach to policing information. Elon Musk, as Al Jazeera states,  calls himself a “free speech absolutist,” which is why he put his controversial policies in place. Certainly, users can now say a lot more than they previously could on the platform without facing consequences. Musk, for example, reinstated President Donald Trump’s account, which had been suspended following the January 2021 attack on the Capitol due to a risk of inciting further violence. To Musk, free speech seems to be an absolute liberty to express oneself that should never be limited, whether that be by private social media companies, the government, or other actors. The problem with this “absolutist” approach to free speech, however, is that it is a fundamental misunderstanding of what the right actually is. Freedom of speech is not the freedom to say whatever one wants without facing any consequences. Further, a private company cannot violate freedom of speech. 

If a social media user makes a statement that is harmful, whether it be offensive or just false, the social media company has the right to limit that statement’s reach. If a user has repeatedly made offensive remarks or intentionally spread harmful disinformation, the social media company has the right, to take action to prevent that individual from continuing to spread these messages. Misinformation cannot be allowed to spread at the rate that it has. It has already harmed vulnerable communities and will continue to harm the general public. As artificial intelligence technology has become more sophisticated, potential bad actors have gained access to more tools to mislead voters. These advances could lead to more convincing disinformation, meaning that those who may not previously have been as vulnerable to disinformation are now at an increased risk.  

Elon Musk has set an industry standard. He has made it okay to take minimal efforts to curb disinformation, and companies like Meta have followed his lead. If the industry does not change course, the consequences for the 2024 election, and the broader future, will be disastrous. 

Image courtesy of Mobile Syrup

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Share This