As constant access to information dilutes the ability of the mass public to consider and address vital issues, misinformation will pose a unique problem in the future. Legislating information distribution is an issue that governments around the world are still learning to solve.
Reuters reports that some agencies and officials in the administration of U.S. President Joe Biden have been prohibited by a federal judge from communicating with tech companies about content moderation. This has been applauded by conservatives as a step in the right direction to promote free speech and hinder the ability of the government to pull the strings of big tech companies, while liberals see the ruling as a dangerous precedent to set before the 2024 presidential elections, as misinformation often runs rampant during the final months of an election season.
The only certainty surrounding misinformation is that it won’t go away. Following the involvement of Cambridge Analytica in the 2016 election, COVID-19 misinformation, and the 2020 election misinformation purveyed by former President Donald Trump, it is a logical short-term solution for the Biden administration to communicate with big tech about misinformation. In the long term, however, this approach must be abandoned, as misinformation can be fought most effectively at the individual level by fostering digital literacy.
It is understandable why the Biden administration seeks to curb misinformation. In 2024, Biden will face an opponent in Trump who is currently polling at 57 percent against his Republican counterparts, according to The Wall Street Journal, and has a voter base that is vocal about their skepticism of the current administration’s public health policies. Additionally, the current indictments surrounding Trump’s alleged attempt to overturn the 2020 election must certainly rouse concerns in the White House. Finally, Trump’s track record of leveraging resources like Cambridge Analytica give the current president more reason to make sure a fair game is being played in the information space.
Dr. Renee Hobbs is a professor at the University of Rhode Island (URI) who specializes in teaching digital media literacy through the Media Education Lab, which seeks to improve media literacy through scholarship and public service. In her book, “Media Literacy in Action,” Hobbs recognizes that blending entertainment and education creates a mess of information when paired with individual interests that don’t align with that of the public and private sectors. Hobbs also recognizes the impact that algorithms and social media addiction play in the issue of misinformation.
Hobbs does not, however, seek only to remedy the disinformation issue through policy recommendations. Instead, she attempts to coach individuals into developing the lifelong learning competencies and habits of mind needed to navigate an increasingly complex media environment, as described by her Media Education Lab. This bottom-up approach is optimistic and assumes individual initiative to determine the veracity of the media they consume by asking the right questions.
In Hobbs’s 2020 book, “Mind Over Media: Propaganda Education for a Digital Age” she encourages individuals to consider questions about who created digital messages, what techniques were used in their creation, and what the goal of the writer entails. Hobbs believes that through the integration of classes about propaganda in K-12 learning, Americans can develop the good habit of thinking critically when consuming media in the modern age. Hobbs’s approach is the long-term solution to misinformation online, as it solves the ethical concerns surrounding public-private coordination by encouraging Americans to engage in thoughtful analysis of the media they consume.
The extent to which AI will have a negative impact on businesses, governments, and civil societies is unknown. These technologies have the capacity to spread deep fake images, video, and audio, as well as automate the spread of malicious material online, which can be extremely dangerous to democracy. To rekindle faith in institutions, slow polarization, and spark meaningful dialogue, it is the responsibility of individuals to critically engage with the content they find online.
Image courtesy of Unsplash