Navigating the complexity of disinformation: moving beyond simplistic narratives

A- A A+

Today, we often fail to meaningfully acknowledge the nuances between disinformation and its subtler counterpart, misinformation. While disinformation involves the deliberate spread of false information, misinformation often stems from inadvertent errors or misunderstandings. We also sometimes ignore the fact that many instruments from the toolbox of information deviations were not invented today and take root in propaganda practices, which are well rooted in statecraft mechanisms of many countries and governments.

To fully grasp the evolution of disinformation, we must look to history. Manipulation of information is not a new tactic but a longstanding tool of political warfare. From the propaganda machines of Nazi Germany to the McCarthy-era witch hunts in the US, the manipulation of truth has been a constant throughout history. By understanding these historical precedents and mechanisms, and what triggered them, we can better contextualise the current disinformation landscape and develop more effective mitigation strategies.

Moreover, our fixation on external actors, especially in light of events like the alleged Russian interference in the 2016 US elections, often blinds us to the homegrown disinformation within Western societies. Political entities, interest groups, and even media outlets exploit misinformation to shape narratives and gain advantages. This phenomenon transcends borders and thrives on polarisation, posing a significant challenge to truth-seeking efforts.

It is crucial to recognise that oversimplifying disinformation as the ultimate threat to democracy serves certain political agendas. By dismissing dissenting voices as purveyors of disinformation, political actors seek to control the narrative and delegitimise opposition, thereby stifling meaningful discourse and undermining democratic principles.

The interim staff report released in late 2023 by the Committee on the Judiciary of the US Congress sheds light on the dangers of a one-size-fits-all approach to tackling disinformation, particularly in the context of the federal government’s involvement in the Election Integrity Partnership (EIP) – a consortium of ‘disinformation’ academics led by Stanford University’s Internet Observatory.

The report exposes many heavy-handed tactics employed by the government, including the censorship of political speech of US citizens. It also highlights how social media companies were pressured to censor information and political opinions, with a bias towards one side of the political aisle.

Here one might start wondering whether all disinformation is created equal, or whether it is all relative to how various narratives from various stakeholders fit the existing political discourse and serve the goals of respective political adversaries.

“Truthfulness has never been counted among political virtues, and lies have always been regarded as justifiable tool in political dealings,” Hannah Arendt reminded us in her book Lying and politics.

‘Super elections’ and disinformation battles

We can only anticipate that similar approaches will reverberate in the months to come.

In 2024, when more than half of the world population will be casting their ballots in more than 50 country elections worldwide, information distortions have the potential to become the most disruptive power, with electorates being manipulated with the variety of disinformation and propaganda tools, as well as censorship instruments available to both state and non-state actors and applied both domestically and internationally.

Moreover, the rise of participatory misinformation, facilitated by the proliferation of social media and citizen journalism, adds another layer of complexity to the disinformation debate in this context. On the one hand, the democratisation of information dissemination has empowered individuals to participate in shaping narratives. On the other hand, it has created the context in which audiences retire to their information bubbles, leaving few options for alternative opinions to be heard or considered in these self-sufficient information ecosystems.

With information, uncorroborated by alternative voices, the risk of it turning into mis- and disinformation is considerably higher. However, if an attempt is made to apply a simplistic punitive approach to how users engage with each other, the end result could be as catastrophic for the democratic discourse as the proliferation of disinformation in itself.

Punitive measures targeting disinformation may inadvertently suppress marginalised voices or legitimate forms of activism. Algorithms designed to flag or remove content deemed as disinformation may disproportionately target certain communities or viewpoints, further exacerbating existing inequalities in access to information and amplifying the voices of those in power.

Germany’s law targeting social media platforms like Facebook and Twitter (currently X), implemented in 2017, aimed to combat hate speech and fake information by mandating the removal of offending posts within 24 hours, under threat of hefty fines. However, the unintended consequences of this approach are becoming increasingly apparent. The government has now amended the law due to the overzealous blocking of content, prompting concerns from organisations like the Association of German Journalists.

Brazil’s proposed Internet Freedom, Responsibility, and Transparency Bill, dubbed the “fake news” bill, also presents a stark illustration of the perils of oversimplified approaches to combatting disinformation. While ostensibly aimed at curbing the spread of false information on social media and messaging platforms, the bill has sparked widespread concern among civil society organisations. Critics argue that the legislation undermines fundamental rights to freedom of expression and privacy while failing to effectively address the root causes of disinformation.

To mitigate these global dangers, a nuanced approach is imperative. Tackling disinformation requires a coordinated approach involving governments, tech companies, civil society organisations, and individual citizens. However, when the debate is framed in black and white terms, it becomes harder to find common ground and work together towards solutions.

Thus, rather than resorting to broad and indiscriminate measures, responses should be tailored to the specific nature of the problem and address the underlying factors driving information digressions.

“In the midst of chaos, there is also opportunity,” wrote Sun Tzu, author of The Art of War in the 5th century BC.

In today’s landscape, the stakes have never been higher to seize the opportunity and turn the information chaos into the information order, where societies can productively function without waging digital battles from within.

Julia Savchenko is a Policy Leader Fellow at the Florence School of Transnational Governance, where she examines the challenges for journalism and mass media in times of information disorder. She is a journalist, TV anchor, and media manager with two decades of experience internationally. Her previous roles include producing TV and radio shows for the BBC World Service in London, Moscow, and Washington, as well as at Voice of America and Radio Free Europe in Washington and Prague. She currently is Senior Editor at Voice of America.