Disinformation: As the dust unsettles
As we await the results of the American presidential elections and its ramifications for democracies across the world, a key determinant that will overshadow debates on good governance, climate change, containment of the pandemic, universal healthcare, immigration, unemployment, international order and an impending economic recession – is the influence of disinformation.
It almost seems surreal that psychologically driven campaigns, deliberately intended to mislead people by tapping into their existing biases, divisions and fear, is the kingmaker in a world that once seemed open to endless possibilities with the digital revolution.
So in a world ruled by information wars, who holds the reins? Like mercenaries who are trained for real wars, online disinformation campaigns are carried out by individuals who are professionally trained, organized and well-financed.
There are many players, ranging from international political propagandists to domestic hate groups. In the case of Russian meddling in the 2016 US elections, a major player was Yevgeny Prigozhin who financed it through the Internet Research Agency (IRA). In addition to running troll armies to attack democracies in the digital space, Prigozhin, a businessman allied with President Vladimir Putin, also runs mercenaries in conflict zones such as Syria, Sudan and Libya among other countries.
Countering the offence
The counter approach to disinformation is unorganized and disproportionately focused on the role of social platforms; international agencies and national governments are not taking many stringent legal actions against those actively engaging in propagating disinformation. Earlier this month, the EU imposed a travel ban and asset freeze on Prigozhin for undermining peace in Libya and breaking the UN arms embargo. But with respect to his crimes related to disinformation we have yet to see concrete punitive measures. While a US grand jury had indicted Prigozhin in 2018 for interfering in elections, charges filed against him by Robert Mueller were dropped before the trial, in 2020.
Social platforms are faulty arbiters
Since social platforms are equipped with better resources, expertise, artificial intelligence and other tech tools, they are expected to do more – to identify and stop the spreading of hateful content that could have real world consequences. This tendency to relegate responsibilities to them is as dangerous as it is problematic. The problem arises when platforms unilaterally decide which content should stay and which should go. We now have clear evidence that even when content gets flagged for violating the community standards, it stays on certain platforms due to business and political reasons. This is precisely why the accountability and transparency of the content moderation by the platforms are being questioned.
Governments themselves, with their lack of technical expertise and reluctance to take a firm stand on issues related to freedom of speech, are heading towards failure in taking a concerted effort to address disinformation issues.
The case of India: a government reluctant to intervene
In India, disinformation is mostly manufactured by domestic players and propagated through mainstream news channels and social media platforms such as WhatsApp, Facebook and Twitter. Most of the disinformation in social media is propagated by political parties with dedicated teams for attacking political opponents and minorities. This has led to an unprecedented wave of violence, leading to even targeted murders. According to Microsoft’s Digital Civility Index 2019, Indians were the population most likely to encounter fake news and internet hoaxes.
While there are no specific laws to tackle fake news in India, IPC sections 153 and 295 can be invoked if the fake news comes under hate speech. A recent notification brought out by the Information and Broadcasting ministry for cancelling the Press Information Bureau (PIB) accreditation of journalists promoting fake news was immediately withdrawn after a massive outrage that hinted at an attack on freedom of press. The Government then passed the buck to the Press Council of India (PCI) and News Broadcasters Association (NBA) to deal with fake news. In case of social media platforms, they are exempted from any liability for third-party content though 79-II of Information Technology Act, 2000. Amendments are underway to make the social media platforms responsible for non-user generated content.
New solutions at different levels of governance
Given the reticence of governments to intervene, Governance Innovation Labs is engaging with civil society to tackle disinformation by building a framework for testing different strategies in collaboration with various stakeholders. In 2017, we conducted the first public consultation for fighting fake news in Kerala at the Trivandrum Press Club with the participation of citizens, journalists, lawyers and AI experts. A survey on the impact of fake news along with the report on the consultation was published for initiating public discourse on the need for corrective action. With digital awareness campaigns and literacy programs, we worked with various agencies to initiate and test solutions for the disinformation problems by curating and debunking major fake news, and also drafting and publishing a disinformation bill for Kerala.
During the pandemic, in response to the rise in disinformation, we did a campaign calling for advertisers to stop funding fake news. In addition to nudging the advertising industry to disassociate from fake news, we also set up a legal team to provide assistance to citizens to sue the media for broadcasting fake news. The legal and monetary consequences for propagating disinformation provides a strong incentive for media firms to ensure stringent fact-checking measures. Augmenting these measures, the Kerala government recently instituted a fact-check division to identify and take time-bound legal measures along with setting up internet information cells in every district. The journalist community in Kerala also came together and published a series on the need for improving the quality of journalism. At Governance Innovation Labs, we continue our efforts in designing, testing and implementing (and failing at times) new solutions for addressing the disinformation crisis.
Delivering the internet’s potential
The divisive digital world in which we find ourselves is far from the promised land we expected–of technological advancements and an open, equal and inclusive internet. To build that future, we must collectively act together to find solutions and be brave enough to cross the lines and redefine what we have accepted as norms. If we delay our move, the opponents to such a vision will inevitably take us to a point of no return.
Asha Jomis is a Policy Leader Fellow at the EUI’s School of Transnational Governance, and founder of Governance Innovation Labs, where she leads various public policy initiatives aimed at improving policy outcomes and designing new policy solutions to encourage the government to deliver better public service.