News

Disinformation, security and democracy

18 July 2025

Share

Reported by Christian Neubacher, Policy Engagement Planning Coordinator, Centre for Science and Policy

Disinformation, security and democracy

As governments navigate an increasingly complicated and interlinked policy landscape, policy institutions are faced with the added element of growing disinformation and misinformation which together obfuscate what constitutes truth. To explore this topic, the 2025 CSaP Annual Conference brought together leading experts from across the disinformation field to explore the current state of disinformation and what options exist to address disinformation moving forward. Chaired by Jill Rutter, Senior Fellow at the Institute for Government, the panel discussed how disinformation taps into human emotions, the top-down production and broader economy of disinformation, and efforts by the UK government and NGOs to deal with disinformation.
The panel included Kevin Armstrong, Policy and External Relations Manager, Full Fact; Dr Jon Roozenbeek, Affiliate Lecturer, Department of Psychology, University of Cambridge; and Dr Ann-Kristen Glenster, Deputy Director, Minderoo Centre for Technology and Democracy, University of Cambridge.

Listen to a recording of the panel discussion:

Kevin Armstrong began the session by starkly laying out today’s disinformation context by arguing that we are in a moment of crisis of facts. Armstrong argued that trust among the public is low, especially for trust in politicians. Compounding low levels of trust is inaction by government and parliament to take policy actions to address the disinformation crisis, and to themselves own up to mistakes when misstating or misrepresenting facts. To help address this, Full Fact, which works to fact check claims by politicians and public institutions, is adjusting its incentive structures to encourage more corrections of inaccurate or misleading claims.

Despite the efforts made by organisations such as Full Fact, factchecking faces an uphill battle against the powerful disinformation economy which has emerged. As Dr Jon Roozenbeek outlined, it costs very little to purchase social media impressions, thereby making it easy for small actors to engage in disinformation. Adding to this pressure are large investments by major countries into the information ecosystem which can amplify and exacerbate existing misinformation. This process of “borrowing legitimacy” allows different disinformation actors to reach beyond their traditional audiences to the broader public.

Roozenbeek stressed that disinformation and misinformation is most often not ideological, but rather strategic. Actors aim to confuse the public about where the truth lies, rather than stating clear falsehoods. Over time, continued exposure to disinformation can lead to false beliefs being formed. Countering disinformation and misinformation therefore requires policymakers to approach it from a strategic rather than ideological perspective and to identify and disrupt key pieces of the disinformation industry, such as disinformation marketplaces. One solution Roozenbeek highlighted is to raise the effective cost of SIM cards to raise the cost of creating fake accounts.

However, as Dr Ann-Kristen Glenster argued, attempts at regulating disinformation have been largely ineffective. Efforts such as the 2023 Online Safety Act stem from the idea of protecting consumers but can conflict with Freedom of Speech protections. Adding to such conflicts, regulators are faced with jurisdictional challenges, a highly rapid pace of technological development, and a lack of strong enforcement mechanisms. These factors impede regulators’ ability to effectively tackle disinformation. Glenster stressed that a large part of combatting disinformation lies in increasing transparency, and efforts should be structured around this.

Compounding the already pernicious challenges faced by government in dealing with disinformation and misinformation, the proliferation of generative AI is making it easier to produce persuasive false or misleading content. Add to this a broader backlash against fact-checking services and the large funding provided to lobbying organisations to oppose regulations in this field, and the outlook becomes even more arduous.

Despite the challenges posed by disinformation, there are initiatives and policies which are working to address those problems. This includes governments creating economic deterrence in the cyberspace against disinformation actors, efforts to create a European “tech stack”, and media literacy efforts in the broader population. Over time, developments such as a “Right to Authenticity”, which would help people to know whether information is generated by a human or a bot, could be brought into fruition.

Given disinformation’s relationship with the economy, human psychology, and technological developments, academic evidence can play a pivotal role in helping the UK government navigate how to deal with disinformation, its scope, and its impacts.

Professor Ann Kristin Glenster

The Minderoo Centre for Technology and Democracy, University of Cambridge

Christian Neubacher

Centre for Science and Policy, University of Cambridge

Dr Jon Roozenbeek

Department of Psychology, University of Cambridge

Jill Rutter

Institute for Government (IfG)