Countering Disinformation Effectively: A Evidence-Based Policy Guide
Carnegie Endowment for International Peace
ABOUT THE AUTHORS
JON BATEMAN is a senior fellow in the Technology and International Affairs Program at the Carnegie Endowment for International Peace. His research areas include disinformation, cyber operations, artificial intelligence, and techno-nationalism. Bateman previously was special assistant to Chairman of the Joint Chiefs of Staff General Joseph F. Dunford, Jr., serving as a speechwriter and the lead strategic analyst in the chairman’s internal think tank. He has also helped craft policy for military cyber operations in the Office of the Secretary of Defense, and was a senior intelligence analyst at the Defense Intelligence Agency, where he led teams responsible for assessing Iran’s internal stability, senior-level decisionmaking, and cyber activities. Bateman is a graduate of Harvard Law School and Johns Hopkins University.
DEAN JACKSON is principal of Public Circle Research & Consulting and a specialist in democracy, media, and technology. In 2023, he was named an inaugural Tech Policy Press reporting fellow and an affiliate fellow with the Propaganda Research Lab at the University of Texas at Austin. Previously, he was an investigative analyst with the Select Committee to Investigate the January 6th Attack on the U.S. Capitol and project manager of the Influence Operations Researchers’ Guild at the Carnegie Endowment for International Peace. From 2013 to 2021, Jackson managed research and program coordination activities related to media and technology at the National Endowment for Democracy. He holds an MA in international relations from the University of Chicago and a BA in political science from Wright State University in Dayton, OH.
SUMMARY
Disinformation is widely seen as a pressing challenge for democracies worldwide. Many policymakers are grasping for quick, effective ways to dissuade people from adopting and spreading false beliefs that degrade democratic discourse and can inspire violent or dangerous actions. Yet disinformation has proven difficult to define, understand, and measure, let alone address.
Even when leaders know what they want to achieve in countering disinformation, they struggle to make an impact and often don’t realize how little is known about the effectiveness of policies commonly recommended by experts. Policymakers also sometimes fixate on a few pieces of the disinformation puzzle—including novel technologies like social media and artificial intelligence (AI)—without considering the full range of possible responses in realms such as education, journalism, and political institutions.
This report offers a high-level, evidence-informed guide to some of the major proposals for how democratic governments, platforms, and others can counter disinformation. It distills core insights from empirical research and real-world data on ten diverse kinds of policy interventions, including fact-checking, foreign sanctions, algorithmic adjustments, and counter-messaging campaigns. For each case study, we aim to give policymakers an informed sense of the prospects for success—bridging the gap between the mostly meager scientific understanding and the perceived need to act. This means answering three core questions: How much is known about an intervention? How effective does the intervention seem, given current knowledge? And how easy is it to implement at scale?
OVERALL FINDINGS
• There is no silver bullet or “best” policy option. None of the interventions considered in this report were simultaneously well-studied, very effective, and easy to scale. Rather, the utility of most interventions seems quite uncertain and likely depends on myriad factors that researchers have barely begun to probe. For example, the precise wording and presentation of social media labels and fact-checks can matter a lot, while counter-messaging campaigns depend on a delicate match of receptive audiences with credible speakers. Bold claims that any one policy is the singular, urgent solution to disinformation should be treated with caution.
• Policymakers should set realistic expectations. Disinformation is a chronic historical phenomenon with deep roots in complex social, political, and economic structures. It can be seen as jointly driven by forces of supply and demand. On the supply side, there are powerful political and commercial incentives for some actors to engage in, encourage, or tolerate deception, while on the demand side, psychological needs often draw people into believing false narratives. Credible options exist to curb both supply and demand, but technocratic solutionism still has serious limits against disinformation. Finite resources, knowledge, political will, legal authority, and civic trust constrain what is possible, at least in the near- to medium-term.
READ THE FULL REPORT