The Integrity Project

View Original

The Diffusion and Reach of (Mis)Information on Facebook During the U.S. 2020 Election

Photo illustration by Axios

PUBLISHED BY
Sociological Science

AUTHORS AND RESEARCHERS
Sandra González-Bailón, University of Pennsylvania
David Lazer, Northeastern University
Pablo Barberá, Meta
William Godel, Meta
Hunt Allcott, Stanford University
Taylor Brown, Meta
Adriana Crespo-Tenorio, Meta
Deen Freelon, University of Pennsylvania
Matthew Gentzkow, Stanford University
Andrew M. Guess, Princeton University
Shanto Iyengar, Stanford University
Young Mie Kim, University of Wisconsin-Madison
Neil Malhotra, Stanford University
Devra Moehler, Meta
Brendan Nyhan, Dartmouth College
Jennifer Pan, Stanford University
Carlos Velasco Rivera, Meta
Jaime Settle, William & Mary
Emily Thorson, Syracuse University
Rebekah Tromble, The George Washington University
Arjun Wilkins, Meta
Magdalena Wojcieszak, University of California, Davis
Chad Kiewiet de Jonge, Meta
Annie Franco, Meta
Winter Mason, Meta
Natalie Jomini Stroud, University of Texas at Austin
Joshua A. Tucker, New York University

ABSTRACT
Social media creates the possibility for rapid, viral spread of content, but how many posts actually reach millions? And is misinformation special in how it propagates? We answer these questions by analyzing the virality of and exposure to information on Facebook during the U.S. 2020 presidential election. We examine the diffusion trees of the approximately 1 B posts that were re-shared at least once by U.S.-based adults from July 1, 2020, to February 1, 2021. We differentiate misinformation from non-misinformation posts to show that (1) misinformation diffused more slowly, relying on a small number of active users that spread misinformation via long chains of peer-topeer diffusion that reached millions; non-misinformation spread primarily through one-to-many affordances (mainly, Pages); (2) the relative importance of peer-to-peer spread for misinformation was likely due to an enforcement gap in content moderation policies designed to target mostly Pages and Groups; and (3) periods of aggressive content moderation proximate to the election coincide with dramatic drops in the spread and reach of misinformation and (to a lesser extent) political content.

READ THE WHITE PAPER IN ITS ENTIRETY.

RELATED RESEARCH PAPERS FROM THE INTEGRITY PROJECT