Differences in misinformation sharing can lead to politically asymmetric sanctions
PUBLISHED BY
Nature
AUTHORS AND RESEARCHERS
Mohsen Mosleh, University of Oxford, U.K.
Qi Yang, Massachusetts Institute of Technology
Tauhid Zaman, Yale University
Gordon Pennycook, Cornell University
David G. Rand, Massachusetts Institute of Technology
ABSTRACT
In response to intense pressure, technology companies have enacted policies to combat misinformation. The enforcement of these policies has, however, led to technology companies being regularly accused of political bias. We argue that differential sharing of misinformation by people identifying with different political groups could lead to political asymmetries in enforcement, even by unbiased policies. We first analysed 9,000 politically active Twitter users during the US 2020 presidential election. Although users estimated to be pro-Trump/conservative were indeed substantially more likely to be suspended than those estimated to be pro-Biden/liberal, users who were pro-Trump/conservative also shared far more links to various sets of low-quality news sites—even when news quality was determined by politically balanced groups of laypeople, or groups of only Republican laypeople—and had higher estimated likelihoods of being bots. We find similar associations between stated or inferred conservatism and low-quality news sharing (on the basis of both expert and politically balanced layperson ratings) in 7 other datasets of sharing from Twitter, Facebook and survey experiments, spanning 2016 to 2023 and including data from 16 different countries. Thus, even under politically neutral anti-misinformation policies, political asymmetries in enforcement should be expected. Political imbalance in enforcement need not imply bias on the part of social media companies implementing anti-misinformation policies.
READ THE WHITE PAPER IN ITS ENTIRETY.
RELATED RESEARCH PAPERS FROM THE INTEGRITY PROJECT