The biggest problem with social media isn’t content moderation

Meta CEO Mark Zuckerberg speaks in Denver on July 29. Photo by David Zalubowski, The Associated Press

The Washington Post
Meta CEO Mark Zuckerberg has taken a lot of heat since announcing that he is pulling his company out of the fact-checking business and curtailing content moderation on its platforms. The criticism is understandable, given the uncertainty over how Meta’s new rules will handle misinformation and otherwise harmful material.

Keep in mind, however, that the company’s content-moderation strategies — and indeed those of practically all social media platforms — have not worked as intended. As Zuckerberg noted in a video about the changes, Meta’s automated content screening often got things wrong. And even when it correctly identified “misinformation” — a nebulous term that’s far more difficult to define than many people want to admit — it struggled to remove the stuff, given the volume and persistence of bad actors.

In any case, the problems that social media poses for its users run much deeper than content moderation. Bigger concerns stem from how platforms disseminate content. Tech companies should be helping address these worries by doing far more to reveal their algorithms to the public, allowing for greater scrutiny of their operations. The companies should also grant access to their data so that researchers and policymakers alike can study the effects that social media networks have on users and society.

MORE

ADDITIONAL NEWS FROM THE INTEGRITY PROJECT