How Covid misinformation stayed one step ahead of Facebook


September 16, 2023

Vox logo

The work of trying to minimize the influence of harmful misinformation is both exhausting and essential. Big pushes, like the one Meta undertook in late 2020 to begin removing more misinformation about Covid-19 vaccines while promoting content from authoritative public health and scientific sources, always seem too late and undertaken in response to public or institutional pressure. And they require a sustained effort that platforms don’t always seem willing to maintain. A question has always lingered in the background of these big public moments where major platforms get tough on online harms: Did these efforts actually work?

A new study, published this week in Science Advances, argues that Meta’s Covid-19 policies may not have been effective. Though Meta’s decision to remove more content did result in the overall volume of anti-vaccine content on Facebook decreasing, the study found that engagement may have “shifted, rather than decreased” outright.

“There’s a broader ecosystem and there’s demand for this content,” said David Broniatowski, one of the study’s authors and an associate professor of engineering at George Washington University. “You can remove specific articles or posts or instances of the content. But the fact that you didn’t see a change in the engagement for the content that remained [on Facebook] goes to show the fact that people are out there and they want this stuff.”

Read the full article