Facebook’s Vaccine Misinformation Policy Reduces Anti-Vax Information

New study shows that social media companies have the tools to reduce the impact of vaccine misinformation on their platforms

March 2, 2022

GW logo

WASHINGTON (March 2, 2022)— Following years of growing vaccine opposition and several outbreaks of measles--a vaccine-preventable disease--Facebook established in 2019 its first policy to stop the spread of misinformation about vaccines. Researchers at the George Washington University wondered if the new policies actually worked to stop the spread of misinformation. Jiayan Gu, PhD student along with Lorien Abroms, Professor of Prevention and Community Health and their colleagues created a new paradigm  for evaluation of the policy. The team found that Facebook’s policy did reduce people’s interactions with vaccine misinformation.

“There is a growing consensus that health misinformation on social media platforms presents a critical challenge for public health, especially during the COVID-19 pandemic,” Abroms said. “While new policies by social media companies are an important first step to stopping the spread of misinformation, it is also necessary to rigorously evaluate these policies to ensure they are effective.”

Gu, Abroms and their team  identified l72 anti- and pro-vaccine Facebook Pages and collected posts from these Pages six months before and after the policy went into effect. The study found that Facebook’s March 2019 vaccine misinformation policy moderately curtailed the number of “likes” of anti- vaccine content on Pages on its platform.

Gu and her colleagues conclude that social media companies can take measures to limit the popularity of anti-vaccine content on its platform.

“This research is a good first step in developing a process to evaluate the effectiveness of social media policies that are created to stop the spread of misinformation,” Gu said. “We are excited to continue this work and grow our understanding of how social media policy interventions can positively change online information sharing ecosystems.”

The study is funded by the GW Institute for Data, Democracy and Politics (IDDP), which launched in 2019 with the support of the John S. and James L. Knight Foundation.  IDDP's mission is to help the public, journalists and policy makers understand digital media’s influence on public dialogue and opinion, and to develop sound solutions to disinformation and other ills that arise in these spaces.

The study, “The Impact of Facebook’s Vaccine Misinformation Policy on User Endorsements of Vaccine Content: An Interrupted Time Series Analysis” was published on March 2 in Vaccine.