Government Role in Regulating Vaccine Misinformation on Social Media Platforms


September 3, 2019

JAMA The Journal of the American Medical Association

Recognizing that interactive computer services offered a forum for true diversity of political discourse, Congress enacted the Communications Decency Act3 to preserve the vibrant and competitive free market that exists on the internet.3 The law therefore insulates interactive computer services, such as social media platforms, from state tort liability. Under this law, a platform is not even obligated to remove content that a court has determined to be defamatory. Although this law shields social media platforms from massive self-censorship or being sued out of existence, it also protects platforms that facilitate misinformation.

Technology companies have traditionally been reluctant to regulate the content users share on their platforms because of the risks and backfires that meddling with content would create. However, as the efforts to spread misinformation move from sporadic and haphazard to organized and systematic, these companies have increasingly been pressured to act. In November 2016, Facebook banned misinformation from advertisements on the site. One study found that this ban led to a 75% decrease in sharing antivaccination misinformation on Facebook relative to Twitter, which had no change in its advertising policy during this time.4 Other platforms have taken similar action: Pinterest banned all antivaccination content outright, while YouTube removed advertising revenue and monetization from antivaccine channels and videos, eliminating financial incentives for those who profit from misinformation.

Read More