Why Is Facebook So Afraid of Checking Facts?

The biggest social network in the world has the wrong idea for how to fight Covid-19 conspiracies.

May 14, 2020

WIRED

A video laden with falsehoods about Covid-19 emerged on Facebook last week, and has now been viewed many millions of times. The company has taken steps to minimize the video’s reach; but its fact-checks, in particular, appear to have been applied with a curious—if not dangerous—reticence. The reason for that reticence should alarm you: It seems that the biggest social network in the world is, at least in part, basing its response to pandemic-related misinformation on a misreading of the academic literature.

At issue is the company’s long-standing deference to the risk of so-called “backfire effects.” That is to say, Facebook worries that the mere act of trying to debunk a bogus claim may only help to make the lie grow stronger. CEO and founder Mark Zuckerberg expressed this precise concern back in February 2017: “Research shows that some of the most obvious ideas, like showing people an article from the opposite perspective, actually deepen polarization,” he said. The company would later cite the same theory to explain why it had stopped applying “red flag” warnings to fallacious headlines: “Academic research on correcting misinformation,” a Facebook product manager wrote, has shown that such warnings “may actually entrench deeply held beliefs.”

Read More