Right-leaning pages also produce more misinformation, the forthcoming study found.
A new study of user behavior on Facebook around the 2020 election is likely to bolster critics’ long-standing arguments that the company’s algorithms fuel the spread of misinformation over more trustworthy sources.
The forthcoming peer-reviewed study by researchers at New York University and the Université Grenoble Alpes in France has found that from August 2020 to January 2021, news publishers known for putting out misinformation got six times the amount of likes, shares, and interactions on the platform as did trustworthy news sources, such as CNN or the World Health Organization.
Ever since “fake news” on Facebook became a public concern following the 2016 presidential election, publishers who traffic in misinformation have been repeatedly shown to be able to gain major audiences on the platform. But the NYU study is one of the few comprehensive attempts to measure and isolate the misinformation effect across a wide group of publishers on Facebook, experts said, and its conclusions support the criticism that Facebook’s platform rewards publishers that put out misleading accounts...
The study “helps add to the growing body of evidence that, despite a variety of mitigation efforts, misinformation has found a comfortable home — and an engaged audience — on Facebook,” said Rebekah Tromble, director of the Institute for Data, Democracy and Politics at George Washington University, who reviewed the study’s findings.