The overarching takeaway from the Facebook Papers is that Facebook knows. The company monitors just about everything, as the whistleblower Frances Haugen revealed by providing 17 news organizations with documents about the social-media company’s internal research and discussions. Facebook and its tech-industry peers employ armies of exceptional research scientists who evaluate how the platform shapes social behavior. Those researchers agree to a Faustian bargain—in exchange for limitless data, they sign nondisclosure agreements. And as the Facebook Papers document, these employees have discovered a range of disturbing problems that, if not for Haugen, might never have become publicly known. Even when employees of Facebook (which officially renamed itself Meta on Thursday) have privately objected to the company’s decisions to put profit over public safety, they’ve in many cases been overruled by Mark Zuckerberg and other executives.
I am working with other researchers—including J. Nathan Matias at Cornell and Rebekah Tromble and David Karpf at George Washington University—on initiatives that would guarantee the sharing of key information. (Matias, Tromble, and Karpf all contributed to this article.) We have observed that when independent researchers have tried to develop public-interest evidence on digital harms, tech companies have regularly obstructed their efforts. For years, Facebook has blocked transparency research from ProPublica, the Markup, New York University, AlgorithmWatch, and others.