Navigating Data Integrity and Digital Policy: An Interview with IDDP Fellow, Jeff Allen


October 5, 2023

Jeff Allen

We’re excited to introduce Jeff Allen, the latest fellow at the Institute for Data, Democracy & Politics (IDDP). With an academic foundation in physics and an illustrious journey through the world of data science, Allen brings a unique blend of technical expertise and policy understanding to the table. Initially a researcher in New York University's physics department, Allen transitioned to the field of data science in 2013. From then on, his career has centered around understanding and influencing how online platforms leverage data, which ultimately led him to IDDP.

Allen's tenure in data science began at about.com, an online publisher, where he was tasked with reverse-engineering Google's search algorithm. This role allowed him to gain insights into how platforms create incentives for entire industries. However, his interest in the inner workings of platforms took a new turn in 2016 when he joined Facebook. Initially part of the local search team, Allen moved to the integrity team amidst the outbreak of the Russian Internet Research Agency scandal and the Cambridge Analytica scandal in the years following the 2016 U.S. election. Allen's commitment to identifying and curtailing unethical practices on platforms continued with his move to Instagram's integrity team, where he focused on tackling content that violated the platform's guidelines.

Recognizing the harms that online platforms could cause to societies and democracies, Allen left Facebook and Instagram at the end of 2019 with a vision of creating an open-source integrity team. This idea took shape when he connected with Sahar Massachi in early 2021. They envisioned a public organization that would serve as a common platform for tackling integrity issues across all digital platforms. This vision eventually led to the establishment of the Integrity Institute. Now, as a fellow at IDDP, Allen aims to use his profound experience to influence and guide policy discussions, while also connecting various stakeholders in the social internet ecosystem.

As we delve deeper into our conversation with Allen, he sheds light on his work, his views on data integrity, and his plan to contribute to IDDP's ongoing mission.

 

Q: What are the primary challenges in the United States regarding policy changes around social media platforms?

Jeff Allen: The main challenge lies in raising awareness of the different solutions available to policymakers. Many people are genuinely concerned about the impact of social media on teenagers, for instance, and they want to directly tell platforms how to redesign themselves. It's tempting to focus the anger and energy people have in one direction, but it's equally important to realize that if we create the right incentive structures, changes can come from inside. Furthermore, by better understanding the nuanced complexity of these platforms, we can work towards more sustainable and effective solutions. The education of what would have a real impact inside these platforms is a crucial thing to get right and it's a task that requires both dedication and expertise.

Q: Can you elaborate on the pros and cons of increased transparency from social media platforms?

Jeff Allen: Transparency is the most important tool we have to change platform incentives and on its own can dramatically improve how social media companies build their platforms. The more companies have to be transparent about the scale and cause of harms on their platforms, the more likely they are to build responsibly and safely. Transparency is complex because while data releases can help research, they also carry risks. On one hand, any data that platforms release can be utilized by bad actors, although they're already pretty skilled at reverse engineering. On the other hand, we have to consider privacy concerns. Transparency reports may contain sensitive data, and it's a challenge to determine what can be included while respecting users' privacy. Moreover, transparency might compromise the competitive edge of these companies. Lastly, there's the issue of trade secrets; companies might be reluctant to give away information about their algorithms. Striking the right balance between openness and protection is a tricky process, and it's something that policymakers, alongside platform managers, have to keep in mind. But, these issues can all be managed, and transparency will be a crucial component of any successful social media regulation.

Q: How can IDDP help navigate issues around social media platform regulations?

Jeff Allen: What makes IDDP special is our insider expertise from people like myself who have worked within these platforms. If I put myself back in my previous roles at Facebook or Instagram, I can reflect on whether a particular policy would have helped me do my job better. The IDDP hosts a community of experts who can determine if certain policies would facilitate or hinder efforts to make platforms more responsible. We leverage our collective experiences and insights to guide policy decisions. This unique perspective from industry insiders makes the IDDP a valuable player in the ongoing dialogue about social media regulation.

Q: Were there any pivotal moments in your career that influenced the path you're on now?

Jeff Allen: One moment that stands out was when we first worked with a senator's office before a hearing. It became clear there was demand for our insider expertise to help policymakers. It was a reminder that the world had not heard from people like Sahar and me, which sparked the realization that there would be significant demand for our kind of expertise. This experience solidified our conviction to bridge the gap between policymakers and tech insiders, leading us to the work we are currently engaged in.

Q: Have you noticed any patterns or improvements in hearings related to social media platforms?

Jeff Allen: I've observed that hearings keep getting better. People on Capitol Hill genuinely care about these issues and they're leveling up really quickly. Despite some low moments, the overall trajectory has been quite promising. It is encouraging to see more nuanced and informed discussions emerging. There's still a long way to go, but the progress made so far gives me hope for a future where policy and technology can work hand in hand to address the challenges posed by social media.

Q: What's your view on how the current wave of Generative Artificial Intelligence is influencing the social media landscape?

Jeff Allen: It's an exciting time, but platforms will need to ensure a responsible transition. As AI starts to produce more content, it's crucial for platforms to decide how to respond. For instance, will Google penalize publishers who rely on AI for writing articles, or will YouTube accept AI-generated videos? These are the kinds of questions that platforms will need to address, as the decisions will greatly influence norms in the publishing world and individual content creators. It's a brave new world and platforms will need to strike a balance between embracing innovation and managing its potential risks.

Q: What implications might AI advancements have on traditional forms of content creation?

Jeff Allen: AI is becoming a tool in the content creator's toolbox, and this could drastically shift the power dynamics. For instance, if you're a publisher, video creator, or a photographer, will your work be sustainable if you're not using AI? The way platforms respond to this will shape the future of content creation. There is a serious risk that platforms create a 'death spiral' for online publishing and content creation. If platforms reward more low effort AI generated content, through high rankings and more views and traffic, then outlets that do original reporting, original research, and novel content creation will all struggle. The adoption of AI in content creation will also influence the job market, forcing professionals to adapt to a changing landscape. It's an exciting time, but it's essential to ensure this transition is handled responsibly.