Bringing Evidence-Based Social Considerations into Tech Policy: An Interview with IDDP fellow, Anna Lenhart


July 14, 2023

Anna Lenhart

Anna Lenhart, a seasoned expert in technology policy and the Institute for Data, Democracy, and Politics’ (IDDP) inaugural Knight Policy Fellow, has devoted a decade to navigating the intersection of tech advancement and social impact. Launching her career as a Civil Environmental Engineer, she made her initial strides into the world of data and prediction through environmental modeling in the biofuels industry. However, her most noteworthy work emerged after a transformative move to Washington D.C., where she brought her technical and data science expertise into the heart of policymaking.

On the Hill, Lenhart served as the technologist on the House Judiciary Investigation of Competition in Digital Markets. Following the investigation, she served as a policy aide to Congresswoman Trahan, a leader on technology policy issues on the Energy and Commerce Committee. Here Lenhart worked extensively on key tech policy issues, including online disinformation, child safety on digital platforms, and fostering better access to data for researchers studying online platforms. 

Lenhart is also a PhD Candidate at University of Maryland College of Information Studies, Ethics and Values in Design Lab, where her work focuses on engaging the public in some of the most challenging questions lawmakers face regarding online regulation. In 2021, she led the development of Contentr, a card game designed to mimic some of the challenges associated with moderating user generated online content. The game was later used as part of a citizen panel centered around platform regulation and Section 230 reform. Past work also includes designing frameworks for balancing freedom from discrimination in automated decision systems with privacy and understanding how advanced smart home users protect their data.  Her ongoing research examines young TikTok users’ understanding and perceptions of researcher access to TikTok data, alongside a comparison of how data protection policies are influenced in the U.S. and Europe. 

Lenhart’s experience, expertise, and interests in publicly-minded evidence-based policymaking align perfectly with IDDP’s mission. In her new role, Lenhart works to translate findings from IDDP research for policymakers, supports ongoing IDDP consultations with policymakers and their staff worldwide, and facilitates new connections between the Institute and those interested in evidence-based policy making related to social media platforms, AI, and much more. Lenhart gave us a closer look at her work and shared her thoughts on the current obstacles and possibilities in tech policy.

 

Q: What led you to the field of technology policy, and what fuels your work?

Anna Lenhart: I don’t love being pigeonholed as a “technology policy” person, D.C. loves to do that. I am deeply committed to fostering a better society – one without poverty, with comprehensive healthcare, and robust communities. I learned early on that my technology degree could be used to make progress on these and other issues I care about. I spent my twenties designing data systems for leading nonprofits in San Diego to help serve their clients better, and like many Americans, I’ve seen the corrosive impact of wealth concentration on democracy. Every industry these days can use data to strengthen their monopolies, making it an important time to be a technologist in the antitrust movement. Lately, I am immersed in how algorithmically fed content is playing into our democracy, eroding faith in governmental institutions, and creating barriers to consumer protection against fraud, climate change, and public health threats. Essentially, my involvement in tech policy stems from its role as a cornerstone in today's most pressing policy and social issues.
 

Q: Can you elaborate on your approach to crafting or advocating for a specific tech policy?

Anna Lenhart: When advocating for a particular tech policy, my primary objective is to make the policy relatable and tangible for the widest possible audience. For instance, when pushing for data protection laws, I've found that illustrating its direct impact on the issues that people care about most - such as climate change - truly resonates. While organizing a coalition around the Digital Services Oversight and Safety Act (H.R 6796, 117th) we were able to connect with climate change organizations and researchers who have been monitoring the way climate misinformation spreads online and how it obstructs building momentum for the climate movement. Surprisingly, 15 climate organizations - groups that usually back energy and natural resources legislation - came forward to support a tech policy bill. This kind of support is what we need to see more of.


Q: Do you think there is enough understanding about the importance of data protection and digital rights?

Anna Lenhart: While awareness around data protection and digital rights is certainly growing, we are still a significant distance from the levels we need to reach. These are complex, abstract issues that don't always visibly impact your day-to-day life, which makes it easy to underestimate their importance. However, when people realize how their personal data can be manipulated and misused, the implications become more concrete and the urgency to act becomes clearer.


Q: How significant is the influence of tech companies in shaping tech policy?

Anna Lenhart: Tech companies wield considerable influence over tech policy, but it's a complex dynamic. It’s easy to point to the massive amount of money spent on lobbying Congress to stop bills aimed at regulating data processing and market power, but there are subtler forms of influence as well. The technology industry benefits immensely from their opacity, the complete lack of transparency into the breadth of data these companies collect and how that data is used to determine what we see online, the services we have access to, etc. sets up a situation where a) policy makers and consumer advocates identify a problem but lack the information needed  for evidence-based policy making, b) tech leaders and the media exploit this lack of insight and frame it as ignorance, enabling them to run with the “oh, policymakers, you don’t understand us enough to regulate,” narrative. In reality, key congressional committee members do possess a considerable understanding, despite companies’ efforts to make their products hard to understand. So yes, it's about the money, but it’s also about masterfully tapping into the U.S. cultural narrative favoring innovators over regulators. 

 

Q: What role does public opinion play in shaping tech policy?

Anna Lenhart: Public opinion plays a pivotal role. Tech policies affect everyone, so it's important that they reflect the interests and needs of the people. Translating public sentiment and user experience into tangible policy is a delicate process, requiring the ability to connect tech policy issues to topics that people care about. This is one of the key challenges that we navigate in our work.

 

Q: Do we need a significant event to spur change in tech policy?

Anna Lenhart: While it’s unfortunate, it often takes a catastrophic event to bring about significant change. We've seen this with instances like Cambridge Analytica and the Facebook Papers, where only in the aftermath of a story that broke into the mainstream was the need for policy change truly recognized. It's crucial in these moments to be ready with the right legislative responses and have key players within Congress prepared–and aligned–to act swiftly.

 

Q: How can we get more people involved in the conversation about tech policy?

Anna Lenhart: We start by making tech policy relatable. The more people understand how tech policy impacts their daily lives, the more they'll feel invested in these discussions. We aim to break down complex ideas into digestible pieces, to make the conversation around tech policy more accessible. By doing so, we not only invite more voices into the conversation, but also ensure a diversity of perspectives. We also have to dispense with the narrative that you need to be a technologist or have a computer science degree to participate in these conversations, that’s not true, everyone is impacted by technology, they have lived experience, that is a form of expertise we have to make space for. 

 

Q: Can you elaborate on the relationship between high-quality research and evidence-based policymaking?

Anna Lenhart: Good policy depends on deeply understanding the problems that plague society (particularly those with less power in a society) and understanding the incentive structures in place that make solving those problems unlikely without government action. Rigorous, methodological research provides the best information to understand the nuances of a problem and the range of stakeholders involved. While working on policies related to social media transparency, I used studies from academics, civil society, congressional investigations and former trust and safety staff to understand what information is locked inside these companies and why. We delved into the specifics of what data - even down to specific metadata - researchers need to answer our most pressing questions: How are recommender systems impacting the spread of content? What are they optimized for? What is the potential for discrimination and other human rights violations? I used research from the field of Human Computer Interaction to better understand the perspective of users and users expectations of privacy.   

 

Q: What is your vision for the future of tech policy?

Anna Lenhart: Good technology policy centers around human rights and democracy. It incentivizes companies to center human autonomy, freedom from discrimination, freedom of expression, in a way that is balanced and more important than their quarterly earnings. The right mix of a comprehensive privacy law, transparency and safety audit mandates and competition reform will provide the policy levers and protections needed to continue the progress towards a sustainable, healthy and free society.