Submit to Digest

Symposium Keynote

Hello everyone, and thank you for inviting me to speak to you all. I’m encouraged by the enthusiasm, diligence, and thoughtfulness each of you bring to this very timely and important topic of comprehensive privacy. The research you continue to create is instrumental in guiding elected officials like myself who are working to create and pass well-informed, sensible legislation for issues as nuanced as this.

As a strong supporter of the American Data Privacy and Protection Act,[1] which passed out of the Energy and Commerce Committee by an overwhelmingly positive margin last year, I want to stress the importance of Congress’ role in ensuring that all Americans enjoy a strong data privacy standard — one that emphasizes data minimization and allows individuals to take action in response to privacy violations. It’s up to Congress to enshrine these rights into law, so it’s disappointing to me that the current federal laws governing our privacy today are the same ones that have been in existence for decades. These laws desperately — and I want to underscore desperately — need an update to account for the current state of data collection practices and need to be more resilient to future developments in technology.

A key entity in enforcing consumers’ data rights is the Federal Trade Commission. In recent years, they have stepped up to bring forth cases against big tech companies — from the headline-grabbers like Meta[2] and Epic Games[3] to the obscure data brokers like Kochava[4] — and hold them accountable for their invasive, anti-consumer practices. But it’s an abdication of our responsibility if we rely solely on the FTC to enact comprehensive data privacy rules. Congress needs to not only equip the FTC with the capacity and authority to address issues of data privacy but also to craft clear consumer protection regulations that place even stronger protections on more sensitive types of data and the privacy of our children and teens.

That’s why we must push to get the American Data Privacy and Protection Act across the finish line. It would ensure that companies are engaging in consumer-friendly data practices and finally give consumers control over their data. It’s most important to codify strong data minimization practices, especially as it relates to sensitive data like medical symptoms, location history, and financial and biometric information. Unfortunately, data is also collected on our children in schools every second by technologies that are advertised to help them learn and grow, leading to disproportionate punitive actions for marginalized youth – a trend that concerns me as a mom. And most recently, with the overturning of Roe v. Wade,[5] millions of women are worried that an ad tech company’s dataset or even their own period tracker app could be used to infer their choice to seek an abortion.

As technology evolves, the capacity for online platforms to capture even more of our sensitive data, such as biometrics, will accelerate — which emphasizes how important it is for us to get ahead of these issues by guaranteeing robust privacy protections now.

One of these emerging technologies that’s been in the spotlight as of late is AI, and I know you are interested in the threats and opportunities for privacy advocates in this space. We have already seen reports of how people’s sensitive medical data are being hoovered up into large AI training sets, which poses a real risk to those people if and when that data is exposed in a chatbot’s answer or an image-generator’s creation. Even companies like Amazon are aware of these risks, urging their engineers to abstain from using ChatGPT for fear of proprietary code leaking in future public-facing chatbot answers.[6] Many consumers, in their eagerness to try out these shiny new chatbots, are unaware of how much their questions and interactions with these bots’ answers are being folded back into the AI’s training, which goes to show how important data minimization principles are in all technology. This moment is an opportunity for data privacy advocates to guide the vitally important development of responsible standards for the AI industry, and I urge you to participate in that process.

In addition to privacy, I’d like to highlight complementary policies that are just as important to ensure that big tech works for the public interest, rather than just for their bottom line. I have long been an advocate for transparency into these large online platforms whose practices and algorithms have been intentionally opaque for far too long. Last year, I drafted and introduced the Digital Services Oversight and Safety Act[7] — DSOSA for short — to fulfill the need for comprehensive transparency legislation and to hold companies accountable for the promises they make to consumers, advertisers, and parents. A comprehensive privacy law will put guardrails on the way an individual’s data can be used by internet services, but without comprehensive transparency, we still lack insight into the impactful and opaque decisions that big tech companies, and their algorithms, make every day.

We need a codified framework for vetted researchers to securely and responsibly access the data they need to do their job. Independent research is necessary to understand the ways companies are using our data and how their algorithms are targeting content to users. You asked about how researchers’ efforts in developing techniques for observational audits of big tech's data practices could contribute to the future of privacy enforcement. I hold the view that researcher data access will be crucial to these efforts, as watchdogs need to be able to back up their claims with clear evidence of violations. This is why recent developments at companies like Twitter, which closed its free API access and introduced an egregiously expensive payment model instead, are deeply concerning to me.[8]

We need to demand that companies provide researcher APIs and terms that are conducive to how researchers actually access and use the data they need to bring to light how these companies are really operating, from the effectiveness of their content moderation enforcement to the impacts of their algorithmic decision making.

If you tuned into March 2023’s blockbuster TikTok hearing,[9] you may have missed my line of questioning in between some of my colleagues asking about TikTok interfering with their home WiFi and how filters work. Needless to say, I took a different approach. I pressed the company’s CEO on this very issue because I truly believe that you can’t actually say you’re embracing transparency if researchers aren’t able to take a serious, independent look under the hood of your company. His answer didn’t impress — it just illustrated that the industry is too afraid of actual accountability to provide the transparency we need.

I want to conclude by emphasizing once again that many of these issues share bipartisan support in Washington. Last summer’s passage of the American Data Privacy and Protection Act in the Energy and Commerce Committee was huge. And now states are taking notice. That includes my home state of Massachusetts, right where we are having this conference, which is using the text of our federal bill as a model for what they can do at the state level.[10] And if they pass the strong provisions we’re working to advance — a duty of loyalty, robust data minimization principles, and a strong private right of action — I will be looking to, at a minimum, defend the state law protecting my constituents — just as various other states did last year.[11]

The work happening in states right now is so important, but it’s critical that we return to the table and continue to advocate for federal comprehensive privacy laws so we can finally get something done to ensure the data rights of all Americans. Thank you for your work in this space and for the opportunity for me to speak on this important issue, and I look forward to hearing about the productive discussions and ideas that come from this symposium.

[1] American Data Privacy and Protection Act, H.R. 8152, 117th Cong. (2022).

[2] See generally Meta/Zuckerberg/Within, In the Matter of, Fed. Trade Comm’n, https://www.ftc.gov/legal-libr... [https://perma.cc/394E-62QZ].

[3] See generally Epic Games, In the Matter of, Fed. Trade Comm’n, https://www.ftc.gov/legal-libr... [https://perma.cc/PT9B-2ECN].

[4] See generally Kochava, In the Matter of, Fed. Trade Comm’n, https://www.ftc.gov/legal-libr... [https://perma.cc/5V5H-W3G9]

[5] 410 U.S. 113 (1973), overruled by Dobbs v. Jackson Women’s Health Org., 142 S. Ct. 2228 (2022).

[6] Eugene Kim, Amazon Warns Employees Not to Share Confidential Information with ChatGPT After Seeing Cases Where Its Answer ‘Closely Matches Existing Material’ from Inside the Company, Bus. Insider (Jan. 24, 2023), https://www.businessinsider.co... [https://perma.cc/ZEN5-U66N].

[7] Digital Services Oversight and Safety Act of 2022, H.R. 6796, 117th Cong. (2022).

[8] Jon Porter, Twitter Announces New API Pricing, Posing a Challenge for Small Developers The Verge, (Mar. 30, 2023, 5:35 AM) https://www.theverge.com/2023/... [https://perma.cc/Q7XZ-5GRM].

[9] TikTok: How Congress Can Safeguard American Data Privacy and Protect Children from Online Harms: Hearing Before the H. Comm. on Energy & Com., 118 Cong. (2023).

[10] Robyn Mohr, Trends in 2023 State Privacy Legislation, Loeb & Loeb Insights (Mar. 2023), https://www.loeb.com/en/insigh... [https://perma.cc/GF35-PF6P].

[11] Grace Bendik, Mapping the Current State of Federal Consumer Privacy Legislation, Colum. Sci. & Tech. L. Rev. Blog (Oct. 11, 2022), https://journals.library.colum... [https://perma.cc/8QJB-JM2G].