Submit to Digest

Beyond the FTC: the Future of Privacy Enforcement

This is a lovely conference, and it is such a delight to be back in person with folks again. Thank you so much for including me and for thinking so hard about how to actually improve privacy enforcement for users of technology. There have been so many ideas here. I love when I’m learning about cutting-edge legal ideas to make privacy better.

I appreciate that the scholars here are discussing and thinking through so many cases and issues where the Electronic Frontier Foundation (“EFF”) has long been involved. We were amicus in the TransUnion[1] and Spokeo[2] cases in the Supreme Court. We weren’t formally involved in the Equifax case,[3] but I’ve blogged about it more times than I can count at this point. The Carrier IQ case[4] made a showing as well, where EFF directly represented the coder who found the problem in the Carrier IQ software that led to the consumer privacy litigation. To get there we fended off a Computer Fraud and Abuse Act threat.

We also heard about efforts to track and bring meaning to Terms of Service, something EFF tried to do a while ago with a tool called TOSBack. The original idea of TOSBack was that we would have links to judicial opinions and other legal materials that could tell you what the law had said about each particular term in a term of service or clickwrap — letting users know if they were enforceable or not, for instance. But ultimately, we threw our hands up because tracking and analyzing the terms was just way too hard. So thank you all for continuing to try to help consumers understand what the terms of service on websites actually mean for them. It was nice to hear that perhaps the large language models for Artificial Intelligence could be helpful for parsing terms of service, something we mere humans struggle with.

I want to talk a little about some of the ideas I found interesting in the papers, some concerns I have, and some broader observations about privacy protections from my vantage point at EFF.

As a litigator at heart, I am very interested in class actions and reviewed with interest Professor Cofone’s article about how we might be able to better harness them. I liked the idea of thinking about the harms to the loss of obscurity as the flip side of what happens when the company gets information to target you better. I’m wondering if there’s something like a loss there; I’m not sure, but the idea is useful to consider.

I also loved the discussions about the critical importance of private rights of action. The lack of serious private rights of action is one of the key reasons that EFF has been the skunk at the garden party on a lot of the privacy laws that have been proposed federally and in the states. We insist that any serious notion of enforceability has to directly empower the users whose privacy has been violated. The remedies have to be real to them. The insufficient private right of action is one of the things we complained about in last year’s proposed American Data Privacy and Protection Act (“ADPPA”).[5] One thing that might not be as apparent to those who don’t litigate is how the ADPPA, combined with arbitration clauses, would prevent user privacy litigation. This isn’t theoretical: it already happened to us in Scott v. AT&T,[6] our lawsuit against AT&T for violating user privacy by selling user location data.[7] We were also not excited about the ADPPA depriving the Federal Communications Commission (“FCC”) of privacy jurisdiction, given that broadband providers are important in terms of protecting people’s privacy. The Scott case demonstrated that, too. That is another reason we unhappily differed from many of our friends in the privacy community about that proposed law.

We still have work to do to make sure users — real people — are empowered to demand accountability and protect their own privacy. To that end, I loved Professor Elvy’s idea of using the implied warranty of merchantability. I was in private practice for ten years in California before I joined EFF and did some warranty practice. It could be wonderful to try to bring that old doctrine into this newer context where we can. I thought that Professor Cooper’s idea of taxing data was also interesting but a bit outside my wheelhouse as a civil liberties attorney.

Among the papers, I was especially happy to see that the data loyalty doctrine is continuing to grow and develop. I am just waiting to get it to a place where we can bring it to bear in cases. I appreciate the work that professors Neil Richards, Woodrow Hartzog, as well as their student, Jordan Francis, are doing to try to turn the data loyalty idea into a workable piece of legislation. I am hopeful that we can support it through the legislative process and that, ultimately, EFF and other organizations can use it to bring a real sea of change to the privacy of users of digital services.

Some other ideas I liked. I will be using the term “regulatory dodges,” which I learned from the paper by Professors Helen Nissenbaum, Katherine Strandburg, and Salome Viljoen. I also appreciate the work done to look at Software Development Kits (“SDKs”). We have considered a couple of potential cases that had to do with the privacy issues with SDKs. Overall, trying to get the legal doctrines to deal seriously with the complexity of our modern internet experience is something that is going to continue to be very interesting and important. I also liked the effort to break out of the binary that privacy is akin to secrecy and the exploration of contextual integrity. It just can’t be the case that if you tell or reveal something to one person or entity, you’ve lost all right to privacy. I think there is a lot to be developed to get to the place where we can argue this before a judge. But it captures something important, and I appreciate the work that has to happen to make it so.

When it works well, what we get to do at the EFF is use smart ideas from academia or even act in partnership with those in academia who are trying to make privacy better for users. My favorite example is how we took Professor Strandburg’s idea about the right of association and NSA spying and turned it into our First Unitarian Church v. NSA case.[8] Other times we are proud to partner with those in the academy to try to make your ideas manifest in other branches of government like legislatures and agencies. Academic thought and analysis is a critically important engine in moving privacy forward, and we truly appreciate the work being done in the academy right now. So thank you very much.

At the same time, there are a couple ideas in the papers that I think are worth thinking hard about before we push forward on them. Honestly, they make me a little nervous. I figured I would just articulate them.

The first one is the idea of platforms as primary enforcers and protectors of people’s rights. We’ve now seen too many ways in which that expectation has made it harder to dislodge the predominant platforms. It is true that one big thing you can do to try to make privacy happen is to be Apple and make the App Store give you a check box against tracking. But I’m nervous for us to embrace that decision-making as a substitution for governance. Apple may be our friend now, but they’re not going to be our friend forever. That’s just the nature of the corporate, for-profit world, where the financial incentives change over time. Putting the responsibility on corporations to protect our privacy will end up being too flimsy for what we need. It’s also dangerous in terms of the big picture of building a healthy ecosystem for users. We really need to get away from the big tech giants controlling everything we do, even if occasionally they use that control in ways we like.

Another thing that makes me nervous is the idea of canonical identifiers. At EFF, we work with people who are under threat from repressive governments and companies for what they say and do online. They need anonymity and the ability to move around both online and offline without being easily identified. Minimizing their canonical identifiers is a better goal for us as a society than maximizing them. Maybe you’ll talk me into it, but it makes me nervous.

A third thing to be careful around are technology mandates. While it can be easy on paper to just ban certain tools and services, it can be very tricky to do that and still think about how we make sure that technologists can continue to innovate and that technology continues to move forward and doesn’t get captured by today’s big players. To be clear, what I mean by tech mandates is not telling technology companies what to do — like protect data security — so much as laws or policies requiring or forbidding access, development, or implementation of particular technology. We need to preserve the ability for our tools and services to change, for developers to have new ideas and implement them on top of old ideas. Too often proposed tech mandates also include limitations on access to code, which blocks follow-on innovation. We need to protect our tinkerers — people who take things apart to see how they work. We need to protect our reverse engineers. All too often tech mandates do include both problems — enforced blindness and locking things down. They don’t have to, but they often do. So we need to continue to be vigilant as we get more serious about regulating technology as part of protecting privacy.

I’ll give you an example. The broadcast flag[9] is something from about 20 years ago. The content industry convinced the FCC to require that anybody who built a technology that could play content (TVs, computers, etc.) had to include a technical flag that allowed the content owner to turn on or off the ability to play the content, to record the content, or to otherwise limit what you could do with the content even after you paid for it. It was a version of digital rights management (“DRM”). EFF went to court along with our friends at the American Library Association, Public Knowledge, and others and got the Broadcast Flag thrown out.[10] DRM in other contexts has similar issues right now, and it is worth making sure that we are not suggesting mandates in the context of privacy that are going to create that kind of lock-in and reduce the ability of users to control their own devices.

A tech mandate that EFF worries about is different than saying “if somebody has a signal, then you have to follow it,” like the “Do Not Track” flag,[11] or building tools that let people send more signals about what they want. The key question is this: who are you centering? These issues, like when a requirement for technology becomes a kind of troubling tech mandate, are tricky at the edges. But I think it’s important to articulate to those of you thinking about new ways to protect privacy about the kinds of things that EFF is really excited about and the kinds of things that we are more nervous about.

The other thing I normally say when I am talking to people about privacy is that we must always recognize that there is no serious dividing line between privacy from companies and privacy from governments. I am very happy that I did not see that argued in the papers presented here. The bottom line is that if we’re going to give people privacy, then we’re going to give them privacy, period. The right and ability to have a private conversation means it does not matter whether Facebook is listening or the government is listening. You’re not having a private conversation if either one of those is happening. And of course, if you think that giving information to a company does not mean it goes to the government, then you are not understanding what is going on right now.

I used to say this simple thing — that any attempt to draw a line between privacy from corporations and privacy from governments is a mistake — and people would look at me blankly. So I am very pleased to see people here today nodding along with me. As somebody who spent a lot of time trying to get the NSA to stop spying on everybody, it is really important that we recognize that a major vector for government spying right now is through corporations that gather our information. There are large NSA programs like Prism or the Upstream programs that gather information from our telecommunications and Internet companies.[12] But there’s also the FBI and local police just buying data from data brokers.[13] This does not mean we don’t have different rules for private companies and governments, but it does mean that we have to think about both vectors of violating our privacy, especially when we talk about designing tools and technological systems that gather information about users.

Here are three key things that EFF believes should be a baseline for any federal privacy law. I told you that we were not fans of the ADPPA, and the failure to include these three is partially why. The first baseline is that we believe that private rights of action and strong enforcement are crucial. A second is that we reject privacy preemption. A federal law must be a floor, not a ceiling. If we were considering a very good federal law, we might reconsider this because we are sensitive to the fact that developers need a clear set of rules and that complying with fifty-plus different jurisdictional rules is expensive and problematic, plus it favors bigger competitors. But I don’t think we have a “best-in-class” federal privacy law coming in our current or any foreseeable future Congress. So ensuring that there is no preemption is really important.

Third, we also take a strong stance on anti-discrimination, which in this context means we reject the idea of poor privacy for poor people. The amount of money you have should not determine how much privacy you get from your tools and services. These are all approaches we have seen. AT&T was charging $29 a month for you not to be tracked by them.[14] Twitter was going to charge people to turn on two-factor authentication.[15] We also see that people who get cheaper phones have a lot more pre-installed spyware or apps that spy on them.[16] We already live in a world where poor people get less privacy than someone like me, and I get less privacy than someone like Mark Zuckerberg. That should stop.

Those are the three EFF bottom lines around privacy legislation: no preemption, a private right of action, and non-discrimination.

There are other things we’d like to see in privacy legislation too. We are great fans of data minimization, requiring companies and governments to only collect what data they actually need and only keep it as long as they need it. My colleague Shirin Mori has done a lot of work around dark patterns and misleading interfaces.[17] EFF has even taken a position that we support banning behavioral ads and think that they can be safely outlawed consistent with the First Amendment. There’s a great blog post by my colleagues Bennett Cyphers and Adam Schwartz about that.[18]

We also care about transparency. The transparency papers in this symposium are fabulous.

Yet another issue that the EFF thinks about in the context of both competition and privacy is interoperability, specifically what we call adversarial interoperability. Adversarial interoperability means you don’t have to go beg permission to be able to interoperate with another service. This is a different idea than Application Programming Interfaces (“APIs”). Interoperability through an API generally means the first company is really in charge of how a second developer interoperates with them. Our goal as a society ought to be a world that does not require permission of the data holder in order to foster competition, innovation, and a race to the top on privacy along with other user-friendly innovations. API-enforced privacy leaves privacy at the mercy of the goodwill of the corporate business model of the API creator, which as I noted above, is just leaving privacy to the good will of for-profit corporations who haven’t demonstrated that they are worthy of our trust. A better answer is to pass a strong baseline privacy law that applies to everyone and let everyone then innovate in compliance with it.

Some other things that your papers made me think about is the flimsiness of privacy in the public policy debates. It seems that every time you’ve got a situation where someone says we’re going to balance privacy against some value, that is your sign that privacy is going down. How can we tie privacy to some other value that has stronger standing in the law or otherwise in people’s lives? How do we tie it to speech rights? We know that Geofeedia gave information to the police about the location of Black Lives Matter protesters.[19] This tied privacy to Fourth and First Amendment issues and the right to participate in a political protest.

This is the same basic argument, developed by Professor Strandburg, that we used in our First Unitarian Church v. NSA case, where we represented a dozen or so organizations that were chilled from using the telephone to organize when they learned of the NSA collecting telephone data en masse. We weren’t successful in the case, but that wasn’t because the theory wasn’t right — instead it was because we had a judge who was so wobbly in the national security context he wouldn’t even make an initial decision in the case, instead sitting on it for many years. But the idea is solid: tying location and other metadata to the right of association is an area that I think is underdeveloped in the caselaw and underdeveloped in theory. We know that so much of the spying the government is doing is based on association: people are tracked based upon where they are, who they talk to, and other non-content material. Governments do this in both national security and domestic contexts. I think there is still work we could be doing, and I’m looking for cases where we can push this concept forward.

We are also working to tie privacy into more traditional civil rights work. The EFF just gave comments to the National Telecommunications and Information Administration (“NTIA”) on this topic, pointing out how civil rights require privacy and how those facing discrimination are often the most surveilled by authorities.[20] We are keeping track of, for instance, how law enforcement use of facial recognition is misidentifying people, almost always Black people. Sadly, we hear about more of these examples every day. Those kinds of stories, tying the issues that we’re concerned about to other values that might have a better ability to make it through the political branches or the judiciary, is something we try to do at EFF. It’s part of why I love the Uniform Commercial Code (“UCC”) idea: taking a side of law and bringing it to bear on privacy.

Another issue we all need to address is that right now many in our society believe that surveillance brings safety. One example: pretty much every person who has their property burglarized is advised — by law enforcement and others — to install cameras. This advice (and expense) comes regardless of whether cameras actually work to deter or solve crimes. In San Francisco, where I live, it’s rare that a home burglary is solved by the police at all, much less that cameras provide significant help.[21] I think that’s probably true if you look at other places too. Yet, the advice is still the same, and it’s taken as a matter of common sense by most people. We need to ask whether all this surveillance is worth the investment — not to mention the impacts on marginalized communities who are often over-policed and over-tracked by these cameras. We’ve seen this problem in the work we have done around Ring cameras.[22]

We need to think about how we break that frame of surveillance bringing safety because it is very powerful right now, and it means that we’re losing many privacy fights. We have to begin to bring some critical and scientific attention to that claim. If I am wrong, and the truth is all the surveillance is making us safer, then I will step back and think about something else to do. But my intuition is that we’ll find that much of the increased surveillance that we’re seeing on both the consumer and the governmental side isn’t making us safer but instead is simply enriching the many people selling us these increasingly profitable surveillance tools.

Let me close with a few more hopeful thoughts. First of all, please don’t give up. This is really important work, and having the time, space, and ability to dialogue and get student help to develop these ideas is, from where I sit, an amazing benefit to all of us who want and need privacy. I appreciate the framing of this Symposium as looking beyond the FTC. Rather than wallowing in problems, it focuses us on looking for solutions.

In a similar vein, I recently started a podcast series called “How to Fix the Internet.”[23] Part of the reason I named it that was because I needed to force myself to imagine a world where we fix some of these problems; to imagine a world where we have technology that serves us, rather than technology that doesn’t. I want to help move us from a world where every technological innovation is greeted with “Oh god, what is it going to do to us now” to instead “My god, look at the cool and important things we can do with this to support our rights and our communities.”

I lived in a time where joy and possibilities were the predominant response to new technologies, and I think we can get to that place again as long as we build in the protections and accountability that are needed. You can’t build a better world unless you can envision a better world, though. In the past few years, I have had so many conversations with lawmakers and thought leaders where they cannot even envision technology bringing a better world. The conceit of my podcast is that not only do I have to think about a better world, I get to bring in smart people and ask them about it. The central question I ask is: What does it look like if we get it right? I urge all of you to think about this too.

Overall, I’m delighted to get to listen in to this conference and to address you directly. I do think what we’re doing at this conference is trying to build a better technological world, one law review article at a time. With smart thinking about law and policy, we’re trying to get us back to a place where our technology serves all of us. Thank you for doing that work.

[1] TransUnion LLC v. Ramirez, 141 S. Ct. 2190 (2021).

[2] Spokeo, Inc. v. Robins, 575 U.S. 982 (2016).

[3] See Equifax to Pay $575 Million as Part of Settlement with FTC, CFPB, and States Related to 2017 Data Breach, Fed. Trade Comm’n, (July 22, 2019), https://www.ftc.gov/news-event... [https://perma.cc/DZA4-YRHE].

[4] In re Carrier IQ, Inc., 78 F. Supp. 3d 1051 (N.D. Cal. 2015).

[5] American Data Privacy and Protection Act, H.R. 8152, 117th Cong. § 403 (2022).

[6] Scott v. AT&T Inc., No. 19-CV-04063-SK, 2021 WL 2839959, at *6 (N.D. Cal. Feb. 16, 2021).

[7] Aaron Mackey, Forced Arbitration Thwarts Legal Challenge to AT&T’s Disclosure of Customer Location Data, Elec. Frontier Found. (Apr. 14, 2021), https://www.eff.org/deeplinks/... [https://perma.cc/2QZW-Q3MN].

[8] First Unitarian Church of L.A. v. Nat’l Sec. Agency, C 13-03287 JSW (N.D. Cal. Mar. 21, 2014).

[9] Broadcast Flag, Elec. Frontier Found., https://www.eff.org/broadcastf... [https://perma.cc/VKC9-CQEL].

[10] ALA v. FCC, Elec. Frontier Found., https://www.eff.org/cases/ala-... [https://perma.cc/4XAH-MQ8N].

[11] Do Not Track, Elec. Frontier Found., https://www.eff.org/issues/do-... [https://perma.cc/7BWZ-WP75].

[12] Upstream vs. PRISM, Elec. Frontier Found., https://www.eff.org/pages/upst... [https://perma.cc/T4Q9-ZKFU].

[13] Bennett Cyphers, Inside Fog Data Science, the Secretive Company Selling Mass Surveillance to Local Police, Elec. Frontier Found. (Aug. 31, 2022), https://www.eff.org/deeplinks/... [https://perma.cc/8H6A-JDTV]; Bennett Cyphers, How the Federal Government Buys Our Cell Phone Location Data, Elec. Frontier Found. (June 13, 2022), https://www.eff.org/deeplinks/... [https://perma.cc/E89R-2Y49].

[14] Chris Neiger, AT&T Inc. Now Wants You to Pay for Privacy — Here’s How Much, Motley Fool (Feb. 22, 2015, 12:36 PM), https://www.fool.com/investing... [https://perma.cc/4YTU-J3JR].

[15] Twitter to Charge Users for Text-Message Authentication, BBC (Feb. 20, 2023), https://www.bbc.com/news/techn... [https://perma.cc/4YD5-4QSU].

[16] Sara Morrison, “Privacy Shouldn’t Be a Luxury”: Advocates Want Google To Do More To Secure Cheap Android Phones, Vox (Jan. 17, 2020, 9:30 AM), https://www.vox.com/recode/202... [https://perma.cc/P2XV-9NZ3].

[17] Shirin Mori, Help Bring Dark Patterns To Light, Elec. Frontier Found. (May 19, 2021), https://www.eff.org/deeplinks/... [https://perma.cc/TZ95-RB8K].

[18] See Bennett Cyphers & Adam Schwartz, Ban Online Behavioral Advertising, Elec. Frontier Found. (Mar. 21, 2022), https://www.eff.org/deeplinks/... [https://perma.cc/SR67-39NX].

[19] Sam Levin, ACLU Finds Social Media Sites Gave Data to Company Tracking Black Protesters, Guardian (Oct. 11, 2016, 4:07 PM), https://www.theguardian.com/te... [https://perma.cc/ZVX7-92YP].

[20] Paige Collins & Adam Schwartz, EFF Comments to NTIA on Privacy and Civil Rights, Elec. Frontier Found. (Mar. 7, 2023), https://www.eff.org/deeplinks/... [https://perma.cc/TAB8-APML].

[21] Brian X. Chen, Security Cameras Make Us Feel Safe, but Are They Worth It?, NY Times (Nov. 15, 2022), https://www.nytimes.com/2022/1... [https://perma.cc/4MQV-7DQB].

[22] Beryl Lipton, Neighborhood Watch Out: Cops Are Incorporating Private Cameras Into Their Real-Time Surveillance Networks, Elec. Frontier Found. (May 11, 2023), https://www.eff.org/deeplinks/... [https://perma.cc/HL7W-YX2J].

[23] How to Fix the Internet: Podcast, Elec. Frontier Found., https://www.eff.org/how-to-fix... [https://perma.cc/KYG9-4H77].