Submit to Digest

Balancing Privacy and Speech in the Right to Be Forgotten

Commentary Notes First Amendment
Written By: Michael Hoven Edited By: Albert Wang Editorial Policy Introduction When the European Commission recently proposed a “right to be forgotten,” U.S. commentators sprang to criticize it. “More Crap from the EU,” said Jane Yakowitz at the Info/Law blog. At Techdirt, Mike Masnick called it a “ridiculous idea.” Granting people the right to erase information about themselves would give them the power to stamp on the speech rights of others. Allowing this in the aggregate could produce profound social costs: increased costs of doing business could stunt innovation; research data could be lost; history could be erased. This comment takes a different position. I argue that the right to be forgotten attempts to solve a privacy problem that is serious and deserves our attention. However, the social costs of establishing such erasure rights in data are nonetheless real. Individual privacy rights should not be allowed to decimate our networked information environment, or our ability to study the data within it and learn about ourselves. The right to be forgotten, and analogous privacy frameworks, contain exceptions — for example, for journalism or free expression — but additional measures should be taken to provide sufficient protection for expression and research. In any privacy regime that incorporates erasure rights, there are two partial solutions that should be instituted to preserve some (if not all) of the social value of personal information. The first partial solution, data anonymization, has many skeptics in the law review literature, but has already reaped many benefits and imposes less of a privacy cost than many other privacy risks that we already tolerate. The second partial solution, eventual opening of suppressed information, is inspired by archival practice and rests on the premise that remembering, not forgetting, is crucial to the democratic process.[i] As a remedy for privacy harms, forgetting is overbroad. Information that was once available but was removed should not permanently vanish, but rather should be restored once the potential for harm is no longer substantial enough to justify the suppression of information.  The Right to Be Forgotten As people live more of their lives online, they expose an increasing amount of personal and potentially sensitive information. Two challenges to privacy result. The first is the “database problem.”[ii] The amount of personal information stored in databases makes possible tracking, surveillance, or other misuse by governments or corporations.[iii] The second is violations of what Helen Nissenbaum calls “contextual integrity.” Information disclosed or known to some can still threaten privacy if it is further distributed and made known in another context. In January, the European Commission announced a proposal to address these problems through a sweeping revision of Europe’s current rules on information privacy. The most notable and controversial feature of the proposed General Data Protection Regulation was the “right to be forgotten,” and it was aimed directly at the problems of networked privacy. The proposal still needs to pass through the European Parliament and is subject to change, as the New York Times reports, but in its current form, the right to be forgotten would establish a broad right of individuals to force online businesses to remove all information about them, even information created and placed online by third parties. Shortly after the European Commission announced its proposed privacy reform, the White House advanced a privacy initiative that, as Adam Theirer noted with dismay, would bring U.S. privacy policy closer to that of Europe. The White House proposed a privacy bill of rights (“White House Privacy Bill of Rights”) that, in part, would require companies to “provide access, correction, deletion, or suppression capabilities to consumers.”[iv] Like the right to be forgotten, this could let users change their minds and remove previously shared data, or restrict their data from being further disclosed.[v] Independently of the White House’s privacy initiative, the Federal Trade Commission (FTC) issued its final report on privacy recommendations (“FTC Privacy Report”) in March, responding to comments to its preliminary privacy report in 2010, according to the Electronic Frontier Foundation. The FTC Privacy Report explicitly invoked the European proposal, saying “there are aspects of the proposed ‘right to be forgotten’ in the [FTC’s proposed] final framework, which . . . allow[s] consumers to access their data and in appropriate cases suppress or delete it.”[vi] Additionally, the FTC endorsed further “exploration of the idea of an ‘eraser button’” introduced in the proposed Do Not Track Kids Act of 2011.[vii] Privacy and the First Amendment As the White House and FTC recommendations show, the right to be forgotten (or an eraser button or other form of data suppression) is not an unthinkable measure to protect privacy in the United States. However, it would have to overcome the powerful objection that it conflicts with the freedom of speech. Jeffrey Rosen, writing in the Stanford Law Review Online, articulates this concern in his analysis of the right to be forgotten, which he interprets to give people a right to request takedowns of information originally posted by someone else. In the U.S. context, giving an individual the power to force the removal of information posted by someone else can be and has been characterized as a violation of First Amendment rights. Eugene Volokh exemplifies this position, arguing that information privacy rules are likely unconstitutional because “the right to information privacy . . . is a right to have the government stop you from speaking about me.”[viii] However, it is not clear that the privacy skeptics are right as a doctrinal matter. Many things that would be called speech in the ordinary sense of the word are restricted without troubling the First Amendment. Frederick Schauer describes a substantial domain of speech that simply doesn’t implicate the First Amendment at all. Insider trading regulations, professional regulation, and sexual harassment laws all constrain how people talk, but courts have rejected or refused to entertain free speech arguments. Erasure rights remain distinct in two ways: they give power over speech to another individual, and speech that was once protected can become unprotected. These features are pronounced in erasure rights, but they have analogues in other regulations that restrain speech. In Harris v. Forklift Systems, Inc., the Supreme Court clarified that hostile work environment claims are conditioned, in part, on the subjective perception of the victim (and did so without mentioning the First Amendment, even though each party briefed the Court on the issue). [ix] In trademark law, the rights of the trademark holder wax and wane according to use and public perception. A descriptive mark can gain trademark protection if it acquires secondary meaning, and a trademark can lose protection through abandonment or through public usage of the mark as a generic term.[x] While Volokh finds privacy protections most troublesome when they give me a “right to stop you from speaking about me,” the challenge of online privacy is that many perceived privacy violations emerge from precisely this situation—you talking about me.  Rumors about sexual activity can be posted to college gossip sites; photos may be uploaded to social networks against someone’s wishes;[xi] information about alcohol and drug use intended for friends may be seen by potential employers.[xii] This information may be true, providing the speaker with a defense against defamation. However, the harm that occurs does not result from the falseness of the information, but from placing information in a visible context. If these privacy violations constitute a real problem, then individuals should have a remedy. The Limits of Free Expression Exceptions to Erasure Rights in Data  The proposed right to be forgotten and the White House’s privacy bill of rights put forth broad exceptions to the individual’s general right to control information about them, including an exception based on the freedom of expression.[xiii] The coverage of the exceptions is uncertain, but the proposals and statements of policymakers indicate that the exceptions will be applied according to a distinction between old media and new media, which puts at risk a body of material whose importance is as yet unrealized. Viviane Reding, Vice-President of the European Commission, stated that “the right to be forgotten cannot amount to a right of the total erasure of history,” and named newspaper archives as a place plainly safe from the right to be forgotten. This is an easy case. But if newspaper archives are destined to be preserved, then what is in danger of erasure? The apparent answer is that the online spaces where young people express themselves will be subject to erasure. In both the United States and Europe, policymakers are moved by the fear that young people are disclosing information about themselves online that they will later regret. The text of the proposed right to be forgotten says it has special force with regard to “personal data which are made available by the data subject while he or she was a child.”[xiv] The FTC Privacy Report noted that an eraser button would be most appealing to protect “[t]eens [who] . . . may voluntarily disclose more information online than they should.”[xv] If youthful oversharing is the paradigmatic concern animating erasure rights, then the erasure rights will target social networks. Although social media and similar forums are sometimes thought of as comprising mere idle chatter, [xvi] these networks are used by many people for many purposes. A facile distinction along new media/old media lines could end up suppressing information whose worth is only beginning to be understood. At least four valuable purposes can be identified. The first two, political activism and citizen journalism, have been closely linked in recent political movements. In the Arab Spring and the Occupy Wall Street protest, activists used social media to coordinate activity, persuade audiences, and document event.[xvii] Third, user-generated content also has value as individual expression (a concern that is well attended to by Volokh, among others). Fourth and least appreciated, user-generated content can be rich source material for researchers.[xviii] Information that initially seems of little significance to anyone but the speaker can nonetheless yield valuable insights. Researchers at OkCupid, a dating site, analyzed the responses of its members to initial contacts within its community and concluded that its members responded to black women at lower rates than they did to other women at the same level of compatibility.[xix] The Library of Congress acquired the entire Twitter archive in 2010 to ensure its preservation for researchers. The New York Times reported on a linguist making use of Twitter data to analyze regionalisms in U.S. English. Two Partial Solutions There are two ways to preserve some of the value of this material while allowing its suppression on privacy grounds. The first is data anonymization. The second is the archival solution: the release of suppressed information after a closed period. If the balance between privacy and free speech seems a close call, then these solutions may seem a useful compromise that makes privacy-protective measures more appealing. If an eraser button is an undesirable policy outcome, then these solutions should implemented as a backstop to curb potential abuse of the erasure right. Jane Yakowitz has argued persuasively that by attacking data anonymization, privacy advocates threaten to forestall valuable research.[xx] Not only is data anonymization effective, but the risk of reidentification of data subjects is smaller than other privacy risks we tolerate.[xxi] She discusses numerous examples of research questions answered by taking advantage of data anonymization: patterns of racial segregation in housing, the effects of smoking on fetuses, and possible discrimination in the allocation of police resources on a neighborhood-by-neighborhood basis.[xxii] But some information is valuable for its individually identifiable information, not its anonymous place in an aggregate of data.[xxiii] Can we apply Yakowitz’s intuition that there is no need to sacrifice the full data record to preserve privacy to the case of individually identifiable data? We can, if we introduce a temporal dimension to the privacy interest at stake. The informational privacy models of the right to be forgotten and the eraser button recommend making data vanish from the universe if it harms privacy. But we see a very different approach to privacy if we look at archives, institutions that have long wrestled with the balance between privacy and openness in their preservation of our collective cultural output. The code of ethics of the Society of American Archivists (SAA) indicates that, when archivists are confronted with material that implicates a privacy interest, the appropriate response is to restrict access to and use of the material—not destroy it. From this perspective, the tendency of policymakers to advocate for the disappearance of data in a digital “poof” is a needlessly extreme solution. We should restrict data to the extent necessary to protect privacy, but no further. The archival solution would build on this premise and extend it in a temporal dimension. Instead of an eraser button, it would be a “hide for a limited time” button. Data that would otherwise be forgotten or deleted could rejoin the public record when it no longer poses a risk to privacy. There is no precise way to determine when the harm to privacy ceases, but current practices provide us with some guideposts. U.S. census records are closed for seventy-two years.[xxiv] The Code of Federal Regulations (CFR) states that documents relating to human intelligence or the design of weapons of mass destruction will be classified for a default of seventy-five years. But restrictions need not be so long. The CFR calls for most classified government material to be declassified at “the lapse of the information's national security sensitivity,” and at least within twenty-five years of its date of classification. Even a much shorter time may be appropriate in the case of data suppression, since the suppression is so contested and the information has once been public. This presumes that such information has any value at all. This is the reverse of the problem facing erasure rights, which have to overcome the objection that what is being said is too valuable to be suppressed. Here, the objection is that what was said was valueless, and if it was suppressed there is no need to restore it. What the value of any suppressed data will be is uncertain, and would depend on how the erasure rights were used. But historians and other scholars routinely find unanticipated uses for data,[xxv] and some information remains in demand decades (or longer) after its collection. The Associated Press reports that when the 1940 U.S. census records, “rich with long-veiled personal details,” were recently placed online after the usual seventy-two-year confidentiality restriction expired, the site nearly froze as it got thirty-seven million hits in a matter of hours. Conclusion The First Amendment provides some free speech backstop against privacy protections, but it is not clear where that backstop is. It may block the current crop of proposed remedies to the problem of networked privacy, in which case new solutions must be developed. If informational privacy does earn greater protection, it should be structured in a way that preserves the social benefits of the free flow of information as much as possible. Limiting any constraints on information flow to a finite period of time is one means to preserve the social value of the information. Casting information into oblivion goes beyond restoring privacy to the level of protection that existed before so much personal information was made available on the Internet. Building on practices already developed by archives, we can prevent information—even if harmful to privacy—from being lost well beyond the time when any privacy harm would have dissipated.

[i] Elizabeth Jelin, The Politics of Memory: The Human Rights Movement and the Construction of Democracy in Argentina, 21 Latin Am. Persps. 38, 53 (1994), available at (discussing the relationship between memory and democracy in post-dictatorship Argentina).
[ii] Daniel Solove, Privacy and Power: Computer Databases and Metaphors for Information Privacy, 53 Stan. L. Rev. 1393, 1398.
[iii]See Solove, supra note 2, at 1416–18, 1426.
[iv]  White House Privacy Bill of Rights at 19.
[vi] FTC Privacy Report at 24.
[vii]Id. at 70.
[viii] Eugene Volokh, Freedom of Speech and Information Privacy: The Troubling Implications of a Right to Stop People From Speaking About You, 52 Stan. L. Rev. 1049, 1050–51 (2000).
[ix] 510 U.S. 17 (1993); Brief for Respondent at 31-33, Harris v. Forklift Systems, Inc., 510 U.S. 17 (1993) (No. 92-1168); Reply Brief of Petitioner at 10-11, Harris v. Forklift Systems, Inc., 510 U.S. 17 (1993); Frederick Schauer, The Boundaries of the First Amendment: A Preliminary Exploration of Constitutional Salience, 117 Harv. L. Rev. 1765, 1783 n95 (2004).
[x] J. Thomas McCarthy, 2–3 McCarthy on Trademarks and Unfair Competition §§ 11.15, 17.8–17.9 (4th ed. 2012), available at Westlaw.
[xi] See James Grimmelmann, Saving Facebook, 94 Iowa L. Rev. 1137, 1171–72 (2009).
[xii] See id. at 1165.
[xiii] White House Privacy Bill of Rights at 48; Proposal for a Regulation of The European Parliament and of the Council on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data (General Data Protection Regulation), at 52, COM (2012) 11 final (Jan. 25, 2012) [hereinafter Proposed EU Regulation].
[xiv] Proposed EU Regulation at 51.
[xv] FTC Privacy Report at 70.
[xvi] See, e.g., Ivor Tossell, Why Some Ache to Tweet, and Others Couldn’t Care Less, Globe and Mail (Sept. 13, 2011, 10:14 AM), (discussing conception that “Twitter is a place to find out what people had for lunch”).
[xvii] See, e.g., Phoebe Connelly, Curating the Revolution: Building a Real-Time News Feed about Egypt, Atlantic (Feb. 10, 2011, 10:28 AM), (discussing NPR reporter Andy Carvin’s curation of social media to analyze the Egyptian revolution); Jay Rosen, Occupy PressThink: Tim Pool, PressThink (Nov. 20, 2011, 1:26 AM), (discussing use of Ustream to broadcast Occupy Wall Street).
[xviii] See Jane Yakowitz, Tragedy of the Data Commons, 25 Harv. J.L. & Tech. 1 (2011).
[xix] Id. at 11–12.
[xx] Id. at 3–4; see also Derek Bambauer, The Myth of Perfection, 2 Wake Forest L. Rev. 22, 24-25 (2012), available at
[xxi] Yakowitz, supra note 18, at 39-42.
[xxii] Id. at 10 (collecting examples).
[xxiii] Heather MacNeil, Without Consent: The Ethics of Disclosing Personal Information in Public Archives 136-37 (1992).
[xxiv] See 1940 US Census Viewable Online after Near Freeze, Wall Street Journal (Apr. 2, 2012, 5:27 PM),
[xxv] In the early modern period, commonplace books—private scrapbooks of quotations and aphorisms—were widely used, and these private documents have become objects of historical study. See, e.g., Anthony Grafton, The Republic of Letters in the American Colonies: Francis Daniel Pastorius Makes a Notebook, 117 Am. Hist. Rev. 1, 6 (2012) (discussing varied approaches to studying the commonplace books of a Pennsylvania lawyer in the late seventeenth and early eighteenth century). Some commentators have characterized Tumblr blogs as the contemporary equivalent of commonplace books. See, e.g., Shaj Mathew, Tumblr as a Commonplace Book, The Millions (Mar. 21, 2012, 6:00 AM),