Submit to Digest

As Section 230 Debate Continues, Congress Finds Reform Opportunities in Child Online Safety Statutes

Reports

Two weeks ago, the Supreme Court heard oral arguments for Gonzalez v. Google, a case coming out of the Ninth Circuit that features a tort claim brought against Google for algorithmically promoting ISIS-produced content. Most of the discussion centers around Section 230 of the Communications Decency Act (“CDA”), 47 U.S. Code § 230, a statute passed in 1996 designed to shield tech companies from liability for third-party content hosted on their platforms. Needless to say, the implications of Gonzalez for the tech industry are significant––and the technical details complex. For nearly three hours, the Court struggled to delineate between so-called “neutral” algorithmic tools and the targeted recommendations those tools generate.

In the midst of questioning, Justice Elena Kagan paused to wonder how well-equipped the Court actually is to address the issue at hand. As she observed, “These are not, like, the nine greatest experts on the [I]nternet.” Even if Section 230 requires reform, suggested Justice Kagan, the legislature might be better suited to address this problem.

Whether or not Justice Kagan is correct, one thing is certain: as cases like Gonzalez spark debate about the utility of Section 230 in the modern Internet and social media era, Congress has shown considerable bipartisan interest in regulating big tech. For the moment, online child protection statutes appear to be among the most high-profile of these efforts.

Since last year, senators have proposed several pieces of legislation designed to protect children’s safety online with support from lawmakers on both sides of the political aisle. Discussion around these bills and the issues they seek to address have necessarily raised important questions about the future of Section 230 and of Internet norms more broadly.

Protecting children online is not a new issue in tech regulation. However, legislators have frequently run into a range of problems––legal and practical––that limit the effectiveness of the laws they pass.

In the mid-1990’s, concerned about the exposure of minors to explicit content online, lawmakers passed the CDA. The statute made it illegal to knowingly transmit or display porn on the Internet to anyone under the age of 18. However, the Supreme Court later struck down most of the statute as unconstitutional. Though the CDA sought to protect children from explicit content, the Court said, the statute had violated First Amendment protections of free speech for adults. Reno v. ACLU. 521 U.S. 844, 849 (1997). Today, only one part of the CDA remains: Section 230, which has since come to protect the presence of the very kind of content that the CDA sought to restrict.

Shortly thereafter, in 1998, Congress passed the Children’s Online Privacy Protection Act (“COPPA”), 15 U.S. Code § 6501-6506, which limited the collection and retention of data from children under 13 years old. In practice, though, COPPA has created difficult enforcement problems in the United States, as websites typically ask the user to self-report their age in the absence of a more reliable age-verification mechanism. Despite how manifestly ineffective such a method is, more stringent solutions raise their own privacy and data collection concerns. For instance, the United Kingdom recently sought to enact its own version of COPPA––the Online Safety Bill––which would require users to supply their credit card or passport information in order to verify their age on adult content websites.

So, as the new Congress forcefully turns its attention back to making the Internet a safe place for children, it will have to contend with persisting normative and practical limits to the effectiveness of tech regulation. A number of bills, which were first presented on the Senate floor in the previous session, are set to be reintroduced.

The Kids Online Safety Act (“KOSA"), S. 3663, 117th Cong. (2022), sponsored by Sen. Marsha Blackburn (R-TN) and Sen. Blumenthal (D-CT), is a prominent bill among these. Designed to impose a standard of care for certain “covered platforms” toward their underage users, KOSA would require tech companies falling within this definition to “prevent and mitigate” harms to minors––including mental health disorders, addiction, physical violence, and sexual exploitation––hosted on their platforms.

If enacted, KOSA would quite likely implicate current Section 230 protections. The last time Congress made a similar effort was in passing FOSTA/SESTA, an explicit amendment to Section 230 designed to hold tech platforms accountable for online content promoting or facilitating sex trafficking.

However, since its enactment in 2018, FOSTA/SESTA has become a cautionary tale for the unintended consequences of abridging Internet speech. A 2020 report suggested that the imposition of new liability on tech companies caused these platforms to preemptively crack down on all sex-related content––making the jobs of legal adult sex workers more difficult and dangerous. On the flip side, a recent report from the U.S. Government Accountability Office (“GAO”) shows that FOSTA/SESTA is almost never used to prosecute sex trafficking charges.

With this experience still in its rearview mirror, Congress is very much aware of the dangers of an unconsidered amendment to Section 230. During a recent Senate Judiciary Committee Hearing on children’s safety online, lawmakers displayed different concerns about the implications of new children’s safety legislation on Section 230 immunity. For instance, Sen. Sheldon Whitehouse (D-RI) expressed a desire to see more class action lawsuits against tech companies continue, rather than be dismissed on immunity grounds. On the other hand, Sen. Mazie Hirono (D-HI) expressed concerns that significantly altering Section 230 to allow such lawsuits would be ill-advised.

The sponsors of KOSA themselves seem to be leery about repeating the mistakes of FOSTA/SESTA. The text of KOSA notably does not mention Section 230. Rather, the bill imposes only a “duty of care” requirement that tech companies “shall act in the best interests” of minors using their platforms.

While this strategic drafting may not send tech companies on another liability-quashing campaign, how such a statute will impact Section 230 protections is much less clear. Assuming KOSA passes with this language unchanged, its practical impact may very well become a question of statutory interpretation. Thus, even if the Supreme Court does decide to leave Section 230 amendment up to Congress in its decision in Gonzalez, the nine justices may soon find Congress has put that ball right back in their court.