Submit to Digest

Lemmon v. Snap, Inc.: Ninth Circuit Chips Away at Tech Companies’ Section 230 Immunity

Reports Federal Circuit Comment

This summer, Snapchat removed its built-in speedometer, known as the “Speed Filter,” from the platform. A recent Ninth Circuit ruling was likely a major reason why, and the court’s decision could have significant implications for the broad immunity from civil liability traditionally enjoyed by internet companies.

Lemmon v. Snap, Inc was filed in the aftermath of a fiery high-speed car crash that killed three young men in 2017. The accident occurred as the boys sped down a Wisconsin road at over 100mph, while documenting their speed using Snapchat’s Speed Filter. The parents of the two passengers brought a negligent design lawsuit, claiming the interplay between Snapchat’s Speed Filter and its reward system of “trophies, streaks, and social recognitions” represented a design defect, which caused their sons’ deaths by encouraging dangerous high-speed driving.

The district court dismissed the action as barred by Section 230 of the Communications Decency Act. Section 230 provides that: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Described by one scholar as “the twenty-six words that created the internet,” its supporters claim Section 230 has shaped the largely user-generated internet that we know today by shielding internet platforms from most forms of civil liability for the content their users post. Without it, the argument goes, internet companies would be forced to excessively censor user content out of fear of lawsuits—imposing prohibitively high content moderation costs on both established companies and innovative startups.

However, this liability shield is not absolute. Section 230 contains various carve-outs for violations of federal criminal, privacy, sex-trafficking, and intellectual property laws. Still, courts have largely held that internet companies are immune from liability when they host, archive, advertise, or fail to remove harmful third-party content.

Lemmon v. Snap, Inc. raises the less-settled question of how Section 230 should apply in cases where harm results from something other than third-party user content. Here, that harm was a car accident allegedly caused by the Speed Filter’s influence on adolescent drivers rather than the Snap message the boys sent before their death. The question for the court, therefore, was whether the plaintiffs nevertheless treated Snap as the publisher or speaker of the plaintiffs’ messages. The Ninth Circuit answered ‘no’ and found for the plaintiffs, holding that Section 230 did not bar their claim and reversed the lower court’s granting of Snap’s motion to dismiss.

The court reasoned that plaintiffs’ claim did not treat Snap as a “publisher or speaker” of the boys’ Snap messages, but rather as the designer of an unsafe product that published Snap’s own speedometer content. The court pointed to its 2008 decision in Fair Housing Council of San Fernando Valley v. Roommates.Com, LLC, where a company-created survey that collected data enabling illegal housing discrimination was not immune under Section 230. Just as Roommates.com was liable for federal housing law violations caused by its own content, the court reasoned, Snapchat could be held liable for tort law violations resulting from content created by its Speed Filter. Moreover, the Ninth Circuit held, the plaintiffs’ claim did not allege a violation of a duty related to the content or publishing of the victims’ Snaps. Rather the plaintiffs alleged a violation of Snap’s duty to design a reasonably safe product—which the court found to be independent of Snap’s role in monitoring and publishing third-party content.

One unanswered question is where courts will draw the line on what constitutes a product defect claim. Another is whether courts will be as deferential to plaintiffs’ pleadings as the Lemmon court. Depending on how broadly courts interpret the holding, it is not difficult to imagine Lemmon implicating common social media features such as hashtag or group recommendations, which at least partially involve company-created content with the potential to encourage dangerous or illegal behavior.

A clearer implication of Lemmon is that Snapchat and other platforms like Facebook and TikTok, which offer in-app filters and augmented reality tools, are on notice that Section 230 will not necessarily immunize them from liability for harmful actions encouraged by those products. Similarly, advertisers who sponsor filters on these platforms should exercise caution and consider the behaviors their filters might encourage.

The response to Lemmon v. Snap, Inc. has been mixed so far. For example, the Electronic Frontier Foundation, a prominent supporter of Section 230 praised the court’s decision. By contrast, Professor Eric Goldman, another influential Section 230 proponent, criticized the decision as “confusing,” while also predicting the case would ultimately be of limited impact. Goldman also questioned the wisdom of narrowing Section 230 immunity for a case that he predicts is ultimately unlikely to succeed on the merits.

Yet Snapchat’s decision to remove a feature linked to multiple deaths from its platform that is widely used by teenagers is a reminder that the courts’ Section 230 jurisprudence is about more than just the ability to recover damages from internet companies. A plaintiff’s ability to survive past the motion to dismiss stage can shape company behavior by opening the door to not only the threat of an eventual trial, but also of discovery that could eventually subject tech company products and practices to greater public scrutiny. As Congress and the courts continue to grapple with the future of Section 230, the Ninth Circuit’s ruling in Lemmon v. Snap, Inc. represents, at a minimum, another potential avenue for plaintiffs to chip away at social media companies’ most powerful protection against civil liability.