Submit to Digest

State AGs’ TikTok Investigation Part of Growing Concern for Social Media’s Adverse Effects on Youth

Reports Privacy

TikTok’s increasing popularity has made it a frequent news item, from its influence on fashion and dance trends to school vandalism and threats of violence. On March 2, it drew the attention of eight state attorneys general, who announced a joint investigation into TikTok’s design, operation, and promotion and how they affect the mental health of children and teens. The investigation will focus on TikTok’s efforts to boost user engagement and increase the amount of time people spend on the app.

This is not the only inquiry into TikTok’s impact on the mental health of its customers. In September 2021, the Wall Street Journal published an investigative report that analyzed the content TikTok serves to its users, including those under the age of 13, by analyzing content served to dozens of fake accounts. It reported that TikTok’s algorithms are based largely on how long people linger on each video, and TikTok can quickly flood feeds with videos about any given topic, none of which are restricted by age. For example, one of the Journal’s accounts, registered as a 13-year-old, was served 569 videos about drug use, including “promotional videos for online sales of drug products.” A follow-up report focused on TikTok’s role in fostering eating disorders in young adults. The investigators encountered videos encouraging behaviors like eating fewer than 300 calories a day, only consuming water, or using laxatives after eating. These videos all violate TikTok’s rules but were not taken down before being fed to the Wall Street Journal’s accounts.

Former TikTok executives stated that the company employs algorithms and around 10,000 human moderators to take down videos that violate TikTok’s rules. But tens of thousands of videos are posted every minute, meaning human moderators only focus on the most popular content. The company has claimed it is testing adjustments to its recommendation algorithm to “avoid pushing too much content from a certain topic to individual users—such as extreme dieting, sadness or breakups—to protect [younger users’] mental well-being.” In a response to the state AG investigation, a TikTok spokesperson wrote that the company limits some of the app’s features by age and enables users to “enjoy content based on age-appropriateness or family comfort.”

Investigations into TikTok are part of a broader wave of concern over social media’s adverse effects on young people. In May 2021, 44 attorneys general wrote an open letter to Facebook, petitioning it to cancel plans for a new version of Instagram aimed at children under 13. The attorneys general cited a concern for children’s well-being based on the “strong data and research that has shown a link between young people’s use of social media and an increase in mental distress, self-injurious behavior, and suicidality.” Subsequently, a whistleblower leaked internal documents to the Wall Street Journal that revealed Facebook knew about Instagram’s adverse mental health effects on teenagers. The company’s internal research showed that 32% of teen girls who felt bad about their bodies felt worse because of Instagram, that 13% of British and 6% of American users with suicidal thoughts traced those thoughts to Instagram, and that teens often attribute increased anxiety and depression to Instgram—a response that was “unprompted and consistent across all groups.” After the Wall Street Journal’s exposé, Facebook “paused” its plans to introduce the kids’ version of Instagram. The new AG investigation into TikTok echoes the concern that tech companies know—and ignore—the adverse effects their products have on children. As part of his announcement of the investigation, California attorney general Rob Bonta stated that “[w]e know this takes a devastating toll on children’s mental health and well-being. But we don’t know what social media companies knew about these harms and when.”

Social media companies have not escaped federal scrutiny, either. In October 2021, the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security held a hearing to promote a proposal for a bill that would protect people under 16 from attention-capturing features like autoplay, push alerts, like buttons, and “influencer marketing” such as unboxing videos. Subcommittee members questioned spokespeople from Snapchat, TikTok, and Youtube. An even more recent Senate proposal would create a duty for social media platforms to “prevent and mitigate harms to minors” and introduce more parental controls over social media usage. The White House, meanwhile, has promised $5 million towards researching the harms of social media and will create a new center within the Department of Health and Human Services responsible for research and education about “the full impact of adolescent social media use [and] especially the risks these services pose to their mental health.”

The impact of these overlapping government initiatives remains to be seen. However, given the mounting evidence of social media’s negative influence on young people, the pressure on social media companies to improve will only increase over time.