AG Sues TikTok for Harming Children’s Mental Health

By Hank Russell

New York Attorney General Letitia James and California Attorney General Rob Bonta co-led a bipartisan coalition of 14 attorneys general in filing lawsuits against the social media platform TikTok for misleading the public about the safety of its platform and harming young people’s mental health. The lawsuits, filed individually by each member of the coalition, allege that TikTok violated state laws by falsely claiming its platform is safe for young people. In fact, many young users are struggling with poor mental health and body image issues due to the platform’s addictive features and are getting injured, hospitalized, or dying because of dangerous TikTok “challenges” that are created and promoted on the platform.

Some of James’ requests in the lawsuit include instructing the social media company to pay a civil penalty of $5,000 for each violation of state law, plus any punitive damages; repayment of revenue from ads targeting New York State’s teen and pre-teen users “as a result of TikTok’s fraudulent, deceptive and illegal acts”; awarding plaintiff costs of$2,000, and having TikTok cover any and all court costs.

“Young people are struggling with their mental health because of addictive social media platforms like TikTok,” James said. “TikTok claims that their platform is safe for young people, but that is far from true. In New York and across the country, young people have died or gotten injured doing dangerous TikTok challenges and many more are feeling more sad, anxious, and depressed because of TikTok’s addictive features. Today, we are suing TikTok to protect young people and help combat the nationwide youth mental health crisis. Kids and families across the country are desperate for help to address this crisis, and we are doing everything in our power to protect them.”

“Our investigation has revealed that TikTok cultivates social media addiction to boost corporate profits. TikTok intentionally targets children because they know kids do not yet have the defenses or capacity to create healthy boundaries around addictive content,” Bonta said. “When we look at the youth mental health crisis and the revenue machine TikTok has created, fueled by the time and attention of our young people, it’s devastatingly obvious: our children and teens never stood a chance against these social media behemoths. TikTok must be held accountable for the harms it created in taking away the time — and childhoods — of American children.”

According to the lawsuits filed by James and the bipartisan coalition, TikTok’s underlying business model allegedly focuses on maximizing young users’ time on the platform so the company can boost revenue from selling targeted ads. TikTok also allegedly uses an addictive content-recommendation system designed to keep minors on the platform as long as possible and as often as possible, despite the dangers of compulsive use.

According to the lawsuit, TikTok allegedly uses a variety of addictive features to keep users on its platform longer, which leads to poorer mental health outcomes. Multiple studies have found a link between excessive social media use, poor sleep quality, and poor mental health among young people. According to the U.S. Surgeon General, young people who spend more than three hours per day on social media face double the risk of experiencing poor mental health outcomes, including symptoms of depression and anxiety.

Some of these addictive features include:

  • Around-the-clock notifications that can lead to poor sleep patterns for young users
  • Autoplay of an endless stream of videos that manipulates users into compulsively spending more time on the platform with no option to disable Autoplay
  • Attention-grabbing content that keeps young users on the platform longer
  • TikTok “stories” and TikTok live content that is only available temporarily to entice users to tune in immediately or lose the opportunity to interact
  • A highlighted “likes” and comments section as a form of social validation, which can impact young users’ self-esteem
  • Beauty filters that alter one’s appearance and can lower young user’s self-esteem

Beauty filters have been especially harmful to young girls, with studies reporting that 50 percent of girls believe they do not look good without editing their features and 77 percent saying they try to change or hide at least one part of their body using these filters. Beauty filters can cause body image issues and encourage eating disorders, body dysmorphia, and other health-related problems.

TikTok challenges encourage users to perform certain activities, some of which have been harmful and sometimes deadly for young users, the lawsuit states. In one example, a 15-year-old boy died in Manhattan while “subway surfing,” a trend where people ride or “surf” on top of a moving subway car. After he passed away, his mother found videos on his TikTok account about subway surfing.

Another example of a dangerous TikTok challenge is the Kia Challenge, videos that show users how to hack the ignition to start and steal Kia and Hyundai car models, which has led to thousands of car thefts. In October 2022, four teenagers were killed in a car crash in Buffalo that police suspect was the result of the TikTok Kia Challenge. A Kia Forte was also stolen in New York City and crashed into a house in Greenwich, causing significant damage to both the car and the residence. The ignition was damaged consistent with descriptions in the TikTok Kia Challenge.

James and the other AGs claim TikTok also violates the Children’s Online Privacy Protection Act (COPPA), a federal law designed to protect children’s data on the internet. TikTok actively collects and monetizes data on users under 13 years old, in violation of COPPA, and does so without parental consent. Researchers estimate that 35 percent of TikTok’s U.S. ad revenue is derived from children and teenagers. While TikTok claims to only allow users over age 13 to access all of its features, TikTok’s deficient policies and practices have knowingly permitted children under the age of 13 to create and maintain accounts on the platform.

TikTok falsely claims that its platform is safe for young users and has misrepresented the effectiveness of its so-called safety tools that are intended to address some of these concerns. James’ lawsuit alleges that TikTok also violated New York’s consumer protection laws by misrepresenting its safety measures, including:

  • Misleading users about its 60-minute screen time limit that it adopted to address concerns of compulsive use of its platform. TikTok deceptively advertised that teens can have a 60-minute screen time limit on the app. However, after using TikTok for 60 minutes, teens are simply prompted to enter a passcode to continue watching videos. 
  • Falsely presenting the effectiveness of its “Refresh” and “Restricted Mode” features. TikTok claims that users can “Refresh” the content the recommendation system feeds them and that they can limit inappropriate content through “Restricted Mode.” However, those features do not work as TikTok claims.
  • Failing to warn young users about the dangers of its beauty filter.
  • Misrepresenting that its platform is not directed toward children. TikTok publicly claims that it is not for children under 13, however, the platform features child-directed subject matter, characters, activities, music, and other content, as well as advertisements directed to children.

Through these lawsuits, James and the bipartisan coalition of attorneys general are using state laws to stop TikTok from using these harmful and exploitative tactics. In addition, the lawsuits seek to impose financial penalties, including disgorgement of all profits resulting from the fraudulent and illegal practices, and to collect damages for users that have been harmed.

Long Island Life & Politics attempted to reach out to TikTok, but could not find any contact information for the business. However, the company issued press releases last month that they worked with the World Health Organization (WHO) to create “evidence-based content” and donated $3 million to “support WHO’s global work in de-stigmatizing mental health conditions and creating an informed, empathetic, and supportive online community.” In addition, TikTok announced last month their partnership with the Family Online Safety Institute to launch the Digital Safety Partnership for Families.