TikTok Faces Legal Battle 13 US States Sue Over Allegations of Harming Children

TikTok-US

TikTok, a popular social media platform with about 120 million active users in the United States, is facing a major legal challenge. On Tuesday, 13 US states and Washington, D.C. filed lawsuits against the platform, accusing it of harming young users and failing to implement adequate protective measures. The lawsuits claim that TikTok’s addictive features have a detrimental impact on children’s mental health and that the platform has been negligent in addressing these concerns. The legal filings are the latest in a series of lawsuits aimed at social media giants for their handling of younger audiences, highlighting growing concerns about the role of technology in children’s lives.

The lawsuits, led by 14 attorneys general, center around the notion that TikTok is exploiting vulnerable users, particularly children and adolescents, through features designed to keep them engaged for extended periods. These features include constant notifications that disrupt sleep patterns and the platform’s autoplay function, which automatically queues up videos, making it difficult for users to disengage. According to the lawsuits, the constant barrage of content can overwhelm young users, creating harmful impacts on their mental and emotional well-being.

A particularly alarming claim relates to TikTok’s role in promoting dangerous viral challenges. In some cases, children have sustained injuries or worse after participating in these challenges, which spread rapidly on the platform. These activities have raised serious questions about the adequacy of TikTok’s content moderation systems and its ability to protect its youngest users from harm.

In a statement issued in response to the lawsuits, TikTok firmly denied the allegations, calling them “inaccurate and misleading.” The platform emphasized its commitment to user safety, pointing out that it has implemented various features aimed at protecting minors. These include default screen time limits, family pairing options, and automatic privacy settings for users under 16 years old. TikTok also stated that it has worked closely with attorneys general and other stakeholders over the past two years to address industry-wide challenges.

Despite these efforts, TikTok expressed disappointment that the states have opted for legal action rather than continuing to work collaboratively on solutions. The company maintains that it has implemented “robust safeguards” and is proactively removing suspected underage users from the platform, countering the claims that it has been negligent in protecting children.

TikTok’s rise in popularity has been meteoric, with 1.6 billion active users worldwide as of July 2024. It is the fifth most popular social media platform globally, trailing only Facebook, YouTube, WhatsApp, and Instagram. In the United States, about a quarter of TikTok’s users fall between the ages of 10 and 19, making the platform particularly attractive to younger audiences. Furthermore, nearly half of its US user base is under the age of 30, highlighting the platform’s deep penetration into youth culture.

However, the amount of time young people spend on TikTok has raised alarm bells among parents, educators, and health professionals. The average user spends 58.4 minutes per day on the platform in 2024, up from 55.4 minutes in 2023. Among users aged 18 to 24, this number jumps to 76 minutes per day. The immersive nature of the platform, driven by its algorithm, creates an environment where young users can spend hours scrolling through videos.

According to the Office of the Attorney General in Washington, D.C., TikTok’s algorithm is particularly concerning because it capitalizes on the reward systems in adolescent brains. These young users are more susceptible to addictive behavior, as they lack the impulse control of adults. The algorithm continuously presents highly personalized content, generating a dopamine-driven cycle that keeps users hooked.

Numerous studies have pointed to the potential negative effects of social media use on mental health, particularly among adolescents. A Harvard study cited in the lawsuits revealed that TikTok generated $2 billion in advertising revenue in 2022, with a significant portion of this income coming from ads targeted at teenagers aged 13 to 17. By 2023, the platform’s net advertising revenue had surged to $8.75 billion, indicating the scale of its influence among young audiences.

While TikTok is not the only platform facing scrutiny over its impact on youth, it has become a focal point for legal actions. In October 2023, Meta, the parent company of Facebook and Instagram, was sued by 33 US states for allegedly targeting children with addictive features. Meta’s platforms were accused of using psychologically manipulative tactics to increase the time younger users spent on their apps. Although Meta has implemented features to address these concerns, such as removing harmful content related to self-harm and eating disorders, the lawsuits underscore the broader challenges posed by social media platforms in safeguarding vulnerable users.

The recent lawsuits against TikTok go beyond the platform’s addictive qualities. They accuse TikTok of introducing two new features—TikTok LIVE and TikTok Coins—that pose additional risks to minors. TikTok LIVE, a live-streaming feature, allows users to interact in real-time, while TikTok Coins, a virtual currency system, enables users to purchase and send virtual gifts during live sessions. According to the lawsuit, these features have led to significant concerns over child exploitation and lax age verification measures.

The complaint asserts that TikTok’s age verification processes are not stringent enough, allowing minors to gain access to these features by falsifying their birth dates. The combination of real-time interaction and the ability to monetize content through virtual gifts has created opportunities for sexual exploitation and other harms to minors. The lawsuits argue that TikTok’s design choices have incentivized minors to engage in these harmful practices.

In a separate lawsuit filed by Texas Attorney General Ken Paxton in October 2024, TikTok was accused of violating the state’s Securing Children Online Through Parental Empowerment Act. Paxton argued that TikTok has failed to provide adequate tools for parents to manage and control their children’s privacy settings, making it difficult for guardians to ensure their children’s online safety. While TikTok has reiterated its commitment to safeguarding minors through features like family pairing, the lawsuits indicate a growing dissatisfaction with the platform’s approach.

The lawsuits against TikTok have elicited a range of reactions both in the United States and abroad. In China, where TikTok’s parent company ByteDance is based, some commentators have supported the legal actions, viewing them as necessary to protect users’ rights. Others, however, have pointed out that similar scrutiny should be directed toward other platforms like Facebook and YouTube, which also feature short-form videos that can be highly addictive.

Medical researchers have long raised concerns about the overstimulation of young users’ brains due to excessive social media exposure. Studies show that overuse of platforms like TikTok can lead to sleep disturbances, problems with attention, and feelings of exclusion. These effects are particularly pronounced in adolescents, who are still developing emotionally and cognitively. Some experts have called for stricter regulation of social media platforms, advocating for more robust measures to protect children from the potentially harmful effects of online content.

The lawsuits against TikTok are part of a broader conversation about the responsibility of tech companies to protect vulnerable users, especially children. Social media platforms have become an integral part of modern life, providing users with unprecedented access to information and entertainment. However, the risks associated with these platforms—particularly for younger users—are becoming increasingly clear.

As technology continues to evolve, there is a growing need for companies to prioritize user safety over profit. The legal actions against TikTok, Meta, and other social media platforms highlight the urgent need for greater accountability in the tech industry. Governments, regulators, and tech companies must work together to develop solutions that protect children from harm while allowing them to benefit from the positive aspects of social media.

TikTok’s legal battle with 13 US states and Washington, D.C. represents a critical moment in the ongoing debate over the role of social media in children’s lives. While TikTok has taken steps to address concerns about user safety, the lawsuits underscore the need for more comprehensive solutions to the challenges posed by digital platforms. As the legal proceedings unfold, they will likely set important precedents for how tech companies are held accountable for protecting their youngest users, shaping the future of online safety for generations to come.

Related Posts