Meta Platforms Inc., Google, TikTok, and Snap Inc. are facing mounting legal challenges from U.S. school districts, accusing the tech giants of contributing to a youth mental health crisis through the addictive design of their apps. The ruling by U.S. District Judge Yvonne Gonzalez Rogers on Thursday in Oakland, California, marks a critical juncture in a growing legal battle, setting the stage for the companies to defend themselves in federal court. This decision contrasts sharply with a June 7 ruling in Los Angeles Superior Court, where similar claims were dismissed, showcasing a split in judicial opinions.
The stakes are high, with more than 150 cases moving forward under Judge Rogers’ oversight, even as the tech companies could potentially avoid liability in over 600 related cases filed in Los Angeles. The lawsuits stem from allegations that these platforms have exacerbated the mental health crisis among students by fostering addictive behavior through features like algorithms and the “like” button.
The ruling in Oakland could have profound implications for the tech industry. Judge Rogers denied a request from the companies to dismiss the negligence claims, but she narrowed the scope of the allegations, leaving some protected by Section 230 of the Communications Decency Act (CDA). This law has long shielded internet platforms from liability for user-generated content, but it does not grant them immunity for claims related to the design of their products. The judge’s decision to allow certain claims to proceed opens the door for school districts to argue that the companies deliberately designed their platforms to be addictive, causing schools to bear the brunt of the ensuing mental health crisis.
This development comes on the heels of another significant ruling by Rogers, in which she allowed a lawsuit by dozens of state attorneys general to proceed against Meta. That lawsuit accuses Meta of knowingly exploiting children’s use of Facebook and Instagram for profit. Similar lawsuits have been filed against TikTok by a coalition of states, further intensifying the legal pressure on the social media giants. Meta and TikTok have both denied any wrongdoing.
The core of the school districts’ claims centers around the notion that these companies knowingly designed their platforms to foster compulsive use among children and teens. Features such as “likes,” notification algorithms, and infinite scrolling are alleged to encourage students to spend unhealthy amounts of time on social media, leading to negative psychological effects. This is a legal theory reminiscent of the tobacco industry’s past lawsuits, where cigarette manufacturers were found liable for designing products that caused addiction and harm to society.
In a statement, Google spokesperson Jose Castaneda rejected the allegations, stating, “In collaboration with youth, mental health, and parenting experts, we built services and policies to provide young people with age-appropriate experiences, and parents with robust controls.” Meta echoed a similar sentiment, emphasizing its efforts to create safer online environments for teens. “We’ve developed numerous tools to support parents and teens, and we recently announced that we’re significantly changing the Instagram experience for tens of millions of teens with new Teen Accounts, a protected experience for teens that automatically limits who can contact them and the content they see,” a Meta spokesperson said.
Despite these defenses, the plaintiffs, represented by lawyers Lexi Hazam and Previn Warren, view the ruling as a victory. They claim that the addictive nature of platforms like Instagram, Snapchat, TikTok, and YouTube has created a public health crisis, with schools bearing the responsibility of addressing the fallout. “Because of the addictive design of Instagram, Snapchat, TikTok, and YouTube, students are struggling. That means schools are struggling — their budgets are stretched, and their educational missions are diverted as they shoulder the added responsibility of supporting kids in crisis,” Hazam and Warren said in a statement.
The lawsuits against the tech giants come at a time of growing concern about the impact of social media on mental health, particularly among young people. Numerous studies have shown a link between excessive social media use and rising rates of depression, anxiety, and even suicide among teens. While the platforms claim to have implemented measures to safeguard their users, the plaintiffs argue that these efforts have been insufficient and are often undermined by the platforms’ core business models, which rely on keeping users engaged for as long as possible.
Judge Rogers’ ruling could lead to significant monetary damages, especially since school districts are seeking to recover costs associated with addressing the mental health crisis caused by these platforms. Each district is asking for compensation for the resources spent on counseling services, mental health programs, and other interventions aimed at mitigating the effects of social media addiction.
Moreover, the potential for widespread financial liability grows with every additional lawsuit. Hundreds of personal injury suits have also been filed against Meta, Google, TikTok, and Snap, with many accusing the companies of contributing to individual cases of psychological distress, self-harm, and even suicide among young users. While these cases focus on personal damages, the school districts’ claims raise the stakes, as they could lead to institutional-level payouts to cover the costs of managing the broader social impact of these platforms.
An important aspect of the legal battle revolves around the concept of public nuisance, a legal theory that has been used successfully in other industries. This strategy was notably employed against Juul, the manufacturer of nicotine vape pens, in lawsuits accusing the company of intentionally marketing its products to teenagers. School districts are hoping to apply the same strategy to social media companies by arguing that their platforms have caused widespread harm to public health, particularly the mental health of young people.
University of Florida law professor Clay Calvert points out that while the public nuisance argument worked in cases like Juul’s, it might face significant challenges when applied to social media companies. One major obstacle is the First Amendment, which protects free speech, including content posted on platforms like Facebook, Instagram, Snapchat, and YouTube. Unlike vape pens, which are tangible products with clear health risks, social media platforms are primarily vehicles for user-generated content, making it harder to draw direct parallels to traditional public nuisance cases.
Still, the prospect of large settlements, similar to those reached in the Juul litigation, has encouraged plaintiffs’ lawyers to pursue this avenue. “I think they see there’s potential out there from large settlements,” Calvert said, but he cautioned that the differences between vape pens and social media platforms could complicate matters.
As the legal battle unfolds, the tech companies will likely continue to assert their innocence, pointing to their efforts to improve safety and mental health features on their platforms. But the mounting lawsuits and increasing public scrutiny suggest that these measures may not be enough to stave off further legal challenges or financial repercussions.
For Meta, Google, TikTok, and Snap, the ongoing litigation presents a threat not only to their financial stability but also to their reputations. The allegations of contributing to a mental health crisis among youth could damage their standing with parents, educators, and policymakers, potentially leading to regulatory changes that would force the companies to overhaul their business practices.
In addition to the lawsuits from school districts and states, federal regulators may soon get involved. The Federal Trade Commission (FTC) and other regulatory bodies have already been examining the role of social media in youth mental health, and the outcomes of these lawsuits could spur further action at the national level. This could include stricter regulations on how platforms can target young users, greater transparency in algorithmic design, and more robust age-verification measures to prevent underage users from accessing harmful content.