A federal judge in Oakland has ruled hundreds of school districts and governmental units can continue their lawsuit alleging social media companies designed platforms to be addictive to younger users with the resulting fallout placing a heavy burden on educators and administrators.
U.S. District Judge Yvonne Gonzalez Rogers filed an opinion Oct. 24 that partially denied a motion to dismiss “the consolidated claims of hundreds of actions brought on behalf of children and adolescents, school districts and local government entities” against Meta, which operates Facebook and Instagram, as well as Snapchat, YouTube owner Google and ByteDance, the parent company of TikTok.
According to Rogers, her opinion addressed only threshold questions relating to negligence and public nuisance claims. She said she only considered the substantive elements of the negligence claim and pledged a separate order on the merits of the public nuisance claim.
In arguing for dismissal, the tech companies insisted the claimed injuries are legally too far removed from the allegedly illegal conduct. While Rogers said she mostly disagrees with that position, she did note the First Amendment and Section 230 of the federal Communications Act “impose a fairly significant limitation” on recovery theories and also said “certain sets of plaintiffs’ allegations involve the non-foreseeable intervening conduct of third parties that breaks the chain of legally attributable causation between the plaintiffs and defendants.”
Rogers’ ruling came nine days after she refused to dismiss a similar complaint from more than 30 state attorneys general. It also lands while a consolidated action from thousands of California school districts proceeds through a state court in Los Angeles.
School districts across the country have signed on to other lawsuits all making the same general allegations: that social media companies knew or should’ve known their products were addictive — particularly to younger users — and didn’t issue proper warnings, resulting in societal damages that fall on public entities to manage.
The plaintiffs filed the current version of their amended complaint in March, incorporating allegations under laws of 19 states. The school districts say they “diverted resources to manage the impact of their students’ compulsive use and accompanying mental health issues,” Rogers wrote. She quoted the complaint as alleging student overuse “results in significant disruption to schools’ operations” and “greatly frustrates their ability to achieve their mandate of educating students in a safe and healthy environment.”
Schools also alleged the companies directly target their population, including as evidence corporate presentations addressing the percentage of a student body that uses particular social media platforms and how that affects the amount of time each user spends on their account; geolocation to identify specific schools; how the school day interrupts platform use; and classroom impacts of certain app features.
Rogers said an early assessment of each platform specific to the conduct and features of each platform, which arose during proceedings for personal injury plaintiffs, applies to the consolidated school district lawsuit. She also said the defendants’ arguments of the claims being too derivative of third-party harms ultimately “are fact-intensive questions which do not counsel dismissal at this early stage.”
As one example, Rogers wrote, “the school districts allege they have had to hire mental health personnel and develop further mental health resources to mitigate the negative in-school consequences of their students’ addiction to and compulsive use of defendants’ platforms.” She said those allegations are distinct from what the individual students could claim in their own litigation.
Whether the tech companies could have foreseen the issues the school districts alleged they encountered “is bolstered by allegations of defendants’ own knowledge,” Rogers wrote. As an example, she added: “On the one hand, many of TikTok’s and Snap’s filters may facilitate entirely innocuous challenges and thus may not be the proximate cause of any harm; on the other, some filters may foreseeably facilitate dangerous challenges. This is a question of fact, which depends largely on the foreseeable uses of filters developed by TikTok or Snap. Plaintiffs’ allegations, while perhaps deliberately generalized, are plausible and so survive defendants’ motion to dismiss to the extent that any filters created by defendants generated specific dangerous challenges.”
She stressed the core theory of the allegations is “the impact of compulsive use itself,” not the specifics of third-party contact and not subject to protections for publishing activity and certain speech. Even if the platforms were launched in good faith, the plaintiffs allege the companies gradually became more sophisticated in attracting and keeping minors as users, and those allegations include what the companies actually knew about user behavior.
Rogers reviewed economic loss laws in the affected states and said the defendants didn’t successfully argue they could be used to bar the negligence claims.
Meta and ByteDance did not respond to a request for comment from The Record.
Google responded with an emailed statement.
“Providing young people with a safer, healthier experience has always been core to our work,” said José Castañeda, Google spokesperson. “In collaboration with youth, mental health and parenting experts, we built services and policies to provide young people with age-appropriate experiences, and parents with robust controls. The allegations in these complaints are simply not true.”