Jury: Meta harms kids’ mental health, safety
A New Mexico jury determined Tuesday that Meta knowingly harmed children’s mental health and concealed what it knew about child sexual exploitation on its social media platforms, a verdict that signals a changing tide against tech companies and the government’s willingness to crack down.
The landmark decision comes after a nearly seven-week trial, and as jurors in a federal court in California have been sequestered in deliberations for more than a week about whether Meta and YouTube should be liable in a similar case.
New Mexico jurors sided with state prosecutors who argued that Meta — which owns Instagram, Facebook and WhatsApp — prioritized profits over safety, and violated parts of the state’s Unfair Practices Act.
The jury agreed with allegations that Meta made false or misleading statements and also agreed that Meta engaged in “unconscionable” trade practices that unfairly took advantage of the vulnerabilities of and inexperience of children.
Jurors found there were thousands of violations, each counting separately toward a penalty of $375 million. That’s less than one-fifth of what prosecutors were seeking.
Meta is valued at about $1.5 trillion and the company’s stock was up 5% in early after-hours trading following the verdict, a signal that shareholders were shrugging off the news.
Juror Linda Payton, 38, said the jury reached a compromise on the estimated number of teenagers affected by Meta’s platforms, while opting for the maximum penalty per violation. With a maximum $5,000 penalty for each violation, she said she thought each child was worth the maximum amount.
The social media conglomerate won’t be forced to change its practices right away. It will be up to a judge — not a jury — to determine whether Meta’s social media platforms created a public nuisance and whether the company should pay for public programs to address the harms. That second phase of the trial will happen in May.
A Meta spokesperson said the company disagrees with the verdict and will appeal.
“We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content,” the spokesperson said. “We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online.”
Attorneys for Meta said the company discloses risks and makes efforts to weed out harmful content and experiences, while acknowledging that some bad material gets through its safety net.
New Mexico’s case was among the first to reach trial in a wave of litigation involving social media platforms and their impacts on children.
More than 40 state attorneys general have filed lawsuits against Meta, claiming it’s contributing to a mental health crisis among young people by deliberately designing Instagram and Facebook features that are addictive.
“Meta’s house of cards is beginning to fall,” said Sacha Haworth, executive director of watchdog group The Tech Oversight Project. “For years, it’s been glaringly obvious that Meta has failed to stop sexual predators from turning online interactions into real world harm.”


