A jury in New Mexico has ordered Meta to pay $375 million in damages, marking the first legal defeat in a series of child safety trials expected across the United States in 2026. The verdict found that the company misled users about the safety of its platforms and exposed minors to sexual exploitation and harmful content. The decision carries immediate consequences: it opens the door to further lawsuits and potential regulatory changes affecting how social media platforms operate.
The verdict: misleading users and exposing minors
The ruling came from a state district court jury in Santa Fe after nearly seven weeks of trial. Jurors sided with New Mexico Attorney General Raúl Torrez, who filed the lawsuit in 2023, arguing that Meta failed to protect children on Facebook, Instagram and WhatsApp.
The jury found that Meta violated the state’s consumer protection laws, specifically the Unfair Practices Act, by making false or misleading statements about platform safety and engaging in “unconscionable” business practices that exploited children’s vulnerability and inexperience.
The case documented thousands of violations, each contributing to the total penalty of $375 million.
“A historic victory for children”
Attorney General Raúl Torrez described the outcome in stark terms:
“The jury’s verdict is a historic victory for every child and every family that has paid the price for Meta’s decision to put profits over children’s safety”.
He added:
“Meta executives knew their products harmed children, ignored warnings from their own employees, and lied to the public about what they knew”.
Torrez has already indicated he will seek additional financial penalties and push for structural changes to Meta’s platforms, including stricter age verification and stronger removal of harmful actors.
Meta’s response: appeal and defense strategy
Meta rejected the verdict and confirmed it will appeal. A company spokesperson stated:
“We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content. We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online”.
The company has consistently denied allegations that its platforms are inherently harmful or intentionally designed to exploit users, especially minors.
The broader legal landscape: a turning point
This case is the first in a series of lawsuits brought by:
- state and federal authorities;
- school districts;
- thousands of families.
All of them aim to hold social media companies accountable for harm linked to their platforms, including:
- exposure to sexual predators;
- addictive design features;
- mental health impacts such as depression, eating disorders and suicide.
The verdict reflects a shift in public perception. While the $375 million fine represents only a small fraction of Meta’s $201 billion revenue in 2025, its symbolic weight is significant.
Inside the investigation: how the case was built
Prosecutors constructed their case by posing as minors on social media platforms. They documented sexual solicitations received and analyzed Meta’s response mechanisms.
According to the prosecution, the company engineered its algorithms to maximize user engagement, even when aware of risks to children.
During the trial, prosecuting attorney Donald Migliori argued that Meta knowingly prioritized keeping young users online over protecting them from exploitation.
The Los Angeles case: addiction at the center
Attention now shifts to Los Angeles, where another major case is underway. A jury is deliberating whether Meta and YouTube designed their platforms to be addictive, particularly for young users.
At the center of the case is a 20-year-old plaintiff identified as “KGM,” whose lawsuit could influence thousands of similar claims.
The legal argument focuses on platform design and dopamine-driven engagement, drawing parallels with substance addiction.
Attorney Jayne Conroy, involved in the case, stated:
“With the social media case, we’re focused primarily on children and their developing brains and how addiction is such a threat to their well-being and … the harms that are caused to children – how much they’re watching and what kind of targeting is being done”.
She added:
“The medical science is not really all that different, surprisingly, from an opioid or a heroin addiction. We are all talking about the dopamine reaction”.
The regulatory implications: Section 230 under pressure
These lawsuits could challenge key legal protections for tech companies, including:
- the First Amendment shield;
- Section 230 of the Communications Decency Act.
Section 230 currently protects platforms from liability for user-generated content. A shift in its interpretation could fundamentally change the business model of social media companies.
A long legal road ahead
Despite the landmark verdict, the legal battle is far from over. Appeals, settlements and further trials could take years.
At the same time, regulation in the United States remains slow compared to Europe and Australia, where stricter rules on tech platforms are already in place.






Commenta con Facebook