Get all your news in one place.
100's of premium titles.
One app.
Start reading
The Guardian - US
The Guardian - US
Technology
Katie McQue

SHOCKING: Meta ordered to pay 375m after being found liable in child exploitation case - You Need To See This

Man looks forward
Mark Zuckerberg, the Meta chief executive. Photograph: Kyle Grillot/Bloomberg via Getty Images

A New Mexico jury on Tuesday ordered Meta to pay $375m in civil penalties after it found the company misled consumers about the safety of its platforms and enabled harm, including child sexual exploitation, against its users.

This is the first jury trial to find Meta liable for acts committed on its platform.

“The jury’s verdict is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety,” said New Mexico attorney general Raúl Torrez.

“Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough.”

The lawsuit was brought by Torrez’s office in December 2023. The lawsuit followed a two-year Guardian investigation published in April of that year revealing how Facebook and Instagram had become marketplaces for child sex trafficking. That investigation was cited several times in the complaint.

The jury ordered Meta to pay the maximum penalty under the law of $5,000 per violation, totaling $375m in civil penalties for violating New Mexico’s consumer protection laws. The jury found Meta liable for both claims brought by the state of New Mexico under the Unfair Practices Act.

Meta has said it will appeal the ruling, and accused Torrez of making “sensationalist, irrelevant arguments by cherrypicking select documents”.

“We respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content,” said a Meta spokesperson. “We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online.”

Internal Meta documents and testimony obtained by the New Mexico department of justice during the litigation revealed that both company employees and external child safety experts repeatedly warned about risks and harmful conditions on Meta’s platforms.

Evidence presented to the jury included details of the 2024 arrest of three men charged with sexually preying on children through Meta’s platforms, and attempting to meet up with them. This was part of a sting investigation operated by undercover agents and dubbed “Operation MetaPhile” by the attorney general’s office.

The New Mexico court heard how Meta’s 2023 decision to encrypt Facebook Messenger – its direct messaging platform, which predators have used as a tool to groom minors and exchange child abuse imagery – blocked access to crucial evidence of these crimes.

Witnesses from law enforcement and the National Center for Missing and Exploited Children (NCMEC) testified about deficiencies in Meta’s reporting of crimes taking place on its platforms, including the exchange of child sexual abuse material (CSAM). Meta has generated high volumes of “junk” reports by overly relying on AI to moderate its platforms, investigators said. These reports were useless to law enforcement, and meant crimes could not be investigated, they said.

In the next phase of the legal proceedings, due to begin on 4 May, the attorney general’s office will seek additional financial penalties and court-mandated changes to Meta’s platforms that “offer stronger protections for children”, said Torrez.

The design feature changes the state is seeking include “enacting effective age verification, removing predators from the platform, and protecting minors from encrypted communications that shield bad actors”.

In taped depositions played at the trial, Meta chief Mark Zuckerberg and Instagram leader Adam Mosseri said harms to children, such as sexual exploitation and detriments to mental health, were inevitable on the company’s platforms due to their vast user bases. Company executives also testified the company has invested billions in technology updates to keep children safe on their platforms. They include Instagram Teen Accounts, which debuted in 2024 and sets default protections for users aged between 13 and 17.

Social media companies have long maintained they are not responsible for crimes committed via their networks because of a US federal law that generally protects platforms from legal liability for content created by their users: section 230 of the Communications Decency Act. Meta’s attempts to invoke section 230 and the first amendment to get the case dismissed were denied in a judge’s ruling in June 2024, due to the lawsuit’s focus on Meta’s platform product design and other non-speech issues, such as internal decisions about content and curation.

The trial lasted almost seven weeks, with both the company and the state calling witnesses that ranged from child safety experts to current and former employees of the company. The jury deliberated its verdict for about one day.

“It’s a huge win for the New Mexico attorney general. His jury didn’t even deliberate very long,” former New Mexico deputy district attorney and current criminal defense lawyer John W Day, told the Guardian.

“This wasn’t surprising, as there’s an undercurrent of resentment and fear and concern among not just families but the community in general, about the invasiveness of social media, and this one certainly opens the floodgates to lots of other litigation and reforms and regulation.”

Meta is also the subject of a separate lawsuit in Los Angeles, as hundreds of families and school districts accuse several big tech platforms of harming children. Plaintiffs in this case allege that Meta, along with Snap, TikTok and YouTube, knowingly designed their platforms to be addictive for young users, contributing to issues such as depression, eating disorders, self-harm and other mental health challenges.

Snap and TikTok have reached settlements, while Meta and YouTube continue to contest the claims in court. All companies deny wrongdoing. The jury is currently deliberating a verdict.

• This article was amended on 25 March 2026. The New Mexico case was the first jury trial to find Meta liable for acts committed on its platform, not the “first bench [juryless] trial” to do so, as an earlier version said.

Sign up to read this article
Read news from 100's of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.