On Tuesday, a New Mexico jury found social media giant Meta violated state consumer protection laws and purposefully misled users about the safety of its platforms for minors. The verdict includes a total civil penalty of $375 million based on the maximum penalty allowed under New Mexico law of $5,000 per violation.
After a nearly seven-week trial, the jury agreed with New Mexico Attorney General Raúl Torrez’s argument that Meta, which owns Instagram, Facebook, and WhatsApp, “knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew.”
The original lawsuit was filed in 2023 in New Mexico, accusing Meta of violating the state’s Unfair Practices Act by engaging in “unconscionable trade practices” that exploited its minor users. According to Torrez’s initial complaint, Meta knowingly failed to “identify and report” child sexual abuse material while “creating and sending harmful notifications that encourage addictive use of its platforms.”
Torrez plainly states that Meta’s algorithms are focused on finding and disseminating “sexually exploitative and explicit materials” for the purpose of creating its own “social network of users looking to buy and sell the images,” adding that it’s “the children who are its casualties and its currency.”
To prove that Meta’s algorithm deliberately steered children toward exploitative material and potential traffickers, State of New Mexico investigators “created numerous accounts for minors” on Facebook and Instagram. Investigators then documented each allegation of sexual solicitation and Meta’s response regarding the claim.
In one instance, investigators created a fictional pairing, Cereceres and her 13-year-old daughter Issa Bee. Cereceres’ profile included signs and symbols connected to human trafficking, and her posts included language that suggested an interest in trafficking Issa.
Three days after investigators established the profile, Cereceres’ account reached Facebook’s maximum limit of 5,000 friends, with over 3,000 followers. Meanwhile, Issa’s account included posts that alleged physical and sexual assault, mental health issues, and physical abuse. She also suggested friends and relatives had trafficked her.
According to New Mexico authorities, Facebook never alerted the account. Still, its algorithm did take notice of Issa’s content to push ads and notifications for mental health treatment and law firms representing victims of human trafficking.
Attorneys for Meta argue its apps are for connection, not predators. The company has invested heavily in safety, disclosing risks and working to weed out harmful content posted on its platforms.
In its 2025 Community Standards Enforcement Report, Meta states that it finds over 98 percent of the child nudity and sexual exploitation content it acts on before users report it, with less than 1 percent of the material restored after investigation.
While Meta reportedly took action on over 9.9 million pieces of child sexual exploitation content on Facebook and Instagram, New Mexico’s investigation identified thousands of users and content that Meta took no action to remove, despite being clear violations of the company’s policies.
Prosecutors were able to get around the First Amendment protections of Section 230 of the U.S. Communications Decency Act by arguing that Meta’s algorithm is responsible for pushing out harmful content to minors.
Social media use disorder is not recognized as an official disorder by any major diagnostic system, including the Diagnostic and Statistical Manual of Mental Disorders. That hasn’t stopped state prosecutors across the country from using novel legal theories to pursue claims of social media addiction and harm caused by the algorithms of tech companies like YouTube, Google, and Meta.
On Wednesday, a California jury found Meta and YouTube purposely created addictive design features that caused a young user’s mental health distress. With prosecutors citing design features such as infinite scroll and for-you recommendations as evidence of intent to harm, the decisions in New Mexico and California set legal precedent for thousands of similar ongoing lawsuits in over 40 states.
At least 16 states have passed legislation to restrict minors’ access to social media over concerns of “addictive behaviors, mental health problems, and other harmful effects,” according to the Harvard Law Review.
Next is a bench trial scheduled to begin in New Mexico on May 4. Torrez has stated he will “seek injunctive relief that requires Meta to pay additional damages and make specific changes to its platforms and company operations.”
Meta is appealing the New Mexico ruling while evaluating its legal options for the decision in California, according to the New York Times.
READ MORE by Tosin Akintola:
Supreme Court Hears Arguments to Overturn Mississippi Law Allowing Late-Arriving Mail-In Ballots
Pentagon and Intelligence Officials Update Nation on Iran War
Partial Government Shutdown Pushes Airport Security to Its Limits








![Kamala Comes Unglued, Makes Bogus Claim About Her Landslide Loss to Trump [WATCH]](https://www.right2024.com/wp-content/uploads/2025/10/Kamala-Comes-Unglued-Makes-Bogus-Claim-About-Her-Landslide-Loss-350x250.jpg)







