A court in New Mexico has ordered Meta to pay $375 million in damages after a jury found the company misled users about the safety of its platforms for children.
The ruling follows a seven-week trial in which jurors concluded that Meta, which owns Facebook, Instagram and WhatsApp, exposed young users to harmful content, including sexually explicit material and contact with predators.
Raul Torrez described the decision as “historic,” noting it is the first time a US state has successfully held Meta accountable in court over child safety concerns.
The jury determined that the company violated consumer protection laws by misleading the public about safeguards for minors. Evidence presented during the trial included internal documents and testimony from former employees indicating that Meta was aware of risks posed to younger users.
Among the witnesses was Arturo Béjar, who testified that internal testing showed underage users were frequently exposed to inappropriate content. He also recounted personal concerns about unsafe interactions involving minors on the platform.
Prosecutors argued that Meta’s recommendation algorithms played a central role by promoting harmful material to young users. They cited internal research suggesting that a significant share of users had encountered unwanted explicit content within a short period.
In response, Meta said it disagrees with the verdict and plans to appeal. The company maintains that it has invested heavily in safety features, including tools designed to protect teenagers and give parents more oversight.
The financial penalty reflects thousands of violations identified by the jury, each carrying a potential fine under state law.
The case is part of a broader wave of legal challenges facing major tech companies in the United States over the impact of social media on children. Similar lawsuits are ongoing, including cases involving other platforms accused of designing features that may contribute to harmful user experiences.
The lawsuit was originally filed in 2023, with state authorities alleging that Meta knowingly directed young users toward inappropriate content and failed to adequately address known risks.