A high-profile court case has opened in California, putting major social media platforms under scrutiny over claims that their products negatively affect the mental health of children and teenagers.
The trial, taking place at the Los Angeles Superior Court, focuses on allegations against Instagram and YouTube, accused of deliberately designing features that encourage excessive use among young users. The case was brought by a plaintiff identified as K.G.M., whose identity is protected because the alleged harm occurred while she was a minor.
In his opening statement, the plaintiff’s lawyer told the jury that the platforms were intentionally engineered to keep children engaged for as long as possible. He argued that design choices such as algorithms, notifications and autoplay functions were created with the goal of maximizing user attention, without sufficient consideration for the psychological risks to young people.
According to the plaintiff, this prolonged exposure contributed to serious mental health challenges. Her legal team also accused the companies of failing to adequately warn users and parents about the potential dangers linked to these design features.
To support his claims, the lawyer cited internal communications and business targets that allegedly prioritized increased time spent on the platforms. He also argued that YouTube, owned by Google, encouraged young users on its main platform because it generated higher advertising revenue than its child-focused version.
Meta and YouTube strongly rejected the accusations. Their lawyers told the court that the plaintiff’s difficulties were rooted in complex personal and family circumstances, not in the platforms themselves. They argued that there is no clear proof that Instagram or YouTube were a decisive factor in her mental health struggles.
Meta’s defense highlighted the plaintiff’s challenging upbringing, including family instability and long-term exposure to psychological support. The company maintained that these factors played a central role in her condition and stressed that it cannot be held responsible for how individuals use social media services.
The case is expected to last around six weeks and could have far-reaching consequences. Legal observers say the verdict may influence thousands of similar lawsuits filed by families, school districts and public authorities across the United States.
During the trial, jurors are expected to hear testimony from mental health experts, former employees who have raised concerns about social media design, and top executives, including the heads of Instagram and YouTube.
At the same time, Meta is facing growing pressure from state authorities. A coalition of 29 U.S. state attorneys general has asked a federal judge to order significant changes to the company’s platforms, including removing accounts belonging to children under 13, deleting data collected from minors, and limiting the use of certain algorithms.
Some states are also calling for restrictions on screen time for young users, the removal of features considered addictive, and limits on appearance-altering filters. While Meta says it has introduced safety measures for teenage accounts, regulators argue these steps offer limited protection.
The courtroom hearings have drawn strong public interest, including parents who believe social media design decisions contributed to serious harm suffered by their children. Other companies originally named in the case, including Snapchat and TikTok, have already reached settlements and are no longer involved.