Tech Giants Face Trial Over Social Media’s Harm to Kids

Additional Coverage:

LOS ANGELES, CA – The digital world is clashing with the legal world as landmark trials begin to determine if major social media companies are responsible for harm to children using their platforms. The first of these pivotal cases kicked off Monday in Los Angeles County Superior Court.

At the heart of the litigation are claims that Instagram parent company Meta and Google’s YouTube intentionally design their platforms to addict and harm young users. While TikTok and Snap were initially named in the suit, both companies have since settled for undisclosed amounts.

Jurors in downtown Los Angeles’ Spring Street Courthouse were introduced to what promises to be a lengthy trial, featuring starkly different narratives from the plaintiffs and the two tech giants still standing as defendants.

Mark Lanier, representing the plaintiffs, delivered an energetic opening statement, declaring the case “easy as ABC,” which he defined as “addicting the brains of children.” He accused Meta and Google, two of the “richest corporations in history,” of having “engineered addiction in children’s brains.”

A 19-year-old identified only as “KGM” is central to the Los Angeles case. Her experience could set a precedent for thousands of similar lawsuits against social media companies. KGM and two other plaintiffs have been chosen for “bellwether trials,” essentially test cases to gauge how arguments resonate with a jury and what, if any, damages might be awarded, explained Clay Calvert, a nonresident senior fellow of technology policy studies at the American Enterprise Institute.

This marks the first time these companies will argue their case before a jury, and the outcome could significantly impact their business models and how they manage child users.

Lanier asserted that defense attorneys would “try to blame the little girl and her parents for the trap they built,” referring to KGM, who was a minor when she claims she became addicted to social media, leading to detrimental effects on her mental health.

According to Lanier, while Meta and YouTube publicly state their commitment to child protection and safeguards, internal documents reveal a different story, explicitly targeting young children as audiences.

Drawing parallels to the tobacco industry, Lanier highlighted internal Meta communications showing employee concerns about the company’s inaction regarding potential harm to children and teens.

“For a teenager, social validation is survival,” Lanier stated, arguing that defendants “engineered a feature that caters to a minor’s craving for social validation,” specifically referencing “like” buttons and similar features.

Lanier concluded his opening statement before the court recessed for lunch, with attorneys for Meta and Google set to deliver their arguments later.

“This was only the first case – there are hundreds of parents and school districts in the social media addiction trials that start today, and sadly, new families every day who are speaking out and bringing Big Tech to court for its deliberately harmful products,” commented Sacha Haworth, executive director of the nonprofit Tech Oversight Project.

Jurors have been instructed by Judge Carolyn B. Kuhl not to alter their social media habits during the estimated eight-week trial, including changing settings or creating new accounts. They are also tasked with deciding the liability of Meta and YouTube independently.

A separate trial with opening arguments also began Monday in New Mexico.

KGM’s lawsuit alleges that early social media use led to addiction, exacerbating depression and suicidal thoughts. Crucially, the suit claims these effects stem from deliberate design choices aimed at maximizing child engagement for profit. If successful, this argument could bypass legal protections like the First Amendment and Section 230, which shields tech companies from liability for third-party content.

“Borrowing heavily from the behavioral and neurobiological techniques used by slot machines and exploited by the cigarette industry, Defendants deliberately embedded in their products an array of design features aimed at maximizing youth engagement to drive advertising revenue,” the lawsuit states.

High-profile executives, including Meta CEO Mark Zuckerberg, are expected to testify during the six to eight-week trial. Experts have noted similarities to the “Big Tobacco” trials, which resulted in a 1998 settlement requiring billions in healthcare costs and restrictions on marketing to minors.

The tech companies dispute these claims, citing numerous safeguards implemented over the years and arguing they are not liable for user-generated content.

A Meta spokesperson recently stated the company strongly disagrees with the allegations and is “confident the evidence will show our longstanding commitment to supporting young people.”

José Castañeda, a Google spokesperson, called the allegations against YouTube “simply not true,” adding that “Providing young people with a safer, healthier experience has always been core to our work.”

This case is the first in a series of trials commencing this year to address social media’s impact on children’s mental well-being.

In New Mexico, a trial began Monday with allegations that Meta failed to protect young users from sexual exploitation, following an undercover online investigation. Attorney General Raúl Torrez filed the lawsuit against Meta and Zuckerberg in late 2023, though Zuckerberg was later dropped from the suit.

A federal bellwether trial in Oakland, California, scheduled for June, will be the first to represent school districts suing social media platforms over harm to children.

Additionally, over 40 state attorneys general have filed lawsuits against Meta, claiming its design of Instagram and Facebook features deliberately addict children and contribute to the youth mental health crisis. Most cases are in federal court, with some in state courts.

TikTok also faces similar lawsuits across more than a dozen states.

Internationally, governments are enacting new laws to regulate social media for children. French lawmakers approved a bill in January banning social media for children under 15, aiming for implementation by September.

Australia has already banned platforms for those under 16, leading to the revocation of 4.7 million child accounts and sparking debates about technology use, privacy, child safety, and mental health. The British government also announced last month it would consider banning young teenagers from social media to enhance child protection from harmful content and excessive screen time.


Read More About This Story:

TRENDING NOW

LATEST LOCAL NEWS