The jury held Meta accountable for $US4.2 million in damages and Google for $US1.8 million. Despite being modest amounts for these financially robust companies, with annual capital expenditures exceeding $US100 billion each, the verdict is significant.
This trial in Los Angeles is intended to act as a bellwether, or test case, for thousands of comparable lawsuits consolidated in California state courts.
‘ACCOUNTABILITY HAS ARRIVED’
The case centers on a 20-year-old woman, referred to in court as Kaley, who was a minor when the case started. She claimed addiction to Google’s YouTube and Meta’s Instagram due to their engaging design features, such as “infinite scroll,” which encourages continuous user engagement.
The jury found both Google and Meta negligent in their app designs and failing to warn users of potential hazards.
“Today’s verdict is a referendum — from a jury, to an entire industry — that accountability has arrived,” stated the plaintiff’s lead attorney.
Meta and Google have expressed disagreement with the verdict and plan to appeal, according to their spokespeople.
Following the verdict, Meta’s shares rose by 0.3%, while Google parent Alphabet’s shares increased by 0.2%.
US law generally shields social media companies from liabilities related to content on their platforms. However, this lawsuit in Los Angeles targeted the design of the platforms rather than the content.
Gil Luria, a technology analyst at D.A. Davidson, described the verdict as a “setback” for Meta and Google.
“This process will likely get dragged out through future cases and appeals, but eventually may cause these companies to put in consumer safeguards that may dampen growth,” he commented.
Snap and TikTok, initially defendants in the trial, settled with the plaintiff before proceedings began. The terms of these settlements remain undisclosed.
MOUNTING CRITICISM
Over the past decade, large tech companies in the US have faced increasing criticism concerning the safety of children and teenagers. This issue has shifted from public debate to legal and governmental scrutiny. The US Congress has not yet enacted comprehensive social media regulations.
According to the nonpartisan National Conference of State Legislatures, at least 20 states have enacted laws in the past year addressing social media usage by children.
Some of these laws regulate cellphone use in schools and require age verification for social media account creation. NetChoice, a trade association supported by companies like Meta and Google, is challenging the age verification mandates in court.
After the verdict, US senators Marsha Blackburn and Richard Blumenthal urged Congress to pass legislation mandating social media platforms be designed with children’s safety in mind.
A separate social media addiction lawsuit, involving several states and school districts, is scheduled for trial this summer in federal court in Oakland, California.
Another state trial is set to commence in Los Angeles in July, involving platforms like Instagram, YouTube, TikTok, and Snapchat, as noted by attorney Matthew Bergman, who is leading the plaintiffs’ cases.
In a related development, a New Mexico jury found Meta in violation of state law in a lawsuit filed by the state’s attorney general. The lawsuit accused Meta of misleading users about the safety of Facebook, Instagram, and WhatsApp, and facilitating child sexual exploitation on these platforms.
TRIAL ARGUMENTS
During the trial, the plaintiff’s attorneys argued that Meta and Google deliberately targeted younger users and prioritized profit over safety. Meta’s defense highlighted the plaintiff’s challenging childhood as a factor in her mental health issues, while YouTube claimed her use of the platform was minimal.
Jurors reviewed internal documents showing how Meta and Google sought to engage younger audiences. Additionally, executives, including Meta CEO Mark Zuckerberg, testified to defend their corporate strategies.
When questioned about Meta’s decision to rescind a temporary ban on beauty filters, which some within Meta warned could harm teenage girls, Zuckerberg justified the choice as allowing user expression.
“I felt like the evidence wasn’t clear enough to support limiting people’s expression,” he stated.
How considerations of free speech and content moderation influenced these companies’ decisions is likely to be significant in any appeal.

