LIVE TV
LIVE TV
LIVE TV
Home > Tech and Auto News > US Jury Slams Meta And Google For Harmful Platform Design, Awards $6 Million In Case Led by Young Users’ Addiction Claims

US Jury Slams Meta And Google For Harmful Platform Design, Awards $6 Million In Case Led by Young Users’ Addiction Claims

A Los Angeles jury held Meta and Google responsible for harmful social media design, awarding $6 million in damages.

Published By: NewsX Web Desk
Last updated: March 26, 2026 05:16:24 IST

Add NewsX As A Trusted Source

A Los Angeles jury has found tech giants Meta and Alphabet’s Google negligent in designing social media platforms that can harm young users, delivering a $6 million verdict in a closely watched case that could influence thousands of similar lawsuits across the United States.

The jury awarded $4.2 million in damages against Meta and $1.8 million against Google. While the amount is relatively small for companies that spend over $100 billion annually, the ruling is significant because it is being treated as a “bellwether” case, a test trial meant to set the tone for many others consolidated in California courts.

‘Accountability Has Arrived’

The case centers on a 20-year-old woman identified in court as Kaley, who said she became addicted to Instagram and YouTube at a young age. She argued that features like infinite scroll were deliberately designed to keep users engaged for longer periods. The jury agreed, finding that both companies were negligent in how they designed their platforms and failed to adequately warn users about potential harms.

Reacting to the verdict, the plaintiff’s lead counsel said, “Today’s verdict is a referendum from a jury, to an entire industry, that accountability has arrived.”

Meta and Google, however, strongly disagreed with the outcome and have said they plan to appeal. Despite the ruling, investor confidence appeared largely unaffected, with Meta shares closing up 0.3% and Alphabet rising 0.2%.

Focus on Platform Design

The case is notable because it focused on platform design rather than content. U.S. law typically protects tech companies from liability for user-generated content, but this lawsuit argued that the very structure of these platforms encourages addictive behavior, especially among children and teenagers.

Technology analyst Gil Luria described the ruling as a setback, saying, “This process will likely get dragged out through future cases and appeals, but eventually may cause these companies to put in consumer safeguards that may dampen growth.”

Other major platforms like Snap and TikTok were also initially part of the case but chose to settle with the plaintiff before the trial began. The terms of those settlements were not made public.

Wider Legal and Policy Impact

The verdict comes amid growing scrutiny of big tech companies over the safety of children online. Over the past decade, concerns about mental health, addiction, and exposure to harmful content have intensified. While the U.S. Congress has yet to pass comprehensive legislation regulating social media, many states have started taking action. At least 20 states introduced laws last year addressing issues such as cellphone use in schools and age verification for social media accounts.

Lawmakers have also weighed in following the verdict. U.S. senators Marsha Blackburn and Richard Blumenthal called on Congress to act, urging new laws that would require platforms to prioritize children’s safety in their design.

Zuckerberg Defends Decisions

Several other significant court cases involving school districts and state officials are about to arise. One case (concerning multiple states and school districts) is expected to come to trial at the Oakland, California, federal courthouse this summer. Another case (against Instagram, YouTube, TikTok, and Snapchat) is scheduled to go to trial in Los Angeles in July.

Lawyers representing the plaintiffs claimed during the trial that Meta and Google had knowingly and purposefully marketed and targeted children and teenagers, putting profit over safety. Jurors were shown company documents that had been obtained from the companies as evidence and also heard testimony from top-level executives; among them was Meta’s CEO Mark Zuckerberg, who was questioned regarding his decision to lift the ban on some of Meta’s beauty filters, to which he responded, “I believe the evidence wasn’t strong enough to conclude that we should limit their ability to express themselves.”

(With Inputs From Reuters)

Also Read: Is OnePlus Shutting Down Globally? Europe Report, Robin Liu’s Resignation Signals Exit From Key Markets — What Does It Mean for Users?    

RELATED News

LATEST NEWS