Some of the world’s largest social media companies are now defending themselves in a major US court battle over allegations that their platforms were designed in ways that intensify engagement among children while disregarding foreseeable psychological risks. Features such as algorithmic feeds, streak systems, and constant notifications were deliberately engineered to maximise time spent on the apps, even when that engagement was found to cause anxiety, compulsive use patterns, and emotional distress. The companies in question are Instagram, Tiktok, Snapchat and YouTube, and so far they deny that their products are addictive, reject claims that they are liable for user behaviour, and maintain that they simply provide tools for connection and creativity.

The Case Brought Against Social Media
The litigation forms part of the consolidated federal case known as the Social Media Adolescent Addiction / Personal Injury Products Liability Litigation (MDL No. 3047), currently proceeding in the U.S. District Court for the Northern District of California.
The multidistrict litigation (MDL) consolidates hundreds of lawsuits filed by families, school districts, and state entities. Plaintiffs allege that platforms including Meta Platforms, TikTok, Snap Inc., and YouTube were designed in ways that contribute to compulsive use among minors, leading to anxiety, depression, eating disorders, and other psychological harms.
Multidistrict litigation is a procedural mechanism used in U.S. federal courts to consolidate similar cases for pretrial proceedings. MDL No. 3047 brings together personal injury claims alleging that social media companies knowingly designed features that increase engagement among adolescents while failing to warn users of associated risks.
The complaints focus on product design rather than isolated incidents. Plaintiffs argue that algorithmic feeds, infinite scrolling, autoplay functions, push notifications, and streak-based reward systems were engineered to maximise time spent on the platforms. The legal theory resembles earlier product liability cases involving tobacco or opioids, where companies were accused of designing products with foreseeable dependency risks.
The defendants argue that their platforms are protected by Section 230 of the Communications Decency Act and that they cannot be held liable for user-generated content. The litigation is expected to test the boundaries of that protection.
State-Level Lawsuits Add Pressure
In addition to the MDL, multiple U.S. state attorneys general have filed separate lawsuits. In 2023, more than 40 states sued Meta, alleging that Instagram and Facebook were designed to exploit young users’ psychological vulnerabilities. The suits reference internal research disclosed by whistleblower Frances Haugen, which suggested that Meta was aware of negative mental health effects among teenage users.
Other states have brought actions against TikTok, claiming that the app’s algorithmic design promotes excessive use among minors and exposes them to harmful content.
These actions increase legal and regulatory scrutiny beyond the federal court proceedings.
How Each Social Media Platform Exploits Children
Tiktok, Snapchat, and Instagram are some of the platforms being scrutinised. Here’s how they trap young people:
TikTok
TikTok has become emblematic of algorithm-driven consumption. Its “For You” feed learns from minimal user interaction and rapidly personalises content streams. The speed at which the algorithm adapts is central to its appeal.
Critics argue that this model produces compulsive viewing cycles, especially among adolescents whose reward systems are highly sensitive to novelty and social validation. Studies have shown that TikTok’s recommendation engine can quickly funnel users toward emotionally charged or body-focused content once interest is detected.
The company says it provides screen-time management tools and moderation systems. However, these safeguards exist within a business model that rewards extended engagement.
Instagram, owned by Meta Platforms, has faced sustained criticism over its impact on teenage mental health. Internal research disclosed in 2021 indicated that Instagram worsened body image concerns for a segment of teenage girls.
Features such as visible follower counts, algorithmic ranking, and performance-based metrics reinforce social comparison. For adolescents navigating identity formation, these systems can turn self-presentation into a continuous public evaluation exercise.
Meta has introduced tools such as private accounts for minors and limits on messaging from adults. Whether these changes meaningfully address the underlying engagement incentives remains contested.
Snapchat
Snapchat operates differently but faces similar criticism. Its “streak” feature rewards consecutive days of messaging between users, creating an obligation to maintain daily contact.
While marketed as playful, streak systems introduce a gamified pressure dynamic. Teenagers report anxiety over losing streak counts, which can feel socially punitive.
Snap states that its platform is focused on close friend communication and supports wellbeing initiatives. Yet the behavioural hooks remain built into the product.
Head of Instagram Calls It “Problematic”
Adam Mosseri, who has led Instagram for eight years, is the first high-profile executive to appear in the case that began this week in Los Angeles. Lawyers have argued that the lead plaintiff (known as KGM) was hurt by other things in life, not Instagram.
YouTube is also named in the suit, while Snapchat and TikTok both reached settlements ahead of the trial.
Mosseri agreed early on in his testimony that Instagram should do everything within its power to help keep users safe on the platform, especially young people. However, he then said he did not think it was possible to say how much Instagram use was “too much”.
“It’s important to differentiate between clinical addiction and problematic use,” he added. “I’m sure I’ve said that I’ve been addicted to a Netflix show when I binged it really late one night, but I don’t think it’s the same thing as clinical addiction”. He then repeatedly admitted he was not an expert in addiction.
When Mosseri was asked what he thought of KGM’s longest single day of use on the platform being a shocking 16 hours, he said “that sounds like problematic use”, refusing to acknowledge it being an addiction.
Meta, who owns Instagram, and other social media companies including YouTube, Snapchat, and Tiktok, are facing thousands of other cases brought by families, state prosecutors, and school districts across the US.
The Key Legal Framing: Does It Count as Addiction?
A central issue in the litigation is terminology. Plaintiffs frequently use the language of addiction, while platform representatives consistently avoid describing their products as addictive. Instead, they speak of excessive or “problematic” use. The MDL title itself includes the term “Adolescent Addiction,” but addiction in this context is argued as a behavioural design outcome rather than a medically classified substance dependency.
The science of behavioural addiction is complex. Unlike substance dependency, it does not involve ingestible chemicals. However, neuroscientific research demonstrates that social media engagement activates dopamine-related reward pathways, particularly in adolescents whose impulse control mechanisms are still developing.
The World Health Organization has formally recognised gaming disorder as a behavioural addiction. While social media disorder does not carry identical diagnostic status, studies published in peer-reviewed journals have linked heavy usage to increased anxiety, depression, disrupted sleep, and body dissatisfaction. While correlation does not prove causation, plaintiffs argue that the companies’ own research demonstrates awareness of risks.
The legal reluctance to use the word addiction does not eliminate the behavioural patterns associated with compulsive use. Did the companies design their platforms in ways that foreseeably intensified psychological harm?
Their Business Models Depend on This Exact Behavioural Outcome
Social media platforms are funded by advertising revenue. That revenue increases when user engagement increases. Time spent on the platform translates directly into monetisable data and impressions.
Incentives therefore align with attention capture rather than moderation. Features such as infinite scroll, autoplay video, and push notifications are not accidental. They are optimisation tools.
The legal case now unfolding asks whether companies can continue to rely on this model while disclaiming responsibility for its foreseeable psychological effects on minors.
Final Thought
The Social Media Adolescent Addiction MDL represents one of the most significant legal challenges the industry has faced. The outcome will not eliminate social media use among young people, but it could reshape how platforms design products for minors.
At stake is whether engagement-optimised business models can coexist with meaningful child protection. The court will ultimately decide whether design choices intended to maximise attention also created foreseeable harm.
For an industry built on monetising user engagement, that question cuts to the core of its operating model.
The Expose Urgently Needs Your Help…
Can you please help to keep the lights on with The Expose’s honest, reliable, powerful and truthful journalism?
Your Government & Big Tech organisations
try to silence & shut down The Expose.
So we need your help to ensure
we can continue to bring you the
facts the mainstream refuses to.
The government does not fund us
to publish lies and propaganda on their
behalf like the Mainstream Media.
Instead, we rely solely on your support. So
please support us in our efforts to bring
you honest, reliable, investigative journalism
today. It’s secure, quick and easy.
Please choose your preferred method below to show your support.
Categories: Breaking News