Who Are the Social Media Companies Involved?
The litigation targets the largest and most widely used social media platforms in the United States — companies that collectively reach tens of millions of minors every day:
The parent company of Facebook, Instagram, WhatsApp, and Messenger. Meta is the most prominent defendant in the litigation. Internal research leaked in 2021 by former employee Frances Haugen revealed that the company’s own scientists found Instagram worsened body image issues for one in three teenage girls, yet the company took no meaningful corrective action. In March 2026, two separate juries — in New Mexico and Los Angeles — found Meta liable for endangering children and designing addictive products.
YouTube is the single most popular platform among U.S. teens, used by an estimated 92% of adolescents according to Pew Research Center data. Its recommendation algorithm and autoplay features create a continuous, never-ending stream of content. A Google executive acknowledged that users spend approximately 70% of their time on the platform watching algorithmically recommended videos rather than content they actively sought out. In March 2026, a California jury found YouTube liable alongside Meta for negligence in the first personal injury bellwether trial.
TikTok’s short-form video platform has become one of the fastest-growing apps among young users. Reports have indicated that more than a third of U.S. TikTok users were under the age of 14 as of 2020. The platform’s auto-playing For You Page, continuous scroll design, and hidden clock features are specifically cited in litigation as mechanisms that distort users’ sense of time and promote compulsive engagement. TikTok has also faced wrongful death lawsuits after minors died attempting dangerous viral challenges promoted by its algorithm.
Snapchat’s ephemeral messaging design — where Snaps disappear after 10 seconds and Stories expire after 24 hours — creates a constant sense of urgency that drives repeated engagement. Features like Snapstreaks, Snap Map, Snapscore, and Spotlight have been alleged to exploit adolescent psychology by gamifying social interaction. Snapchat’s disappearing message format has also been linked to predatory contact with minors and exposure to drug sales, including fentanyl distribution. In early 2026, Snap reached a confidential settlement in a bellwether case set for trial in California state court.
Each of these companies generates billions of dollars in annual advertising revenue. Plaintiffs allege that this profit motive drove the companies to prioritize engagement and screen time over the safety and well-being of their youngest and most vulnerable users.
What Are the Injuries Associated with Excessive Social Media Use?
The harms linked to compulsive social media use among young people are serious, well-documented, and in many cases life-altering. The U.S. Surgeon General has warned that teens who spend more than three hours per day on social media face double the risk of depression and anxiety. A study published in JAMA Network tracked over 4,000 children ages 9 and 10 over four years and found that roughly one-third developed patterns of addictive social media use — and those children were two to three times more likely to experience suicidal thoughts and behaviors.
Mental Health Injuries
- Depression and anxiety — Among the most commonly reported conditions, often requiring therapy, medication, and in severe cases, hospitalization.
- Eating disorders — Including anorexia nervosa, bulimia nervosa, binge-eating disorder, and body dysmorphia (sometimes called “Snapchat dysmorphia”), which can cause permanent damage to physical health and reproductive organs.
- Self-harm — Cutting, burning, and other forms of deliberate self-injury frequently linked to social comparison, cyberbullying, and exposure to self-harm content on platforms.
- Suicidal ideation and attempts — Persistent thoughts of suicide and actual attempts, with studies showing a direct correlation between heavy social media use and elevated suicide risk, particularly among teenage girls.
- Compulsive and addictive use — Inability to stop or reduce usage despite negative consequences, withdrawal-like symptoms when disconnected (irritability, mood swings, agitation), and tolerance requiring increasing amounts of time on platforms to achieve the same effect.
- Attention and focus disorders — Including ADHD symptoms, reduced attention span, slower decision-making, and academic decline.
Physical Health Consequences
- Chronic sleep disruption and insomnia
- Headaches, migraines, and eye strain
- Unhealthy dietary behaviors and disordered eating
- Reduced physical activity
- Elevated markers of chronic inflammation (C-reactive protein)
Broader Impacts
- Social withdrawal and isolation from family and peers
- Cyberbullying and online harassment
- Exposure to sexual predators and sexually explicit material
- Exposure to dangerous challenges and harmful content
- Academic performance decline
- Substance use linked to online drug solicitation
Research published in the Archives of Disease in Childhood found that teenage girls’ smartphone and social media use was significantly linked to anxiety, tiredness, loneliness, and negative feelings about body image. A separate study published in eBioMedicine tracked 168 children and found that early excessive screen time was associated with slower cognitive decision-making and higher anxiety by age 13.
How Are Social Media Algorithms Designed to Promote Addiction?
At the core of these lawsuits is a central allegation: the defendant companies intentionally designed their platforms to be addictive, particularly to adolescents whose brains are still developing. The prefrontal cortex — responsible for impulse control, risk assessment, and decision-making — does not fully mature until approximately age 21, making young users especially susceptible to manipulative design.
The Dopamine Feedback Loop
Social media platforms exploit the brain’s reward system by delivering unpredictable, intermittent rewards — a “like,” a new follower, a comment, a viral post. This mirrors the same variable reinforcement schedule used in slot machines and gambling, triggering dopamine release that reinforces habitual checking and scrolling. Research has shown that these digital rewards can hijack dopaminergic pathways more quickly and reliably than naturally earned rewards, such as academic achievement or physical exercise.
Platform-Specific Design Features
Meta (Instagram and Facebook)
- The Meaningful Social Interaction (MSI) algorithm collects extensive on- and off-platform behavioral data to prioritize high-engagement content.
- Intermittent variable rewards through unpredictable notification of “likes” and comments.
- Features like Instagram Live and disappearing Stories create urgency and fear of missing out.
- Beauty filters and algorithmic amplification of appearance-related content drive body image distortion.
TikTok
- Auto-playing For You Page that cannot be disabled, delivering an endless stream of algorithmically curated short videos.
- Continuous scroll with no natural stopping point.
- A deliberately hidden clock that prevents users from tracking time spent on the app.
- Push notifications (e.g., “TikTok Now”) designed to pull users back at intervals throughout the day.
YouTube
- Autoplay and recommended video features create a never-ending content pipeline — users spend an estimated 70% of watch time on algorithmically recommended content rather than videos they searched for.
- YouTube Shorts further distort time perception with rapid short-form content.
Snapchat
- Ephemeral design where content disappears, compelling constant re-engagement.
- Gamification through Snapstreaks (consecutive days of messaging), Snapscore, Trophies, and Charms.
- Spotlight feature, which reportedly drove a 200% increase in user time spent on the platform.
- Disruptive push notifications timed to maximize re-engagement.
Additional Manipulative Mechanisms
- Infinite scrolling — Eliminates natural stopping cues, encouraging open-ended browsing sessions.
- Push notifications — Timed to interrupt daily life and trigger habitual app-opening.
- No built-in usage limits — Platforms provide no meaningful session timers or prompts to stop.
- Inadequate age verification — Allowing children well below stated minimum ages to create accounts.
- Barriers to disconnecting — Making it difficult to deactivate or delete accounts.
- Absence of parental controls — Failing to offer effective tools for parents to monitor or restrict usage.
Internal documents and executive testimony have revealed that these companies were fully aware of the addictive properties of their products and their outsized impact on minors — yet consistently chose engagement and revenue over user
What Are the Lawsuits About and Recent Litigation Updates
The social media addiction lawsuits assert that the defendant companies designed, manufactured, and marketed defective products that are unreasonably dangerous to minors. The cases are not about the content that individual users post — which is generally protected under Section 230 of the Communications Decency Act — but about the companies’ own design decisions, algorithms, and product features.
The principal legal theories include:
The platforms’ design features, including recommendation algorithms, autoplay, infinite scroll, and notification systems, render them defective products that are unreasonably dangerous to young users.
The companies failed to adequately disclose the known risks of their products to users and their parents.
The companies knew or should have known that their products caused harm to a significant percentage of minor users and failed to take reasonable steps to redesign their products or mitigate the danger.
The companies actively concealed internal research documenting the harmful effects of their platforms on youth mental health.
The companies’ conduct created a widespread public health crisis affecting communities, schools, and government resources.
In cases where a minor died by suicide or from complications of an eating disorder linked to platform use.
Who Is Filing Suit?
Parents and guardians filing on behalf of minor children, as well as young adults who were minors when they were harmed, seeking compensation for mental health treatment, emotional distress, and related damages.
More than 2,000 school districts nationwide have filed claims alleging that social media’s impact on students has forced schools to spend significantly more on mental health counselors, psychologists, security officers, and educational programming about online safety.
At least 42 state attorneys general have filed lawsuits against Meta and other social media companies under consumer protection and unfair trade practices statutes.
Native American sovereign nations, including the Choctaw and Chickasaw Nations, have joined the litigation.
The federal lawsuits have been consolidated into a multidistrict litigation (MDL No. 3047), titled In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, in the U.S. District Court for the Northern District of California before Judge Yvonne Gonzalez Rogers. As of March 2026, more than 2,400 cases are pending in the MDL — a number that has grown rapidly from fewer than 500 cases in mid-2024. A parallel coordinated proceeding (JCCP 5255) is underway in California state court in Los Angeles.
Do You Qualify for a Social Media Addiction Lawsuit?
Patients and families may qualify to file a Social Media Addiction lawsuit if they meet certain criteria. To pursue a claim, potential claimants generally must meet the following criteria:
Who May Be Eligible
- A child, teen, or young adult who used one or more of the named platforms (Instagram, Facebook, TikTok, YouTube, and/or Snapchat)
- Parents or legal guardians filing on behalf of a minor child
- Young adults (typically under age 25) who began using the platforms as minors
Usage Threshold Qualifying Factors
- The individual regularly used one or more social media platforms, typically three or more hours per day
- The individual began using the platform(s) before the age of 18
Qualifying Injuries
The individual must have developed one or more of the following conditions that are connected to their social media use, before the age of 21:
- Eating disorder (anorexia, bulimia, binge-eating disorder)
- Body dysmorphia or severe body image issues
- Self-harm behaviors
- Suicidal ideation (persistent thoughts of suicide) or suicide attempt
- Severe depression or anxiety requiring professional treatment
- Sleep disorders or insomnia
- Addiction-like symptoms (inability to stop using, withdrawal, mood changes when unable to access platforms)
Treatment Requirement
The individual received medical treatment, therapy, or counseling for the qualifying condition.
Important Notes
- The statute of limitations varies by state. In many jurisdictions, the clock begins when the injury is discovered or reasonably should have been discovered — not necessarily when the social media use began.
- Claims are evaluated on a case-by-case basis, and the strength of each case depends on the specific facts and documentation available.
Many families do not fully recognize the connection between their child’s social media habits and their mental health struggles until they speak with an attorney who understands this litigation. It’s important to contact a lawyer soon as possible to evaluate your legal options.
Contact Us Today
Your child’s well-being is not a product to be monetized. If your family has been affected by the harms of social media addiction, our firm is ready to listen and to fight for the justice and accountability your family deserves.
At LexLegal, we are actively investigating and filing claims on behalf of children, teens, young adults, and their families who have been harmed by addictive social media platforms. If you or your child suffered serious psychological or physical harm from compulsive social media use, contact us today. LexLegal offers free, confidential case evaluations to determine whether you qualify for a Social Media Addiction lawsuit. Complete our instant case evaluation form. We’ll review your information and promptly respond about your legal options. Every Social Media Addiction lawsuit we handle is taken on a contingency fee basis, which means there are no upfront costs for clients. Our firm is experienced in handling product liability cases, and we welcome any questions you may have.
Complete our instant case evaluation today to learn whether you may be eligible to file a Social Media Addiction lawsuit.