Popular on Ex Nihilo Magazine

Legal & Compliance

Can You Sue Social Media For Addiction?

February 18, 2026. Mark Zuckerberg sits in a Los Angeles courtroom and tells a jury he’s “not trying to

Can You Sue Social Media For Addiction?

February 18, 2026. Mark Zuckerberg sits in a Los Angeles courtroom and tells a jury he’s “not trying to maximize time spent” on Instagram. This contradicts internal Meta documents already submitted as evidence. “We make money from ads,” an internal presentation reads. “The more time people spend, the more ads they see.” Another document describes Instagram as “like a drug” and Meta employees as “basically pushers.”

The case is KGM v. Meta Platforms, and it asks a question with billions of dollars hanging on the answer: is social media addiction something you can actually sue for? Kaley G.M., now 20, started using YouTube at age 6 and Instagram at 8 or 9. She’s suing both platforms, claiming they deliberately designed features to be addictive and caused her depression, body dysmorphia, and suicidal thoughts. Meta and YouTube say she’s blaming them for personal problems that existed before she ever scrolled a feed.

The trial started February 9 as a bellwether case. The outcome affects 1,500 similar lawsuits pending against social media platforms. TikTok and Snap looked at the evidence and settled for undisclosed amounts before trial. Meta and YouTube chose to fight. The question isn’t just whether they’ll lose. It’s whether social media addiction is a legitimate legal claim or a cynical attempt to blame platforms for user choices.

The Product Liability Gambit

Plaintiffs aren’t suing Meta and YouTube for hosting harmful content. That would trigger Section 230 of the Communications Decency Act, which protects platforms from liability for third-party posts. Instead, they’re arguing the platforms themselves are defective products. Infinite scroll, recommendation algorithms, notification systems, all designed to maximize engagement. The design is the defect.

This legal theory has worked before. In Lemmon v. Snap, the Ninth Circuit allowed a lawsuit to proceed after teenagers died in a high-speed car crash while using Snapchat’s “Speed Filter.” The filter overlaid your current speed on posts, essentially gamifying reckless driving. Snap argued Section 230 protected them. The court disagreed. The harm came from Snap’s product design, not user-generated content.

The social media addiction cases use identical logic. Instagram’s recommendation algorithm isn’t third-party content. Neither are push notifications, infinite scroll, or deliberate friction in the account deletion process. These are features Meta and YouTube designed, implemented, and constantly refined to keep users engaged. If those features are addictive by design, they’re defective products that cause harm.

Section 230’s Biggest Threat

Section 230 has protected platforms from almost every lawsuit for 30 years. The product liability workaround threatens to bypass that protection entirely. If successful, every platform feature becomes potential litigation. Recommendation algorithms, autoplay, notification timing, UI choices that make leaving harder than staying. All of it could be challenged as defectively designed.

Meta and YouTube aren’t just defending this case. They’re defending the legal framework that enabled their business models. Lose here, and 1,500 more cases proceed with precedent against them. Win here, and they preserve Section 230’s broad protections. The stakes explain why they’re fighting rather than settling like TikTok and Snap did.

The Internal Documents Are Damning

“We make money from ads, and ads are shown next to content. The more time they spend, the more money we make.” That’s from Meta’s internal documents. Another presentation compared YouTube to a “casino.” A Meta employee described Instagram as “like a drug” and employees as “basically pushers.”

Project Myst is worse. Meta researched which users experienced adverse mental health effects from Instagram. The study found that users suffering negative impacts were more likely to become addicted. Rather than addressing this, Meta used the data to identify vulnerable users. The internal logic: kids having bad experiences get more addicted, so there’s less business risk to causing harm.

These documents create massive liability exposure. They prove Meta and YouTube knew their platforms caused harm to specific populations. They prove the companies studied the problem. And they prove the companies did nothing, because addiction was profitable. That’s the tobacco playbook.

The difference is cigarettes have no safe use level. Every cigarette increases cancer risk. Social media presumably has safe use cases. Checking directions, messaging friends, posting vacation photos, none of that is inherently harmful. The addiction risk comes from heavy use driven by features designed to maximize engagement. But how do you separate “engaging product” from “addictive by design”?

Courts Can’t Agree on the Line

Courts have struggled with this question. In November 2023, Judge Yvonne Gonzalez Rogers ruled that various design features alleged by plaintiffs refer to products or product components. Features like algorithmically curated feeds, notifications designed to reengage users, and deliberately complex account deletion processes all qualify as design choices that could be defective.

The judge allowed misrepresentation claims and failure-to-warn claims to proceed easily. She also allowed claims about a narrow class of design features. But she dismissed large portions of the consumer protection claims on Section 230 grounds, finding they blamed platforms for publishing decisions rather than product design.

The legal battle is essentially definitional. Is curating a feed publishing or product design? Are recommendations editorial decisions or automated features? The answers determine whether Section 230 applies. Plaintiffs need courts to see these features as product architecture. Platforms need courts to see them as publishing functions. The line keeps shifting case by case.

Did Instagram Cause Her Problems?

Kaley G.M.’s case illustrates the central challenge: how do you prove social media caused the harm? She had significant problems before heavy Instagram use. Her home life was abusive. Her father was absent. She struggled in school. Experienced real-life bullying.

She started therapy at age 11, before her heaviest social media use. Her therapists testified that social media was not the “throughline of main issues.” One said KGM never reported feeling addicted. Another testified KGM told her she was only in the lawsuit because “her mother wanted her to” and “there might be compensation.”

YouTube data shows she averaged 29 minutes per day. Less than one TV episode. The defense argues these usage patterns don’t support addiction claims.

But plaintiffs counter with expert testimony about vulnerable populations. Kids with adverse home situations turn to social media as escape. The escape becomes dependency. KGM fit this pattern. Abusive home drives her to Instagram. Instagram’s algorithm shows content designed to maximize engagement. The content triggers body dysmorphia. She can’t stop using because the platform is designed to prevent stopping.

The causation question: did Instagram cause her problems, or did her problems lead her to Instagram where existing issues intensified? This causation problem exists in every addiction lawsuit. Millions use Instagram without developing eating disorders. If vulnerability comes from pre-existing mental health issues, can you blame the platform for exploiting those vulnerabilities?

Is Social Media Addiction Even Real?

Kaley G.M. testified she used Instagram “first thing when I woke up, right after school, late at night.” She felt compelled to check it constantly. Her mother tried blocking software. It didn’t work. These behaviors sound like addiction.

But the American Psychiatric Association doesn’t recognize “social media addiction” as a diagnosable disorder. Without clinical recognition, how do juries evaluate addiction claims?

Plaintiffs rely on brain science showing social media triggers dopamine responses similar to gambling. Notifications create anticipation. Likes provide variable rewards. Internal Meta documents compared Instagram to a “drug” and YouTube to a “casino.” The companies understood they were using addictive design patterns.

The tobacco comparison helps plaintiffs. Tobacco companies knew cigarettes were addictive and harmful. When internal documents proved they knew and concealed the harm, juries found them liable. Social media companies have similar documents.

The Business Model Is On Trial

Meta generated $134.9 billion in revenue in 2023. YouTube contributed roughly $31.5 billion to Google’s total. Both business models depend on maximizing user engagement to sell more ads. Every feature exists to increase engagement. Recommendation algorithms, infinite scroll, autoplay, push notifications, all serve one goal: keep users scrolling.

If engagement-maximizing design is defective by definition, the entire business model collapses. The product liability theory threatens every platform that monetizes through advertising.

This explains why TikTok and Snap settled while Meta and YouTube fight. Smaller platforms looked at the documents and decided settlement was cheaper. Meta and YouTube face bigger exposure from 1,500 lawsuits, but fighting this case protects the broader principle. One case becomes a firewall against hundreds more.

When Users Make Choices vs When Products Control Them

The defense argues Kaley G.M. made choices. She chose to open Instagram. She chose to keep scrolling. Her parents gave her devices with internet access. At some point, personal responsibility matters.

Except the question is whether addiction is a choice or a disease. Tobacco companies lost on these grounds. They knew cigarettes were addictive. They deliberately engineered nicotine delivery to maximize addiction. Courts held them responsible.

Social media companies engineered engagement optimization. Project Myst shows they knew vulnerable users became addicted. They targeted those users anyway because addiction drove revenue.

But there’s a key difference. Kaley G.M. started at age 8 or 9. Can children be held responsible for addiction that starts before age 10? The law generally says no. Children lack capacity to consent. The failure-to-warn claims hinge on this. Did Meta and YouTube adequately warn users about addiction risks? The answer is no. The companies knew. They didn’t warn.

The Billion-Dollar Question

If plaintiffs win, immediate damages might be small. But she’s one of 1,500 plaintiffs. If each case settles for $1 million to $5 million, that’s $1.5 billion to $7.5 billion in total exposure.

Winning this case creates precedent. Every teenager who developed an eating disorder while using Instagram becomes a potential plaintiff. Meta has 3.96 billion monthly active users. YouTube has 2.7 billion. Even if only 1% develop addiction-related harms, that’s millions of potential claims. Tobacco companies paid $246 billion.

Why TikTok Settled and Meta Fights

This explains the settlement calculation. TikTok and Snap settled these cases for undisclosed amounts. Likely millions per case, possibly tens of millions for the strongest claims. Small enough to avoid setting precedent. Large enough to make the claims go away. Smart business.

Meta and YouTube chose different strategies. Fight the case. Win on causation. Win on Section 230. Establish that engagement optimization isn’t defective design. Protect the business model for all platforms. The gamble is expensive in the short term but protects billions in the long term.

The verdict will either validate 30 years of Section 230 protection or open a new era of platform liability. Billions of dollars in market value hang on whether 12 jurors believe Instagram made Kaley G.M. addicted or whether she’s blaming a platform for problems that existed before she ever opened the app.

Can you sue social media for addiction? The legal answer is maybe. The practical answer depends on whether plaintiffs can prove causation and whether juries believe engagement optimization crosses the line into defective design. We’re about to find out.

Sources

Reuters – Woman Suing Meta YouTube Trial

NBC News – Kaley GM Testimony

Tech Policy – Section 230 Product Liability

Benesch Law – Social Media Product Liability

Dynamis LLP – Section 230 Workarounds


Ex Nihilo magazine is for entrepreneurs and startups, connecting them with investors and fueling the global entrepreneur movement

About Author

Conor Healy

Conor Timothy Healy is a Brand Specialist at Tokyo Design Studio Australia and contributor to Ex Nihilo Magazine and Design Magazine.

Leave a Reply

Your email address will not be published. Required fields are marked *