Forgot your password?
typodupeerror
Social Networks Facebook Slashdot.org

Internal Messages May Doom Meta At Social Media Addiction Trial (arstechnica.com) 54

An anonymous reader quotes a report from Ars Technica: This week, the first high-profile lawsuit -- considered a "bellwether" case that could set meaningful precedent in the hundreds of other complaints -- goes to trial. That lawsuit documents the case of a 19-year-old, K.G.M, who hopes the jury will agree that Meta and YouTube caused psychological harm by designing features like infinite scroll and autoplay to push her down a path that she alleged triggered depression, anxiety, self-harm, and suicidality. TikTok and Snapchat were also targeted by the lawsuit, but both have settled. The Snapchat settlement came last week, while TikTok settled on Tuesday just hours before the trial started, Bloomberg reported. For now, YouTube and Meta remain in the fight. K.G.M. allegedly started watching YouTube when she was 6 years old and joined Instagram by age 11. She's fighting to claim untold damages -- including potentially punitive damages -- to help her family recoup losses from her pain and suffering and to punish social media companies and deter them from promoting harmful features to kids. She also wants the court to require prominent safety warnings on platforms to help parents be aware of the risks. [...]

To win, K.G.M.'s lawyers will need to "parcel out" how much harm is attributed to each platform, due to design features, not the content that was targeted to K.G.M., Clay Calvert, a technology policy expert and senior fellow at a think tank called the American Enterprise Institute, wrote. Internet law expert Eric Goldman told The Washington Post that detailing those harms will likely be K.G.M.'s biggest struggle, since social media addiction has yet to be legally recognized, and tracing who caused what harms may not be straightforward. However, Matthew Bergman, founder of the Social Media Victims Law Center and one of K.G.M.'s lawyers, told the Post that K.G.M. is prepared to put up this fight. "She is going to be able to explain in a very real sense what social media did to her over the course of her life and how in so many ways it robbed her of her childhood and her adolescence," Bergman said.

The research is unclear on whether social media is harmful for kids or whether social media addiction exists, Tamar Mendelson, a professor at Johns Hopkins Bloomberg School of Public Health, told the Post. And so far, research only shows a correlation between Internet use and mental health, Mendelson noted, which could doom K.G.M.'s case and others.' However, social media companies' internal research might concern a jury, Bergman told the Post. On Monday, the Tech Oversight Project, a nonprofit working to rein in Big Tech, published a report analyzing recently unsealed documents in K.G.M.'s case that supposedly provide "smoking-gun evidence" that platforms "purposefully designed their social media products to addict children and teens with no regard for known harms to their wellbeing" -- while putting increased engagement from young users at the center of their business models.
Most of the unsealed documents came from Meta. An internal email shows Mark Zuckerberg decided Meta's top strategic priority was getting teens "locked in" to Meta's family of apps. Another damning document discusses allowing "tweens" to use a private mode inspired by fake Instagram accounts ("finstas"). The same document includes an admission that internal data showed Facebook use correlated with lower well-being.

Internal communications showed Meta seemingly bragging that "teens can't switch off from Instagram even if they want to" and an employee declaring, "oh my gosh yall IG is a drug," likening all social media platforms to "pushers."
This discussion has been archived. No new comments can be posted.

Internal Messages May Doom Meta At Social Media Addiction Trial

Comments Filter:
  • Doom? (Score:5, Insightful)

    by SlashbotAgent ( 6477336 ) on Tuesday January 27, 2026 @06:12PM (#65953342)

    Absolute worst case scenario, a no fault settlement agreement.

  • weak (Score:2, Interesting)

    by hdyoung ( 5182939 )
    Meta’s goal was to lock users into their ecosystem. Um, yeah. That’s basically a rephrasing of the statement “businesses want to keep their customers”. The wording is slightly more psychopathic, but that’s about it.

    Except they utterly failed. As soon as a slightly fresher social media stream showed up (TikTok), a billion young people transferred their attention to the new ecosystem. The effort to move involves creating a new login and password. Meta’s “lock-in
    • Re:weak (Score:5, Informative)

      by dskoll ( 99328 ) on Tuesday January 27, 2026 @06:29PM (#65953390) Homepage

      Meta's damage to society goes far beyond damaging kids' mental health. It also spreads disinformation, enables scam artists to defraud people, winks at exposing minors to sexually-explicit chatbots, and this is refelected in a whole list of media stories [skoll.ca] showing how shitty Meta is.

      • by piojo ( 995934 )

        Isn't a lot of that just the consequence of communication? Where do we draw the line about blaming the tool versus the content?

        (Yes, this is the Section 230 argument again.)

        • by kmoser ( 1469707 )
          This. Unlimited messages to scroll through, scammers, pedophiles, weirdos, etc. all existed decades ago on BBSes and Usenet, and yet those never got legislated out of existence.

          Two words: Personal responsibility
          • by dskoll ( 99328 )

            Individuals cannot use "personal responsibility" to combat a massive organization that hires psychologists who know exactly how to prey on human nature. It's not a fair fight.

            • by kmoser ( 1469707 )
              For decades Madison Avenue has been using psychology to get you to buy products. Just say no.
        • by dskoll ( 99328 )

          We draw the line when the tool enables content that demonstrably causes massive harm to society and keeps enabling said harm because to do otherwise would hurt profits. Sec. 230 doesn't apply to Facebook because it does in fact moderate its content, so it's not a neutral communication provider.

    • Re:weak (Score:5, Insightful)

      by NotEmmanuelGoldstein ( 6423622 ) on Tuesday January 27, 2026 @06:52PM (#65953464)
      Translation:

      It's okay to be a violent racist: It's the victim's job to be bulletproof.

      This is the Chicago ideology of "corporations are naturally immoral" applied to everything. In truth, the social contract stopped being "corporations must serve the public good", 120 years ago.

      Strangely, most people believe leaving your door unlocked does not excuse burglary of your home. Likewise, Facebook/Meta should be forgiven when they willfully abuse people.

      • Facebook/Meta should not be forgiven when they willfully abuse people.
    • Re:weak (Score:5, Insightful)

      by fropenn ( 1116699 ) on Tuesday January 27, 2026 @07:20PM (#65953552)

      This kid has my heartfelt sympathy, but they should lose the case

      If the kid has your sympathy, doesn't that suggest that they should win the case? Not to make this particular young person rich, but to convince the tech companies that it is in their best interest to prevent future kids like the claimant from being produced by the tech companies' products. They knew it was harmful to children. They actively worked to make it more addictive and more harmful. And did nothing to warn parents, limit children's access to the harmful product, or change the product in any way to make it safer for children. (It's not a healthy product for adults, either, but it is a different standard when it comes to adults.)

    • That's like saying you can't sue Marlboro because Camel is just as bad for you. These companies knowingly harmed children with their products and they should have to pay a little, what's wrong with that?
    • But, it’s a stretch for a psychiatrically-diagnosed 19 year old kid to blame their problems on “all the social media” and then claim that they’re entitled to “all teh $$$$$”. This kid has my heartfelt sympathy, but they should lose the case.

      And yet we have ample evidence to show influence on children has massive negative effect on society. We ban all sorts of mildly addictive behavior that is done in the name of keeping a customer when it is focused on kids, that includes advertising McDonalds during children's shows, gambling, etc. Why are you giving this specific one a free pass given they are using all the same gambling playbooks to target a specifically extra vulnerable group?

      Now are you upset that this person may get money? Why would you

    • Meta’s goal was to lock users into their ecosystem. Um, yeah. That’s basically a rephrasing of the statement “businesses want to keep their customers”. The wording is slightly more psychopathic, but that’s about it.

      Yes that's indeed the goal of every business, but what matters is, how they go about achieving that goal. If businesses try to keep customers by making better and better products, that's a socially beneficial thing, because people will have better and better stuff. If they try to keep their customers by getting them addicted and exploiting their psychological triggers, that's socially negative, because you get psychologically damaged people out of this. You can argue about the extent of the damage done, but

    • > The effort to move involves creating a new login and password.

      I guess you have no social circle, otherwise you'd know that's not the case.

      > it’s a stretch for a psychiatrically-diagnosed 19 year old kid to blame their problems on “all the social media”

      No, it's not. The algorithms suggesting content are designed to keep you hooked and scrolling. If the social network was without suggestion algorithms, I'd probably agree with you. Algorithmically curated feeds are a big problem.

  • Anxiety, depression, and self-harm seem to be ingrained in human nature, but the way Meta manipulated them to cause addictive damage for profit certainly deserves whatever fines the courts can hand out, and more. Most psych experiments (even on animal subjects) would undergo an ethics review to prevent harm before being allowed; I doubt Zuck and his team ever considered ethics at all.
  • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Tuesday January 27, 2026 @06:27PM (#65953382) Homepage Journal

    Infinite? Their shitty JavaScript will crash your browser long before forever

    • > Infinite [scroll]? Their shitty JavaScript will crash your browser long before forever

      For small values of infinity.

    • Errr no they may crash your shitty browser which is failing to garbage collect out of view rubbish, but you can very much keep scrolling the likes of Facebook forever in a day. If not on your shitty browser then definitely on the Facebook app. (I stopped using Firefox to browse reddit for the same reason, switching browser fixed the problem, hardly Javascript's fault).

  • Q. anyone here ever met an addict?
    A. yes you have, it's you

    Ha ha ! Funny right?
    Nope ... you're actually an addict.

    The product is designed to be addictive. So funny that the only ones who don't know are the end users.
    "users". You're a "user". I hear the denials coming so loudly.. the first sign of addiction.

    Note: I get to be smug and laugh at you because I'm not addicted. I have had to deal with addicts in my family and elsewhere. I've been addicted before to various things. No hard stuff, just regular stuff
    • by Anonymous Coward

      I'm addicted to banging your mom, and coffee, and weed, but not slashdot.

  • by 93 Escort Wagon ( 326346 ) on Tuesday January 27, 2026 @07:42PM (#65953612)

    And that's the fact that Meta didn't delete all those internal messages - court orders be damned.

  • by sdinfoserv ( 1793266 ) on Tuesday January 27, 2026 @08:33PM (#65953686)
    Because we've never ever heard companies lying they're about their products not being harmful...
    https://www.youtube.com/watch?... [youtube.com]
    let alone vomit up some company sponsored research to support their lies. https://www.eurekalert.org/new... [eurekalert.org]
  • Perhaps freedom of choice is too much to handle by some humans, and we should just classify all citizens into 2 classes able to handle different amounts of free will. The ones wanting a paternalistic government to protect them - told exactly what they can or cannot do by their government - so no activity allowed that is correlated with lower well-being, like having a facebook account or a credit card, or buying whatever thing is bad for you (e.g. cigarettes, burgers or other foods known to be unhealthy), t
  • supposedly provide "smoking-gun evidence" that platforms "purposefully designed their social media products to addict children and teens with no regard for known harms to their wellbeing" -- while putting increased engagement from young users at the center of their business models.

    I wonder how the social media algorithms are treated by Section 230 of the Communications Act of 1934 (47 U.S.C. 230)? The section was designed to shield the internet companies from being treated as the responsible party for, i.e. the publisher of, comments made by their users. But when you throw in an algorithm specifically designed to promote engagement by picking and choosing which posts a user will see, wouldn't this make the social media site a publisher? Or do these algorithms simply count as a form

  • "Good think I'm not susceptible to addiction to social media!" says the guy who has had a Slashdot account for the last 21 years...

"Intelligence without character is a dangerous thing." -- G. Steinem

Working...