Internal Messages May Doom Meta At Social Media Addiction Trial (arstechnica.com) 54
An anonymous reader quotes a report from Ars Technica: This week, the first high-profile lawsuit -- considered a "bellwether" case that could set meaningful precedent in the hundreds of other complaints -- goes to trial. That lawsuit documents the case of a 19-year-old, K.G.M, who hopes the jury will agree that Meta and YouTube caused psychological harm by designing features like infinite scroll and autoplay to push her down a path that she alleged triggered depression, anxiety, self-harm, and suicidality. TikTok and Snapchat were also targeted by the lawsuit, but both have settled. The Snapchat settlement came last week, while TikTok settled on Tuesday just hours before the trial started, Bloomberg reported. For now, YouTube and Meta remain in the fight. K.G.M. allegedly started watching YouTube when she was 6 years old and joined Instagram by age 11. She's fighting to claim untold damages -- including potentially punitive damages -- to help her family recoup losses from her pain and suffering and to punish social media companies and deter them from promoting harmful features to kids. She also wants the court to require prominent safety warnings on platforms to help parents be aware of the risks. [...]
To win, K.G.M.'s lawyers will need to "parcel out" how much harm is attributed to each platform, due to design features, not the content that was targeted to K.G.M., Clay Calvert, a technology policy expert and senior fellow at a think tank called the American Enterprise Institute, wrote. Internet law expert Eric Goldman told The Washington Post that detailing those harms will likely be K.G.M.'s biggest struggle, since social media addiction has yet to be legally recognized, and tracing who caused what harms may not be straightforward. However, Matthew Bergman, founder of the Social Media Victims Law Center and one of K.G.M.'s lawyers, told the Post that K.G.M. is prepared to put up this fight. "She is going to be able to explain in a very real sense what social media did to her over the course of her life and how in so many ways it robbed her of her childhood and her adolescence," Bergman said.
The research is unclear on whether social media is harmful for kids or whether social media addiction exists, Tamar Mendelson, a professor at Johns Hopkins Bloomberg School of Public Health, told the Post. And so far, research only shows a correlation between Internet use and mental health, Mendelson noted, which could doom K.G.M.'s case and others.' However, social media companies' internal research might concern a jury, Bergman told the Post. On Monday, the Tech Oversight Project, a nonprofit working to rein in Big Tech, published a report analyzing recently unsealed documents in K.G.M.'s case that supposedly provide "smoking-gun evidence" that platforms "purposefully designed their social media products to addict children and teens with no regard for known harms to their wellbeing" -- while putting increased engagement from young users at the center of their business models. Most of the unsealed documents came from Meta. An internal email shows Mark Zuckerberg decided Meta's top strategic priority was getting teens "locked in" to Meta's family of apps. Another damning document discusses allowing "tweens" to use a private mode inspired by fake Instagram accounts ("finstas"). The same document includes an admission that internal data showed Facebook use correlated with lower well-being.
Internal communications showed Meta seemingly bragging that "teens can't switch off from Instagram even if they want to" and an employee declaring, "oh my gosh yall IG is a drug," likening all social media platforms to "pushers."
To win, K.G.M.'s lawyers will need to "parcel out" how much harm is attributed to each platform, due to design features, not the content that was targeted to K.G.M., Clay Calvert, a technology policy expert and senior fellow at a think tank called the American Enterprise Institute, wrote. Internet law expert Eric Goldman told The Washington Post that detailing those harms will likely be K.G.M.'s biggest struggle, since social media addiction has yet to be legally recognized, and tracing who caused what harms may not be straightforward. However, Matthew Bergman, founder of the Social Media Victims Law Center and one of K.G.M.'s lawyers, told the Post that K.G.M. is prepared to put up this fight. "She is going to be able to explain in a very real sense what social media did to her over the course of her life and how in so many ways it robbed her of her childhood and her adolescence," Bergman said.
The research is unclear on whether social media is harmful for kids or whether social media addiction exists, Tamar Mendelson, a professor at Johns Hopkins Bloomberg School of Public Health, told the Post. And so far, research only shows a correlation between Internet use and mental health, Mendelson noted, which could doom K.G.M.'s case and others.' However, social media companies' internal research might concern a jury, Bergman told the Post. On Monday, the Tech Oversight Project, a nonprofit working to rein in Big Tech, published a report analyzing recently unsealed documents in K.G.M.'s case that supposedly provide "smoking-gun evidence" that platforms "purposefully designed their social media products to addict children and teens with no regard for known harms to their wellbeing" -- while putting increased engagement from young users at the center of their business models. Most of the unsealed documents came from Meta. An internal email shows Mark Zuckerberg decided Meta's top strategic priority was getting teens "locked in" to Meta's family of apps. Another damning document discusses allowing "tweens" to use a private mode inspired by fake Instagram accounts ("finstas"). The same document includes an admission that internal data showed Facebook use correlated with lower well-being.
Internal communications showed Meta seemingly bragging that "teens can't switch off from Instagram even if they want to" and an employee declaring, "oh my gosh yall IG is a drug," likening all social media platforms to "pushers."
Doom? (Score:5, Insightful)
Absolute worst case scenario, a no fault settlement agreement.
Re: (Score:2)
Re: (Score:3)
$5 discounts on a subscription for everyone...
I beg your pardon sir, is that a premium subscription?
Re: (Score:2)
weak (Score:2, Interesting)
Except they utterly failed. As soon as a slightly fresher social media stream showed up (TikTok), a billion young people transferred their attention to the new ecosystem. The effort to move involves creating a new login and password. Meta’s “lock-in
Re:weak (Score:5, Informative)
Meta's damage to society goes far beyond damaging kids' mental health. It also spreads disinformation, enables scam artists to defraud people, winks at exposing minors to sexually-explicit chatbots, and this is refelected in a whole list of media stories [skoll.ca] showing how shitty Meta is.
Re: (Score:1)
Isn't a lot of that just the consequence of communication? Where do we draw the line about blaming the tool versus the content?
(Yes, this is the Section 230 argument again.)
Re: (Score:1)
Two words: Personal responsibility
Re: (Score:2)
Individuals cannot use "personal responsibility" to combat a massive organization that hires psychologists who know exactly how to prey on human nature. It's not a fair fight.
Re: (Score:2)
Re: (Score:2)
We draw the line when the tool enables content that demonstrably causes massive harm to society and keeps enabling said harm because to do otherwise would hurt profits. Sec. 230 doesn't apply to Facebook because it does in fact moderate its content, so it's not a neutral communication provider.
Re:weak (Score:5, Insightful)
It's okay to be a violent racist: It's the victim's job to be bulletproof.
This is the Chicago ideology of "corporations are naturally immoral" applied to everything. In truth, the social contract stopped being "corporations must serve the public good", 120 years ago.
Strangely, most people believe leaving your door unlocked does not excuse burglary of your home. Likewise, Facebook/Meta should be forgiven when they willfully abuse people.
Correction (Score:3)
Re: weak (Score:2)
Like the one that your parents signed to raise you?
Re: (Score:2)
While corporations were, in fact, good for the market (until they created a cartel/monopo
Re:weak (Score:5, Insightful)
This kid has my heartfelt sympathy, but they should lose the case
If the kid has your sympathy, doesn't that suggest that they should win the case? Not to make this particular young person rich, but to convince the tech companies that it is in their best interest to prevent future kids like the claimant from being produced by the tech companies' products. They knew it was harmful to children. They actively worked to make it more addictive and more harmful. And did nothing to warn parents, limit children's access to the harmful product, or change the product in any way to make it safer for children. (It's not a healthy product for adults, either, but it is a different standard when it comes to adults.)
Re: (Score:2)
Re: (Score:2)
But, it’s a stretch for a psychiatrically-diagnosed 19 year old kid to blame their problems on “all the social media” and then claim that they’re entitled to “all teh $$$$$”. This kid has my heartfelt sympathy, but they should lose the case.
And yet we have ample evidence to show influence on children has massive negative effect on society. We ban all sorts of mildly addictive behavior that is done in the name of keeping a customer when it is focused on kids, that includes advertising McDonalds during children's shows, gambling, etc. Why are you giving this specific one a free pass given they are using all the same gambling playbooks to target a specifically extra vulnerable group?
Now are you upset that this person may get money? Why would you
Re: (Score:2)
Meta’s goal was to lock users into their ecosystem. Um, yeah. That’s basically a rephrasing of the statement “businesses want to keep their customers”. The wording is slightly more psychopathic, but that’s about it.
Yes that's indeed the goal of every business, but what matters is, how they go about achieving that goal. If businesses try to keep customers by making better and better products, that's a socially beneficial thing, because people will have better and better stuff. If they try to keep their customers by getting them addicted and exploiting their psychological triggers, that's socially negative, because you get psychologically damaged people out of this. You can argue about the extent of the damage done, but
Re: (Score:2)
> The effort to move involves creating a new login and password.
I guess you have no social circle, otherwise you'd know that's not the case.
> it’s a stretch for a psychiatrically-diagnosed 19 year old kid to blame their problems on “all the social media”
No, it's not. The algorithms suggesting content are designed to keep you hooked and scrolling. If the social network was without suggestion algorithms, I'd probably agree with you. Algorithmically curated feeds are a big problem.
Karma (Score:2)
infinite scroll (Score:4, Funny)
Infinite? Their shitty JavaScript will crash your browser long before forever
Re: (Score:2)
> Infinite [scroll]? Their shitty JavaScript will crash your browser long before forever
For small values of infinity.
Re: (Score:2)
Errr no they may crash your shitty browser which is failing to garbage collect out of view rubbish, but you can very much keep scrolling the likes of Facebook forever in a day. If not on your shitty browser then definitely on the Facebook app. (I stopped using Firefox to browse reddit for the same reason, switching browser fixed the problem, hardly Javascript's fault).
Re: (Score:2)
I'm trying to resist Chrome because I really don't want to be even more dependent on Google.
just came here to poke my thumb in your eye (Score:1)
A. yes you have, it's you
Ha ha ! Funny right?
Nope
The product is designed to be addictive. So funny that the only ones who don't know are the end users.
"users". You're a "user". I hear the denials coming so loudly.. the first sign of addiction.
Note: I get to be smug and laugh at you because I'm not addicted. I have had to deal with addicts in my family and elsewhere. I've been addicted before to various things. No hard stuff, just regular stuff
Re: (Score:1)
I'm addicted to banging your mom, and coffee, and weed, but not slashdot.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
As an aside, like your tagline, I enjoyed 3rd rock a great deal.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
There's only one surprising thing here (Score:4, Insightful)
And that's the fact that Meta didn't delete all those internal messages - court orders be damned.
Re: (Score:2)
I believe that nicotine is not addictive (Score:3)
https://www.youtube.com/watch?... [youtube.com]
let alone vomit up some company sponsored research to support their lies. https://www.eurekalert.org/new... [eurekalert.org]
Re: (Score:2)
Re: (Score:2)
Yea, don't blame the drug pushers for your abuse
So most humans should not have free will? (Score:2)
Section 230 (Score:1)
supposedly provide "smoking-gun evidence" that platforms "purposefully designed their social media products to addict children and teens with no regard for known harms to their wellbeing" -- while putting increased engagement from young users at the center of their business models.
I wonder how the social media algorithms are treated by Section 230 of the Communications Act of 1934 (47 U.S.C. 230)? The section was designed to shield the internet companies from being treated as the responsible party for, i.e. the publisher of, comments made by their users. But when you throw in an algorithm specifically designed to promote engagement by picking and choosing which posts a user will see, wouldn't this make the social media site a publisher? Or do these algorithms simply count as a form
Addiction? (Score:2)