Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Social Networks

TikTok Pushes Potentially Harmful Content To Users as Often as Every 39 Seconds, Study Says (cbsnews.com) 77

TikTok recommends self-harm and eating disorder content to some users within minutes of joining the platform, according to a new report published Wednesday by the Center for Countering Digital Hate (CCDH). CBS News: The new study had researchers set up TikTok accounts posing as 13-year-old users interested in content about body image and mental health. It found that within as few as 2.6 minutes after joining the app, TikTok's algorithm recommended suicidal content. The report showed that eating disorder content was recommended within as few as 8 minutes.

Over the course of this study, researchers found 56 TikTok hashtags hosting eating disorder videos with over 13.2 billion views. "The new report by the Center for Countering Digital Hate underscores why it is way past time for TikTok to take steps to address the platform's dangerous algorithmic amplification," said James P. Steyer, Founder and CEO of Common Sense Media, which is unaffiliated with the study. "TikTok's algorithm is bombarding teens with harmful content that promote suicide, eating disorders, and body image issues that is fueling the teens' mental health crisis." The CCDH report details how TikTok's algorithms refine the videos shown to users as the app gathers more information about their preferences and interests. The algorithmic suggestions on the "For You" feed are designed, as the app puts it, to be "central to the TikTok experience." But new research shows that the video platform can push harmful content to vulnerable users as it seeks to keep them interested.
Further reading: For teen girls, TikTok is the 'social media equivalent of razor blades in candy,' new report claims
This discussion has been archived. No new comments can be posted.

TikTok Pushes Potentially Harmful Content To Users as Often as Every 39 Seconds, Study Says

Comments Filter:
  • by XeLiTuS ( 2787743 )
    Gee, I canâ(TM)t imagine why this could be happening. Iâ(TM)m sure an adversary like China would NEVER try to amplify content to weaken a whole generation of rival citizens. /s
    • Re:Manipulation (Score:5, Insightful)

      by northerner ( 651751 ) on Thursday December 15, 2022 @04:43PM (#63133762)

      I don't think it's a China thing. It's a platform maximizing views/$ thing.
      Youtube pushes all kinds on dreck and crap until you teach it what you want to see.
      The default recommended Youtube Shorts are even worse.
      Don't get me started on Facebook.

      Or maybe it is actually a political weapon, plus the inherent tendency of these systems to amplify harmful content.

      • by SuperKendall ( 25149 ) on Thursday December 15, 2022 @04:53PM (#63133800)

        I don't think it's a China thing. It's a platform maximizing views/$ thing.

        Embrace the healing power of "AND".

      • by raymorris ( 2726007 ) on Thursday December 15, 2022 @05:41PM (#63133884) Journal

        If I were a Chinese corporate officer, accountable to the CCP, I would want to make money and I'd darn sure not have any interest in protecting Americans. If I could brag about causing problems for Americans while taking their money, that would be a bonus.

        Same as we see Russia buying ads promoting extremist views on both sides of political issues in the US - to increase disharmony, because divided we fall. They want to both take the US down a notch AND they'll gladly take our money while they do so.

        • Comment removed based on user account deletion
          • That wasn't hypocritical without good cause. In the aftermath of the 2016 election people who looked into what Russia had actually done found that it was a few hundred thousand dollars on supporting groups viewed as extreme on both sides of the aisle. Frankly they can get far more mileage out of us assuming they're behind everything than they can from actually being behind anything so I'm assuming they factored in getting caught in those cases as well.

            China is smart enough to realize the same and probabl
      • I don't think it's a China thing. It's a platform maximizing views/$ thing.

        It is hard to believe that, because China does NOT allow such content to be shown to their citizens, especially their youth.

        Their TikTok feed is extremely different than what is sent to the west....they filter out the crap that flows to us.

    • Don't their own citizens use it too?

      • China requires platform owners to censor destructively pathological content. That doesn't apply to the separate owners in the west where we let our children wallow in it.

        Apparently, the respective versions of Tiktok are still just as popular with tweens in the west and the east, it's just that one is full of kittens while the other is full of weight reduction tips for young girls.

        • In China, mentioning that Taiwan is a nation is considered "destructively pathological content", and attempts to censor such claims around the world by corporate boycott. Even popular actors have been forced to apologize for it:

          https://www.nytimes.com/2021/0... [nytimes.com]

          That level of censorship concerns observing objective reality, such as the active government of Taiwan.

          • I'm sure all those tweens really really really miss watching those in-depth 15-second lectures on geopolitics. I'm sure the kids overload the servers in their rush to watch.

        • Not only weight reduction. TikTok, the Western version, has become the modern Tumblr. LARPing mental illness, extremist politics, gender ideology, and all manner of degeneracy is being fed to young users. Don't let kids on there unless you're happy for them to become a gender queer furry who hates you and looks at Mao with admiration.

        • Comment removed based on user account deletion
      • Don't their own citizens use it too?

        The version they see in China, especially what goes to the kids...is VERY different and highly filtered for harmful content.

        They also incorporate time limitations for viewing for the kids.

        The Chinese protect their folks from the flood of crap that is more and more pervasive on the western side of TikTok.

        They know what they are doing.

  • Robot algorithms hate digital hate - exterminate!

  • Current younger generation attention span has been determined to be around 40 seconds.

    • Re: (Score:1, Insightful)

      by geekmux ( 1040042 )

      Current younger generation attention span has been determined to be around 40 seconds.

      An algorithm determined within 30 seconds that suicidal content was just the kind of "for you" content that needed to be served up hot and fresh to teenagers.

      And attention span is the metric on your mind?

  • I think the time has finally arrived when the normal people must finally say to those susceptible to doing stupid things after seeing something on the internet:

          If you cannot handle stuff on the internet STAY THE FUCK OFF IT.
    • ...those susceptible to doing stupid things after seeing something on the internet

      You mean, like, teenagers...the target market for TikTok?

  • by walterbyrd ( 182728 ) on Thursday December 15, 2022 @04:44PM (#63133768)

    There is a movement to keep TikTok away from the USA.
    If that were to happen, I think it's possible that another platform would just take over.

    • by Tablizer ( 95088 )

      Indeed! Myspace, Facebook, Twitter, Instagram, YouTube, 4chan, etc. have all been accused of similar things in the past and would probably fill the void of having content from slimebags if TikTok went away. TikTok is mostly a symptom, not the cause.

      Teens are inherently insecure, curious, impressionable, emotional, reactionary, etc. Content makers push all these buttons for viewership points because they can and the web is too big to fully police. It's like kicking teens out of the mall parking lot in Augus

      • Teens are inherently insecure, curious, impressionable, emotional, reactionary, etc.

        Well, let's see, in the US you have to be 21yrs old to legally purchase alcohol, and weed products where legal.

        Let's make access to social media an adult thing and make the legal age there to be 21yrs too maybe?

        Sure, like with alcohol, it won't stop EVERYONE, but it will stop most.

        And heck, if most teens aren't on social media anymore, they won't likely want to even try to get on since most of their peers are no longer

    • Probably, but TikTok is uniquely bad. Despite their recent adverts, claiming to care about safety of children, they've allowed the platform to become an analogue to Tumblr back in the day. No other major social media site comes close to the concentration of degeneracy TikTok hosts.

      They will indeed try to go elsewhere to find likeminded people who seek attention through feigned mental illness and weird sex ideologies, meaning we'll just need to keep shutting them down as they arise. Ideally social media woul

  • Twitter pushes harmful content in less than 1/3rd that time. TikTok really needs to step up it's game to compete!
  • by Anonymous Coward

    Because let's face it, the grooming from the LGBTQXYZPDQLMNOP crowd is being forced down their thoats to groom them in pre-school, long before they get to Tik'Tok. The "there is no such thing as gender" crowd are screwing up childhood for every child they can sink their polished claws into.

  • ... researchers set up TikTok accounts ...

    What about Youtube or Facebook accounts? Why didn't they they test the safety of American-owned ^H^H^H other social-media sites.

    Separately, experts have argued that TikTok is no different from Facebook, in that it can cocoon users inside “filter bubbles,” hyper-personalized feedback loops that distort reality and even reaffirm negative traits that aren’t socially acceptable outside the bubble.

    • by Xenx ( 2211586 )
      To be fair, Facebook usage among teens plummeted. Sure, Instagram/Snapchat are still up there. However, they're still slightly below tiktok in usage.
  • It is all in the eye of the beholder. Censorship sucks. Get over it. You want to be an adult? Then deal with reality.

    • We are talking about impressionable children. Are there any limits? Should we care?
      • by reanjr ( 588767 )

        There's already a limit. Social media companies all require you to be 13 years old to use their services.

        • There's already a limit. Social media companies all require you to be 13 years old to use their services.

          Raise the legal age of social media to 18yrs or, maybe better, 21 yrs and then maybe we can find common ground.

          In the us, you have to be 21yrs to purchase alcohol, a mind altering drug.

          Why not do the same for social media, since it is being show to have mind altering potential on young developing brains of our kids.

      • Impressionable children need to hear divergent views and interpretations, so we should care. There's a potent temptation to "guide people to truth" by denying them access to alternative views, which is part of why even pre-school children are being taught not to use gendered pronouns.

    • In what "eye of the beholder" is suicide not harmful?
  • I have not seen anything of the kind on Ticktock. Researches could be biased. Did they do a comparable study on Facebook, youtube, etc?
  • There's a reason why an AI exposed to social media turned into a right-wing homophobic racist N*** party member within 24 hours.

    • by nwaack ( 3482871 )
      Huh?
    • meanwhile the fake liberal hivemind of humans on twitter did the same thing for years

      • by ufgrat ( 6245202 )

        I'm afraid your IngCon Newspeak is self-contradictory. Do you mean that the "liberal hivemind" is fake or that "liberals" are fake? Either way, your propaganda buzzkill does not meet basic English grammatical standards.

        • There are no grammatical errors in my sentence. Confusion of parsing and ambiguity in your brain is not a standard of grammatical correctness.

            Washington DC, news and social media have fake liberals instead of real liberals in power.

          Fake liberals are authoritarian, censor, use labels instead of arguments, are adverse to hearing other points of view, weaponize agencies of government against opposition, and are pro big corporation.

          • by ufgrat ( 6245202 )

            Fake liberals are authoritarian, censor, use labels instead of arguments, are adverse to hearing other points of view, weaponize agencies of government against opposition, and are pro big corporation.

            The irony is palpable.

            • Is "real liberal" also a label? I showed argument why such people have different beliefs than real liberals, hence they are fake. I did not throw a label without argument.

              The irony is you willfully ignoring the reality of liberals vs. fake liberals.

              A real liberal doesn't want an authoritarian government, doesn't want censorship, welcome other points of view for discussion, is horrified at the notion of government being weaponized against opponents, and doesn't like big corporations having power through go

  • by reanjr ( 588767 ) on Thursday December 15, 2022 @05:23PM (#63133858) Homepage

    Exactly what kind of content do you expect when your search terms are "body image" and "mental health".

    If you search Google for "body image mental health", the first link has a phone number for a suicide hotline.

    • by AmiMoJo ( 196126 )

      You answered your own question. People expect help, not stuff that makes those conditions worse. Google is careful to make sure helpful content comes first in their listings.

      • But you don't go to Tik Tok looking for solutions to problems. You go there to find a community of people with similar interests. Like those interested in suicide and body issues.

        • Comment removed based on user account deletion
          • by reanjr ( 588767 )

            Therapists don't troll Tik Tok looking for pro bono patients. There's no expectation you would find one there unless you are mentally retarded.

          • by reanjr ( 588767 )

            People actually DO look for chat rooms to discuss suicide with other like-minded individuals.

            Here's a page talking about a subreddit that ended up getting banned: https://www.reddit.com/r/Stoic... [reddit.com].

            So, yeah, if you search Tik Tok for suicide, that's what you should expect. If you expect anything else, you don't belong on the Internet.

  • Just imagine if they were doing that every 35 seconds! Now then there would be a _real_ problem!

    In other news, stupid metrics are stupid.

  • Does anyone even know what the hell the article means by âoesuicidal contentâ or âoeeating disorder contentâ? Wtf does that mean? Some kind of video encouraging an eating disorder or just some video of skinny girls dancing around?

    • Given that most skinny girls don't have an eating disorder, I'm going to have to say that by "eating disorder content" they mean "eating disorder content." A bulimic giving tips on how to hide the vomiting noises.

      I'd put the fat acceptance movement in the same category, but I'm not sure if overeating been officially recognized as a disorder in the DSM yet, like undereating has.

    • Given the similarities to Tumblr, it would not surprise me to see the 'pro ana' (pro-Anorexia) movement making a comeback.

      This crap has been around for decades in various places, like Tumblr and Deviant Art. TikTok is very much the spiritual heir to the business if exposing children to extreme politics, self-harm, furries, groomers, and gender ideology.

  • Reading this it's a bit ironic. I follow several blacksmith channels and GOD FORBID they actually show a knife?!?!?!gasp Those videos get banned within minutes of going up. It's lunacy.
  • Numbers out of context are meaningless. Tell us how those compare with Facebook, Twitter, YouTube.

  • Study says. Maybe, maybe not. Study from advocacy group.

The 11 is for people with the pride of a 10 and the pocketbook of an 8. -- R.B. Greenberg [referring to PDPs?]

Working...