TikTok Pushes Potentially Harmful Content To Users as Often as Every 39 Seconds, Study Says (cbsnews.com) 77
TikTok recommends self-harm and eating disorder content to some users within minutes of joining the platform, according to a new report published Wednesday by the Center for Countering Digital Hate (CCDH). CBS News: The new study had researchers set up TikTok accounts posing as 13-year-old users interested in content about body image and mental health. It found that within as few as 2.6 minutes after joining the app, TikTok's algorithm recommended suicidal content. The report showed that eating disorder content was recommended within as few as 8 minutes.
Over the course of this study, researchers found 56 TikTok hashtags hosting eating disorder videos with over 13.2 billion views. "The new report by the Center for Countering Digital Hate underscores why it is way past time for TikTok to take steps to address the platform's dangerous algorithmic amplification," said James P. Steyer, Founder and CEO of Common Sense Media, which is unaffiliated with the study. "TikTok's algorithm is bombarding teens with harmful content that promote suicide, eating disorders, and body image issues that is fueling the teens' mental health crisis." The CCDH report details how TikTok's algorithms refine the videos shown to users as the app gathers more information about their preferences and interests. The algorithmic suggestions on the "For You" feed are designed, as the app puts it, to be "central to the TikTok experience." But new research shows that the video platform can push harmful content to vulnerable users as it seeks to keep them interested. Further reading: For teen girls, TikTok is the 'social media equivalent of razor blades in candy,' new report claims
Over the course of this study, researchers found 56 TikTok hashtags hosting eating disorder videos with over 13.2 billion views. "The new report by the Center for Countering Digital Hate underscores why it is way past time for TikTok to take steps to address the platform's dangerous algorithmic amplification," said James P. Steyer, Founder and CEO of Common Sense Media, which is unaffiliated with the study. "TikTok's algorithm is bombarding teens with harmful content that promote suicide, eating disorders, and body image issues that is fueling the teens' mental health crisis." The CCDH report details how TikTok's algorithms refine the videos shown to users as the app gathers more information about their preferences and interests. The algorithmic suggestions on the "For You" feed are designed, as the app puts it, to be "central to the TikTok experience." But new research shows that the video platform can push harmful content to vulnerable users as it seeks to keep them interested. Further reading: For teen girls, TikTok is the 'social media equivalent of razor blades in candy,' new report claims
Manipulation (Score:2, Funny)
Re:Manipulation (Score:5, Insightful)
I don't think it's a China thing. It's a platform maximizing views/$ thing.
Youtube pushes all kinds on dreck and crap until you teach it what you want to see.
The default recommended Youtube Shorts are even worse.
Don't get me started on Facebook.
Or maybe it is actually a political weapon, plus the inherent tendency of these systems to amplify harmful content.
Re:Manipulation (Score:4, Funny)
I don't think it's a China thing. It's a platform maximizing views/$ thing.
Embrace the healing power of "AND".
Both. China not trying to protect Americans! (Score:4, Interesting)
If I were a Chinese corporate officer, accountable to the CCP, I would want to make money and I'd darn sure not have any interest in protecting Americans. If I could brag about causing problems for Americans while taking their money, that would be a bonus.
Same as we see Russia buying ads promoting extremist views on both sides of political issues in the US - to increase disharmony, because divided we fall. They want to both take the US down a notch AND they'll gladly take our money while they do so.
Re: (Score:1)
Re: (Score:2)
China is smart enough to realize the same and probabl
Re: (Score:2)
It is hard to believe that, because China does NOT allow such content to be shown to their citizens, especially their youth.
Their TikTok feed is extremely different than what is sent to the west....they filter out the crap that flows to us.
Re: (Score:2)
Don't their own citizens use it too?
Re: (Score:2)
China requires platform owners to censor destructively pathological content. That doesn't apply to the separate owners in the west where we let our children wallow in it.
Apparently, the respective versions of Tiktok are still just as popular with tweens in the west and the east, it's just that one is full of kittens while the other is full of weight reduction tips for young girls.
Re: (Score:2)
In China, mentioning that Taiwan is a nation is considered "destructively pathological content", and attempts to censor such claims around the world by corporate boycott. Even popular actors have been forced to apologize for it:
https://www.nytimes.com/2021/0... [nytimes.com]
That level of censorship concerns observing objective reality, such as the active government of Taiwan.
Re: (Score:2)
I'm sure all those tweens really really really miss watching those in-depth 15-second lectures on geopolitics. I'm sure the kids overload the servers in their rush to watch.
Re: Manipulation (Score:3)
Not only weight reduction. TikTok, the Western version, has become the modern Tumblr. LARPing mental illness, extremist politics, gender ideology, and all manner of degeneracy is being fed to young users. Don't let kids on there unless you're happy for them to become a gender queer furry who hates you and looks at Mao with admiration.
Re: (Score:1)
Re: (Score:2)
The version they see in China, especially what goes to the kids...is VERY different and highly filtered for harmful content.
They also incorporate time limitations for viewing for the kids.
The Chinese protect their folks from the flood of crap that is more and more pervasive on the western side of TikTok.
They know what they are doing.
Digital Hate (Score:1)
Robot algorithms hate digital hate - exterminate!
e.g. (Score:2)
Current younger generation attention span has been determined to be around 40 seconds.
Re: (Score:1, Insightful)
Current younger generation attention span has been determined to be around 40 seconds.
An algorithm determined within 30 seconds that suicidal content was just the kind of "for you" content that needed to be served up hot and fresh to teenagers.
And attention span is the metric on your mind?
Time for the norms to lay it out straight. (Score:2, Insightful)
If you cannot handle stuff on the internet STAY THE FUCK OFF IT.
Re: (Score:3)
...those susceptible to doing stupid things after seeing something on the internet
You mean, like, teenagers...the target market for TikTok?
Re: (Score:3)
Re: (Score:2)
No, susceptible teenagers.
From this link - https://www.newportacademy.com... [newportacademy.com]:
"Lack of frontal lobe maturity catalyzes a variety of teen behaviors. That’s because the prefrontal cortex is involved with a wide range of functions, known as executive functions. These include the following: Complex decision-making Planning skills Impulse control Emotional reactions Focusing attention Prioritizing competing information received all at once The ability to ignore external distractions. Therefore, children
Re: (Score:2)
NOPE.
2. No superiority was implied.
My position is that a wide range of behaviors are due to something wrong in a person's brain and not their choice.
I feel that some murderers, for example, are just not able to control themselves. So hate - or superiority - is misplaced.
But that does not mean they should not be locked up. Quite the opposite.
Of course, if psychology was a science it could tell us this.
But it's a shitty, touchy-feely social art and can tell us nothing.
Re: (Score:2)
1. Is psychology is a science now ? NOPE.
I think the link I posted falls into the 'settled science' portion of neurology, rather than into the realm of psychology.
Re: (Score:2)
The statement quoted makes the claim that because some things are done in the frontal lobe and the frontal lobe matures later, those things "result in risky teenage behavior".
That is pseudoscience. That "consequently" is bullshit.
And the fact that that they tag on "Moreover, teens are also dealing with hormones as a result of puberty" further shows up their nonsense pseudoscience.
TikTok closed, another platform takes over (Score:4, Interesting)
There is a movement to keep TikTok away from the USA.
If that were to happen, I think it's possible that another platform would just take over.
Re: (Score:2)
Indeed! Myspace, Facebook, Twitter, Instagram, YouTube, 4chan, etc. have all been accused of similar things in the past and would probably fill the void of having content from slimebags if TikTok went away. TikTok is mostly a symptom, not the cause.
Teens are inherently insecure, curious, impressionable, emotional, reactionary, etc. Content makers push all these buttons for viewership points because they can and the web is too big to fully police. It's like kicking teens out of the mall parking lot in Augus
Re: (Score:2)
Well, let's see, in the US you have to be 21yrs old to legally purchase alcohol, and weed products where legal.
Let's make access to social media an adult thing and make the legal age there to be 21yrs too maybe?
Sure, like with alcohol, it won't stop EVERYONE, but it will stop most.
And heck, if most teens aren't on social media anymore, they won't likely want to even try to get on since most of their peers are no longer
Re: TikTok closed, another platform takes over (Score:2)
Probably, but TikTok is uniquely bad. Despite their recent adverts, claiming to care about safety of children, they've allowed the platform to become an analogue to Tumblr back in the day. No other major social media site comes close to the concentration of degeneracy TikTok hosts.
They will indeed try to go elsewhere to find likeminded people who seek attention through feigned mental illness and weird sex ideologies, meaning we'll just need to keep shutting them down as they arise. Ideally social media woul
That's terrible (Score:1)
Do they include gender swapping body image issues? (Score:2, Interesting)
Because let's face it, the grooming from the LGBTQXYZPDQLMNOP crowd is being forced down their thoats to groom them in pre-school, long before they get to Tik'Tok. The "there is no such thing as gender" crowd are screwing up childhood for every child they can sink their polished claws into.
Re: Do they include gender swapping body image iss (Score:2)
Re: Do they include gender swapping body image is (Score:3)
Sure, yet we can't ignore the similarities to Munchausen syndrome by proxy, also how gender confusion appears to arise more among leftwing people. People who eat meat aren't the ones who end up with vegan cats.
The medical community is still groping in the dark, using subjective and poorly tested approaches to treatment. Better the child grows to adulthood, at which point the decision is on them - assuming it wasn't simply a phase or a symptom of an underlying condition.
Facebook and Instagram does more than that (Score:2)
Center for Countering Digital Hate? (Score:1)
All about the USA (Score:2)
What about Youtube or Facebook accounts? Why didn't they they test the safety of American-owned ^H^H^H other social-media sites.
Separately, experts have argued that TikTok is no different from Facebook, in that it can cocoon users inside “filter bubbles,” hyper-personalized feedback loops that distort reality and even reaffirm negative traits that aren’t socially acceptable outside the bubble.
Re: (Score:2)
Isn't Everything Potentially Harmful? (Score:2)
It is all in the eye of the beholder. Censorship sucks. Get over it. You want to be an adult? Then deal with reality.
Re: (Score:3)
Re: (Score:2)
There's already a limit. Social media companies all require you to be 13 years old to use their services.
Re: (Score:2)
Raise the legal age of social media to 18yrs or, maybe better, 21 yrs and then maybe we can find common ground.
In the us, you have to be 21yrs to purchase alcohol, a mind altering drug.
Why not do the same for social media, since it is being show to have mind altering potential on young developing brains of our kids.
Re: (Score:3)
Impressionable children need to hear divergent views and interpretations, so we should care. There's a potent temptation to "guide people to truth" by denying them access to alternative views, which is part of why even pre-school children are being taught not to use gendered pronouns.
Re: (Score:1)
Liar.
Re: (Score:2)
Thank you for challenging claims, and allowing them to be examined and verified. There is an article about the practice at https://www.bbc.com/worklife/a... [bbc.com] .
Re: (Score:3)
Re: (Score:2)
In a suicider's one.
Re: (Score:1)
people don't have the right to off themselves? Is that your religion?
Re: (Score:2)
Canada runs suicide ads on TV. (Score:2)
Social Media (Score:2)
There's a reason why an AI exposed to social media turned into a right-wing homophobic racist N*** party member within 24 hours.
Re: (Score:2)
Re: (Score:2)
Oh. Story from a few years ago.
https://www.cbsnews.com/news/m... [cbsnews.com]
Microsoft turned Twitter loose on a defenseless chat bot, and the results were educational.
Re: (Score:1)
meanwhile the fake liberal hivemind of humans on twitter did the same thing for years
Re: (Score:2)
I'm afraid your IngCon Newspeak is self-contradictory. Do you mean that the "liberal hivemind" is fake or that "liberals" are fake? Either way, your propaganda buzzkill does not meet basic English grammatical standards.
Re: (Score:1)
There are no grammatical errors in my sentence. Confusion of parsing and ambiguity in your brain is not a standard of grammatical correctness.
Washington DC, news and social media have fake liberals instead of real liberals in power.
Fake liberals are authoritarian, censor, use labels instead of arguments, are adverse to hearing other points of view, weaponize agencies of government against opposition, and are pro big corporation.
Re: (Score:2)
Fake liberals are authoritarian, censor, use labels instead of arguments, are adverse to hearing other points of view, weaponize agencies of government against opposition, and are pro big corporation.
The irony is palpable.
Re: (Score:1)
Is "real liberal" also a label? I showed argument why such people have different beliefs than real liberals, hence they are fake. I did not throw a label without argument.
The irony is you willfully ignoring the reality of liberals vs. fake liberals.
A real liberal doesn't want an authoritarian government, doesn't want censorship, welcome other points of view for discussion, is horrified at the notion of government being weaponized against opponents, and doesn't like big corporations having power through go
works as expected (Score:3)
Exactly what kind of content do you expect when your search terms are "body image" and "mental health".
If you search Google for "body image mental health", the first link has a phone number for a suicide hotline.
Re: (Score:2)
You answered your own question. People expect help, not stuff that makes those conditions worse. Google is careful to make sure helpful content comes first in their listings.
Re: works as expected (Score:2)
But you don't go to Tik Tok looking for solutions to problems. You go there to find a community of people with similar interests. Like those interested in suicide and body issues.
Re: (Score:2)
Re: (Score:2)
Therapists don't troll Tik Tok looking for pro bono patients. There's no expectation you would find one there unless you are mentally retarded.
Re: (Score:2)
People actually DO look for chat rooms to discuss suicide with other like-minded individuals.
Here's a page talking about a subreddit that ended up getting banned: https://www.reddit.com/r/Stoic... [reddit.com].
So, yeah, if you search Tik Tok for suicide, that's what you should expect. If you expect anything else, you don't belong on the Internet.
Dodged a bullet there! (Score:2)
Just imagine if they were doing that every 35 seconds! Now then there would be a _real_ problem!
In other news, stupid metrics are stupid.
Wtf (Score:2)
Does anyone even know what the hell the article means by âoesuicidal contentâ or âoeeating disorder contentâ? Wtf does that mean? Some kind of video encouraging an eating disorder or just some video of skinny girls dancing around?
Re: (Score:2)
Given that most skinny girls don't have an eating disorder, I'm going to have to say that by "eating disorder content" they mean "eating disorder content." A bulimic giving tips on how to hide the vomiting noises.
I'd put the fat acceptance movement in the same category, but I'm not sure if overeating been officially recognized as a disorder in the DSM yet, like undereating has.
Re: Wtf (Score:2)
Given the similarities to Tumblr, it would not surprise me to see the 'pro ana' (pro-Anorexia) movement making a comeback.
This crap has been around for decades in various places, like Tumblr and Deviant Art. TikTok is very much the spiritual heir to the business if exposing children to extreme politics, self-harm, furries, groomers, and gender ideology.
Genuinely surprising (Score:2)
Relative (Score:1)
Numbers out of context are meaningless. Tell us how those compare with Facebook, Twitter, YouTube.
Study says. (Score:2)
Study says. Maybe, maybe not. Study from advocacy group.