Instagram's Recommendation Algorithms Are Promoting Pedophile Networks (theverge.com) 61
According to a joint investigation from The Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst, Instagram's algorithms are actively promoting networks of pedophiles who commission and sell child sexual abuse content on the app. The Verge reports: Accounts found by the researchers are advertised using blatant and explicit hashtags like #pedowhore, #preteensex, and #pedobait. They offer "menus" of content for users to buy or commission, including videos and imagery of self-harm and bestiality. When researchers set up a test account and viewed content shared by these networks, they were immediately recommended more accounts to follow. As the WSJ reports: "Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children."
In addition to problems with Instagram's recommendation algorithms, the investigation also found that the site's moderation practices frequently ignored or rejected reports of child abuse material. The WSJ recounts incidents where users reported posts and accounts containing suspect content (including one account that advertised underage abuse material with the caption "this teen is ready for you pervs") only for the content to be cleared by Instagram's review team or told in an automated message [...]. The report also looked at other platforms but found them less amenable to growing such networks. According to the WSJ, the Stanford investigators found "128 accounts offering to sell child-sex-abuse material on Twitter, less than a third the number they found on Instagram" despite Twitter having far fewer users, and that such content "does not appear to proliferate" on TikTok. The report noted that Snapchat did not actively promote such networks as it's mainly used for direct messaging.
In response to the report, Meta said it was setting up an internal task force to address the issues raised by the investigation. "Child exploitation is a horrific crime," the company said. "We're continuously investigating ways to actively defend against this behavior." Meta noted that in January alone it took down 490,000 accounts that violated its child safety policies and over the last two years has removed 27 pedophile networks. The company, which also owns Facebook and WhatsApp, said it's also blocked thousands of hashtags associated with the sexualization of children and restricted these terms from user searches.
In addition to problems with Instagram's recommendation algorithms, the investigation also found that the site's moderation practices frequently ignored or rejected reports of child abuse material. The WSJ recounts incidents where users reported posts and accounts containing suspect content (including one account that advertised underage abuse material with the caption "this teen is ready for you pervs") only for the content to be cleared by Instagram's review team or told in an automated message [...]. The report also looked at other platforms but found them less amenable to growing such networks. According to the WSJ, the Stanford investigators found "128 accounts offering to sell child-sex-abuse material on Twitter, less than a third the number they found on Instagram" despite Twitter having far fewer users, and that such content "does not appear to proliferate" on TikTok. The report noted that Snapchat did not actively promote such networks as it's mainly used for direct messaging.
In response to the report, Meta said it was setting up an internal task force to address the issues raised by the investigation. "Child exploitation is a horrific crime," the company said. "We're continuously investigating ways to actively defend against this behavior." Meta noted that in January alone it took down 490,000 accounts that violated its child safety policies and over the last two years has removed 27 pedophile networks. The company, which also owns Facebook and WhatsApp, said it's also blocked thousands of hashtags associated with the sexualization of children and restricted these terms from user searches.
SNAFU? (Score:4, Insightful)
Re: (Score:2)
Feminism and critical theory are both aspects of modernist philosophy, not post-modern philosophy. They are actually quite opposed to post-modernism. But you don't actually know what any of those words mean. You're just a parrot repeating things you heard on the TV that was left on in the living room to distract you from the fact that you don't have any meaningful social structure around you.
terrible but hey cda-230 so (Score:3)
consequence free!
Um... you know the FBI lets these swim (Score:5, Interesting)
The only problem here is their algorithm is dumber than the pedos and didn't hide them after the FBI was done with them.
Re: (Score:1)
These people are the 1% who are all in a child sex slave canal intent on keeping incomes down so they can buy our children! Right? /s
Re: (Score:2)
These people are the 1% who are all in a child sex slave canal
Fortunately, barges move pretty slowly. Should be easy enough to catch them.
Re: (Score:2)
children are likely to taken advantage of even by adults we might describe as 'dumb'.
So no I don't want dumb pedos. I don't want them to have ANY platform to publish their filth or get inspired to rape some kid.
This shit needs to be shut down. CDA-230 needs to go and people hosting this stuff need to take responsibility - yes even if it breaks the entire net.
Re: (Score:3)
I disagree with this, I don't think they need inspiration (or at least don't know), it happened before the internet, historically it was even socially acceptable to have sex with 12 year olds. Where is the proof that this increases the instances of abuse?
Without proof all we are doing are stating a hypothesis and then going into a panic trying to stop something that may or may not increase the rate of pedophilia by some unknown amount.
Here are some unsupported inferences you can make: It decreases actual a
Re: (Score:2)
because you can catch them and toss 'em in jail.
Ummm... these people are obviously not wanted by society... so why imprison them? Why not just execute them? Obviously, they can never be let out of jail due to their proclivities, so seriously, why not just execute them and save everyone the hassle?
I can hear your response, "we are a civilized society. we don't just kill people", and my response to that is, "you are not a civilized society, you can't even provide food and shelter to people who actively WANT to participate positively in society", so fuck th
Re: (Score:3)
and that such content "does not appear to proliferate" on TikTok.
Yes, but TikTok is run by godless heathen Commies while Instagram is run by honest god-fearing law-abiding Muricans, so it's all right then.
Re: (Score:2)
As it should be. We don't want platforms to have to police their content so carefully in case some reference to something illegal slips by. That would be devastating for internet freedoms
Re: (Score:1)
Sounds like a woke leftist paradise.
Gay, Trans, Bi and now MAPs.
Yup. All those gay [imgur.com], trans [imgur.com], bi [imgur.com], and MAP [imgur.com] folks from the "left".
Re: (Score:1)
4 links about nobodies with 3 of unknown political beliefs is evidence that right wingers are all child rapists.
Lol, can't argue with that "logic"!
Re:Not surprised. (Score:4, Insightful)
Let's see. Two are avowed magas, one is a priest, and the last is a police officer. Clearly these peope are "leftists".
It's always amazing when accusations are pointed out as being confessions how people will jump through hoops to escape being hoisted on their own petard.
Re: (Score:1)
Let's see, 1 is a member of a conservative group. The others are of unknown political beliefs.
There are liberal cops, priests, and so on. You can not determine someone's political beliefs solely from their career path.
So, you have 1 anecdotal right winger, 3 unknowns, and no data.
Wow, amazingly timed? (Score:1)
Re: (Score:1)
Certain parties were possibly sitting on this info for awhile, but let's be honest, Meta could have stepped in to fix the problem before they were exposed publicly.
Re: Wow, amazingly timed? (Score:1)
Re: (Score:1)
Producer or provider? Are you saying the FBI creates original child porn for the internet?
Re: (Score:2)
It doesn't matter. If someone were running a honeypot on your social media network would you cooperate with it?
Would you even know it was a honeypot?
Article is paywalled (Score:2)
If you see a story about something scary the 1st thing you should do is find out what the source is and the second thing you should do is find out when the source 1st reported. If it was a while ago then you're probably being rage baited.
"Child exploitation is a horrible crime" (Score:5, Informative)
NAMBLA says (Score:1)
Re: (Score:1)
Ah, the right projecting again, truth is here:
https://ips-dc.org/the-global-... [ips-dc.org]
Re: NAMBLA says (Score:2)
They're obsessed because there's so much child diddling going on in uptight conservative communities. Get diddled and you either end up a diddler or an anti-diddling activist.
Re: (Score:2)
I think the people have a bias of desire in favor of young. We've also decided that this is something that should be suppressed. (Lots of reasons.) Some folks, though, think that the profit from exploiting the desire is more important.
This is a tension that will probably exist as long as we try to protect children from exploitation. That seems, to me, a worthy goal, so it's not unexpected that there will be the need to enforce the rules.
There's a real problem at the transition ages, where we keep making
Re: (Score:1)
Surveys of American men (sorry I've never seen a global survey) say men of all ages have highest preference by far for women 22-24. Not under 18. Adults wanting to fuck children is not the norm.
Many places already protect the 18-17 year olds. That's not what this is about.
Fucking children is just wrong and the people doing that shit need to be stopped.
Re: (Score:2)
In other news, a survey of secret alcoholics suggests that they've got their drinking totally under control, and a survey of everyone in the elevator confirms that nobody farted.
Re: (Score:1)
Context: META ignoring online Child Sexual Abuse Material (CSAM)
So is this really a problem? [...] FWIW just because I wank to videos of MILFs doesn't mean I'm gonna rape harass women in real life.
Translation:
So is this really a problem? [...] FWIW just because I wank to videos of children doesn't mean I'm gonna rape harass children in real life.
To answer the pedo AC's question: Yes, it is a problem. Easy access to CSAM normalizes child sexual abuse. That puts more children at risk. e.g. "Boys will be boys" has excused a lot of sexual assault, empowering abusers while dismissing and shaming victims.
Re: (Score:1)
Also, demand creates supply.
If there wasn't an audience of buyers there'd be a lot less child rape on video getting g produced and posted.
Instead they shut down (Score:2)
and harass glass pipe makers
Hm. (Score:2)
Makes you think... (Score:2)
If a recommendation algo recommends prohibited content, one has to wonder what the service is generally used for.
Hashtags seem like a great way to find Pedos (Score:1)
Re: (Score:2)
Religious Backlash (Score:1)
Is Instagram going to ban Christianity?
simplistic (Score:1)
Re: (Score:1)
TFS kinda smells like teen yoga or something, I don't see the word porn or illegal. Unsavory, and arguably something the site should Do Something About, but not yet a legal matter.
Then again it's hard to tell through the mediaspeak.