Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Social Networks Facebook The Internet

Instagram's Recommendation Algorithms Are Promoting Pedophile Networks (theverge.com) 61

According to a joint investigation from The Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst, Instagram's algorithms are actively promoting networks of pedophiles who commission and sell child sexual abuse content on the app. The Verge reports: Accounts found by the researchers are advertised using blatant and explicit hashtags like #pedowhore, #preteensex, and #pedobait. They offer "menus" of content for users to buy or commission, including videos and imagery of self-harm and bestiality. When researchers set up a test account and viewed content shared by these networks, they were immediately recommended more accounts to follow. As the WSJ reports: "Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children."

In addition to problems with Instagram's recommendation algorithms, the investigation also found that the site's moderation practices frequently ignored or rejected reports of child abuse material. The WSJ recounts incidents where users reported posts and accounts containing suspect content (including one account that advertised underage abuse material with the caption "this teen is ready for you pervs") only for the content to be cleared by Instagram's review team or told in an automated message [...]. The report also looked at other platforms but found them less amenable to growing such networks. According to the WSJ, the Stanford investigators found "128 accounts offering to sell child-sex-abuse material on Twitter, less than a third the number they found on Instagram" despite Twitter having far fewer users, and that such content "does not appear to proliferate" on TikTok. The report noted that Snapchat did not actively promote such networks as it's mainly used for direct messaging.

In response to the report, Meta said it was setting up an internal task force to address the issues raised by the investigation. "Child exploitation is a horrific crime," the company said. "We're continuously investigating ways to actively defend against this behavior." Meta noted that in January alone it took down 490,000 accounts that violated its child safety policies and over the last two years has removed 27 pedophile networks. The company, which also owns Facebook and WhatsApp, said it's also blocked thousands of hashtags associated with the sexualization of children and restricted these terms from user searches.

This discussion has been archived. No new comments can be posted.

Instagram's Recommendation Algorithms Are Promoting Pedophile Networks

Comments Filter:
  • SNAFU? (Score:4, Insightful)

    by gosso920 ( 6330142 ) on Wednesday June 07, 2023 @09:30PM (#63584822)
    No, it's working like normal.
  • by DarkOx ( 621550 ) on Wednesday June 07, 2023 @09:33PM (#63584830) Journal

    consequence free!

    • by rsilvergun ( 571051 ) on Thursday June 08, 2023 @12:33AM (#63585014)
      so they can go arrest the people who join, right? You don't want these to be invisible. Dumb pedos are the kind you want because you can catch them and toss 'em in jail.

      The only problem here is their algorithm is dumber than the pedos and didn't hide them after the FBI was done with them.
      • These people are the 1% who are all in a child sex slave canal intent on keeping incomes down so they can buy our children! Right? /s

        • These people are the 1% who are all in a child sex slave canal

          Fortunately, barges move pretty slowly. Should be easy enough to catch them.

      • by DarkOx ( 621550 )

        children are likely to taken advantage of even by adults we might describe as 'dumb'.

        So no I don't want dumb pedos. I don't want them to have ANY platform to publish their filth or get inspired to rape some kid.

        This shit needs to be shut down. CDA-230 needs to go and people hosting this stuff need to take responsibility - yes even if it breaks the entire net.

        • I disagree with this, I don't think they need inspiration (or at least don't know), it happened before the internet, historically it was even socially acceptable to have sex with 12 year olds. Where is the proof that this increases the instances of abuse?

          Without proof all we are doing are stating a hypothesis and then going into a panic trying to stop something that may or may not increase the rate of pedophilia by some unknown amount.

          Here are some unsupported inferences you can make: It decreases actual a

      • because you can catch them and toss 'em in jail.

        Ummm... these people are obviously not wanted by society... so why imprison them? Why not just execute them? Obviously, they can never be let out of jail due to their proclivities, so seriously, why not just execute them and save everyone the hassle?

        I can hear your response, "we are a civilized society. we don't just kill people", and my response to that is, "you are not a civilized society, you can't even provide food and shelter to people who actively WANT to participate positively in society", so fuck th

    • and that such content "does not appear to proliferate" on TikTok.

      Yes, but TikTok is run by godless heathen Commies while Instagram is run by honest god-fearing law-abiding Muricans, so it's all right then.

    • by Ksevio ( 865461 )

      As it should be. We don't want platforms to have to police their content so carefully in case some reference to something illegal slips by. That would be devastating for internet freedoms

  • Right after the Twitter and Google cases, this comes out. One has to think there is an agenda to be pushed.
    • Certain parties were possibly sitting on this info for awhile, but let's be honest, Meta could have stepped in to fix the problem before they were exposed publicly.

    • but I'd like to know what the studies are and if they're recent. There's been a slew of news stories about scary things... from 8-10 years ago.

      If you see a story about something scary the 1st thing you should do is find out what the source is and the second thing you should do is find out when the source 1st reported. If it was a while ago then you're probably being rage baited.
  • by Pierre Pants ( 6554598 ) on Wednesday June 07, 2023 @09:41PM (#63584842)
    says company that sells personal data of children and teenagers and fails to comply with FTC privacy order. https://www.ftc.gov/news-event... [ftc.gov]
  • Love Is Love
  • and harass glass pipe makers

  • Algs promoring what people seem to keep looking for. Doesn't reveal much good about people, does it?
  • If a recommendation algo recommends prohibited content, one has to wonder what the service is generally used for.

  • If someone uses a hashtag like #preteensex, or other Pedo related hashtags, couldn't Instagram forward this information to the FBI?
  • by Anonymous Coward

    Is Instagram going to ban Christianity?

  • It would seem to me, pardon my naivete, that this would be someplace the law enforcement community could then look to apprehend these people who promote this sort of thing? I mean, promoting breaking a law, isn't that illegal? And if you are in possession of or distributing child porn, isn't that something that is actionable and will get you locked up? If instagram makes algorithms that promote child porn networks, shouldn't law enforcement be able to take data off those networks and use it to prosecute the
    • by Anonymous Coward

      TFS kinda smells like teen yoga or something, I don't see the word porn or illegal. Unsavory, and arguably something the site should Do Something About, but not yet a legal matter.

      Then again it's hard to tell through the mediaspeak.

Never test for an error condition you don't know how to handle. -- Steinbach

Working...