Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Twitter Social Networks

How Twitter's Child Porn Problem Ruined Its Plans For an OnlyFans Competitor (theverge.com) 100

An anonymous reader quotes a report from The Verge: In the spring of 2022, Twitter considered making a radical change to the platform. After years of quietly allowing adult content on the service, the company would monetize it. The proposal: give adult content creators the ability to begin selling OnlyFans-style paid subscriptions, with Twitter keeping a share of the revenue. Had the project been approved, Twitter would have risked a massive backlash from advertisers, who generate the vast majority of the company's revenues. But the service could have generated more than enough to compensate for losses. OnlyFans, the most popular by far of the adult creator sites, is projecting $2.5 billion in revenue this year -- about half of Twitter's 2021 revenue -- and is already a profitable company.

Some executives thought Twitter could easily begin capturing a share of that money since the service is already the primary marketing channel for most OnlyFans creators. And so resources were pushed to a new project called ACM: Adult Content Monetization. Before the final go-ahead to launch, though, Twitter convened 84 employees to form what it called a "Red Team." The goal was "to pressure-test the decision to allow adult creators to monetize on the platform, by specifically focusing on what it would look like for Twitter to do this safely and responsibly," according to documents obtained by The Verge and interviews with current and former Twitter employees. What the Red Team discovered derailed the project: Twitter could not safely allow adult creators to sell subscriptions because the company was not -- and still is not -- effectively policing harmful sexual content on the platform.

"Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale," the Red Team concluded in April 2022. The company also lacked tools to verify that creators and consumers of adult content were of legal age, the team found. As a result, in May -- weeks after Elon Musk agreed to purchase the company for $44 billion -- the company delayed the project indefinitely. If Twitter couldn't consistently remove child sexual exploitative content on the platform today, how would it even begin to monetize porn? Launching ACM would worsen the problem, the team found. Allowing creators to begin putting their content behind a paywall would mean that even more illegal material would make its way to Twitter -- and more of it would slip out of view. Twitter had few effective tools available to find it. Taking the Red Team report seriously, leadership decided it would not launch Adult Content Monetization until Twitter put more health and safety measures in place.
"Twitter still has a problem with content that sexually exploits children," reports The Verge, citing interviews with current and former staffers, as well as 58 pages of internal documents. "Executives are apparently well-informed about the issue, and the company is doing little to fix it."

"While the amount of [child sexual exploitation (CSE)] online has grown exponentially, Twitter's investment in technologies to detect and manage the growth has not," begins a February 2021 report from the company's Health team. "Teams are managing the workload using legacy tools with known broken windows. In short, [content moderators] are keeping the ship afloat with limited-to-no-support from Health."

Part of the problem is scale while the other part is mismanagement, says the report. "Meanwhile, the system that Twitter heavily relied on to discover CSE had begun to break..."
This discussion has been archived. No new comments can be posted.

How Twitter's Child Porn Problem Ruined Its Plans For an OnlyFans Competitor

Comments Filter:
  • AI is not ready to successfully identify CSE .. hell, given the massive issues with YooToob filters over-aggressively flagging copyright "infringement" , AI isn't even very good at getting THAT right

    Unless and until AI gets much better at it the only way to police exploitation is to have humans reviewing and .. that doesn't scale and humans burn out very quickly

    You see this on FacePlace all the time - automated filters attempting to stop hate speech/abuse are aggressively policing completely innocuous speec

    • but they filter out Republican Senators too. Twitter accidently admitted it years ago. [businessinsider.com]

      It's not that the speech is innocuous, it's just the volume on the dog whistle was low enough it got past your ears. I'm dialed into politics to 11 and I miss 'em all the time unless someone dialed into 15 points 'em out to me.
  • Twitter did it right by looking to see what security holes existed in their business plan and if they could be fixed. There were and they couldn't so Twitter did the right thing and shelved it.

    Responsible business behavior on the Internet. Who knew that was a thing?
    • by gweihir ( 88907 )

      Well, yes. Somewhat. Because that Twitter cannot fix these basically says they are really, really incompetent.

      • Or that the problem is really, really hard. I'm the last person to support Twitter. I think it is the dumpster fire of the Internet, but the problem is really hard, and people will remain incredibly creative in finding ways to work around any limitations, restrictions or protections put in by the system. On the flip side, I don't think Twitter is really all that interested in stopping the CSE, if you get down to it. Too much money in it if they clutch their pearls and wail that there's nothing they can

        • I agree on this. I'm not a twitter fan, nor do I give a rats behind about whether or not they have an onlyfans clone, but... automatically detecting and censoring (or punishing, whatever,) age related crime in uploaded videos is a pretty damned hard problem to solve. Especially since it would have to always know the difference between someone posting a picture of their kid playing in the sprinkler in swim trunks vs them involved in something more risque. For a human, that's usually easy, we can usually agr
  • by stabiesoft ( 733417 ) on Thursday September 01, 2022 @08:51AM (#62843133) Homepage
    about the plan. He is buds with Jack. jack and him could have been smokin some 420 and Jack says hey, we are going to monetize p0rn. Elon goes, whoa, sounds profitable, maybe I'll buy twitter and cash in. Jack nods. After the red team reports back "no go", jack calls elon and says nope on the p0rn. Elon goes, changed my mind about buying then. Need to cancel asap.
  • by joe_frisch ( 1366229 ) on Thursday September 01, 2022 @09:21AM (#62843219)
    Pedophlia is not claimed to be all that popular and the risks / penalties for buying or selling are extremely high. Why does it seem such a common issue on the internet? Is this really child porn or are they conflating content with children with content with unverified ages, or slightly underage porn?
    • by Anonymous Coward on Thursday September 01, 2022 @09:50AM (#62843301)

      I had a friend from long ago wind up in those circles. One of those "he was always the nicest guy, married, kind, fun. WTF Happened?!?!?!" kind of things.

      We looked up the newspaper reports and they showed some details of what you are asking about.
      Apparently he became a member of a set of sites/forums that focused on CSE and those into it gather vast libraries of the stuff. There's a lot of trading and exchanges like a currency between them. Want to know about more boards? Want to join our group? Send me 1,000 files I don't already have" (both to prove you are "one of us" as well as to expand their collection) So I'm thinking it's a low number of individuals, but they are seeking a lot of material with a constant need for new stuff..

      In the end he got caught in a chat room with an undercover officer posing as a 13 year old, trying to make arrangements to visit her. He'll be in jail for a pretty long time as it is and that's just on state charges, federal is still pending.

      It's a weird thing where you think about an old friend you spent a lot of time around for a decade and if you saw them passing by on the street today, half of you wants to say "Hey, how's it going, good to see ya!" and the other half wants to kill him on sight.

    • by Knightman ( 142928 ) on Thursday September 01, 2022 @10:04AM (#62843333)

      It's an interesting question. It would be great if there was a breakdown of what counts as child porn, especially if we considers that if a 15yo sends a naked pic to their gf/bf etc it counts as child porn in many places.

      Lumping everything together under the heading child porn actually undermines efforts to combat exploitation of children for the simple reason that there are cases like the example I gave above that isn't child exploitation, just teenagers doing what teenagers have always done. It also allows some politicians and Karens to wield the "for the children" cry as a bat in an effort to wreck things they don't like because there isn't a perfect solution available for what is upsetting them.

      There isn't one public internet service (social media or whatever) that doesn't combat CP and sexual exploitation when detected, but when they don't even have a good solution for detecting spam, plagiarism, bullying and whatnot (including handling false positives) - expecting them to a have a perfect solution for CP and sexual exploitation is just plain stupid. And considering those who peddle in CP and sexual exploitation, they tend to be quite savvy at hiding their activities and using Twitter as a paid service to do it seems kind of self defeating in the end - you have a service saving all the evidence of what you are doing that the law enforcement can get to plus all the financial records of the transactions from those consuming said content.

      • All porn is upsetting to these people. The child porn angle is just a means to an end. Take out the platform for the adult content while going after the child content.

        The whole thing seems very backwards when you consider most of the biggest sex scandals that deal with children are by and far those from a religious background. The biggest non-religious child exploitation was Epstein in recent memory and depending on what country some of this stuff took happen, no law was likely broken anyway. I mean, she wa

        • the biggest sex scandals that deal with children are by and far those from a religious background

          I don't suppose you've ever heard of the U.S. education system?

          https://www.edweek.org/leaders... [edweek.org]

          The biggest non-religious child exploitation was Epstein in recent memory

          I guess you missed all those significant child-trafficking busts that were being made a few years ago. You know, the ones that we stopped hearing about when Biden came into office.

          https://www.washingtonpost.com... [washingtonpost.com]

          https://www.usatoday.com/story.. [usatoday.com]

      • If the interest is in finding an excuse for increasing surveillance, then lumping all of those together and calling them "child porn" is an efficient tactic.
    • I had this thought as well. There can't be more than a tiny number of Pedos in the world. A ridiculously small number like 1 in a million works out to 8,000 Pedos world wide. Then I remembered the amazing amount of porn you and I consume in a day and the problem becomes clear.
      • Wikipedia cites that less than 5% of adult men are pedophiles. I'm somewhat skeptical, but let's say 1% of adult males.

        125 million adult males in the United States? 1% of 125m = 1,250,000

        That seems crazy...

    • by AmiMoJo ( 196126 ) on Thursday September 01, 2022 @10:15AM (#62843367) Homepage Journal

      Paedophilia, an attraction to pre-pubescent children, is fairly rare. But once they are pubescent and capable of reproduction, humans are wired to desire them. Before modern medicine, it increased the chances of genes surviving by passing them on as early as possible.

      Now we are civilized, we recognize that children engaging in the somewhat risky business of sex is not a good idea. Most countries require them to be in their mid teens at least before engaging in sex, and 18 before engaging in sex work like taking suggestive photos or having sex on camera.

      So yes, it's completely normal to find humans under the age of 18 sexually attractive. It's just that we expect people to abide by the laws protecting them anyway, like we expect people to control all sorts of other urges they may have.

      • Surprised and gratified to find a rational point of view from you, so thanks. On this topic the real reason for a lot of CP laws isn't helping children and I'm not sure CP laws even do that. These laws push evidence underground, make it harder to find and prosecute abusers, ruin lives of people who, for example, publish perfectly innocent pictures of their children on the beach, encourage government overreach and ignoring of rights, and in general cause more harm than they prevent.
        • by narcc ( 412956 )

          Sure... it's the laws against child porn that are the problem. Why, if it was just legal to distribute and possess they'd catch all the producers, right away, yeah?

          Get real. This is practically and admission that you want easy access to kiddie porn.

          • "Distribute and possess" is actually a good example. If you're driven by child protection, there's a very large difference between them, distribution is analogous to speech, whereas possession is analogous to thought, it doesn't affect the outside world in any way. If you're driven by disgust, then there's little or no difference.
            • by narcc ( 412956 )

              Neither possession nor distribution are in any way harmless. The production of that kind of content is necessarily harmful. It is only produced because there is demand. Distribution creates availability and availability creates demand. Distribution, then, is necessarily harmful. Possession is at the end of that chain, so it is also necessarily harmful.

              This is always about child protection. This has absolutely nothing to do with free speech. It's more than a little disgusting that you'd even try to equ

              • The production of that kind of content is necessarily harmful.

                Not true, I have definitely read articles that state cartoon versions are also illegal in some places. And with CGI it may become very realistic. I don't see even that type becoming acceptable.

              • So when I pirate a game I'm actually producing more demand for it? That's something to tell the game studios.
              • >By normalizing that sort of content, you normalize child sexual abuse.

                Don't be absurd. By your logic we need to ban small-breasted porn stars to save the children. And while we're at it we need to ban even disagreeing with CP laws. The more rage and disgust the better right? This purity spiral has gone too far already.
                • by narcc ( 412956 )

                  This purity spiral has gone too far already.

                  We're talking about child sexual abuse. Normal people have a serious problem with that. There's something seriously wrong with you pedo freaks.

                  • >ad hom

                    meh

                    >Normal people have a serious problem with (child sexual abuse)

                    I have a real problem with it too, but I have a different perspective. Instead of hating pedos I love children, and want them to be safe. And the best way to assure their safety is not a witch hunt. I know it feels good to have an out group, but the policies you support are ineffective to your claimed goal. So which is it? Would you rather protect a child or destroy a pedo? If you want to protect children the best thing to do
                    • by narcc ( 412956 )

                      Fuck of pedo scum. We all read your post. Try to walk it back all you want, but your pro-kiddie porn manifesto is just a few posts up thread for all to see.

                    • > We all read your post.

                      I stand by every word.

                      >bile

                      Hatred is a sword with a poisoned haft.
          • by dfghjk ( 711126 )

            "This is practically and admission that you want easy access to kiddie porn."

            Or, equally likely, that you do. The most righteous are frequently the biggest offenders.

        • by AmiMoJo ( 196126 )

          There is an interesting debate here. Some years ago an artist in the UK had an exhibition of photographs they took, one of which was of their own children at the beach with nothing on. The police decided to take no action, and it's hardly the first time nude children have been used in art, from album covers to classical paintings.

          I think there is a good argument that people shouldn't publish photos of their children that their children may object to, either at the time or later in life. While guardians do h

      • Paedophilia, an attraction to pre-pubescent children, is fairly rare. But once they are pubescent and capable of reproduction, humans are wired to desire them. Before modern medicine, it increased the chances of genes surviving by passing them on as early as possible.

        Now we are civilized, we recognize that children engaging in the somewhat risky business of sex is not a good idea. Most countries require them to be in their mid teens at least before engaging in sex, and 18 before engaging in sex work like taking suggestive photos or having sex on camera.

        So yes, it's completely normal to find humans under the age of 18 sexually attractive. It's just that we expect people to abide by the laws protecting them anyway, like we expect people to control all sorts of other urges they may have.

        i thought intelligent posts have long since been banned on slashdot...

        • That type of post would have 100 downvotes on Reddit. On Reddit, if you're dating someone who's 1 year younger than you, they will scream about "age gap" and that you must be a pedo.
    • It might not be all that popular, but it's fanbase is steady and unwavering.

      • by HiThere ( 15173 )

        How do you know?

        • The same way you do. It's not that complicated, nor is it much of a stretch to reach that conclusion. You can't walk away from your sexuality, even if it's "deviant".

          • by HiThere ( 15173 )

            IIUC, that was not Kinsey's conclusion. His conclusions (as I understood it) is that most traits are on a range, and many of them could alter their expression for convenience. It's true that in at least some cases he found extreme groups that were not flexible, but they were a small percentage.

      • by dfghjk ( 711126 )

        but not moreso than any other form of porn, or do you have evidence that says otherwise?

        You want steady and unwavering, look at men who talk about tits and ass. Half our Super Bowl commercials are inspired by it.

        • I claimed nothing outside of what I explicitly stated. The question was, "Why is there so much child porn?"

          If you want to muddy the waters, feel free to do so. But that's your discussion. I'm not involved.

        • I would think it is, no real evidence of course just conjecture. I think it is more unwavering because it is so socially unacceptable the only people who do it are on the ones that can't help themselves. I think its like being gay was in the past, it was totally unacceptable, now people may simply give it a go, and nobody cares.

    • Is there? I don't know never looked, never plan to look. Even if only 0.01% (1 in 10 thousand) of the approximately 4.7 billion people on the internet doing it that's still 47,000 people doing it.

  • Yeah, the wife won't be upset when she sees this stuff on the c.c. statement.
  • by Shaitan ( 22585 ) on Thursday September 01, 2022 @10:57AM (#62843477)

    "While the amount of [child sexual exploitation (CSE)] online has grown exponentially"

    Citation needed and a citation that can't be explained by improved detection methods finding activity that was already out there or the projected or otherwise hypothesized increase of material and which is corrected for population growth.

    There doesn't seem to be any sign of this on user driven content sites. There might be some teenagers slipping through among some of the younger and barely legal content (mostly posting themselves) but there is very little indication I've seen of actual children being posted or if it were being tolerated by other users. Is this just more of the pretending PH was a child porn haven thing again?

    • I assume they are talking about anyone under 18 posting risky photos of themselves online. Which I am sure probably has been skyrocketing.

    • by dfghjk ( 711126 )

      All you need for that statement to be true is a proper exponent. Easily achieved, given "exponential growth" of the internet itself. It's a provocative statement that says nothing and is certainly claimed without evidence.

      Yes, teens have been contributing by "victimizing" themselves for a while now and distributing the content to others of similar age. This might meet the legal standard for CP in some jurisdictions, but it is hardly "child exploitation". For that there needs to be exploitation.

  • by gosso920 ( 6330142 ) on Thursday September 01, 2022 @11:57AM (#62843691)
    Foiled again!" - Jack Dorsey
  • Isn't a core property of pedophilila that you try to stay away from attention and controversy?

"Being against torture ought to be sort of a multipartisan thing." -- Karl Lehenbauer, as amended by Jeff Daiell, a Libertarian

Working...