How Twitter's Child Porn Problem Ruined Its Plans For an OnlyFans Competitor (theverge.com) 100
An anonymous reader quotes a report from The Verge: In the spring of 2022, Twitter considered making a radical change to the platform. After years of quietly allowing adult content on the service, the company would monetize it. The proposal: give adult content creators the ability to begin selling OnlyFans-style paid subscriptions, with Twitter keeping a share of the revenue. Had the project been approved, Twitter would have risked a massive backlash from advertisers, who generate the vast majority of the company's revenues. But the service could have generated more than enough to compensate for losses. OnlyFans, the most popular by far of the adult creator sites, is projecting $2.5 billion in revenue this year -- about half of Twitter's 2021 revenue -- and is already a profitable company.
Some executives thought Twitter could easily begin capturing a share of that money since the service is already the primary marketing channel for most OnlyFans creators. And so resources were pushed to a new project called ACM: Adult Content Monetization. Before the final go-ahead to launch, though, Twitter convened 84 employees to form what it called a "Red Team." The goal was "to pressure-test the decision to allow adult creators to monetize on the platform, by specifically focusing on what it would look like for Twitter to do this safely and responsibly," according to documents obtained by The Verge and interviews with current and former Twitter employees. What the Red Team discovered derailed the project: Twitter could not safely allow adult creators to sell subscriptions because the company was not -- and still is not -- effectively policing harmful sexual content on the platform.
"Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale," the Red Team concluded in April 2022. The company also lacked tools to verify that creators and consumers of adult content were of legal age, the team found. As a result, in May -- weeks after Elon Musk agreed to purchase the company for $44 billion -- the company delayed the project indefinitely. If Twitter couldn't consistently remove child sexual exploitative content on the platform today, how would it even begin to monetize porn? Launching ACM would worsen the problem, the team found. Allowing creators to begin putting their content behind a paywall would mean that even more illegal material would make its way to Twitter -- and more of it would slip out of view. Twitter had few effective tools available to find it. Taking the Red Team report seriously, leadership decided it would not launch Adult Content Monetization until Twitter put more health and safety measures in place. "Twitter still has a problem with content that sexually exploits children," reports The Verge, citing interviews with current and former staffers, as well as 58 pages of internal documents. "Executives are apparently well-informed about the issue, and the company is doing little to fix it."
"While the amount of [child sexual exploitation (CSE)] online has grown exponentially, Twitter's investment in technologies to detect and manage the growth has not," begins a February 2021 report from the company's Health team. "Teams are managing the workload using legacy tools with known broken windows. In short, [content moderators] are keeping the ship afloat with limited-to-no-support from Health."
Part of the problem is scale while the other part is mismanagement, says the report. "Meanwhile, the system that Twitter heavily relied on to discover CSE had begun to break..."
Some executives thought Twitter could easily begin capturing a share of that money since the service is already the primary marketing channel for most OnlyFans creators. And so resources were pushed to a new project called ACM: Adult Content Monetization. Before the final go-ahead to launch, though, Twitter convened 84 employees to form what it called a "Red Team." The goal was "to pressure-test the decision to allow adult creators to monetize on the platform, by specifically focusing on what it would look like for Twitter to do this safely and responsibly," according to documents obtained by The Verge and interviews with current and former Twitter employees. What the Red Team discovered derailed the project: Twitter could not safely allow adult creators to sell subscriptions because the company was not -- and still is not -- effectively policing harmful sexual content on the platform.
"Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale," the Red Team concluded in April 2022. The company also lacked tools to verify that creators and consumers of adult content were of legal age, the team found. As a result, in May -- weeks after Elon Musk agreed to purchase the company for $44 billion -- the company delayed the project indefinitely. If Twitter couldn't consistently remove child sexual exploitative content on the platform today, how would it even begin to monetize porn? Launching ACM would worsen the problem, the team found. Allowing creators to begin putting their content behind a paywall would mean that even more illegal material would make its way to Twitter -- and more of it would slip out of view. Twitter had few effective tools available to find it. Taking the Red Team report seriously, leadership decided it would not launch Adult Content Monetization until Twitter put more health and safety measures in place. "Twitter still has a problem with content that sexually exploits children," reports The Verge, citing interviews with current and former staffers, as well as 58 pages of internal documents. "Executives are apparently well-informed about the issue, and the company is doing little to fix it."
"While the amount of [child sexual exploitation (CSE)] online has grown exponentially, Twitter's investment in technologies to detect and manage the growth has not," begins a February 2021 report from the company's Health team. "Teams are managing the workload using legacy tools with known broken windows. In short, [content moderators] are keeping the ship afloat with limited-to-no-support from Health."
Part of the problem is scale while the other part is mismanagement, says the report. "Meanwhile, the system that Twitter heavily relied on to discover CSE had begun to break..."
Re: (Score:2)
Nah, Qanon is more a bunch of useful, gullible fools for me and my colleagues. And we can even tell them by now that they're useful, gullible fools and they'll just double down on it and believe even more bullshit we feed them.
I love my job.
Re: (Score:1)
qanon who famously missed the boat on epstein entirely lol what a joke
qanon and you don't actually care about kids, trafficked or not. the worst type of craven policial goons.
Re: (Score:2)
Hey! I may be a lowlife that destroys society, but I still have standards!
Think about the criminals! And the trolls! (Score:2)
You're feeding a troll and propagating a stupid Subject.
Re: (Score:2)
Hey, everyone deserves to pursue a hobby.
Re: (Score:2)
Now I think I should have tried to make a joke along the lines of "What do you get when you cross a with a troll?" landing on the "Think of the children" punchline.
Re: (Score:2)
I don't quite get that one. I only know what you get when you cross the Queen of England and Prince Charles.
You get killed in a tunnel.
Re:Think [of] the criminals! And the trolls! (Score:2)
Too soon?
(Plus I blame the paparazzi and the general malaise of journalism.)
But I'm thinking the new punchline is something like "Think of the (troll) children" or "Eat the children"? (In a tunnel or cave?)
But as I've written before, I wouldn't recognize I'd written a good joke if it bit me in the... I wish I could write humor, but the evidence is pretty clear.
Re: (Score:2)
QANON is quite real. The things that QANON traffics in are conspiracy theories, and Epstein [yahoo.com] was basically absent from them.
Things that make you go "Hmmm...."
Re:Common problem (Score:5, Funny)
So, your theory is Ghislaine Maxwell was Q and the entire QAnon thing was one big smokescreen to direct attention away from Jeffery Epstein and his Illuminati buddies, Donald Trump, Prince Andrew, Bill Gates, and Bill Clinton? And that once she was captured, Plan B was put into action and the virus was released from China as Q's number 2 -- Dr. Anthony Fauci -- stepped up in a bold coup attempt?
Sir, your ideas are intriguing to me and I would like to subscribe to your newsletter.
Qanon doesn't cover Epstein because Trump (Score:4, Informative)
The Q guy is the guy that runs 8 chan. There's a lengthy documentary covering why but the short answer is Q does a bunch of stuff over on 8 Chan that wouldn't be possible unless he was the owner (e.g. various admin level stuff). The 8 Chan guy is a huge simp for Trump, so he's not going to let his cult cover anything bad for the guy. It's that simple.
Re:Qanon doesn't cover Epstein because Trump (Score:4, Interesting)
Pretty much. Epstein is only useful as a secondary character in an attempt to tie Democrats, liberal elites, and the like into a shadowy global child exploitation ring that can't quite ever be caught.
The moment that Epstein and Maxwell begin to connect to Trump [newsweek.com], the QANON God-King, suddenly Q cries "NO, LOOK OVER THERE!" while QANON ties itself in knots from cognitive dissonance.
Re: (Score:2)
No, Ghislaine Maxwell was obviously a Reddit supermod. (I honestly don't know if I am being ironic or not.)
https://www.reddit.com/r/Drama/comments/hnbohd/ghislaine_maxwell_was_a_reddit_poweruser_and/ [reddit.com]
Re: (Score:2)
> So, your theory is Ghislaine Maxwell was Q
I thought Q was either Desmond Llewelyn ( https://en.wikipedia.org/wiki/... [wikipedia.org] ) or John de Lancie ( https://en.wikipedia.org/wiki/... [wikipedia.org] ).
Re:So how does onlyfans handle those issues. (Score:5, Informative)
OnlyFans requires the models to prove their age by sending them copies of ID. It's not perfect but they do at least have a system that is fairly accurate.
Twitter has no such system. It struggles just with the verified account badges, which are a much lower bar. The real core of the issue is that Twitter is a much more open platform that allowed anyone to make an account without verification, where as OnlyFans started as a platform where users can monetize their accounts and so had to handle stuff like age verification and payments from day one.
Re: (Score:2)
Well, this really should not be a problem. Just require more verification if anybody wants to offer adult content. Also, if such material is found, there is a clear money-trail to follow. While most police-forces are incompetent online, follow-the-money is pretty easy to do because of all the hurdles to tax-evasion and money-laundering already in place and because the police has a century or so of experience with it.
Re: (Score:3)
Sure, they could set it up and make sure it works, but they haven't done that yet. It's not trivial either, fake ID is a thing. There are companies that offer it as a service.
Re: (Score:2)
Even if they get the ID thing sorted, they still have to deal with the backlash from advertisers and current users abandoning the platform in protest.
Combining a shady business like porn with a "normal" business is a bad idea. OnlyFans is only porn so avoids problems with advertisers and moralist users. When OnlyFans is occasionally accused of hosting CP, they benefit from the resulting publicity.
Re: (Score:2)
Twitter already has plenty of porn. They require users to mark such posts as adult, which makes viewers have to click through a warning before they are shown. Lots of sex workers use Twitter to communicate with clients too.
As they discovered, even at the current level of tolerating it but not actively promoting it as a feature, they are not able to do adequate verification or response when alerted to non-consensual images.
Re: (Score:2)
Since when ist porn "shady"? Are you stuck in the last century?
Re: (Score:2)
"Shady" is probably a legitimate characterization. People tend to hide their porm habits. It's not the same as illegal, but it's the opposite of displayed. To me that seems a legitimate use of the word.
Re: (Score:2)
The vendors also hide, for various understandable and some illegal reasons. Republishing copyrighted material illegally is one of them. So is failing to pay their bills: during the dotcom era, many online services were very excited at the prospect of making money from the very high overall traffic of pornography. Many of those online services suffered as they discovered how few porn vendors paid their monthly bills, were terminated, and re-opened from another post office box with another name.
I've done some
Re: (Score:3, Insightful)
Since when ist porn "shady"? Are you stuck in the last century?
Do you watch porn with your mom? Or with your kids?
Re: (Score:3)
FFS!
I don't play rugby with my mother or children either. That doesn't mean rugby is a dodgy sport!
Re: (Score:2)
FFS!
I don't play rugby with my mother or children either. That doesn't mean rugby is a dodgy sport!
On the contrary. I'm not super knowledgable about rugby, but I'm pretty sure if you don't dodge, you're going to lose.
Re: (Score:2)
FFS!
I don't play rugby with my mother or children either. That doesn't mean rugby is a dodgy sport!
On the contrary. I'm not super knowledgable about rugby, but I'm pretty sure if you don't dodge, you're going to lose.
But in all seriousness, it's not just about whether you'd play it with them. The question is whether you would admit to them that you watch/play rugby or not.
Re: (Score:3)
Is sex shady then? Work? Sleep? Anything you don't do with mom or the kids is shady?
Re: (Score:2)
Well, this really should not be a problem. Just require more verification if anybody wants to offer adult content. Also, if such material is found, there is a clear money-trail to follow. While most police-forces are incompetent online, follow-the-money is pretty easy to do because of all the hurdles to tax-evasion and money-laundering already in place and because the police has a century or so of experience with it.
There are a few who would post for purely exhibitionistic satisfaction, but I don't see why they wouldn't turn down the money, especially if there were a no "porn" without validated bank info anyway. Shouldn't be that hard to teach a real AI to look for porn not in a porn account.
Re: (Score:2)
OnlyFans requires the models to prove their age by sending them copies of ID. It's not perfect but they do at least have a system that is fairly accurate.
That only works for the immediate, and doesn't protect subjects of the content if they go through a middleman and are perfectly comfortable with agreeing now but later change their minds and don't wish to continue publishing content but the middleman doesn't agree. Then it becomes a situation where the subject has to figure out how to contact the service provider, then to prove that they are the subject, and to convince the service provider that the consent on-file that links to some other method of contac
AI is not ready and humans burn out (Score:2)
AI is not ready to successfully identify CSE .. hell, given the massive issues with YooToob filters over-aggressively flagging copyright "infringement" , AI isn't even very good at getting THAT right
Unless and until AI gets much better at it the only way to police exploitation is to have humans reviewing and .. that doesn't scale and humans burn out very quickly
You see this on FacePlace all the time - automated filters attempting to stop hate speech/abuse are aggressively policing completely innocuous speec
Hilariously the filters for hate speech work (Score:1)
It's not that the speech is innocuous, it's just the volume on the dog whistle was low enough it got past your ears. I'm dialed into politics to 11 and I miss 'em all the time unless someone dialed into 15 points 'em out to me.
Re: (Score:2, Flamebait)
Gotta to protect those Christian values!
Re: (Score:2)
It's both. Some feminist are very anti-porn and do not feel that any woman that had a real honest choice would ever really want to do porn under any condition and therefore no woman should ever be allowed to make porn because it perpetuates violence against women in general.
Then you got your religious (left and right, think Jews/Muslims and Christians) folks that see pornography as evil and want it gone. By going after child porn, you can take down the platforms that also host all the adult stuff which is t
Re: (Score:2)
LOL, whut? The Internet's finance was kick-started by the military industrial complex. It was pushed along by academics and wasn't commercialized heavily until the early 90s when AOL came in. While AOL almost certainly had some porn, and rogue porn was around well before "eternal September", the Internet did fine with porn being a rounding error in commerce.
Re: (Score:2)
I mean, you needed many extra drives to carry alt.binaries.
Good for Twitter (Score:2)
Responsible business behavior on the Internet. Who knew that was a thing?
Re: (Score:3)
Well, yes. Somewhat. Because that Twitter cannot fix these basically says they are really, really incompetent.
Re: (Score:2)
Or that the problem is really, really hard. I'm the last person to support Twitter. I think it is the dumpster fire of the Internet, but the problem is really hard, and people will remain incredibly creative in finding ways to work around any limitations, restrictions or protections put in by the system. On the flip side, I don't think Twitter is really all that interested in stopping the CSE, if you get down to it. Too much money in it if they clutch their pearls and wail that there's nothing they can
Re: (Score:2)
Could musk have known (Score:4, Interesting)
Re: (Score:2)
That's not the craziest idea, especially because it never came up publicly.
Re: (Score:2)
Well, I'm rather anti-christian myself, but I'm also against the powerful exploiting the weak. (Actually, that's one of the reasons I'm against christianity. If christians lived up to their proclaimed ideals, I'd be in favor of it, but they have a *long* history of claiming to do so while actually doing the opposite. Long enough and varied enough to cause me to believe that it's systematic.)
Why is there so much child porn? (Score:4, Insightful)
Re:Why is there so much child porn? (Score:4, Interesting)
I had a friend from long ago wind up in those circles. One of those "he was always the nicest guy, married, kind, fun. WTF Happened?!?!?!" kind of things.
We looked up the newspaper reports and they showed some details of what you are asking about.
Apparently he became a member of a set of sites/forums that focused on CSE and those into it gather vast libraries of the stuff. There's a lot of trading and exchanges like a currency between them. Want to know about more boards? Want to join our group? Send me 1,000 files I don't already have" (both to prove you are "one of us" as well as to expand their collection) So I'm thinking it's a low number of individuals, but they are seeking a lot of material with a constant need for new stuff..
In the end he got caught in a chat room with an undercover officer posing as a 13 year old, trying to make arrangements to visit her. He'll be in jail for a pretty long time as it is and that's just on state charges, federal is still pending.
It's a weird thing where you think about an old friend you spent a lot of time around for a decade and if you saw them passing by on the street today, half of you wants to say "Hey, how's it going, good to see ya!" and the other half wants to kill him on sight.
Re:Why is there so much child porn? (Score:4, Insightful)
It's an interesting question. It would be great if there was a breakdown of what counts as child porn, especially if we considers that if a 15yo sends a naked pic to their gf/bf etc it counts as child porn in many places.
Lumping everything together under the heading child porn actually undermines efforts to combat exploitation of children for the simple reason that there are cases like the example I gave above that isn't child exploitation, just teenagers doing what teenagers have always done. It also allows some politicians and Karens to wield the "for the children" cry as a bat in an effort to wreck things they don't like because there isn't a perfect solution available for what is upsetting them.
There isn't one public internet service (social media or whatever) that doesn't combat CP and sexual exploitation when detected, but when they don't even have a good solution for detecting spam, plagiarism, bullying and whatnot (including handling false positives) - expecting them to a have a perfect solution for CP and sexual exploitation is just plain stupid. And considering those who peddle in CP and sexual exploitation, they tend to be quite savvy at hiding their activities and using Twitter as a paid service to do it seems kind of self defeating in the end - you have a service saving all the evidence of what you are doing that the law enforcement can get to plus all the financial records of the transactions from those consuming said content.
Re: (Score:2)
All porn is upsetting to these people. The child porn angle is just a means to an end. Take out the platform for the adult content while going after the child content.
The whole thing seems very backwards when you consider most of the biggest sex scandals that deal with children are by and far those from a religious background. The biggest non-religious child exploitation was Epstein in recent memory and depending on what country some of this stuff took happen, no law was likely broken anyway. I mean, she wa
Re: (Score:1)
the biggest sex scandals that deal with children are by and far those from a religious background
I don't suppose you've ever heard of the U.S. education system?
https://www.edweek.org/leaders... [edweek.org]
The biggest non-religious child exploitation was Epstein in recent memory
I guess you missed all those significant child-trafficking busts that were being made a few years ago. You know, the ones that we stopped hearing about when Biden came into office.
https://www.washingtonpost.com... [washingtonpost.com]
https://www.usatoday.com/story.. [usatoday.com]
Re: (Score:2)
Here, folks, we have a prime example of the right's attempt to destroy public education by equating teachers with "groomers" and thus public schools child exploitation/child porn.
Notice how they pretend to be left of center, with their positive reference to the #metoo movement, before diving straight down the rabbit hole of insane right-wing conspiracy nonsense.
Never forget that this has absolutely nothing to do with protecting children, and everything to do with destroying public education.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Wikipedia cites that less than 5% of adult men are pedophiles. I'm somewhat skeptical, but let's say 1% of adult males.
125 million adult males in the United States? 1% of 125m = 1,250,000
That seems crazy...
Re:Why is there so much child porn? (Score:5, Informative)
Paedophilia, an attraction to pre-pubescent children, is fairly rare. But once they are pubescent and capable of reproduction, humans are wired to desire them. Before modern medicine, it increased the chances of genes surviving by passing them on as early as possible.
Now we are civilized, we recognize that children engaging in the somewhat risky business of sex is not a good idea. Most countries require them to be in their mid teens at least before engaging in sex, and 18 before engaging in sex work like taking suggestive photos or having sex on camera.
So yes, it's completely normal to find humans under the age of 18 sexually attractive. It's just that we expect people to abide by the laws protecting them anyway, like we expect people to control all sorts of other urges they may have.
The internet is really, really great (Score:1)
Re: (Score:2)
Sure... it's the laws against child porn that are the problem. Why, if it was just legal to distribute and possess they'd catch all the producers, right away, yeah?
Get real. This is practically and admission that you want easy access to kiddie porn.
Re: (Score:2)
Re: (Score:2)
Neither possession nor distribution are in any way harmless. The production of that kind of content is necessarily harmful. It is only produced because there is demand. Distribution creates availability and availability creates demand. Distribution, then, is necessarily harmful. Possession is at the end of that chain, so it is also necessarily harmful.
This is always about child protection. This has absolutely nothing to do with free speech. It's more than a little disgusting that you'd even try to equ
Re: (Score:2)
The production of that kind of content is necessarily harmful.
Not true, I have definitely read articles that state cartoon versions are also illegal in some places. And with CGI it may become very realistic. I don't see even that type becoming acceptable.
Re: (Score:2)
Yes, it is. I address that very thing in my post.
Re: (Score:1)
Re: (Score:1)
Don't be absurd. By your logic we need to ban small-breasted porn stars to save the children. And while we're at it we need to ban even disagreeing with CP laws. The more rage and disgust the better right? This purity spiral has gone too far already.
Re: (Score:2)
This purity spiral has gone too far already.
We're talking about child sexual abuse. Normal people have a serious problem with that. There's something seriously wrong with you pedo freaks.
Re: (Score:1)
meh
>Normal people have a serious problem with (child sexual abuse)
I have a real problem with it too, but I have a different perspective. Instead of hating pedos I love children, and want them to be safe. And the best way to assure their safety is not a witch hunt. I know it feels good to have an out group, but the policies you support are ineffective to your claimed goal. So which is it? Would you rather protect a child or destroy a pedo? If you want to protect children the best thing to do
Re: (Score:2)
Fuck of pedo scum. We all read your post. Try to walk it back all you want, but your pro-kiddie porn manifesto is just a few posts up thread for all to see.
Re: (Score:1)
I stand by every word.
>bile
Hatred is a sword with a poisoned haft.
Re: (Score:2)
"This is practically and admission that you want easy access to kiddie porn."
Or, equally likely, that you do. The most righteous are frequently the biggest offenders.
Re: (Score:2)
There is an interesting debate here. Some years ago an artist in the UK had an exhibition of photographs they took, one of which was of their own children at the beach with nothing on. The police decided to take no action, and it's hardly the first time nude children have been used in art, from album covers to classical paintings.
I think there is a good argument that people shouldn't publish photos of their children that their children may object to, either at the time or later in life. While guardians do h
Re: (Score:2)
Paedophilia, an attraction to pre-pubescent children, is fairly rare. But once they are pubescent and capable of reproduction, humans are wired to desire them. Before modern medicine, it increased the chances of genes surviving by passing them on as early as possible.
Now we are civilized, we recognize that children engaging in the somewhat risky business of sex is not a good idea. Most countries require them to be in their mid teens at least before engaging in sex, and 18 before engaging in sex work like taking suggestive photos or having sex on camera.
So yes, it's completely normal to find humans under the age of 18 sexually attractive. It's just that we expect people to abide by the laws protecting them anyway, like we expect people to control all sorts of other urges they may have.
i thought intelligent posts have long since been banned on slashdot...
Re: (Score:2)
Re: (Score:3)
It might not be all that popular, but it's fanbase is steady and unwavering.
Re: (Score:2)
How do you know?
Re: (Score:2)
The same way you do. It's not that complicated, nor is it much of a stretch to reach that conclusion. You can't walk away from your sexuality, even if it's "deviant".
Re: (Score:2)
IIUC, that was not Kinsey's conclusion. His conclusions (as I understood it) is that most traits are on a range, and many of them could alter their expression for convenience. It's true that in at least some cases he found extreme groups that were not flexible, but they were a small percentage.
Re: (Score:2)
but not moreso than any other form of porn, or do you have evidence that says otherwise?
You want steady and unwavering, look at men who talk about tits and ass. Half our Super Bowl commercials are inspired by it.
Re: (Score:2)
I claimed nothing outside of what I explicitly stated. The question was, "Why is there so much child porn?"
If you want to muddy the waters, feel free to do so. But that's your discussion. I'm not involved.
Re: (Score:2)
I would think it is, no real evidence of course just conjecture. I think it is more unwavering because it is so socially unacceptable the only people who do it are on the ones that can't help themselves. I think its like being gay was in the past, it was totally unacceptable, now people may simply give it a go, and nobody cares.
Re: (Score:2)
Is there? I don't know never looked, never plan to look. Even if only 0.01% (1 in 10 thousand) of the approximately 4.7 billion people on the internet doing it that's still 47,000 people doing it.
OnlyFans is sports, right? (Score:2)
Re: (Score:3)
Is there proof of this? (Score:3)
"While the amount of [child sexual exploitation (CSE)] online has grown exponentially"
Citation needed and a citation that can't be explained by improved detection methods finding activity that was already out there or the projected or otherwise hypothesized increase of material and which is corrected for population growth.
There doesn't seem to be any sign of this on user driven content sites. There might be some teenagers slipping through among some of the younger and barely legal content (mostly posting themselves) but there is very little indication I've seen of actual children being posted or if it were being tolerated by other users. Is this just more of the pretending PH was a child porn haven thing again?
Re: (Score:2)
I assume they are talking about anyone under 18 posting risky photos of themselves online. Which I am sure probably has been skyrocketing.
Re: (Score:2)
All you need for that statement to be true is a proper exponent. Easily achieved, given "exponential growth" of the internet itself. It's a provocative statement that says nothing and is certainly claimed without evidence.
Yes, teens have been contributing by "victimizing" themselves for a while now and distributing the content to others of similar age. This might meet the legal standard for CP in some jurisdictions, but it is hardly "child exploitation". For that there needs to be exploitation.
Please tell me it was going to be called "Twatter" (Score:2)
Re: (Score:2)
"Curses! (Score:3)
Child porn on Twitter? (Score:2)
Isn't a core property of pedophilila that you try to stay away from attention and controversy?