AI Platform Generated Images That 'Could Be Categorized as Child Pornography,' Leaked Documents Show (404media.co) 189
404 Media: OctoML, a Seattle-based startup that helps companies optimize and deploy their machine learning models, debated internally whether it was ethical and legally risky for it to generate images for Civitai, an AI model sharing and image generating platform backed by venture capital firm Andreessen Horowitz, after it discovered Civitai generated content that OctoML co-founder Thierry Moreau said "could be categorized as child pornography," according to internal OctoML Slack messages and documents viewed by 404 Media.
OctoML has raised $132 million in funding, and is an AWS partner, meaning it generated these images on Amazon servers. "What's absolutely staggering is that this is the #3 all time downloaded model on CivitAI, and is presented as a pretty SFW model," Moreau, who is also OctoML's VP, technology partnerships, said in a company Slack room called #ai_ethics on June 8, 2023. Moreau was referring to an AI model called "Deliberate" that can produce pornographic images. "A fairly innocent and short prompt '[girl: boy: 15], hyperdetailed' automatically generated unethical/shocking content -- read something could be categorized as child pornography," his Slack message added.
OctoML has raised $132 million in funding, and is an AWS partner, meaning it generated these images on Amazon servers. "What's absolutely staggering is that this is the #3 all time downloaded model on CivitAI, and is presented as a pretty SFW model," Moreau, who is also OctoML's VP, technology partnerships, said in a company Slack room called #ai_ethics on June 8, 2023. Moreau was referring to an AI model called "Deliberate" that can produce pornographic images. "A fairly innocent and short prompt '[girl: boy: 15], hyperdetailed' automatically generated unethical/shocking content -- read something could be categorized as child pornography," his Slack message added.
If a machine makes images.. (Score:3, Interesting)
.. but no children are harmed, what's the problem?
It's just pixels
Re: (Score:3, Insightful)
Most of us don't want to promote this behavior in our adult population, no matter how much technology can sanitize it.
Re:If a machine makes images.. (Score:5, Insightful)
That's the same logic as saying violent computer games make people violent.
Pedophiles exist. They always have and they always will. There's no way of making that not be the case. What we CAN do is prevent them, either by guidance or punishment, from harming children. If they can get their rocks off by looking at AI hallucinations is that not infinitely better than doing so by looking at actual children?
Re:If a machine makes images.. (Score:4, Interesting)
I don't really put too much stock in logic, because it can be flawed. I'd rather see data on the subject. And I suspect that the causes behind pedophilia are much different than violence. And that the reasons that ordinary people engage in violent video games are vastly different than the reasons that pedophiles have to look at explicit images of children.
Re:If a machine makes images.. (Score:5, Interesting)
Okay, then how about this: There is zero scientific evidence that the consumption of pornography causes sexual crimes. The bias of the evidence is that having pornography and even prostitution lowers the rate of sexual assault and rape. Note: Like usual for social research, most of the studies suffer from various flaws.
Same deal with child porn. We put such a massive negative on it because of the harm caused to children in the creation of it, to the point that we punish even the "customers" because, at least theoretically, they're supporting the creation by reimbursing the producers in some way. Whether money, shared porn(I've read that there are 'rings' where to get in you have to share your own unique images), whatever.
There's every possibility that access to ethical "child porn" can help those with pedophilia control their urges. Another "ethical" source would be the copyright expired imagery that one of the nordic countries has in their national library. While children were harmed for that, said children lived their complete lives before the release of the imagery, and generally speaking their direct heirs have passed as well.
I'm libertarian leaning. I think that the bar to make something illegal that does not cause any direct harm, does not generate direct victims, should be rather high.
Driving while intoxicated is illegal because we can point out the thousands of victims per year. But drinking is mostly NOT illegal, because you don't actually have to drive while doing it.
Re: (Score:2)
Re: (Score:2)
How is that moral stance working with addiction in general, or for that matter with pedophilia?
This is how bad public policy happens. We can't have needle exchanges and opioid distribution locations because it would be encouraging the behavior, so we let the invisible hand handle it. Same deal with the pedophiles, really. Unfortunately the invisible hand is diddling people's children and causing permanent psychological harm.
Re: (Score:3, Insightful)
the main motivator of past legislation: how we as a society do or don't condone that kind of content, so from a moral perspective.
That is an appalling reason for the police to be empowered to arrest people, and completely incompatible with the concept of a free society.
Re: (Score:3)
"For that there is scientific data available..."
Scientific data about what? Here's the abstract of the "example" you provided:
"The dramatic increase in child pornography offenses over the past 10 years is directly related to the availability of such material on the Internet. Child pornography can be found on the Web, in newsgroups, and on peer-to-peer networks (the most common source at present). Offenders are a heterogeneous group, with different motivations and levels of risk. The possibility of crossove
Re:If a machine makes images.. (Score:4, Interesting)
Re: (Score:3)
First, note that while I spoke more generally initially, with "same deal with child porn", I was pointing out that it applies equally to CP.
For the study you cited: [nih.gov]
Possession of child pornography without a history of contact offenses does not appear to increase the risk of future contact reoffending.
Right in the abstract. Possession does not increase contact offenses.
That's supporting my position that there isn't evidence that consumption(possession) of CP increases the odds of them actually assaulting a child.
And legislation intended to enforce social morals generally suck. That's how we got anti-gay laws, segregation, bad drug laws, and
Re: (Score:2, Redundant)
Re: (Score:3)
First of all its anecdotal as you say second as the OrangeTide says its logic (ok human ilogic), the kind is twisted to justify any opinion. By the same logic I could say we shouldn't speak about it at all because perhaps it will give people ideas.
Thirdly pedophilia is hardly a unusual kink, Its been around for probably since the beginning of humans, for millennia people have been marrying 12 year olds.
Look I am not certain whether is increases child abuse or it decrease it, but due to the lack of actual
Re: (Score:3)
I have to say I'm left scratching my head at seeing a comment voted up this much that starts with, "I don't really put too much stock in logic".
Re: (Score:2)
Extrapolating an answer from initial position via a chain of logic is more susceptible to error than collecting actual data that gives the answer.
Re: (Score:2)
"I don't really put too much stock in logic, because it can be flawed."
Look who's talking.
"And I suspect that the causes behind pedophilia are much different than violence."
Bold.
"the reasons that pedophiles have to look at explicit images of children"
Pedophiles do not "have to look at explicit images of children", nor is that what this article is about.
Re: (Score:2)
I suspect that might be a tad risky to research.
Re: (Score:2)
By making a quick search on Google scholar, I found very little in term of conclusive research. There are some papers that say it is harmful, but I saw almost nothing in terms of numbers.
To me it looked like the same level of quality as the "violent video games" argument, that is, not much.
Re: (Score:2)
Because they want can beat someone up but not get laid.
Re: (Score:3)
There is no evidence that viewing CP turns people into sexual predators.
The little evidence that exists suggests the opposite: That CP provides an alternative outlet.
The same is true of "normal" porn. It is often used by incels who have no relationships with real women. Except with pedophiles, the lack of relationships with real children is exactly what we want.
Policy should be driven by objective evidence, not your misguided sense of morality.
Re: (Score:3)
AI has no original capabilities to create anything. It all comes from the training set.
So the AI CP is sourced from real CP. You still ok with that?
And as far as "Viewing children being harmed prevents children being harmed" goes.. WTF? Those pedophiles viewing CP have created a market that encourages the creation of more CP. Their content didn't appear out of the void. It's either real kids being abused or it's now AI kids being abused created from real kids being abused used as training data.
Take
Re: (Score:3)
So the AI CP is sourced from real CP. You still ok with that?
That is pretty unlikely. Unless you think there is CP on the open Internet? What is far more likely is that faces and non-porn pictures of children get merged with body elements from adults and the adult elements get scaled down. That is well within the capabilities of generative AI.
Re: (Score:2)
But that's not CP and would not appeal to a market targeted by CP.
Why would "adult elements get scaled down"? Why would an AI know to do that? Does it shave the pubes too?
"That is well within the capabilities of generative AI."
But not the training of AI, unless it is trained in that manner (which is the point you are disagreeing with).
Re: (Score:2)
You seem to think that AI art tools are compositors, that they have a database of images and just sort of cut and paste them together.
That's not how it works.
These diffusion engines work basically in the opposite of image recognition. Start with a a chunk of static, and whatever kinda vaguely like some aspect of the prompt, try to make it look more like that aspect of the prompt. Now, it's not "one word at a time" (or even "one token at a time") - tokens interact (e.g. "river bank" won't give you a financ
Re: (Score:2)
(Then again, there's probably enough pictures of naked babies and "embarassing family pics" out there that it might understand that kids don't have pubes. Dunno. Again, feel free to try if you're into that sort of thing)
Re: (Score:2)
But that's not CP and would not appeal to a market targeted by CP.
Why would "adult elements get scaled down"? Why would an AI know to do that? Does it shave the pubes too?
"That is well within the capabilities of generative AI."
But not the training of AI, unless it is trained in that manner (which is the point you are disagreeing with).
The best thing is to spend a few hours fucking around with AI image generators yourself.
Here is a prompt I would start with..
"clear lake, woods, castle, mountains"
How is it possible that not only did it draw a castle surrounded by trees and mountains... when you look into the lake you see a reflection of the castle and surrounding environment? Nobody programmed the material properties of water or gave it algorithms for computing fairly convincing reflections. Ditto for all the other elements in the scene
Re: (Score:2)
Not quite. It should, theoretically at the very least, be able to extrapolate what a naked young human looks like by knowing what clothed young humans, clothed older humans, and naked older humans all look like. The differences aren't THAT great.
Re: (Score:2)
Was looking for this comment, surprised to was from you.
Would really "like" to see an example, even though "like" will be misinterpreted. I would like to know how an AI generates images that would be reasonably interpreted as CP and how people that claim it does can make the judgement. I am suspicious of bad faith here. It is possible I would think, but how does an AI know what a child's private parts look like? Who's making the training sets?
Re: (Score:2)
AI has no original capabilities to create anything. It all comes from the training set.
Image generators are pseudo random fields of noise mixed with context provided by the model. It's basically a version of monkeys banging on keyboards with far greater odds of producing something mostly coherent.
When you look at AI imagery concepts such as global illumination, shadows, lighting, reflections, behaviors of solids and liquids, materials and artistic styles are all things the AI figured out and is able to apply. Ditto for structures and variations of human forms.
So the AI CP is sourced from real CP. You still ok with that?
It knows what a human, young hu
Re: (Score:2)
Most of us don't want to promote this behavior in our adult population, no matter how much technology can sanitize it.
By that same logic, we should ban violent video games out of fear that they make kids violent, too. But science has pretty thoroughly disproven that way of thinking, so long as you avoid meta-analysis papers (which can pretty much prove anything you want to prove just by choosing the right combination of papers). Violent kids gravitate towards violent games, but playing violent games doesn't make normal kids become violent.
In much the same way, one would expect people who are attracted to little kids to s
Re: (Score:2)
There is no scientifically sound data that indicates this "promotes" this behavior. For other sexual activities, pornography always leads to significant less violence and significant less rape. There are good reasons to think this would be the same here.
Re: (Score:2)
what behavior? and why limited to "our adult population"?
Re: (Score:2)
why limited to "our adult population"?
I wasn't aware that 8 year olds can be pedophiles.
Re: If a machine makes images.. (Score:2)
Re: (Score:2)
Re: (Score:2, Insightful)
Bad comparison. We're awash in data that clearly shows that low gun ownership first world societies have drastically lower homicide and gun violence rates https://en.wikipedia.org/wiki/... [wikipedia.org] . No such data exists for the other examples.
Might wanna read articles before you link them. (Score:4, Informative)
We're awash in data^H^H^H^H false-assumptions based on ignoring the data
There we go. Fixed that for ya.
Bad comparison
Wrong. 100% spot on comparison because guns don't kill folks: people do. BTW, This data? [wikipedia.org] (same link you sent) has no such clear evidence whatsoever. It's all over the place. Did you even look at it or did you just fire off the first link you got from Google? Iceland and Qatar are in the top 10 fewest deaths but both with high gun ownership (30 and 19 per 100k respectively). Why is Jamaica at a sky-high 35.22 deaths/100k (well above the USA, for example) but a much lower gun ownership rate of 8 versus the USA at 120?
Your cited source can be used to make all kinds of not-gun-control-friendly assertions. Certainly it doesn't even come close to proof (or even decent evidence) of your assertion that low gun ownership correlates to lower homicide rates. It does seem to assert that you are a big anti-gun partisan who doesn't care how weak or sloppy your references are and you just wanted to pipe up with some anti-gun BS rhetoric.
You already look extremely off-base with the not-evidence you've cited. Maybe you should read a summary of the National Academies study Data on Firearms and Violence Too Weak to Settle Policy Debates; Comprehensive Research Effort Needed [nationalacademies.org] here is the study [nationalacademies.org], too. The conclusion is "The NRC review found insufficient evidence to draw a conclusion about the causal relationship between gun prevalence and violent crime." Maybe quick misinterpreted data by a known red-armband partisan beats an actual study, but not in my universe.
Re: (Score:2)
Re: (Score:3)
Who is hurt by AI generated child porn?
All of us. The images themselves are sickening and doesn't make sexualized images of children any less so because they were computer generated.
Is it ok to show xrated movies to kids? No?
What if the xrated movies are computer generated? Is it now ok? Why would it be?
Re: (Score:2)
Is it ok to show xrated movies to kids? No?
What if the xrated movies are computer generated? Is it now ok? Why would it be?
This is a bogus analogy because child porn isn't porn for kids, it's porn depicting children for the pedophile "demographic" (for lack of a better term).
The logical reason for not allowing CG kiddie porn isn't that it's gross and offensive, because those are subjective judgements. There's no shortage of mainstream entertainment where it is implied that underaged sex has taken place (perhaps you've heard of Stephen King's IT?). Instead, the reason it's bad to have CGI child porn is because its existence gi
Re: (Score:2)
If it was children generating and looking at the images, you would have a point.
Re: (Score:2)
So, it goes into the same category as alcohol, tobacco, and X-rated movies. Not for children. (and they'll do it anyway).
Re: (Score:2)
Who cares what is "sickening" to you, your posts are sickening to us all.
CP is not intended to show to a child audience.
"What if the xrated movies are computer generated? Is it now ok? Why would it be?"
Yes, it is OK to computer generate "xrated" movies. Preventing kids from watching them solves the problem, computer generated or not.
Re: (Score:2)
.. but no children are harmed, what's the problem? It's just pixels
Puritans are frightened by nudity. Nude = pornography is a uniquely western take on porn. It's also absolutely silly how wound up some people get about it. It doesn't matter if the nudity is real or imagined. You hear people screaming child-porn all the time at any anime with nudity, or gods forbid, fan-service in it. Cartoons piss people off.
I've always thought the people that get so upset about it probably need to adjust their sexual filters. If you get excited looking at pictures of kids nude, it's a you
Re: (Score:2, Interesting)
Who are all,these people screaming about anime?
Can you provide some examples of puritans afraid of anime nudity and therefore AI generated CP which was trained on real CP is ok?
I can't wait to see the twisted logic you come up with to make this ok beyond where you already started.
Re: (Score:2)
which was trained on real CP
Again, you have zero indication for that.
Re: (Score:2)
Of course I do from basic logic and what we know of how these LLM work.
They have zero capacity for original creation. Absolutely everything they know comes from being fed in, normally through training.
If they produce CP then they know about CP because they were trained on CP.
I'm open to discuss how an LLM could generate CP if it had never seen any but that's going to be a high bar.
Re: (Score:2)
Initial data could be:
pics of kids
pics of adults
pics of naked kids (which I think is RIDICULOUS to refer to as child porn since every parent ever has these pics)
pics of naked adults
adult penetration
I doubt that's that big of a leap. To a machine a penis is a penis and a vagina is a vagina. I really doubt they have any sort of delineation between adult and child when it comes to image generation of a similar nature other than "this one's a little shorter a chubbier, this one's a little taller and skin
Re: (Score:3)
Of course I do from basic logic and what we know of how these LLM work.
You do not seem to have these capabilities with regards to the current question.
If they produce CP then they know about CP because they were trained on CP.
That is complete nonsense. Stop anthropomorphizing these machines.
Re: (Score:2)
Ok so no one important who has any power or influence over anything anywhere.
Re: (Score:2)
Indeed.
that is an matter for the courts and in criminal c (Score:2)
that is an matter for the courts and in criminal court you have the rights to
Discovery (aka demand the source code and logs)
face your accuser (think speeding tickets asking about the Calibration Records / asking what the operator was doing that day )
also the state must trun over all docs to your team if they try to say that some NDA says they can't or that your team needs to pay all kinds of fees / costs then that is an Brady Violation!
Re: (Score:2)
.. but no children are harmed, what's the problem?
It's just pixels
If child porn from generated AI and from real children are indistinguishable, then a practically enforceable law must declare both to be either legal or illegal. Even if generated images are watermarked in some way, it's likely that real images could be made to pass the same watermark tests.
Re: (Score:2)
If generated porn was indistinguishable, why would anyone every not use generated porn to make money the AI stuff would be much cheaper/easier/safer to produce.
The only reason to use real children would be because you enjoyed doing so, but all commercial incentives would be removed.
You could still prove child abuse by finding the original child.
Re: (Score:2)
Indeed, https://getimg.ai/ [getimg.ai] allows you to create whatever you want.
Just ask for very, very young, not '8 year-old'.
News this is not.
Re: (Score:2)
Re: (Score:2)
You'd be fine with this in the US and other places. In the US you have to have a victim for an image to be considered child porn. Other places like Australia, consider fictional depictions to also be classified as child porn. So, Bart and Lisa Simpson (as minors) performing sexual acts could be classified as child porn.
Re: (Score:2)
That's because what like 90% of the people training models for CivitAI train their models with...
These aren't models made by e.g. Stability. They're made by randos on the internet.
Re: (Score:2)
So when AI generates me an image of a zebra on roller skates, that means the AI must have been trained with zebras on roller skates at some point?
Re: (Score:2)
Yes [pinterest.com]. Yes it does [etsy.com].
The same thing for a cat on roller skates [pinterest.com].
Re: (Score:2)
So you know for a FACT that that image was in the training set? Note that that is a drawing that is not in the style of photorealism. How about the lady with 3 arms and 6 double long fingers on her middle hand? Do you think it got that from a photograph it scanned?
Re: (Score:2)
Since we don't know what this software is being trained on, there is always the possibility a photo of someone's drawing was used. Think of how many pictures were and are being drawn for D&D. You don't think one of them couldn't fit your description?
Re: (Score:2)
OTOH, knowing how SD actually works, no. It just tends to create distorted anatomy with annoying frequency.
Re: (Score:2)
There is zero indication of that. It would not need any CP to be trained on and I would think that finding enough CP to serve as training data on the open Internet should be pretty much impossible these days.
Re: (Score:2)
but children *were* harmed. It was trained on something after all..
Yeah, it was most likely trained on adult porn and g-rated pictures of kids. Using AI for de-aging has been a thing for awhile now. There's absolutely no need for AI to have been trained on actual CSAM for it to produce a simulation of it.
Besides, the example given in TFS said the prompt was to produce images of teenagers. Due to normal variations in biological development rates, sometimes actual adults look like young teenagers, and vice versa. [reddit.com]
AI software shouldnt need this (Score:2)
It is well within the power of AI image generating software to generate a picture of a naked, flat chested woman with no pubic hair or a little girl with clothes on. Is it really such a stretch of the imagination for the software to then be able to make an image of a naked little girl?
On top of this, CP is awfully rare on the open internet nowadays. Unless they're scraping the dark web as well (which I doubt) where would they be getting these CP photos to model things on?
Basically, I just dont buy the idea
It's not AI (Score:4, Insightful)
It's the data. Garbage in / garbage out.
Of course poorly curated data sets have always been a serious issue in the AI/DL industry. And because we currently only only how to do quantity over quality when it comes to gathering data, means this problem is going to persist for a very long time. The dirty little secret of "big data" is that it's big like a landfill is big and similarly organized.
Like 90% of CivitAI models are used for porn (Score:2)
If you ask it to generate an image of a child, exactly what do you expect to get? Even the non-porn-targeted models still often have nudity in their training datasets.
The cite was almost unnavigable for non-porn purposes until they implemented filters. Even with them a lot slips through.
This stuff is why my kids only can use MidJourney (Score:2)
StableDiffusion & CivitAI are genies let WAY, WAY too far out of the bottle to do anything at all about. You'll never ever stop it now. Folks can just sit on what they have right this moment forever and the problem will never go away. Not sure what to do about it. We're teetering on 'thought police' here, but some thoughts... These weirdos exist. What can you do?
Re: (Score:2)
Re: (Score:2)
The important part is "protects kids". "works against these people" is rather irrelevant, at best.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
StableDiffusion & CivitAI are genies let WAY, WAY too far out of the bottle to do anything at all about. You'll never ever stop it now. Folks can just sit on what they have right this moment forever and the problem will never go away. Not sure what to do about it. We're teetering on 'thought police' here, but some thoughts... These weirdos exist. What can you do?
Make images involving Natalie Portman, petrified, in hot grits, so that at least rule 34 will become true again?
Interesting ethics question (Score:2)
This is definitely an interesting ethics debate. Religious moral superiority aside, are generated images really that bad? The reason it is illegal is that children are harmed. If none are harmed, then it should be legal.
If anything, the existence of easily available Ai content would massively reduce the amount of real children harmed because there would be no market for their content.
Re: (Score:2)
If computer generated (trained on real CP) is ok then are you cool with AI CP all around you in daily life?
TV ads, posters hanging in your friend's home, your son or husband having an AI 5 year old sucking an old guy's cock as a screen saver, your boss mailing out AI CP to show everyone at work how realistic AI has become today?
It's all AI generated and therefore ok?
I think not.
And remember LLMs have zero original creation capability. If an AI is generating CP then it was trained on real CP. How is that o
Re: (Score:2)
The core reason why CP is illegal is the damage to the person involved. No person, no damage.
Everything else creates a slippery slope pretty quickly because you're starting to judge what is going on in peoples' heads. If you start doing this, people like Stephen King should tread lightly because... have you read one of his books? If you judge people by what's going on in their head, you don't have to lock him up, you should lock him away where nobody can find him.
Re: (Score:2)
How do you know the CP (which in some cases is just an interpretation) was not created from medical images, for example? The prompts used apparently did not mention pornography at all. They asked for hyper-realistic images of humans of various ages.
Re: (Score:2)
"medical images" - are only so as long as they are used for the purpose they were taken - medical treatment or education about that subject etc. As soon as someone start distributing them or using them out of prurient interest - that becomes an abuse.
So this argument that AI might never have been feed CP in the training data but can still produce CP, isnt valid because so much of this is about intent. If the AI is being used to generate CP, than any image of a underage person used to train it becomes retro
Re: (Score:2)
I think adult porn is ok, but I am not ok with it being all around me in my daily life either. Or violence in movies, or sports.
Just because someone doesn't think it is bad for someone to enjoy something they don't like, doesn't mean they want to be surrounded by it constantly.
I don't think being gay is bad in any way shape or form, however I would still not be pleased if I was constantly being hit on by men, or was forced to watch men having sex.
Re: (Score:2)
If computer generated (trained on real CP) is ok then are you cool with AI CP all around you in daily life?
Considering that people flip out over one lesbian kiss between consenting adults in a movie, I don't think we're in any immediate danger of pedophilia (simulated or otherwise) being normalized by society. Harming children is a despicable act, and rightfully so.
Re: (Score:2)
Training (Score:3)
Re: (Score:2)
You take some fashion magazines with kids' fashion, you take a nudie catalogue, you let AI combine them...
Re: (Score:2)
I somehow doubt that an AI would be able to fashion boys' and girls' genitalia - including completely missing breasts in girls, and inability to retract foreskin in boys - from adult men/women and clothed children. But this "I somehow doubt" is precisely the issue here, since very few of us have seen any of the outputs - and noone at all sees any of the inputs.
Re: (Score:2)
I mean, nobody said the anatomy was correct.
It could be something as simple as child faces grafted on to adult bodies. Hence the "could be considered CP" ambiguous statement.
Re: (Score:2)
But from a strict legal perspective, nude images of someone who is 17 years and 364 days old is considered "child" porn, because the model isn't 18 or older (and in some countries, I think the age limit may be even higher than that).
Here's your song [youtube.com].
(I especially love the rats puking at the end of the song)
Re: (Score:2)
And how do you know this level of detail was there?
Re: (Score:2)
I seriously don't know how "anatomically correct" the pictures are. I haven't even seen any of the output, what's linked in the article is pretty much what you'd probably find in some fashion catalogue.
Getting an AI to give girls flat breasts would be easy, just teach the AI that girls look like boys "up there". How you model children genitalia without knowing the difference from adult genitalia is a tricky question, though, I give you that. They probably just adjust the size and create something out of som
Ball Dropping (Score:2)
You know... (Score:5, Funny)
Pics or didn... erh...
Never mind, I believe you.
But was it, really? (Score:5, Informative)
Whenever I hear somebody say "could have been" in connection with pornography, I get skeptical. Religious fundamentalists are so crazy and perverted that as far as they're concerned, just about anything "could have been" porn.
For context: one of my buddies had the police knock on his door because at the pub he showed us a short video of his two little kids playing in the bath. Apparently, somebody who wasn't even being shown the video decided this was child pornography.
There was justice, eventually. He simply pulled out his phone and showed the cops the video. And we figured out who the a-hole was and got them permanently barred from the local.
Re: (Score:2)
Religious fundamentalists are so crazy and perverted that as far as they're concerned, just about anything "could have been" porn.
Indeed. These people just try to suppress anything that their deranged priests tell them too (while these priests more often than not engage in some quiet child raping), and blatantly lying to non-believers is obviously ok.
All art and all news are about something illegal (Score:2)
Open your favourite book of fiction. Is it crime? People kill each other. History? People kill each other by millions, destroy countries and take slaves. Romcom? People cheat and cuckold. The storyline in any prose is illegal behaviour of some kind.
Watch news. War here, war there (read: people killing each others). School shootings. Financial crime and feud. Celebrities having affairs (which used to be crime and still are in some countries).
Most content that mankind has generated or keeps generating is abou
Why did they prompt that? (Score:2)
Re: Why did they prompt that? (Score:2)
Re: (Score:2)
Yes, because Slashdot supports html code in the comments, like <i>this</i>.
AI has no agency (Score:3)
This is my weekly reminder that AI has no agency. If someone instructed the AI to generate an image with something awful, so be it, shame on them. I don't think it is the job of the tool to police the user, it is the other way around. Photoshop doesn't detect if users make fake news, Notepad doesn't detect if users compose a death threat, JPEG encoders don't enforce copyright infringement, Tesla Autopilot does not detect if you are violating your parole, etc.
Consider:
"Leaked documents show that hammer hit fingers." should really be "User hit finger with hammer, hammer did not try to stop them."
the price just went up... (Score:2)
Decide: (Score:2)
Do you want to block approved content or permit illegal/offensive content?
There are a few non-criminal reasons for wanting nude images of children, primarily medical. A series of images or an animation of normal and/or abnormal development, for instance.
You're probably not going to lose clients blocking that (unless you have a med school as a client), but generalise the problem and you get the idea. No matter what you do, you're going to end up blocking good stuff or allowing bad stuff.