OpenAI Considers Allowing Users To Create AI-Generated Pornography (theguardian.com) 108
OpenAI, the company behind ChatGPT, is exploring whether users should be allowed to create AI-generated pornography and other explicit content with its products. From a report:While the company stressed that its ban on deepfakes would continue to apply to adult material, campaigners suggested the proposal undermined its mission statement to produce "safe and beneficial" AI. OpenAI, which is also the developer of the DALL-E image generator, revealed it was considering letting developers and users "responsibly" create what it termed not-safe-for-work (NSFW) content through its products. OpenAI said this could include "erotica, extreme gore, slurs, and unsolicited profanity."
It said: "We're exploring whether we can responsibly provide the ability to generate NSFW content in age-appropriate contexts ... We look forward to better understanding user and societal expectations of model behaviour in this area." The proposal was published as part of an OpenAI document discussing how it develops its AI tools. Joanne Jang, an employee at the San Francisco-based company who worked on the document, told the US news organisation NPR that OpenAI wanted to start a discussion about whether the generation of erotic text and nude images should always be banned from its products. However, she stressed that deepfakes would not be allowed.
It said: "We're exploring whether we can responsibly provide the ability to generate NSFW content in age-appropriate contexts ... We look forward to better understanding user and societal expectations of model behaviour in this area." The proposal was published as part of an OpenAI document discussing how it develops its AI tools. Joanne Jang, an employee at the San Francisco-based company who worked on the document, told the US news organisation NPR that OpenAI wanted to start a discussion about whether the generation of erotic text and nude images should always be banned from its products. However, she stressed that deepfakes would not be allowed.
Because Money (Score:4, Funny)
There was too much money left on the table. Isn't this also why Sony finally gave in to porn on Blu-Ray?
Re: (Score:2, Insightful)
It's pointless them banning it anyway, because there are already plenty of other AI tools to generate porn.
Re: (Score:3)
It's pointless them banning it anyway, because there are already plenty of other AI tools to generate porn.
Maybe. But if it costs $100 million to train a really good AI, that's not something an open source project is likely to pull off.
Re:Because Money (Score:4, Insightful)
There are many models for this already that work fine. And porn as a field has a hilarious amount of IT nerds experimenting with how to make it.
Re:Because Money (Score:5, Interesting)
I dabble in home-made Generative AI, mainly Stable Diffusion. Emphasis on fantasy/sci-fi fusion images (animals, military vehicles, spaceships, Steampunk, Cyberpunk, etc.)
It's become increasingly difficult to avoid free models which contain NSFW stuff. You stumble upon them everywhere.
I found myself having to add more and more negative prompts to make sure images don't contain scantily-clad (or outright naked) females.
They're that pervasive.
So, no, not difficult at all to find and generate.
OpenAI are reading the writing on the wall and try to stay ahead of the wave. If not, lesser-prude entities will make money hand over fist out of AI pr0n.
Re:Because Money (Score:5, Insightful)
Yeah, it's honestly annoying how pervasive porn models are. I think like 90% of people using SD are using it to make porn or waifus.
As I was driving home last night... you know how when Szilard was walking home after hearing Rutherford be dismissive of nuclear chain reactions, stopped at a stop light, and suddenly the idea of neutron chain reactions hit him like a ton of bricks and it was like the world peeled away around him? I had one of those moments when I realized that all of the components are now out there:
Story Diffusion (e.g. "Open Sora")** Generate realistic videos of anything, from input commands.
MotionCtrl (ability to direct objects and camera positions in AI vids)
Open LLMs (countless), trained to roleplay multiple characters, control scenes, and to issue external commands
VR headsets (with a mic + voice recognition and/or hand controls; or alternatively, LMMs can accept voice natively)
Optional: FPV cameras on the headset, plus depth map model (Midas, etc) background stripping, plus vid2vid to integrate the user's body into the scene
And... remote-control sex toys that accept external commands.
You can see what someone is surely going to put together from these pieces over subsequent years: a porn holodeck. Where an open LLM or LMM takes user inputs (such as speech or motion) once every 1-2 seconds, determines character reactions, directs the next 1-2 seconds of video generation, generates audio, and when appropriate, issues commands to control... peripherals. It'll take quite a bit of compute power to handle the video generation all in realtime, but you can fan it out to multiple cards, so it's doable if you keep the resolution and steps down & just AI upscale the outputs and motion-interpolate between frames. The LLM/LMM wouldn't need a huge number of parameters for such a role - even something like LLaMA 8B as a base model should suffice well; there's no issues with running that in realtime on any semi-modern GPU.
I'm like 80% ACE so the whole thing sounds rather gross to me, but knowing how people are using SD already, you know this is... sigh, I walked right into the pun, didn't I?.... "you know this is coming." :P I guess it's good that the perpetually horny will have such an outlet.
** - Story Diffusion is "out". but they haven't released the video-generation half of the model yet, only the keyframe-generation "comic book mode", but they reportedly plan to release the whole thing; there's also less advanced models like Stable Video Diffusion and the like (which MotionCtrl was designed for).
Re: (Score:2)
I'm like 80% ACE
I'm sorry... I don't know what that means.
Maybe this is applicable: "a person who does not experience sexual attraction"?
Re: (Score:2)
Ace is short for asexual .
Re: (Score:1)
Anonymous Coward Extraordinaire
Re: (Score:2)
Odd, I haven't ran into this at all, granted other than openai, I'ved only used 3 AI/LLMs, and only 1 was image generation which forbid any kind of adult content.
I'm sure there are ones that do, but I haven't ran into any.
Re: (Score:2)
Drop by user-created models at say civitai at some point and start browsing. We're talking user-created stuff, not things made by megacorps.
Re: (Score:1)
Re: (Score:2)
If they only did the same thing when the first sexually-explicit cave paintings appeared...or statues... or drawings on paper... or photos... or radio broadcasts... or video footage... or online websites... /sarcasm
Re: (Score:1)
Re: (Score:2)
It doesn't.
Also, there are already quite a few good open source AI models.
Re: (Score:2)
It already exist and there are a plethora of models out there some of them on the darker parts of the web because they are treading into the realm of punishable felonies. So far it has been static images but movies are coming already.
Seems like you have been under a boulder the size of Mount Washington the last two years.
It's just a question of time until a single person makes a movie that's rivaling the great classics like "The ten commandments" and "2001" for almost nothing. That would render actors if no
Re: (Score:2)
You're confusing foundational models and finetunes. You can make a finetune in a day or so for any specific task. Including porn.
Re: (Score:2)
It's pointless them banning it anyway, because there are already plenty of other AI tools to generate porn.
We didn't need computers to generate porn before. People were creating lifelike pornographic images with paint and canvas, pencil and paper, stone and chisel, and other various media. If this is about keeping children out of this then there's the same means to produce it as with adults, find someone with the ability and willingness to produce it. There's also cases of adults with child-like proportions posing for cameras to create the illusion of children in situations not socially acceptable for childre
Re: (Score:2)
Re: (Score:2)
It's pointless them banning it anyway, because there are already plenty of other AI tools to generate porn.
I doubt they were banning it out of a sense of morals, whether they're willing to do it depends on whether they think the extra cash outweighs the negative press/perception.
Re: (Score:1)
I guess they learned from betamax.
Re: Because Money (Score:2)
Generating these things locally has almost always been possible. There are also companies that specialize in generating adult images and text. (Likely with their own local models.)
Re:Because Money (Score:5, Insightful)
There was too much money left on the table. Isn't this also why Sony finally gave in to porn on Blu-Ray?
You say that like it's a bad thing. That there's money to be made means you're satisfying a desire lots of people have (and are willing to pay for).
I get why companies do what they do. YouTube wants to appeal to a broad market and doesn't (or didn't) have a good way to cordon off racy content. OpenAI might have the similar problem. Personally, I'd prefer the LLM companies released G-, PG-, R-, and X-rated versions of their LLMs and let users select what they want. Personally, I'm selecting the G or PG version but that's just me.
On a pragmatic level, I have to wonder which is worse: actually hiring young women to have actual sex on camera or let AIs generate the video. This is the same argument we've had about AI-generated kiddie porn. Do we cause more harm by normalizing artificial kiddie porn or by outright banning it, knowing some people will just create it the old fashioned way?
Re: (Score:2)
By at least banning artificial kiddie porn, you are sending the message that this is a socially unacceptable thing. If you don't ban it, it's essentially giving the OKAY for this behavior. Sure, those are some what extreme outlooks, but do you see a middle ground to be had here?
As for hiring actual young people to have sex on camera versus AI generated, both probably have their place. Hiring adults to have sex on camera is a contract amongst consenting adults. AI generated of course doesn't hurt anyone (may
Re:Because Money (Score:4, Interesting)
By at least banning artificial kiddie porn, you are sending the message that this is a socially unacceptable thing. If you don't ban it, it's essentially giving the OKAY for this behavior.
Okay to what behaviour? Kiddy porn is banned because it's creation involves the abuse of children which by legal definition cannot consent. Who is being abused by generating a text prompt that spits out a kiddy image? WON'T SOMEONE THINK OF THE PIXELS!
People jack off to all sorts of weird things. I'm not one to judge. If someone wants to beat off to fake AI generated pictures of kids then seriously more power to them. If anything AI competes with ... potentially even reducing ... the actual abusive content out there. If these people's cravings are satisfied in this way it's one less abused kid video being traded online.
Just for fun look up child abuse stats in Japan and now how much lower they are despite the very VERY widespread prevalence and legality of underaged hentai. By saying that you think algorithmically generated pictures of underage people should be banned you're effectively creating a thoughtcrime.
Re: (Score:2)
I get what you are saying. Completely. You may even be correct that by allowing a "fake" gives someone an outlet and that sates the urge.
It could also lead to them wanting the real thing. I could see either scenario taking place.
Regarding Japan, just look up any criminal stat and it's likely to be lower then USA. We're #1 for incarceration after all.
Re: (Score:1)
It could also lead to them wanting the real thing.
Other media doesn't work that way, so why would this?
Re: (Score:2)
By at least banning artificial kiddie porn, you are sending the message that this is a socially unacceptable thing. If you don't ban it, it's essentially giving the OKAY for this behavior.
Okay to what behaviour?
You're okay'ing the behavior of creating and consuming pornographic images of children. Most people's reaction to that isn't "oh, well if you're only looking at porn involving artificial children, that's totally within society's norms!".
I don't have a well-formed opinion on whether such artificial images should be illegal (On one hand, there's the argument you make about it being victimless. On the other hand, "ick".) but it shouldn't be surprising that a business doesn't want their product to appear to
Re: (Score:2)
You're okay'ing the behavior of creating and consuming pornographic images of children.
Yes I am providing they involve no actual children it should be 100% okay because... again... It's a thought crime. You're literally punishing someone's imagination. Why is an imagination that has no further impact on others a crime?
Most people's reaction to that isn't "oh, well if you're only looking at porn involving artificial children
No. Most religious nutjobs who think their moral opinion should be imparted onto others despite no actual impact on others think that. The same backwards fuckwits who would support obscenity laws that prevent consenting adults doing what they want to each other in their own home
Re: (Score:2)
You seem to have misunderstood the point of my post. To my knowledge creating such images isn't generally illegal. I don't believe OpenAI or its users would be breaking any laws by doing so in most jurisdictions.
Despite that, it seems pretty obvious why OpenAI wouldn't want their model to be known as the go-to place for people looking to get their fill of (artificial) kiddie porn. I see what you're saying about the seeming contradiction between the taboo around pornographic material when ultra-violence
Re: (Score:2)
Kiddy porn is banned because it's creation involves the abuse of children which by legal definition cannot consent.
It's the only thing, that I can think of, where having media of the crime is also a crime.
I guess you don't live in the UK (or maybe the US either): possession of various 'how-to' manuals (e.g. how to build a bomb) are also illegal, and grounds for arrest and conviction for committing a crime.
Please note, I'm not saying this is 'right' (although I'll admit it's more 'complicated' than all knowledge should be freely available to everyone who asks for it), just that this is the current state of affairs.
Re: (Score:1)
Re: (Score:2)
Just pointing out that sooner or later most companies will choose money even if it conflicts with whatever ethics, morals, values whey espouse. Like so many companies that do business with monarchies and dictatorships with terrible human right records, they just can't say no to the money.
Re: (Score:1)
Re: (Score:2)
You're throwing around schoolyard insults and I'm the one who needs to grow up?
Present a cogent argument for your position and maybe someone will listen. Otherwise, go back to wanking off in your mom's basement.
Re: (Score:1)
There was too much money left on the table. Isn't this also why Sony finally gave in to porn on Blu-Ray?
Because the capability to easily have the AI create images of Hermione decked out in complete fetish slave gear and under the influence of whatever potion Harry managed to slip into her Butterbeer is just too tempting for some people. It might actually be hilarious if you could get get Ratcliff out of the frame. /s
Re: (Score:2)
Obviously. I am surprised it took so long. Well, maybe AI finally can make something that is actually reasonably good. Or not.
Re: (Score:2)
You talk about it as if this is hypothetical. In real world on the other hand, AI generated fake porn is plentiful already. And sold commercially.
Their issue is doing it better than competition, which they probably can.
Re: (Score:1)
Well, not needing to choose angles that hide fingers and toes is a great start. No idea why your mind went to illegal pornography instead. Projection?
Re: (Score:1)
In other news, how is mindgeek doing financially after actually supporting illegal porn of existing human beings?
Re: (Score:2)
Re: Really? ... (Score:3)
I don't get the whole revenge porn thing. If you released a photo of yourself on FB or socially, someone could already make porn of you, for malicious reasons, for lustful reasons, for random reasons. It feels like that has been possible for at least a few years.
Re: (Score:2)
I think it's more the lowering of requirements. In the past making a deepfake required some amount of knowledge of editing and skill to create and even then it's still difficult to make it convincing under any type of critical eye.
Now AI images are is not that hard to pick out for someone who knows what to look for but as it improves the ability to just download that FB photo, punch in a prompt and a couple minutes later you have something convincing to 80% of the populace is a scary proposition. Don't e
Re: (Score:2)
While it it does lower the requirement it also lowers the believably that it is you. If any idiot can create this then why would anybody believe anything is real, hopefully removing the social stigma of being accused of making porn. Hopefully it will also make real revenge porn meaningless because it will all be plausibly deniable.
Re: (Score:2)
If any idiot can create this then why would anybody believe anything is real
Hopefully it will also make real revenge porn meaningless because it will all be plausibly deniable.
Is there genre of "positive dystopia"? This will all be interesting soon.
Re: (Score:2)
It's because the financial tradeoff potential is huge.
The principal cost of porn production is the women ('actresses'). If you could generate AI women, you eliminate almost all the cost.
OnlyFans, which has even mediocre women making millions.
Personalized avatars + sexbots for lonely people, or those who've lost a loved one.
Generating esoteric fettish material - as mentioned in the OP, things like ultraviolent and niche porn.
The market there is huge. It would almost sublimate the existing porn market if the
Re: (Score:2)
Right. Faking the first moon landing took Stanley Kubrik and tens of millions of dollars, and even there they didn't get it quite right. The next fake moon landing, in five years or whatever, will just need a laptop computer and maybe a few Photoshop experts to polish it up.
Re: (Score:2)
The tsunami of legal actions over revenge porn, or deep faked porn of real people would be enormous.
It should be pretty easy to engineer the ChatGPT prompts to refuse to take an existing image (never mind the porn, think of copyright issues) or a request to generate an image based upon some real person.
Too hard to regulate appropriately? (Score:5, Insightful)
Re: (Score:3)
They'd be better off lowering enforcement. It's a potential disaster to officially allow this bc of US politics and the religious right.
As a matter of fact, the Very first issue about the Internet that Congress wanted to regulate was Obscenity. Remember the CDA Section 230 [cornell.edu] ?
It's by Sheer accident that the CDA protects a lot of speech online and shields providers for what they chose to allow. It was originally written with the Goal of strongly Incentivizing service providers to ban Obscenity and the li
Re: (Score:3)
At which time the genie is already out of the box and it will be impossible to stop.
Well you can't beat that (Score:2)
What does "NSFW" mean when ... (Score:3)
... your job is producing pornography?
Asking for a friend.
Re: (Score:2)
Anything that risk causing your head to be bashed in.
Remember to not disable the safety protocols on the holodeck or you might get your head bashed in with a giant rubber toy.
Re: (Score:3)
Worked for a data center catering almost entirely to adult websites. I got an earful when my boss walked in on me looking at a dress-shopping website instead of porn, until I was able to show him that the dress-shopping website was also a customer of ours.
Re: (Score:2)
Exactly the same as it did before, unless you're exclusively sharing it with colleagues in your field.
Only harm is to sex workers (Score:4, Insightful)
It will also greatly reduce the risk of "revenge" porn because a video of what appears to be a real person engaging in sexual activity will be assumed to be AI generated, rather than assumed to be real.
Re: (Score:2)
It will also greatly reduce the risk of "revenge" porn because a video of what appears to be a real person engaging in sexual activity will be assumed to be AI generated, rather than assumed to be real.
I don't disagree with you, especially on the first points in terms of displacing real abuse but this could be a hard statement to accept if you are personally the victim of this and deepfakes of you are being passed among your social circles. I can imagine this being a real issue in high schools and colleges. Even if the people know it's not real it can still be humiliating, it's a tricky issue.
Re:Only harm is to sex workers (Score:4, Insightful)
Deepfakes are a crime separate from the production of pornography, namely the illegal use of a person's likeness without permission, and in particular, in a disparaging way. That's what needs to be attacked, not the issue of whether it is pornographic. Do you want a deep fake of you eating worms? Or chopping off the heads of babies? Those aren't pornography.
Re: (Score:3)
True, very good point but that is a legal one. If the tools are freely and easily available this line does get blurred quite a bit. Revenge porn was not particularly a thing either until the advent of cheap and abundant video devices and a means to share it enabled it.
I remember the Pam Anderssen sex tape was such a big deal not because it was pornographic but at that time it wasn't really possible for such a leak to be able to spread so far and wide. Before the internet someone would have to mail you a
Re: (Score:2)
I wasn't recommending any particular policy and I think part of the issue is how would you even craft a policy of law around this, it hearkens back to the older phonography discussion of "i'll know it when I see it"
s/porn claim/libel slander etc claim, something I could always pass around, was always not-real-yet-humiliating
Yes those are recourses that exist but do they really help the victim? Libel and slander already have roadblocks (for good reasons) and they assume you know the perpetrator and are able to take them to court, cover the fee's and even at the end of that if you win what is the recourse? You could
Re: (Score:2)
While the internet is forever, it is also very big and any porn made of an average person will quickly be lost among a see of other porn. This will only add to that quantity, in fact why would you even need to look for "real" porn at all when you could simply generate the porn you want yourself. It will be safer to get porn, no need to worry that that person is underage, forced into it or not, you know they aren't they are not real. The only harm it is causing is the fact that you believe it is causing you
Re: (Score:2)
The only harm it is causing is the fact that you believe it is causing you harm.
Ehhh, if this is something that get's in your social circle even if it does get lost in the sea of the internet it's still not a good outcome to me, somebody in my opinion is harmed. Probably not go to prison but we should discourage. Like most other crimes it;s usually someone you know, not just just a rando.
But I can't see genie going back in so it'll be interesting social experiment of sorts. I wonder if AI generated porn will in fact take off bigger than real porn or just remain a niche like hentai,
Re: (Score:2)
At some point, you've got to look at the systemic effects of these things. This has broad reaching social and economic implications.
That said, there's no keeping this cat in the bag at this point... it's going to happen one way or the other.
Re: (Score:2)
The systemic effects are clear: The more porn available in any variant, the fewer sexual violence. This has been shown to happen time and again and it is entirely reasonable to expect this to work for _all_ types of porn.
Re: (Score:2)
That said, there's no keeping this cat in the bag
Please leave the poor cat alone. Here's some nice AI porn for you.
Re: (Score:3)
Obviously. Availability of porn is strongly linked to decreased rape numbers. This makes it very plausible that any type of AI porn will reduce the real thing being made. However, the cave-men that have a deep need to apply violence to any problem need to be convinced first that the problem is when this stuff is being made with real people, and that the availability is actually beneficial. And that is not very likely to happen because these people are not accessible to rational argument.
allowing users (Score:2)
That's all you need to know: the overlords are deciding what they will allow us to do.
Re: (Score:2)
So more juvenile deep fakes? (Score:2)
It's already become a problem where a juvenile will impose a real person's head (typically a female classmate) onto a naked body. Sometimes it's a completely fake nude body and other times it's someone else's real nude body.
The point being, this is increasingly making headlines and laws are slowly catching up to punish people that decide this is a good idea.
With that said, it's surprising to see OpenAI think this is some how a good idea. Pretty sure them getting sued by a victim or their parents will be com
Re: (Score:2)
The cat is out of the bag, privacy is dead. We can either deal with it responsibly as a society (ha!) or put our heads in the sand.
Not all nudity is pornographic (Score:2)
So, I'm watching some YouTube videos and among the suggestions was this documentary on cosmetic surgery. I believe it was about 90 minutes long, and from some news outlet I recognized, so I thought this might have been something shown on prime time TV. It started like a typical broadcast TV news program where it introduced a family as a kind of example for the broader issue. The patient of this cosmetic surgery was a teenage girl, but she wasn't looking for breast implants or a nose job like I first thou
Re: (Score:2)
I was exposed to a lot of computer files with people in the nude, some of which were clearly children. But bring this to the public for little cost and somehow this is a problem.
Even discussing this subject exposes you to smears, like a selective out of context quote I used to illustrate this point.
A modest proposal (Score:2)
Someone needs to generate AI porn of the board and execs of OpenAL.
Re: (Score:2)
Evidence please, I would say the amount of porn available in 1990 was much lower than 2019 but the suicide rate for the USA has been going down, yeah I know there could be other factors, I probably wouldn't accept it going up as direct evidence of more porn causing more suicide either, but that is kind of a broad statement with no actual supporting evidence,
https://ourworldindata.org/gra... [ourworldindata.org]
Re: (Score:1)
They simply don't know how to stop it (Score:1)
Certain types of images are illegal. (Score:2)
I'm sure they'll try to block it but I'm sure creeps out there will find ways around their blocking. AI porn sounds way too dangerous for them to get involved in.
Re: (Score:2)
Why do you think people who look at porn are creeps? I think the overwhelming majority of males and a large proportion of women have watched it.
That is a way do dehumanize people, take a perfectly normal, natural part of human nature, in this case attraction to the opposite sex and call it something bad.
It really amazes me that a desire to see someone naked is considered "creepy" while watching movies that depict people dying in horrible ways is OK, nobody says people going to watch a horror or an action mo
Re: (Score:2)
Huh? Read what I said. I am all in favor of REGULAR porn. Go look up what kind of porn is ILLEGAL, that's what I was talking about.
Doesn't Porn, like Gaming, Push the Technology? (Score:2)
Good. (Score:2)
I've always maintained that - given history - the first place AI will generate photorealism and video would be porn.
The only one who loses in this are women.
Re: (Score:2)
The only one who loses in this are women.
Actually, no. More porn availability is linked to less sexual violence.
Re: (Score:2)
The only one who loses in this are women.
Deep fakes aside, why? The guys are lusting after someone else, not you.
Oh. I get it. It's bad for OnlyFans creators.
Re: (Score:2)
At some point too much of almost anything is bad for you. Also why is it bad for women? Isn't it the mans body, and choice if they wish to watch porn especially if no one is harmed in the production, the only women who this is bad for are women who work in the porn industry.
Verification? (Score:1)
Wot, no Rule 34 yet? (Score:2)
People were probably having this conversation when Mediaeval woodcutters started making porn for printing on the first printing presses. You can see how totally unsuccessful we have been at achieving consensus in the intervening centuries.