Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
IT Technology

OpenAI Considers Allowing Users To Create AI-Generated Pornography (theguardian.com) 108

OpenAI, the company behind ChatGPT, is exploring whether users should be allowed to create AI-generated pornography and other explicit content with its products. From a report:While the company stressed that its ban on deepfakes would continue to apply to adult material, campaigners suggested the proposal undermined its mission statement to produce "safe and beneficial" AI. OpenAI, which is also the developer of the DALL-E image generator, revealed it was considering letting developers and users "responsibly" create what it termed not-safe-for-work (NSFW) content through its products. OpenAI said this could include "erotica, extreme gore, slurs, and unsolicited profanity."

It said: "We're exploring whether we can responsibly provide the ability to generate NSFW content in age-appropriate contexts ... We look forward to better understanding user and societal expectations of model behaviour in this area." The proposal was published as part of an OpenAI document discussing how it develops its AI tools. Joanne Jang, an employee at the San Francisco-based company who worked on the document, told the US news organisation NPR that OpenAI wanted to start a discussion about whether the generation of erotic text and nude images should always be banned from its products. However, she stressed that deepfakes would not be allowed.

This discussion has been archived. No new comments can be posted.

OpenAI Considers Allowing Users To Create AI-Generated Pornography

Comments Filter:
  • by El Fantasmo ( 1057616 ) on Thursday May 09, 2024 @10:04AM (#64459535)

    There was too much money left on the table. Isn't this also why Sony finally gave in to porn on Blu-Ray?

    • Re: (Score:2, Insightful)

      by AmiMoJo ( 196126 )

      It's pointless them banning it anyway, because there are already plenty of other AI tools to generate porn.

      • It's pointless them banning it anyway, because there are already plenty of other AI tools to generate porn.

        Maybe. But if it costs $100 million to train a really good AI, that's not something an open source project is likely to pull off.

        • Re:Because Money (Score:4, Insightful)

          by Luckyo ( 1726890 ) on Thursday May 09, 2024 @10:19AM (#64459583)

          There are many models for this already that work fine. And porn as a field has a hilarious amount of IT nerds experimenting with how to make it.

        • by ceoyoyo ( 59147 )

          It doesn't.

          Also, there are already quite a few good open source AI models.

        • by Z00L00K ( 682162 )

          It already exist and there are a plethora of models out there some of them on the darker parts of the web because they are treading into the realm of punishable felonies. So far it has been static images but movies are coming already.

          Seems like you have been under a boulder the size of Mount Washington the last two years.

          It's just a question of time until a single person makes a movie that's rivaling the great classics like "The ten commandments" and "2001" for almost nothing. That would render actors if no

        • by Rei ( 128717 )

          You're confusing foundational models and finetunes. You can make a finetune in a day or so for any specific task. Including porn.

      • It's pointless them banning it anyway, because there are already plenty of other AI tools to generate porn.

        We didn't need computers to generate porn before. People were creating lifelike pornographic images with paint and canvas, pencil and paper, stone and chisel, and other various media. If this is about keeping children out of this then there's the same means to produce it as with adults, find someone with the ability and willingness to produce it. There's also cases of adults with child-like proportions posing for cameras to create the illusion of children in situations not socially acceptable for childre

      • It's pointless them banning it anyway, because there are already plenty of other AI tools to generate porn.

        I doubt they were banning it out of a sense of morals, whether they're willing to do it depends on whether they think the extra cash outweighs the negative press/perception.

    • by Luckyo ( 1726890 )

      I guess they learned from betamax.

    • Generating these things locally has almost always been possible. There are also companies that specialize in generating adult images and text. (Likely with their own local models.)

    • Re:Because Money (Score:5, Insightful)

      by smoot123 ( 1027084 ) on Thursday May 09, 2024 @10:22AM (#64459585)

      There was too much money left on the table. Isn't this also why Sony finally gave in to porn on Blu-Ray?

      You say that like it's a bad thing. That there's money to be made means you're satisfying a desire lots of people have (and are willing to pay for).

      I get why companies do what they do. YouTube wants to appeal to a broad market and doesn't (or didn't) have a good way to cordon off racy content. OpenAI might have the similar problem. Personally, I'd prefer the LLM companies released G-, PG-, R-, and X-rated versions of their LLMs and let users select what they want. Personally, I'm selecting the G or PG version but that's just me.

      On a pragmatic level, I have to wonder which is worse: actually hiring young women to have actual sex on camera or let AIs generate the video. This is the same argument we've had about AI-generated kiddie porn. Do we cause more harm by normalizing artificial kiddie porn or by outright banning it, knowing some people will just create it the old fashioned way?

      • By at least banning artificial kiddie porn, you are sending the message that this is a socially unacceptable thing. If you don't ban it, it's essentially giving the OKAY for this behavior. Sure, those are some what extreme outlooks, but do you see a middle ground to be had here?

        As for hiring actual young people to have sex on camera versus AI generated, both probably have their place. Hiring adults to have sex on camera is a contract amongst consenting adults. AI generated of course doesn't hurt anyone (may

        • Re:Because Money (Score:4, Interesting)

          by thegarbz ( 1787294 ) on Thursday May 09, 2024 @12:52PM (#64460075)

          By at least banning artificial kiddie porn, you are sending the message that this is a socially unacceptable thing. If you don't ban it, it's essentially giving the OKAY for this behavior.

          Okay to what behaviour? Kiddy porn is banned because it's creation involves the abuse of children which by legal definition cannot consent. Who is being abused by generating a text prompt that spits out a kiddy image? WON'T SOMEONE THINK OF THE PIXELS!

          People jack off to all sorts of weird things. I'm not one to judge. If someone wants to beat off to fake AI generated pictures of kids then seriously more power to them. If anything AI competes with ... potentially even reducing ... the actual abusive content out there. If these people's cravings are satisfied in this way it's one less abused kid video being traded online.

          Just for fun look up child abuse stats in Japan and now how much lower they are despite the very VERY widespread prevalence and legality of underaged hentai. By saying that you think algorithmically generated pictures of underage people should be banned you're effectively creating a thoughtcrime.

          • I get what you are saying. Completely. You may even be correct that by allowing a "fake" gives someone an outlet and that sates the urge.

            It could also lead to them wanting the real thing. I could see either scenario taking place.

            Regarding Japan, just look up any criminal stat and it's likely to be lower then USA. We're #1 for incarceration after all.

            • by Anonymous Coward

              It could also lead to them wanting the real thing.

              Other media doesn't work that way, so why would this?

          • By at least banning artificial kiddie porn, you are sending the message that this is a socially unacceptable thing. If you don't ban it, it's essentially giving the OKAY for this behavior.

            Okay to what behaviour?

            You're okay'ing the behavior of creating and consuming pornographic images of children. Most people's reaction to that isn't "oh, well if you're only looking at porn involving artificial children, that's totally within society's norms!".

            I don't have a well-formed opinion on whether such artificial images should be illegal (On one hand, there's the argument you make about it being victimless. On the other hand, "ick".) but it shouldn't be surprising that a business doesn't want their product to appear to

            • You're okay'ing the behavior of creating and consuming pornographic images of children.

              Yes I am providing they involve no actual children it should be 100% okay because... again... It's a thought crime. You're literally punishing someone's imagination. Why is an imagination that has no further impact on others a crime?

              Most people's reaction to that isn't "oh, well if you're only looking at porn involving artificial children

              No. Most religious nutjobs who think their moral opinion should be imparted onto others despite no actual impact on others think that. The same backwards fuckwits who would support obscenity laws that prevent consenting adults doing what they want to each other in their own home

              • You seem to have misunderstood the point of my post. To my knowledge creating such images isn't generally illegal. I don't believe OpenAI or its users would be breaking any laws by doing so in most jurisdictions.

                Despite that, it seems pretty obvious why OpenAI wouldn't want their model to be known as the go-to place for people looking to get their fill of (artificial) kiddie porn. I see what you're saying about the seeming contradiction between the taboo around pornographic material when ultra-violence

        • So you want to use laws to tell society what is morally acceptable? That's just top-down control over culture. It's been a thing since the first king. I'll pass.
      • Just pointing out that sooner or later most companies will choose money even if it conflicts with whatever ethics, morals, values whey espouse. Like so many companies that do business with monarchies and dictatorships with terrible human right records, they just can't say no to the money.

      • Whatever sophomoric nitwit. Demonstrate one iota of original thought, you mindless repetitious meme-bot. You don't get what is going on at all. They can't control it, so they are trying to get us to accept that whatever their crazy ass uncontrolled Frankenstein does is just hunky-dory. And here you are, off on some idiotic 1990's Chicago school of economics economics rant. Grow the fuck up.
        • You're throwing around schoolyard insults and I'm the one who needs to grow up?

          Present a cogent argument for your position and maybe someone will listen. Otherwise, go back to wanking off in your mom's basement.

    • by Anonymous Coward

      There was too much money left on the table. Isn't this also why Sony finally gave in to porn on Blu-Ray?

      Because the capability to easily have the AI create images of Hermione decked out in complete fetish slave gear and under the influence of whatever potion Harry managed to slip into her Butterbeer is just too tempting for some people. It might actually be hilarious if you could get get Ratcliff out of the frame. /s

    • by gweihir ( 88907 )

      Obviously. I am surprised it took so long. Well, maybe AI finally can make something that is actually reasonably good. Or not.

  • by boggin4fun ( 1422043 ) on Thursday May 09, 2024 @10:15AM (#64459565)
    This strikes me as the company pursuing the best course of action when they determined that it is not financially feasible to put in place enough protections to prevent this kind of activity on their systems. So... instead of protecting against it, they allow it and make money from it. I'm sure there is nuance here, but this smells like PR spin on a business decision to me.
    • by mysidia ( 191772 )

      They'd be better off lowering enforcement. It's a potential disaster to officially allow this bc of US politics and the religious right.

      As a matter of fact, the Very first issue about the Internet that Congress wanted to regulate was Obscenity. Remember the CDA Section 230 [cornell.edu] ?

      It's by Sheer accident that the CDA protects a lot of speech online and shields providers for what they chose to allow. It was originally written with the Goal of strongly Incentivizing service providers to ban Obscenity and the li

  • ...or maybe you can
  • by WoodstockJeff ( 568111 ) on Thursday May 09, 2024 @10:28AM (#64459601) Homepage

    ... your job is producing pornography?

    Asking for a friend.

    • by Z00L00K ( 682162 )

      Anything that risk causing your head to be bashed in.

      Remember to not disable the safety protocols on the holodeck or you might get your head bashed in with a giant rubber toy.

    • by jeek ( 37349 )

      Worked for a data center catering almost entirely to adult websites. I got an earful when my boss walked in on me looking at a dress-shopping website instead of porn, until I was able to show him that the dress-shopping website was also a customer of ours.

    • Exactly the same as it did before, unless you're exclusively sharing it with colleagues in your field.

  • by joe_frisch ( 1366229 ) on Thursday May 09, 2024 @10:35AM (#64459631)
    I don't see any downside of AI produced porn other than the very general AI problem of it displacing human workers. In fact producing more extreme or currently illegal types of porn would destroy the illegal market for using real people to produce that sort of content. In addition the sooner people realize that videos are not in any way evidence that something real happened, the more protected will will be from political deep-fakes (which at the moment are extremely dangerous).

    It will also greatly reduce the risk of "revenge" porn because a video of what appears to be a real person engaging in sexual activity will be assumed to be AI generated, rather than assumed to be real.
    • It will also greatly reduce the risk of "revenge" porn because a video of what appears to be a real person engaging in sexual activity will be assumed to be AI generated, rather than assumed to be real.

      I don't disagree with you, especially on the first points in terms of displacing real abuse but this could be a hard statement to accept if you are personally the victim of this and deepfakes of you are being passed among your social circles. I can imagine this being a real issue in high schools and colleges. Even if the people know it's not real it can still be humiliating, it's a tricky issue.

      • by groobly ( 6155920 ) on Thursday May 09, 2024 @11:08AM (#64459723)

        Deepfakes are a crime separate from the production of pornography, namely the illegal use of a person's likeness without permission, and in particular, in a disparaging way. That's what needs to be attacked, not the issue of whether it is pornographic. Do you want a deep fake of you eating worms? Or chopping off the heads of babies? Those aren't pornography.

        • True, very good point but that is a legal one. If the tools are freely and easily available this line does get blurred quite a bit. Revenge porn was not particularly a thing either until the advent of cheap and abundant video devices and a means to share it enabled it.

          I remember the Pam Anderssen sex tape was such a big deal not because it was pornographic but at that time it wasn't really possible for such a leak to be able to spread so far and wide. Before the internet someone would have to mail you a

    • by CAIMLAS ( 41445 )

      At some point, you've got to look at the systemic effects of these things. This has broad reaching social and economic implications.

      That said, there's no keeping this cat in the bag at this point... it's going to happen one way or the other.

      • by gweihir ( 88907 )

        The systemic effects are clear: The more porn available in any variant, the fewer sexual violence. This has been shown to happen time and again and it is entirely reasonable to expect this to work for _all_ types of porn.

      • by PPH ( 736903 )

        That said, there's no keeping this cat in the bag

        Please leave the poor cat alone. Here's some nice AI porn for you.

    • by gweihir ( 88907 )

      Obviously. Availability of porn is strongly linked to decreased rape numbers. This makes it very plausible that any type of AI porn will reduce the real thing being made. However, the cave-men that have a deep need to apply violence to any problem need to be convinced first that the problem is when this stuff is being made with real people, and that the availability is actually beneficial. And that is not very likely to happen because these people are not accessible to rational argument.

  • That's all you need to know: the overlords are deciding what they will allow us to do.

  • It's already become a problem where a juvenile will impose a real person's head (typically a female classmate) onto a naked body. Sometimes it's a completely fake nude body and other times it's someone else's real nude body.

    The point being, this is increasingly making headlines and laws are slowly catching up to punish people that decide this is a good idea.

    With that said, it's surprising to see OpenAI think this is some how a good idea. Pretty sure them getting sued by a victim or their parents will be com

    • The cat is out of the bag, privacy is dead. We can either deal with it responsibly as a society (ha!) or put our heads in the sand.

  • So, I'm watching some YouTube videos and among the suggestions was this documentary on cosmetic surgery. I believe it was about 90 minutes long, and from some news outlet I recognized, so I thought this might have been something shown on prime time TV. It started like a typical broadcast TV news program where it introduced a family as a kind of example for the broader issue. The patient of this cosmetic surgery was a teenage girl, but she wasn't looking for breast implants or a nose job like I first thou

    • by sinij ( 911942 )

      I was exposed to a lot of computer files with people in the nude, some of which were clearly children. But bring this to the public for little cost and somehow this is a problem.

      Even discussing this subject exposes you to smears, like a selective out of context quote I used to illustrate this point.

  • Someone needs to generate AI porn of the board and execs of OpenAL.

  • They don't know how to control their AI. Controlling AI should be their number one priority. Brushing off their responsibility to control their creation like this is completely unacceptable.
  • I'm sure they'll try to block it but I'm sure creeps out there will find ways around their blocking. AI porn sounds way too dangerous for them to get involved in.

    • Why do you think people who look at porn are creeps? I think the overwhelming majority of males and a large proportion of women have watched it.

      That is a way do dehumanize people, take a perfectly normal, natural part of human nature, in this case attraction to the opposite sex and call it something bad.

      It really amazes me that a desire to see someone naked is considered "creepy" while watching movies that depict people dying in horrible ways is OK, nobody says people going to watch a horror or an action mo

      • Huh? Read what I said. I am all in favor of REGULAR porn. Go look up what kind of porn is ILLEGAL, that's what I was talking about.

  • Gaming and porn have always been on the bleeding edge of technology, pushing the limits. Be a shame to turn off one of those spigots.
  • I've always maintained that - given history - the first place AI will generate photorealism and video would be porn.

    The only one who loses in this are women.

    • by gweihir ( 88907 )

      The only one who loses in this are women.

      Actually, no. More porn availability is linked to less sexual violence.

    • by PPH ( 736903 )

      The only one who loses in this are women.

      Deep fakes aside, why? The guys are lusting after someone else, not you.

      Oh. I get it. It's bad for OnlyFans creators.

  • by Anonymous Coward
    Pics, or it didn't happen.
  • I'm surprised that in the august company of Slashdotters, we got this far in the comments without anyone bringing up "Rule 34 [wikipedia.org]". Which touches (consensually or not) upon the difficult - if not impossible - problem of defining pornography.

    People were probably having this conversation when Mediaeval woodcutters started making porn for printing on the first printing presses. You can see how totally unsuccessful we have been at achieving consensus in the intervening centuries.

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...