Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Technology

Stable Diffusion Made Copying Artists and Generating Porn Harder (theverge.com) 63

AmiMoJo writes: Users of AI image generator Stable Diffusion are angry about an update to the software that "nerfs" its ability to generate NSFW output and pictures in the style of specific artists. Stability AI, the company that funds and disseminates the software, announced Stable Diffusion Version 2 early this morning European time. The update re-engineers key components of the model and improves certain features like upscaling (the ability to increase the resolution of images) and in-painting (context-aware editing). But, the changes also make it harder for Stable Diffusion to generate certain types of images that have attracted both controversy and criticism. These include nude and pornographic output, photorealistic pictures of celebrities, and images that mimic the artwork of specific artists.

"They have nerfed the model," commented one user on a Stable Diffusion sub-reddit. "It's kinda an unpleasant surprise," said another on the software's official Discord server. Users note that asking Version 2 of Stable Diffusion to generate images in the style of Greg Rutkowski -- a digital artist whose name has become a literal shorthand for producing high-quality images -- no longer creates artwork that closely resembles his own. "What did you do to greg," commented one user on Discord.

This discussion has been archived. No new comments can be posted.

Stable Diffusion Made Copying Artists and Generating Porn Harder

Comments Filter:
  • by ffkom ( 3519199 ) on Friday November 25, 2022 @05:18PM (#63079604)
    ... if they had been offered an almost magical paintbrush that would accelerate their painting by a hundred times, but at the same time disallowed them to paint all those many acts and nudes and celebrities they used to paint?
    • Probably "it's OK, I like doing the naughty bits"

      While this story came out I was just generating some images of a restaurant called Shenanigan's [ko-fi.com]. My poor little 1070 struggles to grunt these out even at 512x768.

      I've only made porn accidentally so far, I asked for some portraits in the style of Masamune Shirow and got more tits than robots.

      • by lsllll ( 830002 )
        Interesting. I have an installation in a VM under Proxmox and clearly throwing 16 core at it is painfully slow. So I did some reading and made me think I could upgrade my wife's GPU and throw her 1070ti in the Proxmox server to do SD on, but from what you're saying, that'll still be painfully slow.
        • It's not horrible or anything, even at the max 150 steps it's only 3-4 minutes at that resolution. But it's not exactly fast. At 20 steps you can get nine images (which is my usual sample size) and then decide what kind of words you'd like to use, and if any of the seeds are promising, etc... and that takes about 3 minutes on my pny xlr8 1070 oc. (mild and basically irrelevant pre-overclock.) The Ti should be a little bit speedier.

        • It's all about GPU processing power. Rent time on Vast.AI, use the free (or paid) tier on Collab, or throw a 3090/4090 or upcoming 4090 Ti at it. Anything less than a 3080 and you'll probably have a poor experience. CPU would be a nightmare

          • It's all about GPU processing power. Rent time on Vast.AI, use the free (or paid) tier on Collab, or throw a 3090/4090 or upcoming 4090 Ti at it. Anything less than a 3080 and you'll probably have a poor experience. CPU would be a nightmare.

            Even a 3060 mobile variant is quite nice. The faster GPUs are nice to have, and essential for training your own model due to VRAM, but for inference a 3060 is adequate. Might change with larger models though.

      • Oh they know how to tip. They know.

        Welcome to shenanigans, enjoy your food, welcome to shenanigans Calvin works here!

    • They'd go to the paintbrush seller on the other side of the street who's giving it away just for donations, and the seller that was once the talk of the town will join the others in irrelevance.

      • Yup. And then they'll move on down the way and go to the people giving unencumbered paintbrushes away for free.

    • ... if they had been offered an almost magical paintbrush that would accelerate their painting by a hundred times, but at the same time disallowed them to paint all those many acts and nudes and celebrities they used to paint?

      A paintbrush implies a much greater creative input on the end product.

      A better metaphor would be an incredibly quick painter who took commissions (but who refuses to do nudes?).

      That's the critical bit to understanding this tech, the artistic creativity mostly comes from the artists whose work was used to train the models. The user who provides the prompt can accept or reject the output, but they're very limited in their ability to refine the output.

      • The new approach is to have a sanitised base model pre-trained by StabilityAI that can be fine-tuned by users for their own image or porn, hentai, celebs or graphics in the style of someone. The cost of creating the base model is absorbed by the company, and the cost of fine-tuning is small enough to be accessible to end users. Maybe soon we'll be able to use reference images without fine-tuning at all. So the base model could be completely fine by itself and the responsibility falls on users now.
        • the cost of fine-tuning is small enough to be accessible to end users

          Yes, but historically it's been pretty hard to use a training to produce a style, it's only easy to train for a subject. I don't know how that's going to work with 2.0 though, I haven't played with it yet. Also, unless you have an expensive GPU with a lot of VRAM, you're going to want to do training on hosted resources.

    • Re: (Score:2, Insightful)

      by ilsaloving ( 1534307 )

      They would be saying the same thing todays artists are saying: They are happy a machine specifically designed to devalue and demean their work has been removed from the market.

      • You're blind if you think the only use of image AI is to devalue a few artist's work. This is a platform for a million applications.
        • Apparently you missed the memo. This isn't Twitter or Parler, so it's not necessary to respond like a belligerent asshole.

          You're obviously more interested in being a troll than having an actual conversation, so bye.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      ... Made Generating Porn Harder

      Porn is supposed to make you hard. DUH.

  • Shortsighted? (Score:4, Insightful)

    by rsmith-mac ( 639075 ) on Friday November 25, 2022 @05:24PM (#63079616)

    Eh...

    It's their model and they can do what they want with it. But making it less useful is probably not going to do them any favors in the long run, especially as they're not the only image generation model under development.

    • Users were already still using the old model for some stuff. I found it was easier to generate really nice illustrative art styles with it. AUTOMATIC1111's fork just has a quick pulldown so you can switch between models. I'm looking forward to having another one to play with, even if it doesn't supersede the one I'm using now. Maybe I will use its inpainting with stuff I generate with 1.5.

    • Re: (Score:2, Flamebait)

      by quantaman ( 517394 )

      Eh...

      It's their model and they can do what they want with it. But making it less useful is probably not going to do them any favors in the long run, especially as they're not the only image generation model under development.

      Whether it's shortsighted depends on whether they want to be known for generating interesting artistic images or whether they want to be known for generating porn.

    • Re: Shortsighted? (Score:4, Insightful)

      by Pinky's Brain ( 1158667 ) on Friday November 25, 2022 @07:11PM (#63079816)

      They are angling for a buy out, not getting ready to sell a product.

  • by Opportunist ( 166417 ) on Friday November 25, 2022 @05:37PM (#63079624)

    Furry artists worldwide breathed a sigh of relief.

    • Does that refer to artists who draw furries, or artists who are furries?

      • As an actual furry, I can tell you that the meaning has changed over the years. In the 90's and earlier, "furries" were the characters, not real people. After the turn of the millennium, all of that social/gender identity politics nonsense started to creep into the fandom, and now the term "furry" seems to refer to real people. Being one of the older people in this fandom, I find it very annoying and unnatural to refer to humans as "furries".

        • I was at a furry convention in the early 2000s. I had just finished up a fursuit dance competition, and as I was walking off stage, one of the other contestants came up to me and said "Good job! You're really good at this for a human!" I was so taken aback that I didn't know what to say. It took me a few seconds to process what he meant, and by then he had walked away. It wasn't until later that I realized how odd it was for him to refer to me as a "human". At the time, I thought maybe he just didn't kn
      • The people who draw those pics to make money.

        I didn't believe just how much money is in those drawings 'til I saw a price tag. Holy fuck.

  • by twocows ( 1216842 ) on Friday November 25, 2022 @06:12PM (#63079696)
    Considering how many other AI services are going and the general hatred for anything related to sexuality in the highly repressed business world, I'm just glad they didn't strip the capability completely. Making allowances for copyright or whatever seems fine to me.

    In the general sense, I think increasing availability of fake smut tailored to people's weird, extreme fetishes gives them an outlet so they don't have to repress (or fail to repress) their weirdness in polite company, so I'm fine with AI-generated porn.
    • by Z00L00K ( 682162 )

      The genie is already out of the box now so the upgrades they offer might not really cause any changes now and those wanting to make erotic material will figure out a way around it.

  • One of the first prompts I used on SD was "two cats playing chess in Picasso style", follow by many more Picasso requests and then a lot of Chagall, since he's actually my favorite painter. Don't know if nerfed those as well, but it's a shame nonetheless.
  • by thegarbz ( 1787294 ) on Friday November 25, 2022 @06:43PM (#63079748)

    AI models are becoming dime a dozen. If you choose to nerf yours on purpose you may very quickly find yourself irrelevant in a market place of non-nerfed competitors.

    • What is you are saying is: if you don't sell sex accessories in your shop, nobody will come buy clothes and food. We know this is not true, sex acessories are confined into specific places (physical and online) and this does not prevent regular shops and things like Youtube to grab a large fraction of video streaming market. Other actors accept porn and thrive as well. There is space for several business models.

      • Except sex - which yes, undoubtedly will be the most common use - is not the only reason you might want an image to include nudity or even sexual activity. And isn't necessarily wrong even if it is. Similarly, the idea that a STYLE of image is now prevented? And you can't have anything that looks too much like a celebrity? That's the concept of IP gone completely amuck.

        Why would I put effort into using a tool that's deliberately broken?

        • It is a fine-tuning platform. You take the SD model, clean and sanitised, and fine-tune it on a collection of porn of your liking. Or an artist you want to rip. Or just your own selfies. The platform is safe because it doesn't actually train on porn, celebs and artists.
        • Why would I put effort into using a tool that's deliberately broken?

          They target the corporate audience (advertisements, book illustration). As a company it is easy to know whether you will ever want to represent nudity. It's not broken just because they excluded nudity as a field of application of the produced images.

          the idea that a STYLE of image is now prevented?

          The style cannot be prevented, he probably asked to remove his NAME and the index of HIS ARTWORK out of the database, based on existing regulations. For example the Picasso family might ask to de-index the name and the artworks based on the Right to be Forgotte

      • What is you are saying is: if you don't sell sex accessories in your shop, nobody will come buy clothes and food.

        No. What I'm saying is if you're a general goods store, just like the general goods store next door, and you decide to purge all items in your shop that anyone may find offensive (not just sex, they announced one of the most popular parts of using these AI models: Art imitation), then don't cry when your customers go next door instead.

  • by Big Hairy Gorilla ( 9839972 ) on Friday November 25, 2022 @07:30PM (#63079854)
    Waste of freaking resources.
    For starters fellas, isn't the internet practically founded on porn?

    You've already got the best, why settle for freak show generated?
    • I, for one, am glad that it won't be used to generate hi-res graphic images of George W. & Trump naked in a sauna, sweat & "other" fluids dripping from their pallid wrinkly naked bodies while thrusting & groaning in a lustful dance choreographed by the devil himself.
  • If it's a problem for you, just download v1. Or fork v1 and backport v2. Or vice versa.

    Or just download one of the many existing forks. Yes, the furries have their own.

  • ..talk about a double party pooper.

  • by SuperKendall ( 25149 ) on Friday November 25, 2022 @11:47PM (#63080286)

    Although it sounds like a joke, there is a group managing an "Unstable Diffusion" AI engine, whose goal is an engine tuned to produce high quality "sex positive" images (meaning any kind of NSFW content) - as a group they also do not want to have people producing deep fakes(or child porn) but they are otherwise trying to keep all of the aspects Stable Diffusion is ejecting, like producing images from other images.

    They are starting a kickstarter on December 9th to help fund continued development.

    Will be interesting to see how these AI art engines continue to evolve.

    • "as a group they also do not want to have people producing deep fakes(or child porn)"

      Holy crap. That never occurred to me, but makes perfect sense. Various "interest groups" will customize it to their preferences or perversions as you may interpret.

      There is an outstanding question, not yet answered afaik, about violence in video games. FPS's and GTA for instance. Does playing the games MAKE you violent? or Does playing it offer "healthy" way to blow off some of your negative tendencies?

      So then, my question
      • by ffkom ( 3519199 )

        So then, my question would be, if for instance child porn people customise it to their liking, is that a "legitimate use" because it "protects" actual children, or is it horrifying because perverts everywhere have access to whatever that is??

        I think you can answer that question easily by looking into history. At times/locations when/where homosexuality was outlawed and a social taboo, it was always assumed that homosexuals could "infect" others to become homosexuals themselves, and that such individuals would have a tendency to rape others to fulfill their sexual desires. There does not need to be any scientific evidence for such being true if there is enough moral outrage about the subject.

        • I take your point. Emotional arguments, in the real world, seem to always win over cold blooded analysis. So then in the case of violence in video games, are there any broad conclusions to be drawn yet? Like... does GTA make teenage boys want to do cocaine off a hooker's ass? and shoot each other and drive over cops? Does it *cause* incels? or is it just shits&giggles, just blowin' off a little steam in the cyberworld? I saw my kids and nephews playing them and I can't say anyone I know turned out bad,
      • if for instance child porn people customise it to their liking, is that a "legitimate use" because it "protects" actual children, or is it horrifying because perverts everywhere have access to whatever that is??

        Yeah that is a good question. I don't know which way that should go.

        It does seem like if you legalized generated child porn, that it would help protect real kids against being used for such. But I can also see the argument that if you legalize that it could drive greater popularity of child porn..

  • Where we ever get nude images from now?

    You know... superheroes were basically nudes with colored leotards.

  • by Gibgezr ( 2025238 ) on Saturday November 26, 2022 @02:43AM (#63080450)

    First of all, this isn't just about nudity, but let's address the biggest misconception I see.
    A lot of people commenting think that removing nudity form the training is fine, as it just makes the model safe for work, but it actually hurts the generation of non-nude humans as well. Significantly. I use the f222 model as my main general-purpose model, because it generates better clothed humans than SD1.5. The f222 model is based off of extending SD1.5 to have more knowledge of nudity (so the total opposite of the direction SD2.0 went). This actually makes f222 better at making humans IN GENERAL. The f222 model knows a lot more about the shape of humans. It's not perfect, but what f222 needs is just even more body types and ugly folks, but it's not completely lacking in the ability to generate those either. It definitely does have a bias towards pretty people, but it is not near as overwhelming as some of the other models.
    So even just for generating clothed, normal people, the ranking is f22>SD1.5>SD2.0.
    Now for the non-nudity stuff: first other NSFW things. Censoring weapons say. This sucks. It makes it difficult to do pictures of "50's noir detective holding a pistol". It makes it hard to get the model to do abstract things that are tangentially related to whatever concept gets censored out. Just like with removing nudity form the model, it impacts more than intended. We need to go the OTHER way, and give the Ai more to work with, not less: give the artist more to work with, not less.
    Lastly: censoring artist styles. This just hurts the model even more. Tagging a prompt with artists' styles is very convenient, and while trivial usage gets boring fast ("X painted by Picasso"), combining artistic styles together allows the artist to get the AI to give them images that have huge, USEFULL variety compared to otherwise. We get to direct the AI to make images suitable for purpose, but not limited to "looks like a Monet". It's great, a very powerful tool. People made enormous websites that showcased the art style of every artist tagged in the SD1.4/1.5 CLIPs, and artists use those to help them find styles to mix together that will direct the image generation is a useful fashion. We are trying to get the AI to generate the imagery we imagine, not just the imagery IT imagines...a concept lost on many who don't use these tools. I have spent literal days working with an image in SD, bouncing revisions to paint programs and back, inpainting/outpainting and revising until I get what I want. The process is one of art direction, and not just "type in a short prompt and get something cool back".
    It's a collaborative effort between you and the model, and you want the model to have as much imagination and technical skill as possible, but also be very open to direction. If you are an art director, working with a model that's not very good at taking directions is like working with a human artist that can't take directions.
    A general purpose model needs breadth and depth of training in everything under the sun. The models that do that best are most useful as general purpose collaborators, the rest will be passed by.

  • Asking for a friend.
  • Ooooh, so it's now better at generating harder porn?!

    No, wait... :-/

I've noticed several design suggestions in your code.

Working...