Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Technology

Creator of DeepNude, App That Undresses Photos of Women, Takes It Offline (vice.com) 196

The creator of DeepNude, an app that used a machine learning algorithm to "undress" images of clothed women, announced Thursday that he's killing the software after viral backlash for the way it objectifies women. From a report: On Wednesday, Motherboard reported that an anonymous programmer who goes by the alias "Alberto" created DeepNude, an app that takes an image of a clothed woman, and with one click and a few seconds, turns that image into a nude by algorithmically superimposing realistic-looking breasts and vulva onto her body. The algorithm uses generative adversarial networks (GANs), and is trained on thousands of images of naked women. DeepNude only works on images of women, Alberto said, because it's easy to find thousands of images of nude women online in porn.

Following Motherboard's story, the server for the application, which was available for Linux and Windows, crashed. By Thursday afternoon, the DeepNude twitter account announced that the app was dead: No other versions will be released and no one else would be granted to use the app. "We created this project for users' entertainment months ago," he wrote in a statement attached to a tweet. "We thought we were selling a few sales every month in a controlled manner... We never thought it would become viral and we would not be able to control traffic."

This discussion has been archived. No new comments can be posted.

Creator of DeepNude, App That Undresses Photos of Women, Takes It Offline

Comments Filter:
  • by SuperKendall ( 25149 ) on Friday June 28, 2019 @09:08AM (#58839936)

    he's killing the software after viral backlash for the way it objectifies women.

    Technically isn't it de-objectifying women, since it removes objects from them?

    • Re: (Score:3, Interesting)

      by Anonymous Coward

      I can see how the app is offensive to some women, but the article is clearly by an extremist: "DeepNude also dispenses with the idea that this technology can be used for anything other than claiming ownership over women’s bodies"

      Really? Can't be used for anything else? I'm not even a fan of deepfakes because really, any abuse of it is just as bad as the nudes being complained about here, for example a video of a public figure saying the opposite of what they believe is probably just as offensive to th

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        Of course the person writing the article is a complete fucking idiot. And what's wrong with objectifying anyway? We do it all the time. We also fantasise all the time. There's nothing wrong with it. The only problem here is that of easily being able to make deep fakes but I think that horse bolted a long time ago when photoshop was released.

      • Re:Well actually... (Score:5, Interesting)

        by gbjbaanb ( 229885 ) on Friday June 28, 2019 @10:10AM (#58840230)

        I can think of uses for this (beyond the obvious "entertainment" aspect):

        if you can put nude overlays on women, then you can put clothes overlaid on her image, and if you can do that, you have a multi-billion dollar app you can sell to every clothes retailer on the internet.

        • In order to do that with this software, you'd need a training corpus of umpty-jillion people wearing the clothing you wanted to put on them. You could get people into generic jeans and tee shirts easily enough, but specific garments would be right out.

          • We're probably not far from the point where you can train a network to take picture of a person and a separate picture of a garment, and then produce a picture of that person wearing that.

            Or take a picture of a pattern, and have it produce a shirt with that pattern on it.

          • by Anonymous Coward

            Oh I disagree, I'd think you could use a handful of modeling algorithms to generate an approximate shape and it's deviation from a standard shape, and then have clothing parametrically coded, then it's just moving the clothing over to the model. The key is training the machine to generate a usable shape body model, not to merely overlay an image of some kind of clothing, and then rendering clothing based on generated model parameters.

            Thus you only need a couple thousand to ten thousand people naked to gener

          • by rastos1 ( 601318 )
            Welcome to the future [youtu.be]. The are hundreds of avatars which are basically a 3D models of human body. For various heights, weights, ages, proportions, races, ... and a software that takes 2D pieces of fabric, puts them on the avatars and "sews" them creating a 3D look on the garment in almost real time taking into account not only color and structure of the fabric but also physical features such as weight, stiffness, etc. There are even body scanners [youtube.com] where a person can step in and get measured to create a perso
            • Combine this with a 3D printer capable of cutting and sewing cloth, and there goes the fashion industry.

              YOU WOULDN'T 3D PRINT A CAR, WOULD YOU?
              Fuckin right I would.
      • for example a video of a public figure saying the opposite of what they believe is probably just as offensive to them

        You're probably right. But in the case of politicians that have been in office for more than 10 years fakes are not needed. You are likely to find actual video of them saying the exact opposite of what they claim to believe currently.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      DeepNude only works on images of women, Alberto said, because it's easy to find thousands of images of nude women online in porn.

      Sure sounds like a lot of women are already objectifying themselves.

    • 198x: ultraprudish blue haired church ladies, who believe they possess the one and only Truth, try to ban anything and everything sexual.

      201x: ultraprudish blue haired feminsts, who believe they possess the one and only Truth, try to ban anything and everything sexual.

      • by Anonymous Coward

        Just because you don't understand the difference between consensual and non-consensual sexual activity you don't get to label those who do as "ultraprudish". That's the same people who are more sexually liberated and more open about their sexuality than any generation in the history of civilization. Holy shit, we have kink parties and group sex and talk about it online, how fucking prudish! Just quit being a fucking creep and you too can get laid!

  • by Anonymous Coward

    someone knocked the dicks off all the statues too couple thousand years, society hasnt really progressed lel

    • Re:TITTIES (Score:5, Interesting)

      by gbjbaanb ( 229885 ) on Friday June 28, 2019 @10:12AM (#58840244)

      There is graffiti in the burial chambers of ancient Egypt showing workers giving to the queen doggy style.

      We've not changed, after 6,000 years suggests we never will.

  • His mom (Score:3, Funny)

    by Anonymous Coward on Friday June 28, 2019 @09:11AM (#58839954)

    Porbably because someone sent him a picture of his mom... after his software worked on it...

  • Vulva? (Score:5, Funny)

    by 110010001000 ( 697113 ) on Friday June 28, 2019 @09:12AM (#58839962) Homepage Journal

    I read slashdot. What is a "vulva"?

  • Huh... (Score:2, Insightful)

    by Anonymous Coward

    "DeepNude only works on images of women, Alberto said, because it's easy to find thousands of images of nude women online in porn."

    All things aside I'd say DeepNude is a reflection of the Internet. Kinda like politicians are a reflection of the citizens who vote for them. The genie is out of the bottle now though. This thing will be revere engineered and new programs will pop up. Some better, some worse.

    • by vlad30 ( 44644 )
      I wonder if this was better than that one a few years back that took every available image of a celeb then stitched all the available skin shots until they had a near nude of full nude (if pantyless or nip slips shots were available) of said celeb or does it simply match the best nude available to the body type pictured. On the previous software I the developer complained of the same problem far fewer male celebs were photographed with more "skin" showing
      • by ceoyoyo ( 59147 )

        It just makes things up that look reasonably real. You could do the same thing with photoshop, or a paintbrush.

        • Except it doesn't work on men because no one has written code into photoshop or paintbrush. The coding is too hard. Kinda like blockchain and AI and deep learning. That shit only works on women.

          • by ceoyoyo ( 59147 )

            To be fair, in my experience biological neural networks are much better at drawing dicks and balls, so the non-biological ones drawing vulva is maybe just karma.

            Everyone can draw boobs though.

    • You don't need a similar program for men anyways.

      Most of them will just send you naked pictures, often times completely unsolicited.
  • by Anonymous Coward

    yes there 100's of thousands of pictures of naked women out there but if the process has an error percentage and people reate millions of fakes and post them wouldn't it become a self fueling cycle until the results become something not a woman?

    • Re: (Score:2, Informative)

      by drinkypoo ( 153816 )

      if the process has an error percentage and people reate millions of fakes and post them wouldn't it become a self fueling cycle until the results become something not a woman?

      Technically, it's already producing something which is not a woman, at least not a real one. It's producing plausible results, but it's not actually undressing anyone, and the resulting image is not of any actual woman who ever lived.

      • Re: (Score:2, Insightful)

        Re-read your comment and see (you can grimace and stick you tongue out and grunt and stuff) if there's, say, any gender-specific issues with it.

        • Re-read your comment and see (you can grimace and stick you tongue out and grunt and stuff) if there's, say, any gender-specific issues with it.

          There are not. If they had done the same thing with men, which they claim they didn't do only because of the lack of a training corpus, you could swap the genders around and it would work perfectly well. Thanks for your interest, though.

        • by lgw ( 121541 )

          Re-read your comment and see (you can grimace and stick you tongue out and grunt and stuff) if there's, say, any gender-specific issues with it.

          He used no gendered words. Words have gender. People have sex.

    • With enough fakes maybe, but I suspect you're drastically underestimating the number of nudes on the internet.

  • by JoeyRox ( 2711699 ) on Friday June 28, 2019 @09:22AM (#58840006)
    We need a DeepProgrammer app, that uses a machine learning algorithm to "unmask" programmers.
  • by ugen ( 93902 ) on Friday June 28, 2019 @09:26AM (#58840018)

    "DeepNude only works on images of women, Alberto said, because it's easy to find thousands of images of nude women online in porn."
    Right, because finding images of male porn online is exceedingly rare and difficult. So that's why he did what he did. Gotcha.

    • by Anonymous Coward

      Not that it's hard to find male nudes online (no pun intented), but it is true that there are way more shots of whole-body naked women than men, even in porn - lots of "PoV" videos which just show an almost bodyless penis.

    • Well, at least it only works on women who only appear on porn sites. The jerk is not clever enough to port it over to any men, porn stars or not, or men and women who are not porn stars.

      That way, if a nude photo of the statue of our wife show up, we'll all go: "Fuck you! She never appeared in a porn movie."

      Kinda makes the app useless.

    • Compared to finding pictures of women, yeah. Here's a simple test for you, type "nudes" into Google, turn off safe search, then click Images. Let us know how many pages you scrolled through before you found a full body male nude, or before you gave up.

      Training these algorithms requires quite large datasets, yes filling it with males would in the most literally sense be *several orders of magnitude* more difficult.

      • by mentil ( 1748130 )

        Just for the heck of it, tries this on Bing (great for finding porn images, or so I hear *cough*). In the first 200 images or so, found 1 image of a man, most the way down. Now I put in 'nude men' and lo and behold, had to scroll almost as far down to see the first pic of a woman. Oh wait, it's a shemale.

      • About half a page. Google relies on your previous searches. As a gay male, it knows I am less likely to want to see naked females then hot, sweaty, muscled, naked men.
  • by Anonymous Coward

    ...but it does not work well. It puts tits where arm pits should be and creates something that does not pass for human. If you remember the three titted women & Quade from the original "total recall", kinda like that but a lot worse.

    • If you remember the three titted women & Quade from the original "total recall", kinda like that but a lot worse.

      *sigh* ... *unzips*...

      • If you remember the three titted women & Quade from the original "total recall", kinda like that but a lot worse.

        *sigh* ... *unzips*...

        I'll bet you wish you had three hands.

    • Where are the examples of images its produced?

    • ...but it does not work well. It puts tits where arm pits should be

      That sounds like an improvement to me.

  • by aepervius ( 535155 ) on Friday June 28, 2019 @09:34AM (#58840046)
    It simply paste over some sample body with controlled resizing/reshaping so that it looks like the original women proportion. While creepy, it is a more complicated way to copy/paste the head of a woman onto a porn model body, it isn't undressing. In fact you can use the photo of a man if I read the article correctly, and it will paste a vulva , not "undress" the man. It is an expansive piece of junk which is over-hyped as an attack on women, when it fact it is probably more offensive to take the photo of a man and make him appear have breast and a vulva.
    • Re: (Score:1, Flamebait)

      by AlanObject ( 3603453 )

      when it fact it is probably more offensive to take the photo of a man and make him appear have breast and a vulva.

      When a woman has her recognizable face published on media with even a marginally convincing synthesis of a nude body she has to consider the potential threat caused by it from unwanted sexual aggression from any party, known or unknown. There is no end to the possible number of predators that will equate her exposure as "availability" or "asking for it."

      Men do not have this problem.

      Both genders may suffer bad effects more or less equally, such as damage to one's reputation but no -- doing this to men

      • by sinij ( 911942 ) on Friday June 28, 2019 @10:16AM (#58840276)

        When a woman has her recognizable face published on media with even a marginally convincing synthesis of a nude body she has to consider the potential threat caused by it from unwanted sexual aggression from any party, known or unknown. There is no end to the possible number of predators that will equate her exposure as "availability" or "asking for it."

        This is not a realistic threat scenario for the following reasons: a) possible number of predators that would act is very small and most of them are already incarcerated b) you are assuming that said predators are more likely to act as a consequence of arbitrary action by a third-party, an unproven assertion c) the logical conclusion of your argument is that women should wear burkas, obviously not a desirable or practical solution.

        • ... most of them are already incarcerated ...

          By "most," you mean more than 50%.

          The US Department of Justice isn't aware of that many yet.

          Approximately 30% of sexual assault cases are reported to authorities. [slashdot.org]

          Your post has only a small portion that is correct, and it applies to you:

          ... an unproven assertion ...

          • by mentil ( 1748130 )

            Several sexual assaults can be committed by the same person, so it's quite possible that 30% of assaults are perpetrated by more than 50% of assailants... which would mean that LEO knows of more than half of all of them.

        • a) possible number of predators that would act is very small and most of them are already incarcerated

          [Citation Required]

          As a counter-example, there's a whole lot of rapes already going on for a "very small" number of predators that are mostly in jail to be perpetrating.

          b) you are assuming that said predators are more likely to act as a consequence of arbitrary action by a third-party, an unproven assertion

          There's a whole lot of people that change their behavior once they've decided a woman is a "slut". Nude pseudo-photos can do that.

          You're also demanding levels of proof you are unwilling to provide yourself in your own assertions.

          ) the logical conclusion of your argument is that women should wear burkas

          Why? This software would remove the burka.

          • by sinij ( 911942 )

            a) possible number of predators that would act is very small and most of them are already incarcerated

            [Citation Required]

            Here is citation for you - BJS: Criminal Victimization [bjs.gov] . Serious violent crime, rape - UCR rate per 1000 residents in 2017 is 0.4. To compare, in 2017 0.114 per 1000 is motor vehicle death in US.

            b) you are assuming that said predators are more likely to act as a consequence of arbitrary action by a third-party, an unproven assertion

            There's a whole lot of people that change their behavior once they've decided a woman is a "slut". Nude pseudo-photos can do that.

            [Citation Required]

            • Re: (Score:3, Insightful)

              by jeff4747 ( 256583 )

              Here is citation for you

              Your citation doesn't back up your assertion. First, your assertion is that there's only a very small number who would act only because of the fake pictures but would otherwise not act. Citing convictions does not address that at all.

              Second, your citation does not include any information about what quantity of those people are actually incarcerated, just that some people are incarcerated for rape.

              Third, you don't address at all the difference between accusations of rape and convictions for rape, which run [washingtonpost.com]

              • by sinij ( 911942 )

                You yet to offer any citation of you own. So before we move on to further discuss my citation, I have to assume that you don't have anything to show to support: "There's a whole lot of people that change their behavior once they ... [see nude] pseudo-photos can do that."

                Aside, do you think "incel" is widespread? I really don't know, but I am assuming this is a form of mental illness is about as common as manic depressive as this is likely what these people are suffering from.

        • a) possible number of predators that would act is very small and most of them are already incarcerated

          To some people, all males are predators. It is just that some of them have not been caught yet.

          Just having a sexual impulse is gross. Babies are supposed to be created through love, not through gross sex...

          Yeah. Mental.

      • maybe a risk in India or someplace, but women in the Western take clothes off to go out in public, and support government bans on defensive weapons like they don't have a care in the world
      • You should reinforce your argument with a link.

        I have done that for you [youtube.com].

        Full tape with lewd Donald Trump remarks (Access Hollywood)

        yw

    • *sigh* Fine, but please, don't take the fun out of x-ray specs. At least let us have that!
    • Its a bit more involved its using a gan
    • to nude photos of women. This isn't a matter of objectifying them. School teachers lose jobs all the time because it comes out they did soft core porn in college. The people getting those women fired aren't feminists either (the feminists fall into two camps there, supportive of the women's decision to pose nude and sympathetic that she felt the need to do porn). The folks firing those women are usually the religious right / morality police.

      I could see this being a problem where an angry ex or coworker
      • by lgw ( 121541 )

        I have hopes that the newest generation (Digital Natives?) will react to "there are nude photos of X" with "so what, everyone has nudes out there somewhere". This recent neo-puritanism doesn't seem to be sticking to them at all.

        • But there will be a new form of puritanism as the old one dies out. The current wave of social justice moral crusaders what have been yesteryear's nosy church-goer, but as religiousness has waned, those people have had to find other outlets where they can be sanctimonious pricks. The next generation will just find a new club or movement from which they can annoy the rest of us.
    • by AmiMoJo ( 196126 )

      It basically automates photoshopping. It's like a lot of things - possible for a long time if you had the skills and patience, but suddenly available to everyone with a click or two. Like drones or 3D printed weapons.

      This sort of thing will keep happening and I'm not sure there is much we can do to stop it.

  • Alien (Score:5, Funny)

    by Areyoukiddingme ( 1289470 ) on Friday June 28, 2019 @10:39AM (#58840404)

    We never thought it would become viral and we would not be able to control traffic."

    I take it this guy has never met any actual humans...

  • Nude Ginger or Nude MaryAnn?

  • The complaints are (ie from Vice): "The $50 DeepNude app dispenses with the idea that deepfakes were about anything besides claiming ownership over womenâ(TM)s bodies." ....if I take a pic of your head and photoshop on some porn actress's body, you can't really say I'm claiming 'ownership' over your body. It's not your body. You can complain that I'm misusing your FACE, sure.

  • Weaponized Autism was already working on a crack yesterday.

    Probably on the bay already...

  • Really? What guy in the world really needs that? What HEALTHY HUMAN in the world really needs that? I've been doing it with my own eyes for 40 years.

  • Itâ(TM)s only a matter of time before a functional version of the legendary 3D glasses becomes available. I, for one, welcome our new AI overlords.

No skis take rocks like rental skis!

Working...