Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Technology

A Horrifying New AI App Swaps Women Into Porn Videos With a Click (technologyreview.com) 258

Karen Hao, reporting for MIT Technology Review: The website is eye-catching for its simplicity. Against a white backdrop, a giant blue button invites visitors to upload a picture of a face. Below the button, four AI-generated faces allow you to test the service. Above it, the tag line boldly proclaims the purpose: turn anyone into a porn star by using deepfake technology to swap the person's face into an adult video. All it requires is the picture and the push of a button. MIT Technology Review has chosen not to name the service, which we will call Y, or use any direct quotes and screenshots of its contents, to avoid driving traffic to the site. It was discovered and brought to our attention by deepfake researcher Henry Ajder, who has been tracking the evolution and rise of synthetic media online.

For now, Y exists in relative obscurity, with a small user base actively giving the creator development feedback in online forums. But researchers have feared that an app like this would emerge, breaching an ethical line no other service has crossed before. From the beginning, deepfakes, or AI-generated synthetic media, have primarily been used to create pornographic representations of women, who often find this psychologically devastating. The original Reddit creator who popularized the technology face-swapped female celebrities' faces into porn videos. To this day, the research company Sensity AI estimates, between 90% and 95% of all online deepfake videos are nonconsensual porn, and around 90% of those feature women.

This discussion has been archived. No new comments can be posted.

A Horrifying New AI App Swaps Women Into Porn Videos With a Click

Comments Filter:
  • Well..... (Score:3, Insightful)

    by Agent Fletcher ( 624427 ) on Monday September 13, 2021 @06:27PM (#61793665)
    What's the website?
  • by grasshoppa ( 657393 ) on Monday September 13, 2021 @07:01PM (#61793777) Homepage

    Pervy, sure. Weird, ok. Sad, absolutely.

    But horrifying? Why?

    • by Tablizer ( 95088 )

      Indeed. I've done it in Photoshop/Gimp on static images and I am sure many others have*. So why is video any different? It's for my personal amusement only.

      Yes, we dudes are horny pervs; "God" made us that way, and we'll stay that way in private. We'll be politically correct in public, but our bedroom toys are ours alone.

      * In fact the first time I ever saw a color PC (Wintel) was when somebody in the dorm was editing different faces onto porn. Everyone was thinking, "gee, I gotta get a color card & moni

      • I think they meant that it's whore-a-fying... . My question is if I can use my male friends faces as a joke instead....
      • I was 12 when Duke Nukem 3D came out and I talked my parents into buying it for me. Oh the joy of downloading a mod that made all those strippers and posters full nude. The enemies were now just distractions to kill so I could enjoy the ladies in peace.
        This even though I had already violated the "no porn" rule the very first time I knew the parents were far enough away I'd have time to clear the screen if they were coming. I'd like to say I've changed, but nah. Although I don't like deepfakes; they're just
    • by RJFerret ( 1279530 ) on Monday September 13, 2021 @07:34PM (#61793847)

      Pissed of student does this of the teacher. She loses her job, her kids go through financial stress as she has to change careers away from anything to do with children/her degree/background. It's impossible to prove a negative, so you can never prove it isn't you beyond a shadow of a doubt, and many communities don't want adult actresses teaching their kids regardless of their scholarly training and smarts. It's the purity culture thing.

      Is that pervy? Nope. Weird? Nope. Horrifying? Yup.

      Many people don't want to be involved in others sex lives, they don't want to be the subject of someone's sexual gratification, they want to be able to chose who they share themselves with, they don't want to be devalued. Women already have a hard enough time with people abusing their pictures, their likenesses, their presence in public, adding a hidden aspect they might be ignorant of is an issue. If you actually were able to chose to do porn and then have responses if challenged, that's different than someone not making that choice and having their agency stripped from them.

      In other cultures (some parts of the Middle East) they wouldn't just lose their jobs, the ramifications might be death.

      Like anything, a tool can be wielded for good (pleasure) or bad (harm). It can't be banned, as it's thoughts. So how do you defend yourself?

      My answer is you don't, culture has to embrace the fact that no photo, video, or audio recordings are evidence or proof of anything anymore, which renders much of a courtroom moot. You can sure bet if there's video of me doing something, I'll bring copies of that video with the judge doing it, the opposing counsel doing it, etc.

      So yeah, the ramifications of it is horrifying, especially since so far the people victimized have been predominantly women, a marginalized group in society to begin with.

      • Seems a tad far fetched, wouldn't you say? Do we have any evidence this might actually happen?

        • by fafalone ( 633739 ) on Tuesday September 14, 2021 @02:09AM (#61794665)
          We do have many examples of teachers being fired because their nudes, even just topless shots, unintentionally leaked, even when they leaked by students stealing and/or cracking their phones/accounts to get them. You might be able to prove a fake is a fake, but it's hardly an unwarranted concern.
        • by cmseagle ( 1195671 ) on Tuesday September 14, 2021 @02:56AM (#61794729)
          It's a real-world scenario from the article:

          The Revenge Porn Helpline funded by the UK government recently received a case from a teacher who lost her job after deepfake pornographic images of her were circulated on social media and brought to her school’s attention, says Sophie Mortimer, who manages the service.

      • by dgatwood ( 11270 )

        Pissed of student does this of the teacher. She loses her job, her kids go through financial stress as she has to change careers away from anything to do with children/her degree/background. It's impossible to prove a negative, so you can never prove it isn't you beyond a shadow of a doubt ...

        Depends on whether the video is already out in the wild or not. If it is, you can probably do a reverse image search of one of the frames and find similar images, then find the original video, and then you have pretty solid proof that it isn't real. The nice thing is, the same basic technologies that make deepfakes possible also make it easier to find the original video, assuming that a copy of it exists anywhere on the Internet.

        Similarly, if a couple of college kids have sex on camera and then substitute

        • by Bert64 ( 520050 )

          How do you prove which is the original and which is the fake?

          • by dgatwood ( 11270 )

            Very easily, usually. The version with billions of copies on the Internet dating back a decade is real. Historical Internet archives are your friend.

            Alternatively, if the creator of the original video is known, you can contact that creator, because odds are good that the creator still has a copy of the original employment contracts, both to prove that the performers were adults and to prove that they agreed to appear in the videos.

            • by Bert64 ( 520050 )

              That's assuming that the source pornography in question was professionally produced, widely distributed and that the producer or participants are still around. None of these are a given - there is plenty of amateur porn and non consensual porn etc floating around the internet or in peoples personal collections which could be used as source material for a deepfake.

          • How do you prove which is the original and which is the fake?

            No need to.
            Reasonable doubt.
            Case closed.

      • by HiThere ( 15173 )

        For now your comment about courtroom implications is probably wrong. Most deepfakes are reasonably easy for an expert to detect. How much that would help outside of a courtroom, however, is a different story. So the rest of your argument looks good. And who knows what next year will bring.

      • by gweihir ( 88907 )

        Pissed of student does this of the teacher. She loses her job

        Nope. Does not happen. Pretty much everybody should know by now this is possible.

        • And yet... from TFA:

          The Revenge Porn Helpline funded by the UK government recently received a case from a teacher who lost her job after deepfake pornographic images of her were circulated on social media and brought to her school’s attention, says Sophie Mortimer, who manages the service.

      • by AmiMoJo ( 196126 )

        My answer is you don't, culture has to embrace the fact that no photo, video, or audio recordings are evidence or proof of anything anymore, which renders much of a courtroom moot.

        This is a very important point, and it's been happening for years already.

        For example a few years back there was a civil case where a car park management company tried to get money from someone using one of their spaces without paying. They submitted a photoshopped image of the space with the parking signs made more visible than they actually were. They were only caught out because they earlier submitted the same image without the modifications and the defence was able to compare the two.

        Decades before the

    • by gweihir ( 88907 )

      Pervy, sure. Weird, ok. Sad, absolutely.

      But horrifying? Why?

      That is just the usual hysterics pushing their deranged fantasy of how the world works. Anybody sane pretty much expected this but does not care too much.

    • Pervy, sure. Weird, ok. Sad, absolutely.

      But horrifying? Why?

      Really.

      A Horrifying New AI App Swaps Women Into Porn Videos With a Click

      Omg pick beautiful women, you idiots!

  • Put uploaded photos on well-equipped dudes having sex with a lot of women and guys would upload their own photo!
    • by PPH ( 736903 )

      Put uploaded photos on well-equipped dudes having sex with a lot of women

      You don't think the LGBTQ community will run with this and paste their crushes on anything else?

  • I don't see the "horrifying" part - this really isn't any different than some kid decades ago cutting out a picture form one magazine and pasting in onto their copy of playboy. If anything widespread faking of videos like this will protect women who have been filmed without consent because there will be no way to know if it was real. If people are just doing making these videos privately, who is to know or care. That said, it seems to fall under all of the restrictions of using someone's likeness for c
    • by PPH ( 736903 )

      This feels like something that is covered under existing laws.

      I suspect that this might be covered under some of the new 'revenge porn' laws. If not, it wouldn't be a stretch to amend such laws to include it.

      Paste someone's face on a nude and wank to it all you want in private. Post it in an online forum and expect trouble.

    • by AmiMoJo ( 196126 ) on Tuesday September 14, 2021 @03:47AM (#61794801) Homepage Journal

      this really isn't any different than some kid decades ago cutting out a picture form one magazine and pasting in onto their copy of playboy.

      It's very different in two key ways:

      1. The fake is much better quality than an obviously cut-out head pasted on, and will continue to improve over time.

      2. One kid stashing a magazine under his bed is different to it being posted on the internet where others can view it.

  • Sure you can ban the app. How do you ban people's imagination? How do you criminalize what's in someone's thoughts? Also, how do you prevent people from writing their own code to do this, now that the idea is public?

  • by couchslug ( 175151 ) on Monday September 13, 2021 @07:42PM (#61793881)

    Asking for a friend.

  • by WindBourne ( 631190 ) on Monday September 13, 2021 @07:48PM (#61793893) Journal
  • by theshowmecanuck ( 703852 ) on Monday September 13, 2021 @08:08PM (#61793945) Journal
    Getting older and not such great shape anymore. I don't think it would be an ethical issue if they substituted my head on Ron Jeremy. The one above the shoulders ffs. I wouldn't want Ron Jeremy to feel bad.
    • I wouldn't want Ron Jeremy to feel bad.

      He's in jail, I doubt he cares. Is somebody going to feel enough pity on him to give him an extra serving of dessert, or not?

      • give him an extra serving of dessert

        Is that prison slang?

      • I didn't hear about this. I just read about it now. His manager's statement was pretty humourous.

        "When Rolling Stone charged Ron with being in the ME TOO movement, he showed us proof against the allegations," Rusciolelli said in the statement.

  • by Lost Penguin ( 636359 ) on Monday September 13, 2021 @10:07PM (#61794195)
    Betty White on this app.
    • by Mal-2 ( 675116 )

      If it's WW2 era Betty White (yes she's that old) then that would probably be just fine. She was pretty damn hot in her day.

  • If deepfakes get good enough, so that anyone can make porn of anyone ... then people will be aware of that, and not attribute "porn of someone I know" to that person - it is implicitly assumed that it is a fake. Yes, it is a bad thing to portray someone that way and still will be - but the impact in a couple decades of someone seeing someone's head on someone else's body, will probably be "meh" for most people. Shocking now that it is new and novel, not so when it is mainstream.

If you are smart enough to know that you're not smart enough to be an Engineer, then you're in Business.

Working...