Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
The Internet Communications Network

Pornhub Hasn't Been Actively Enforcing Its Deepfake Ban (engadget.com) 97

Pornhub said in February that it was banning AI-generated deepfake videos, but BuzzFeed News found that it's not doing a very good job at enforcing that policy. The media company found more than 70 deepfake videos -- depicting graphic fake sex scenes with Emma Watson, Scarlett Johanson, and other celebrities -- were easily searchable from the site's homepage using the search term "deepfake." From the report: Shortly after the ban in February, Mashable reported that there were dozens of deepfake videos still on the site. Pornhub removed those videos after the report, but a few months later, BuzzFeed News easily found more than 70 deepfake videos using the search term "deepfake" on the site's homepage. Nearly all the videos -- which included graphic and fake depictions of celebrities like Katy Perry, Scarlett Johansson, Daisy Ridley, and Jennifer Lawrence -- had the word "deepfake" prominently mentioned in the title of the video and many of the names of the videos' uploaders contained the word "deepfake." Similarly, a search for "fake deep" returned over 30 of the nonconsensual celebrity videos. Most of the videos surfaced by BuzzFeed News had view counts in the hundreds of thousands -- one video featuring the face of actor Emma Watson garnered over 1 million views. Some accounts posting deepfake videos appeared to have been active for as long as two months and have racked up over 3 million video views. "Content that is flagged on Pornhub that directly violates our Terms of Service is removed as soon as we are made aware of it; this includes non-consensual content," Pornhub said in a statement. "To further ensure the safety of all our fans, we officially took a hard stance against revenge porn, which we believe is a form of sexual assault, and introduced a submission form for the easy removal of non-consensual content." The company also provided a link where users can report any "material that is distributed without the consent of the individuals involved."
This discussion has been archived. No new comments can be posted.

Pornhub Hasn't Been Actively Enforcing Its Deepfake Ban

Comments Filter:
  • by Anonymous Coward on Sunday April 22, 2018 @04:14PM (#56485237)

    (rushes to PornHub)

    • by Anonymous Coward

      Congratulations, you are now gay.

  • by Anonymous Coward

    Something involving hot grits, maybe?

  • I searched Pornhub for deepfake. "We're sorry, but the requested search cannot be found. Broaden your search." Slashdotted? I only want to see a technical demo. Any links?
  • by Anonymous Coward on Sunday April 22, 2018 @04:28PM (#56485323)

    Over at buzzfeed it must be a slow news day, then a fast news day, then a slow news day, then a messy news day...

    All puns completely intentional.

    • Re: (Score:2, Funny)

      by Anonymous Coward

      When do we care about Buzzfeed? They are not reputable? That site is a joke. imho its like a mild "The Onion."

    • by mjwx ( 966435 )
      At a Buzzfeed office.

      Boss: HARRISON... I bloody well told you to stop watching Porn at the office.
      /furious clicking
      Serf: Erm... I wasn't watching porn.
      Boss: Then what the Sam hell were you doing.
      Serf: Uhhh.... Researching... Yes, researching an article (mumbles) that's the ticket.
      Boss: What about?
      Serf: Errr... ah! you know that fake deep stuff, the celebrity heads on pornstars bodies?
      Boss: Go on.
      Serf: Well apparently they're still doing it.
      Boss: Hmmm.... I smell a Buzzfeed worthy story here, mak
  • by Anonymous Coward

    https://www.pornhub.com/video/search?search=deepfakeporn

  • by Anonymous Coward

    I'm really of two minds on this subject.

    On one hand, I can certainly understand how any public figure - or heck, even if you're a complete unknown - you wouldn't want to see these types of fake videos of yourself being made and published for all to see.

    On the other - lets not forget The Fappening. If these so-called deep fake videos become indistinguishable from the real thing, then you have plausible deniability when your actual videos *do* leak. If anything, the more there is, the more likely they'll ge

    • you have plausible deniability when your actual videos *do* leak.

      I am not sure denying it is the best strategy. Paris Hilton was a near-nobody before her "leaked" sex video thrust her into a life of reality TV stardom. She inherited a few million, but has earned more than $100M on her own with TV shows, fragrances, fashion lines, etc.

      Vanessa Hudgens was a fading child star when her "leaked" nude photos led to some mature movie and TV roles.

  • then it sounds like they're doing a damn good job considering they had over 4 million videos uploaded just last year.

    • by AmiMoJo ( 196126 )

      Their business model is 90% copyright violation. Policing their site too well would just increase their liability for the infringement, or get 90% of the content removed.

      Their policy is very much to ignore everything until it is reported, and there is little incentive for users to report this stuff.

  • by ffkom ( 3519199 ) on Sunday April 22, 2018 @04:55PM (#56485409)
    Why would I want the face of some actor in some porn - I mean - who looks at faces in pornography, anyway?

    I would rather prefer a lot if people spent that lot of CPU time on replacing ugly looking bodies with ones that are beautifully shaped and come without weird piercings and tattoos...
    • There's an entire category of pornography dedicated to just faces of women experiencing pleasure.

      Or so a friend told me.

  • This is probably upsetting several dozen people in the world.

  • Awful behaviour, despicable, I'm not going to bookmark that site at all, not now. How do we bookmark sites again?

  • Why bother? (Score:4, Insightful)

    by Holammer ( 1217422 ) on Sunday April 22, 2018 @06:05PM (#56485697)

    In a foreseeable future people will have the CPU crunch and software to render deepfake in real-time. ... on a phone!

  • well, it resembled the OP title enough
  • by MobyDisk ( 75490 ) on Sunday April 22, 2018 @08:12PM (#56486165) Homepage

    So I actually want to try out deep fakes - I can think of a billion hilarious uses for this. Unfortunately, every search result for it is centered on pron. There was even a reddit where people were posting instructions and helpful information - but I think between 99% and 101% of it was pron, and reddit shut it down. Anyone have any legit links to information on how to get started on it that DON'T involve pron?

  • Some links would be helpful. Just for research purposes, you see. I think it would be useful to verify these claims, um, first hand.
  • So essentially making a cartoon of a public figure is akin to rape now?
    • Everything is rape now. The word rape lost all of its meaning.

      • You used the word rape in two successive sentences. Do you have any idea how many people now feel raped by that? Oh shit! Did I just....
    • I think the problem is that the images can be presented as real. You may not care if someone draws an insulting cartoon of you, but migt care about a video that appears to show you doing something horribly embarrassing (like buying a windows phone if you are an Apple fanboy), and then distributes that in a way that will make people believe it actually happened.

      If its sexual, than for many people that is worse. Would you really be comfortable with your friends seeing a video that looks like you are engaging

  • People want to see it, Pornhub wants to show what people want to see, what's their motivation to ban it without being forced?

    • Exactly and to be perfectly fair why bother trying to force this away at all? It's not going away and if anything it's only going to get worse (or "better" depending on what your perspective is). This isn't something people can just wish into non-existence and there's no realistic way to undo it. It was inevitable that we'd all get machines powerful enough to do this and that someone would work out how to do it and then make it easy enough for anyone to download a program and get started. Everything in the
      • Not to mention that from now on, every celebrity where a sex tape gets leaked can credibly claim that "That isn't me!"

  • the article invalidates itself, in the first two paragraphs:

    "the site, which averages over 100 billion video views a year."

    "While banned material frequently slips through the cracks on large sites that allow users to upload content"

    "BuzzFeed News easily found more than 70 deepfake videos using the search term "deepfake" on the site's homepage"

    so how many videos are on porn hub? 2,017,953 go to pornhub, and then go to the porn videos page and show the newest videos (this should bring up the maximum number of

  • I see a real risk in this technology: imagine the effect of doing a good fake of a political candidate sexually abusing a child. There might be lots of denials, but it would be difficult to completely reject. With our current level of concern (rational or not) about abuse of children, it would gather a lot of attention, and even risk prison for the victim.

    The alternative of making fake child porn (eg no real children involved) is illegal, but I don't think represents as much of a risk

  • fap fap fap .. fap fap .. fap .. uuurgh ref [quickmeme.com] !

My sister opened a computer store in Hawaii. She sells C shells down by the seashore.

Working...