Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Software Technology

Algorithm Automatically Spots 'Face Swaps' In Videos (technologyreview.com) 40

yagoda shares a report from MIT Technology Review: Andreas Rossler at the Technical University of Munich in Germany and colleagues have developed a deep-learning system that can automatically spot face-swap videos. The new technique could help identify forged videos as they are posted to the web. But the work also has sting in the tail. The same deep-learning technique that can spot face-swap videos can also be used to improve the quality of face swaps in the first place -- and that could make them harder to detect. The new technique relies on a deep-learning algorithm that Rossler and co have trained to spot face swaps. These algorithms can only learn from huge annotated data sets of good examples, which simply have not existed until now. In semi-related news, the Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA) says it's "fighting back" against the dangers posed by new face-swapping technologies that have been used to digitally superimpose the faces of its members onto the bodies of porn stars.

"SAG-AFTRA has undertaken an exhaustive review of our collective bargaining options and legislative options to combat any and all uses of digital re-creations, not limited to deepfakes, that defame our members and inhibit their ability to protect their images, voices and performances from misappropriation. We are talking with our members' representatives, union allies, and with state and federal legislators about this issue right now and have legislation pending in New York and Louisiana that would address this directly in certain circumstances. We also are analyzing state laws in other jurisdictions, including California, to make sure protections are in place. To the degree that there are not sufficient protections in place, we will work to fix that..."
This discussion has been archived. No new comments can be posted.

Algorithm Automatically Spots 'Face Swaps' In Videos

Comments Filter:
  • So, can the algorithm create a face swap that even it couldn't detect? I think this may be something of a Zen koan.

    • Re:Superman vs God (Score:5, Insightful)

      by goombah99 ( 560566 ) on Monday April 23, 2018 @10:15PM (#56492287)

      Yes! that's what the whole principle of adversarial learning is based on.

      • Yes! that's what the whole principle of adversarial learning is based on.

        But then wouldn't the face swap detection algorithm also be getting better via adversarial learning at spotting face swaps?

        My question is, who wins, the face swap detecting algorithm or the face swap detecting algorithm trying to create the perfect face swap?

        • Yes! that's what the whole principle of adversarial learning is based on.

          But then wouldn't the face swap detection algorithm also be getting better via adversarial learning at spotting face swaps?

          My question is, who wins, the face swap detecting algorithm or the face swap detecting algorithm trying to create the perfect face swap?

          Either the egg or the chicken wins. It's called the optional stopping problem in statistics.

    • That doesn't matter, if it has greater than a 50% confidence then they send out a demand letter for money or to knock you offline anyway and see if you try to fight back. Think back to how often DMCA notices were abused when they were handed over to automated platforms to send them out. That's all this is, the Screen Actors' Guild is not here to protect society (quite the opposite, some might argue). They don't care about you or spotting fake news or whatever, they're just trying to make sure the actors

  • "fighting back" against the dangers posed by new face-swapping technologies that have been used to digitally superimpose the faces of its members onto the bodies of minimum scale movie extras.

    FTFY.

  • dangers? be happy! (Score:5, Interesting)

    by ooloorie ( 4394035 ) on Monday April 23, 2018 @08:52PM (#56492067)

    "fighting back" against the dangers posed by new face-swapping technologies that have been used to digitally superimpose the faces of its members onto the bodies of porn stars

    Where is the "danger" in that? Porn stars usually have great bodies, so you'll end up looking great. And you don't have to worry about leaked sex tapes anymore because you always have plausible deniability.

    I think the real reason the screen actor's guild is so up in arms about this is because it makes it much easier for movie producers to mix and match acting ability with looks: they can go for an unknown actor and paste exactly the kind of face on him they want. And licensing your face to be pasted on an unknown actor isn't as lucrative as acting yourself.

    • Comment removed based on user account deletion
    • by AmiMoJo ( 196126 )

      The Screen Actor's Guild cares about this for the same reason they care about impressionists and rubber masks - they want to protect their member's likenesses. A big star is an attraction by themselves, even if they are only voicing a character in an animated movie. They certainly don't pay the stars 1000x the going rate because they are a 1000x better actor.

      In the case of deepfakes, a female actress showing some skin is a big draw for a movie. Of course there is also the fact that involuntary pornography i

  • by Fly Swatter ( 30498 ) on Monday April 23, 2018 @09:10PM (#56492095) Homepage
    They correctly used the term algorithm instead of AI.
    • Even "algorithm" is incorrect in this case. The current state of the art depends a lot on data quality, model architecture, and hyperparameters. As an algorithmist, I shudder to call such things algorithm.

  • by Anonymous Coward

    Actually, I am waiting the first defense attorney to get an expert to say the criminal caught on camera wasnâ(TM)t the accused. It was a face swap.

  • by Anonymous Coward

    Until Emma Watson's photo got superimposed onto Kiera Knightly's body with a photoshopped cock in its mouth.

  • by Anonymous Coward

    I hear the porn stars are pissed that their bodies are "swapped" onto ordinary actors' faces.

  • If it can spot face swaps and flag them, it can be used to train the face swap algorithms to make them better.

    • by AmiMoJo ( 196126 )

      The main limiting factor seems to be the original video onto which the new face is being mapped. Odd angles, things obscuring parts of the face, poor lighting and the like all make it struggle and look less realistic. Of course, it also helps if the actor the face is being applied to looks somewhat like the target.

      As such, I expect there will soon be a market for porn stars who look a bit like a famous actress and who produce videos specifically designed to make the algorithm work better. If they could get

  • by Anonymous Coward

    Is anyone else imagining this technology being used the other way to put their favourite porn star faces on non-porn movies?

  • Am I the only one seeing a Butlerian Jihad on the way?

  • by 93 Escort Wagon ( 326346 ) on Tuesday April 24, 2018 @12:12AM (#56492563)

    Some grad student figured out a way to get a PhD for watching pornography all the time!

  • by Anonymous Coward

    Why are they trying to break one of the laws of the internet? Arrest those fucks.

  • by trg83 ( 555416 ) on Tuesday April 24, 2018 @08:51AM (#56493871)
    It's clear that the SAG is on this because there is a massive amount of money at stake. The real story here is that, should we evolve the technology to the point where no AI or person can detect it, there is a real hazard to liberty. Political enemies could be placed (virtually) in embarrassing or illegal positions, surveillance footage of crimes could be faked, criminals could go free by claiming their surveillance footage was faked. This really shakes the core of things we've learned as "truth." We've known photos were able to be faked, with various amounts of success/believability, for a long time, but the idea of producing videos that can't be detected as fakes is crazy.
    • by epine ( 68316 )

      The real story here is that, should we evolve the technology to the point where no AI or person can detect it, there is a real hazard to liberty.

      I've seen thousands of liberty narratives, in all manner formal dress and dishabille.

      Have I ever seen a narrative about liberty which frames liberty as something we're lucky to have only because we we gifted a milieu of sufficient objective agreement (and this only through the magic social pixie-dust of unfakeable images?). No, I have not.

      Libertarian defendant: Bu

In the long run, every program becomes rococco, and then rubble. -- Alan Perlis

Working...