Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Social Networks Software The Internet Technology

Deepfakes Can Now Be Made From a Single Photo (cnet.com) 64

Samsung has developed a new artificial intelligence system for creating deepfakes -- fabricated clips that make people appear to do or say things they never did -- that only needs as little as one photo. CNET reports: The technology, of course, can be used for fun, like bringing a classic portrait to life. The Mona Lisa, whose enigmatic smile is animated in three different videos to demonstrate the new technology, exists solely as a single still image. A Samsung artificial intelligence lab in Russia developed the technology, which was detailed in a paper earlier this week. Here's the downside: These kinds of techniques and their rapid development also create risks of misinformation, election tampering and fraud, according to Hany Farid, a Dartmouth researcher who specializes in media forensics to root out deepfakes.

The system starts with a lengthy "meta-learning stage" in which it watches lots of videos to learn how human faces move. It then applies what it's learned to a single still or a small handful of pics to produce a reasonably realistic video clip. Unlike a true deepfake video, the results from a single or small number of images fudge when reproducing fine details. For example, a fake of Marilyn Monroe in the Samsung lab's demo video missed the icon's famous mole, according to Siwei Lyu, a computer science professor at the University at Albany in New York who specializes in media forensics and machine learning. It also means the synthesized videos tend to retain some semblance of whoever played the role of the digital puppet. That's why each of the moving Mona Lisa faces looks like a slightly different person. [...] The glitches in the fake videos made with Samsung's new approach may be clear and obvious. But they'll be cold comfort to anybody who ends up in a deepfake generated from that one smiling photo posted to Facebook.

This discussion has been archived. No new comments can be posted.

Deepfakes Can Now Be Made From a Single Photo

Comments Filter:
  • by Anonymous Coward

    Mona Lisa porn.

  • I remain skeptical (Score:5, Interesting)

    by Shane_Optima ( 4414539 ) on Thursday May 23, 2019 @08:37PM (#58645304) Journal
    I mean this is obviously something to keep a close eye on, obviously all hell is going to break loose when nation state actors begin fabricating video evidence to influence elections or world events, but if this technology is so great then why did it look so bad in Rogue One, a movie that presumably dropped at least a couple million bucks on those scenes?

    (On a related note: A few people I've talked to any swear they can't notice any difference in that movie between the human actors and the CG ones. Am curious about what % of the population that is and whether they aren't suffering from undiagnosed face blindness or something.)
    • if this technology is so great then why did it look so bad in Rogue One

      The CG Racheal in Blade Runner 2049, on the other hand, is incredible. The amount of work required was equally incredible though - IIRC it took about a year to complete that two minute scene. However, this stuff is only going to get easier...

    • by AmiMoJo ( 196126 )

      Rogue One was in production long before the tech existed and was done manually, not by AI. You watch it in high definition with a well framed shot made with expensive cameras.

      The threshold for fake videos of politicians saying stuff at news conferences in a Facebook video that's been transcoded to animated GIF and back already is much lower.

      • I wasn't thinking that AI had trumped millions of dollars' worth of manual fiddling yet but I haven't dug around on Youtube for higher quality vids yet. Certainly AI has the potential to do that but it would need a lot of human training. And maybe they've done that. I wonder if people are being asked "which face do you think is the virtual one?" on Amazon Mechanical Turk as we speak.
    • One of the things that i find interesting is that people think the impact of such technology is a *new* thing.

      I come from the developing world. As a child, we didn't even really have a washroom. Had to go to what we'd call an outhouse.

      The lack of clear information is just something we took as normal. You never really knew who was telling the truth or not. Even today as I touch base with people, it's the same thing. Rumors start and away it goes with nothing to really provide a check.

      I really don't view thes

    • by idji ( 984038 )
      because it was handcrafted.
      • well, I do believe that someday most of these problems will be automagically solved by AI but without a lot of horsepower and extensive training with human feedback (telling it what looks fake and what doesn't), I wasn't thinking this was a magic bullet. Maybe it is. Maybe it really has reached the point where it can beat out millions of dollars of by-hand fiddling. The video in TFS was too small to get a real good look; was too lazy to dig for more.
    • by KevMar ( 471257 )

      There is also the issue of once you see it, you can't unsee it.

      How many of us were fine with normal TV until they saw HDTV. Or were fine with 96 bit mp3 files until someone pointed out where they sound hollow. CRT displays running at the 60mhz default because they had the wrong driver, until you could see the flicker. Or movies you first saw years ago but look awful rewatching them today.

      I think once you learn how to spot CGI effects or get use to better CGI effects, then you are more susceptible to seeing

    • by PJ6 ( 1151747 )

      I mean this is obviously something to keep a close eye on, obviously all hell is going to break loose when nation state actors begin fabricating video evidence to influence elections or world events, but if this technology is so great then why did it look so bad in Rogue One, a movie that presumably dropped at least a couple million bucks on those scenes?

      We're very close.

      https://www.youtube.com/watch?v=bPhUhypV27w [youtube.com]

      Regarding cost - sometimes projects with huge budgets exist mostly to move money around, and nobody involved (in control) really cares too much about quality. Happens to movies, but see especially: most software projects where the initial budget exceeds $2M.

      • That is very interesting, and obviously I do need to spend more time digging around for the good examples but:

        1) They don't keep the morph in place long enough for me to conclusively determine whether it's really lifelike. It's constantly fading in and out. There are some instants where I think I'm seeing that same plastic-y vibe I always notice but then it morphs back to his real face again. (Subtly. I grant you that the subtly of the transformation is very impressive.)

        2) He never actually looks comp
  • ...like in blade runner.

    We are definitely living in some interesting times. We can't even trust our own senses anymore.

  • by AxisOfPleasure ( 5902864 ) on Thursday May 23, 2019 @11:56PM (#58645838)

    The gullible will fall for it and the sceptical will trust even fewer people. It'll offer the marketing-droids a perfect way to sell any product using any celebrity dead or alive.

    " Hi I'm Luis Garavito I murdered over 200 children but don't you find that getting those blood stains out of your clothes a chore when your hiding from the Police? Me too so I tried new "WHITE-O" brand detergeant. It's amazing! I can now reuse those blood stained sheets again and again after disposing of my victims. That's good for the environment and with money off offer, it's good for your pocket too!"

    • by DarkOx ( 621550 )

      I am actually pretty concerned about this. If we cross the Rubicon and enter a world where a lot of people have the ability to make fake media but few people have the technical ability to determine if it is a fake its really going to cause a pretty massive break down of trust across society.

      The obvious did $Pol really say that; but even in the court room is the surveillance video real? I would find it hard to say yup, that guys the perp beyond a reasonable doubt when the primary evidence is a some slow scan

    • The gullible will fall for it if it reinforces their existing beliefs

      FTFY.

  • It's like burning witches because you're paranoid they might use their power to do something bad.
  • This has grossly huge implications for court proceedings and evidence rules. Laws and legal processes always lag behind the technology. Yes, the nation-state uses frighten me, but the bitter divorce with angry custody battle frightens me more. I've seen the things that some people will do to try to "win" in a divorce. The highlights include feeding low-dose arsenic to a child, drugging a spouse with X, posting private photos/videos on line, selling off all of their possession at 1/100th of their value,
  • by Anonymous Coward

    We knew the tech is only early stage, we know its evolving and once it becomes commonplace its gonna be a shitshow for people to asses what is real and fake. Like we dont have enough problems with people choosing what is real or fake based on their feelings already.

    Atleast politicians can breathe abit easier now. Everytime they are caught on tape doing bad stuff, its a deepfake.

  • one should probably not only mention the industry sponsor but also the academic institution Skoltech which made this possible. There is a nice video by the team which is more informative than the linked cnet article https://www.youtube.com/watch?... [youtube.com]
  • Bad: obviously this tech will be a tool in generating fake news. You can bet that Russia for example is running similar kind of software. Perhaps we need a solution to cryptographically sign video content by certified sources? Good: creative possibilities. Just an example. Imagine a civil war documentary where the generals could be “interviewed”, and they would answer with something they wrote in their memories. Etc. Usage in a museum. Etc.
  • a porn tape of women pissing on each other and paste Trump's face onto the male porn star in the video and release it. Trump will then scream "fake news!". An investigation will find the original porn tape and Trump will say "see, the media and dems are out to get me!". His idiot followers, easily manipulated as they are, will believe it.

    Trump claimed the pee pee tape doesn't exist because he knows that there are cameras everywhere in the kinds of hotels he stays in, especially in Moscow, so he tells his

Let's organize this thing and take all the fun out of it.

Working...