Deepfakes Can Now Be Made From a Single Photo (cnet.com) 64
Samsung has developed a new artificial intelligence system for creating deepfakes -- fabricated clips that make people appear to do or say things they never did -- that only needs as little as one photo. CNET reports: The technology, of course, can be used for fun, like bringing a classic portrait to life. The Mona Lisa, whose enigmatic smile is animated in three different videos to demonstrate the new technology, exists solely as a single still image. A Samsung artificial intelligence lab in Russia developed the technology, which was detailed in a paper earlier this week. Here's the downside: These kinds of techniques and their rapid development also create risks of misinformation, election tampering and fraud, according to Hany Farid, a Dartmouth researcher who specializes in media forensics to root out deepfakes.
The system starts with a lengthy "meta-learning stage" in which it watches lots of videos to learn how human faces move. It then applies what it's learned to a single still or a small handful of pics to produce a reasonably realistic video clip. Unlike a true deepfake video, the results from a single or small number of images fudge when reproducing fine details. For example, a fake of Marilyn Monroe in the Samsung lab's demo video missed the icon's famous mole, according to Siwei Lyu, a computer science professor at the University at Albany in New York who specializes in media forensics and machine learning. It also means the synthesized videos tend to retain some semblance of whoever played the role of the digital puppet. That's why each of the moving Mona Lisa faces looks like a slightly different person. [...] The glitches in the fake videos made with Samsung's new approach may be clear and obvious. But they'll be cold comfort to anybody who ends up in a deepfake generated from that one smiling photo posted to Facebook.
The system starts with a lengthy "meta-learning stage" in which it watches lots of videos to learn how human faces move. It then applies what it's learned to a single still or a small handful of pics to produce a reasonably realistic video clip. Unlike a true deepfake video, the results from a single or small number of images fudge when reproducing fine details. For example, a fake of Marilyn Monroe in the Samsung lab's demo video missed the icon's famous mole, according to Siwei Lyu, a computer science professor at the University at Albany in New York who specializes in media forensics and machine learning. It also means the synthesized videos tend to retain some semblance of whoever played the role of the digital puppet. That's why each of the moving Mona Lisa faces looks like a slightly different person. [...] The glitches in the fake videos made with Samsung's new approach may be clear and obvious. But they'll be cold comfort to anybody who ends up in a deepfake generated from that one smiling photo posted to Facebook.
Oh wonderful. (Score:1)
Mona Lisa porn.
My new service (Score:2)
I'm taking regular movies and putting pornstar faces on the actors.
Deep fakes could always be done with a single photo. What's the news?
I remain skeptical (Score:5, Interesting)
(On a related note: A few people I've talked to any swear they can't notice any difference in that movie between the human actors and the CG ones. Am curious about what % of the population that is and whether they aren't suffering from undiagnosed face blindness or something.)
Re: (Score:2)
if this technology is so great then why did it look so bad in Rogue One
The CG Racheal in Blade Runner 2049, on the other hand, is incredible. The amount of work required was equally incredible though - IIRC it took about a year to complete that two minute scene. However, this stuff is only going to get easier...
Re: (Score:2)
So you're saying it was a virtual Replicant?! Inconceivable!
Re: (Score:2)
She has the same "dead salmon" eyes that all these fake human images have.
Re: (Score:2)
Rogue One was in production long before the tech existed and was done manually, not by AI. You watch it in high definition with a well framed shot made with expensive cameras.
The threshold for fake videos of politicians saying stuff at news conferences in a Facebook video that's been transcoded to animated GIF and back already is much lower.
Re: (Score:2)
Re: (Score:2)
One of the things that i find interesting is that people think the impact of such technology is a *new* thing.
I come from the developing world. As a child, we didn't even really have a washroom. Had to go to what we'd call an outhouse.
The lack of clear information is just something we took as normal. You never really knew who was telling the truth or not. Even today as I touch base with people, it's the same thing. Rumors start and away it goes with nothing to really provide a check.
I really don't view thes
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
There is also the issue of once you see it, you can't unsee it.
How many of us were fine with normal TV until they saw HDTV. Or were fine with 96 bit mp3 files until someone pointed out where they sound hollow. CRT displays running at the 60mhz default because they had the wrong driver, until you could see the flicker. Or movies you first saw years ago but look awful rewatching them today.
I think once you learn how to spot CGI effects or get use to better CGI effects, then you are more susceptible to seeing
Re: (Score:2)
I mean this is obviously something to keep a close eye on, obviously all hell is going to break loose when nation state actors begin fabricating video evidence to influence elections or world events, but if this technology is so great then why did it look so bad in Rogue One, a movie that presumably dropped at least a couple million bucks on those scenes?
We're very close.
https://www.youtube.com/watch?v=bPhUhypV27w [youtube.com]
Regarding cost - sometimes projects with huge budgets exist mostly to move money around, and nobody involved (in control) really cares too much about quality. Happens to movies, but see especially: most software projects where the initial budget exceeds $2M.
Re: (Score:2)
1) They don't keep the morph in place long enough for me to conclusively determine whether it's really lifelike. It's constantly fading in and out. There are some instants where I think I'm seeing that same plastic-y vibe I always notice but then it morphs back to his real face again. (Subtly. I grant you that the subtly of the transformation is very impressive.)
2) He never actually looks comp
Re: (Score:2)
You don't even need audio. Just post a silent video of a politician doing anything with a little girl. Even hugging or kissing would be enough for a vexecution.
Prove you are human... (Score:1)
...like in blade runner.
We are definitely living in some interesting times. We can't even trust our own senses anymore.
Hello advertising hell! (Score:4, Insightful)
The gullible will fall for it and the sceptical will trust even fewer people. It'll offer the marketing-droids a perfect way to sell any product using any celebrity dead or alive.
" Hi I'm Luis Garavito I murdered over 200 children but don't you find that getting those blood stains out of your clothes a chore when your hiding from the Police? Me too so I tried new "WHITE-O" brand detergeant. It's amazing! I can now reuse those blood stained sheets again and again after disposing of my victims. That's good for the environment and with money off offer, it's good for your pocket too!"
Re: (Score:2)
I am actually pretty concerned about this. If we cross the Rubicon and enter a world where a lot of people have the ability to make fake media but few people have the technical ability to determine if it is a fake its really going to cause a pretty massive break down of trust across society.
The obvious did $Pol really say that; but even in the court room is the surveillance video real? I would find it hard to say yup, that guys the perp beyond a reasonable doubt when the primary evidence is a some slow scan
Re: (Score:2)
That does not really solve a lot of problems. For example:
did $Pol really say that?
I can the video has not been altered in that it carries $reporter or $networks signature; but I have no way to verify they did not create the fake video and than sign it.
did the property manager really check the renters left and was everything really cleaned and locked up
No idea same problem I know I have have the video they intended to send me but I don't know its the least bit 'authentic'.
My ability to trust both of these currently rely on the fact that it would be
1) More trouble than its worth to make a fake
2) To damaging to the purveyors own r
Re: (Score:2)
The gullible will fall for it if it reinforces their existing beliefs
FTFY.
Oh no, someone might have power! (Score:1)
Huge Implications for Court Proceedings/Evidence (Score:1)
Its gonna get worse (Score:1)
We knew the tech is only early stage, we know its evolving and once it becomes commonplace its gonna be a shitshow for people to asses what is real and fake. Like we dont have enough problems with people choosing what is real or fake based on their feelings already.
Atleast politicians can breathe abit easier now. Everytime they are caught on tape doing bad stuff, its a deepfake.
skoltech (Score:2)
Both good and bad (Score:2)
Trump's handlers are going to take (Score:1)
a porn tape of women pissing on each other and paste Trump's face onto the male porn star in the video and release it. Trump will then scream "fake news!". An investigation will find the original porn tape and Trump will say "see, the media and dems are out to get me!". His idiot followers, easily manipulated as they are, will believe it.
Trump claimed the pee pee tape doesn't exist because he knows that there are cameras everywhere in the kinds of hotels he stays in, especially in Moscow, so he tells his