Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Software Technology

New Deepfake Algorithm Allows You To Text-Edit the Words of a Speaker In a Video (newatlas.com) 97

It is now possible to take a talking-head style video, and add, delete or edit the speaker's words as simply as you'd edit text in a word processor. A new deepfake algorithm can process the audio and video into a new file in which the speaker says more or less whatever you want them to. New Atlas reports: It's the work of a collaborative team from Stanford University, Max Planck Institute for Informatics, Princeton University and Adobe Research, who say that in a perfect world the technology would be used to cut down on expensive re-shoots when an actor gets something wrong, or a script needs to be changed. In order to learn the face movements of a speaker, the algorithm requires about 40 minutes of training video, and a transcript of what's being said, so it's not something that can be thrown onto a short video snippet and run if you want good results. That 40 minutes of video gives the algorithm the chance to work out exactly what face shapes the subject is making for each phonetic syllable in the original script.

From there, once you edit the script, the algorithm can then create a 3D model of the face making the new shapes required. And from there, a machine learning technique called Neural Rendering can paint the 3D model over with photo-realistic textures to make it look basically indistinguishable from the real thing. Other software such as VoCo can be used if you wish to generate the speaker's audio as well as video, and it takes the same approach, by breaking down a heap of training audio into phonemes and then using that dataset to generate new words in a familiar voice.

This discussion has been archived. No new comments can be posted.

New Deepfake Algorithm Allows You To Text-Edit the Words of a Speaker In a Video

Comments Filter:
  • by ihaveamo ( 989662 ) on Monday June 17, 2019 @08:40PM (#58779550)
    This will be used for evil, and people believe what they want to hear. Politicians will change what they say, and people will change what politicians say. Guaranteed.
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Writing used to be the gold standard. Then at some point, there were wax letter seals. Audio and video can be faked? It's just another in a long line of examples of where you can't necessarily trust which is in front of you. Like you say, people believe what they want to hear--look at Trump supporters. I worry more about people being framed for crimes, but even that has long been an issue--just gather someone's skin and you may have plenty of dna to use. Regardless, we're always left to the limits of

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        This is why we need cryptographicly signed files with public keys for verification

    • by jmatson ( 150605 )

      Someone needs to make an AI that can identify Deepfakes, and fast!

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        That will certainly help train the deepfake makers.

    • by Tablizer ( 95088 )

      Politicians will change what they say, and people will change what politicians say.

      No need. Politicians flip-flop & pander so often that they've probably already stated all positions on a topic.

      • No need. Politicians flip-flop & pander so often that they've probably already stated all positions on a topic.

        The new system will allow them to say something once and the software can state the opposite thus making politicians more efficient. They can use the time saving to serve their country, the world and humanity. We should rejoice.

        • by tsa ( 15680 )

          We will be living in Paradise soon!

        • by Tablizer ( 95088 )

          "I introduce to you the Lie-O-Matic 9000! The most efficient political spinning machine since Facebook! Step right up, folks!..."

    • by Anonymous Coward
      Those of us who spent years finding lizard men in mpeg artifacts will help you decide what is real and what is fake.
    • by Kokuyo ( 549451 ) on Tuesday June 18, 2019 @01:44AM (#58780336) Journal

      It means you cannot trust anything anymore. Not your own memory (because that's proven to be suspect at the best of times), certainly not anyone else's memory and now not even digital records.

      We'll have to start digitally signing video and audio somehow...

      • by rtb61 ( 674572 )

        You can trust the source, you can always trust the source when it is open, ;D. What you need to do is create a credible source, a matter of legislated public record. A publicly funded data source, where you can publish as provenly you and have it recorded and locked down, available to the public. Not on that source, not real but put in on that source and it will never ever go away, no ever, a matter of legislated public record.

        • A publicly funded data source,

          Because if the government has total control over its funding, that'll make sure it's trustworthy!

          • by tsa ( 15680 )

            So, it should be independent of the government. Like the Court of Justice. Oh wait...

      • by Anonymous Coward

        Steel you can trust.

      • >We'll have to start digitally signing video and audio somehow...

        At least until they develop a means for doing this shit 'in the camera' on the fly, signing the output.

        Y'know - replace every "cut taxes" with "raise taxes"; "lower mortality rates" with "murder infants in their cots" or some such shit.

      • We'll have to start digitally signing video and audio somehow...

        I have a better question: Do people care enough? Yeah sure for criminal investigations, but in the general case do people care? We live in a post truth world now. There are literally videos of the President of the USA saying one thing, denying it, and then people defending his denials. Lies are truthified magically with the words "Fake News" in the eyes of the masses.

        Does it really matter that we can actually fake it?

    • What do you mean... "will be". I think you've got your tenses mixed up.
    • by AmiMoJo ( 196126 )

      Do people even care about what is real or not any more? I mean there wasn't really any question over if Trump said all the stuff he was condemned for saying, but people elected in anyway, perhaps because it said that stuff.

      The problem we have now is that people think politicians are all the same, everything they say is a lie and honesty is a useless metric for selecting a leader. To an extent they may be right, but by rejecting the truth as something to be valued a feedback loop is created that keeps making

      • Do people even care about what is real or not any more? I mean there wasn't really any question over if Trump said all the stuff he was condemned for saying, but people elected in anyway

        I believe there were a lot of voters that voted Trump as the least bad choice of the two candidates.

        I guess that shows how bad a candidate people thought Hillary was...

    • I can't imagine any made-up dialogue could be much worse than the genuine crap they're spewing already.

    • It means that we'll need to be able to trust those people who report on what politicians say, to certify that what we're seeing is real. In the past, reading reports about what politicians were saying and doing was the only way to find out what they were saying and doing. In that respect, this is noting new.

      Of course, trust in journalism is at an all-time low, for some reason, so... We are in lots of trouble. You're right.
    • Sad but true, this technology can have a lot of good uses too. From just making Video Games NPC more interactive. to speech tharipy, and testing different types of speeches to make sure you get the right tone and points across.

      But the problem today is we don't have a trustworthy system to validate truth from fiction/simulation. Because if a politician did say something bad, and it was reported as such, they can claim it was a doctored fake, or they didn't say such a thing, they don't have much recourse to

  • by Gravis Zero ( 934156 ) on Monday June 17, 2019 @08:51PM (#58779592)

    They were so worried about proving if it could be done that they failed to ask themselves if it should be done.

    • If it can be done, it will be done. There is no "should"

      • by Viol8 ( 599362 )

        Sorry, not true, people are not robots. It would for example be perfectly possible to create a super virus that could wipe out humanity but it hasn't - as far as we know - been done because the people smart enough to do it also know that it would ultimately mean their demise too. Similarly a lot of AI tech will be used for abuse far more than for good and the people creating it need to take a step back and examine their moral compass occasionally.

        • Okay, let's qualify that a little bit: "If it's easy to do, it will be done." This is easy to do; just download the software and type in the text you want your target to say on video.

        • by mccrew ( 62494 )

          I might believe it if you could point to one example where this has worked in real life.

          Sadly, the common case is the mindset that "we" have to develop it first before "they" do.

    • by Anonymous Coward

      Speaking as a researcher, if you can do it, you do it. Otherwise you risk losing your (very tenuous) funding and hence your job.

      Want researchers to filter their research through their principles? Give them job security.

      • by Viol8 ( 599362 )

        "Want researchers to filter their research through their principles?"

        Yes. Normal people have a moral compass that should override financial concerns which is why most of us don't rob banks or mug people. The fact that you think your job is more important than any moral standpoint tells us all we need to know about your sociopathic tendencies.

    • by necro81 ( 917438 )

      They were so worried about proving if it could be done that they failed to ask themselves if it should be done.

      That sounds much more urgent if it is breathlessly delivered by Jeff Goldblum [youtube.com].

  • by HangingChad ( 677530 ) on Monday June 17, 2019 @09:35PM (#58779686) Homepage

    Would be a tool to show when video has been edited and the speaker's words changed.

  • Now we can expect MLK to be advertising for Burger King...

    I have a dream of..." a flame broiled whopper with cheese for 3.99."

  • tell me at least it has vim key bindings, or maybe the world would be better off if this was harder to use - yeah, finally the use case for emacs bindings.
  • A new deepfake algorithm can process the audio and video into a new file in which the speaker says more or less whatever you want them to.

    What we need is DeepFake as a service. Maybe Amazon could offer it as part of AWS. Even better if there was an app..
    Choose (or upload) a face and the words, get your clip back by return.

  • Finally, I can have a celebrity "read" my novel for free!

  • The lead-in to this post is misleading in that it implies that this software is available for immediate use by readers.

    Not "you", but "them" -- as in the people who actually have the software. Those people have not released their code.
    "You" can't do jack. ...Yet.
  • ... in a perfect world the technology would be used to cut down on expensive re-shoots when an actor gets something wrong, or a script needs to be changed. ...

    Except that actors will immediately sue to prohibit such re-shoots, and/or negotiate in absentia royalties for those scenes in which a computer algorithm is basically stealing their job.

I do not fear computers. I fear the lack of them. -- Isaac Asimov

Working...