Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Technology

'Deepfakes' of Celebrities Have Begun Appearing in Ads, With or Without Their Permission (wsj.com) 57

Digital simulations of Elon Musk, Tom Cruise, Leo DiCaprio and others have shown up in ads, as the image-melding technology grows more popular and presents the marketing industry with new legal and ethical questions. From a report: Celebrity deepfakes are coming to advertising. Among the recent entries: Last year, Russian telecommunications company MegaFon released a commercial in which a simulacrum of Hollywood legend Bruce Willis helps defuse a bomb. Just last week, Elon Musk seemed to star in a marketing video from real-estate investment startup reAlpha Tech. And last month a promotional video for machine-learning firm Paperspace showed talking semblances of the actors Tom Cruise and Leonardo DiCaprio. None of these celebrities ever spent a moment filming these campaigns. In the cases of Messrs. Musk, Cruise and DiCaprio, they never even agreed to endorse the companies in question. All the videos of digital simulations were created with so-called deepfake technology, which uses computer-generated renditions to make the Hollywood and business notables say and do things they never actually said or did.

Some of the ads are broad parodies, and the meshing of the digital to the analog in the best of cases might not fool an alert viewer. Even so, the growing adoption of deepfake software could eventually shape the industry in profound ways while creating new legal and ethical questions, experts said. Authorized deepfakes could allow marketers to feature huge stars in ads without requiring them to actually appear on-set or before cameras, bringing down costs and opening new creative possibilities. But unauthorized, they create a legal gray area: Celebrities could struggle to contain a proliferation of unauthorized digital reproductions of themselves and the manipulation of their brand and reputation, experts said.

This discussion has been archived. No new comments can be posted.

'Deepfakes' of Celebrities Have Begun Appearing in Ads, With or Without Their Permission

Comments Filter:
  • by luvirini ( 753157 ) on Tuesday October 25, 2022 @03:08PM (#62997447)

    ..specially to the celebrity who is faked.

    But the real problem will soon be in things like politics where people will be faked to say and do things that they never did.

    And even if the video is then analyzed and debunked as deepfake, some people will still believe in it.

    • "..specially to the celebrity who is faked."

      Yes, all those celebrity sex movies and photos are fake.:-)

    • by Tablizer ( 95088 ) on Tuesday October 25, 2022 @03:27PM (#62997509) Journal

      Oh come on, you don't wanna see Elvis sell blue-suede shoes?

      • by Xenx ( 2211586 )
        Honestly, that wouldn't bother me personally nearly as much. That is, a deceased celeb being used. You should be able to tell that a deceased celebrity is likely not actually endorsing the product. To be clear, I understand and agree with why it would bother people. I'm just saying I wouldn't find it as offensive since it's clear what is going on.
    • ..specially to the celebrity who is faked.

      But the real problem will soon be in things like politics where people will be faked to say and do things that they never did.

      And even if the video is then analyzed and debunked as deepfake, some people will still believe in it.

      I'm not as convinced political deepfakes will be as big an issue as people believe.

      Right now video == proof because video is hard to fake, but voice impersonators are pretty good and you don't see many fake voice recordings.

      Moreover there's already lots of politicians who get caught in scandals without any video, just reports from people who were there to witness it.

      I think the public will figure out pretty quickly that video can be faked and shouldn't be believed unless a reputable media outlet has verifie

      • I'm not as convinced political deepfakes will be as big an issue as people believe

        Where have you been the past decade?

        People believe some politician said or did something they never even said or did, without any shred of evidence, just because some bozo claims they did and it fits their narrative. You think they won't if there is a picture to "prove" it?

        • If they will believe it even without a picture, then that's not a deepfake problem, it's a plain old lies problem. The "deepfake problem" would have to constitute only instances in which someone would not believe a lie without the fake, but will believe the lie thanks to the fake instead of performing further checks which would have revealed the truth.
        • by quantaman ( 517394 ) on Tuesday October 25, 2022 @04:29PM (#62997667)

          I'm not as convinced political deepfakes will be as big an issue as people believe

          Where have you been the past decade?

          People believe some politician said or did something they never even said or did, without any shred of evidence, just because some bozo claims they did and it fits their narrative. You think they won't if there is a picture to "prove" it?

          Do they truly believe it or do they just say they believe it to fit in?

          A lot of folks "believed" that a pizza joint was in the middle of a child trafficking ring, yet only one bozo showed up at the pizza joint with a gun [wikipedia.org].

          If I knew about a bunch of kids being trafficked at a nearby business I wouldn't just sit around on my ass chatting about it on the Internet. So why were all the other thousands (at least) of Pizzagate "believers" sitting behind their keyboards instead of doing something to stop it?

          Because they ultimately knew it was BS.

          Maybe they believed parts of the conspiracy were true in some sense, but when it came to the principal claims they knew better than to go testing their hypothesis.

          • by Opportunist ( 166417 ) on Wednesday October 26, 2022 @01:56AM (#62998727)

            I'm more inclined to believe that yes, they did really believe that, but that a bunch of kids being abused isn't enough for them to get off their ass. I mean, they ain't kids, so why bother doing anything about it, it doesn't really affect them, does it?

            You'll find that the Venn-diagram between conspiracy nuts and egoistical, selfish bastards who don't give a crap about anyone but themselves is nearly a perfect circle.

          • by AmiMoJo ( 196126 )

            A lot of folks "believed" that a pizza joint was in the middle of a child trafficking ring, yet only one bozo showed up at the pizza joint with a gun.

            That doesn't mean that more of them wouldn't have gone there with guns is he hadn't. He was just the first, and his immediate arrest discouraged any others who were thinking of trying.

            A better example would be the January 6th insurrection. Enough people believed that the election was stolen, that members of the government were criminals, and that they would succeed in their revolution for a large mob of them to descend on the Capitol Building.

            • A lot of folks "believed" that a pizza joint was in the middle of a child trafficking ring, yet only one bozo showed up at the pizza joint with a gun.

              That doesn't mean that more of them wouldn't have gone there with guns is he hadn't. He was just the first, and his immediate arrest discouraged any others who were thinking of trying.

              A better example would be the January 6th insurrection. Enough people believed that the election was stolen, that members of the government were criminals, and that they would succeed in their revolution for a large mob of them to descend on the Capitol Building.

              True, though a big difference with the 2020 election is the number of people in positions of trust (prominent politicians and media personalities) who were either telling them the election was stolen or saying they had "deep concerns".

              Most conspiracy theories don't get that kind of direct endorsement from supposedly trustworthy parties.

      • Right now video == proof because video is hard to fake, ...

        Unless you're Trump denying what's in the video, even when it's literally him speaking, and his supporters (enablers) believing him.

      • by rgmoore ( 133276 )

        I'm not as convinced political deepfakes will be as big an issue as people believe.

        I agree with you, but from the other side of the issue. The kind of people who will be fooled by deepfakes even after they've been debunked are already being fooled by much less sophisticated technology. They're easy to fool because they want to believe.

    • by tlhIngan ( 30335 )

      ...specially to the celebrity who is faked.

      I'm sure it's nothing a lawyer can go fix. After all, the ad would basically identify who paid for it - the ad had to run on some medium (billboard, radio, TV, internet) and thus there's a money trail leading to someone making the ad and all that stuff.

    • Followed immediately by pols denying things they did say.

    • And even if the video is then analyzed and debunked as deepfake, some people will still believe in it.

      We have that now. Even when a video is real, some people won't believe it -- and/or some will deny it, even when they're literally the one speaking in the video. [*cough* Trump *cough*]

    • by jonwil ( 467024 )

      The reverse will also happen where politicians will come out and say "that wasn't me in that video where I promised to do xyz"

    • But the real problem will soon be in things like politics where people will be faked to say and do things that they never did.

      They call it "correction" (season 2 on topic) https://www.imdb.com/title/tt8... [imdb.com]

    • by 0xG ( 712423 )
      I can't wait to see the deepfake of Donald going down on Vladimir :-)
  • ...they've always been deepfakes.

  • by Joe_Dragon ( 2206452 ) on Tuesday October 25, 2022 @03:22PM (#62997485)

    use the DMCA to stop it!

  • They're impersonating someone and/or using someone's image without approval.

    There's no question there. That's illegal. How you made that image is irrelevant.

    • Comment removed based on user account deletion
      • by ceoyoyo ( 59147 )

        In Canada it's punishable by up to 10 years in prison. Looks like it varies from state to state in the US, but in many it's a felony.

        Possibly more important would be that someone using the likeness of a celebrity could be sued up the wazoo. Even if they're not actually endorsing something. For example: https://en.wikipedia.org/wiki/... [wikipedia.org].

      • Re:Legal questions? (Score:4, Interesting)

        by rgmoore ( 133276 ) <glandauer@charter.net> on Tuesday October 25, 2022 @06:16PM (#62997935) Homepage

        In the US, it depends on how the impersonation is used. The line for federal law seems to be falsely claiming an association or endorsement, so using someone's image to sell your product without their permission is definitely illegal. That would be a civil tort, though, not a criminal matter. Some states have stronger laws that allow a person complete right over their likeness when used for commercial benefit. In practice, though, nobody is likely to go after every in-person impersonator, so the celebrity impersonators on Hollywood Boulevard can keep their jobs.

        • For actors who are members of Actors Equity, the standard response to this situation is:

          I am Equity. You many not use my likeness, or the sound of my voice, without my written permission, which will be given only upon payment of a fee.

          • by rgmoore ( 133276 )

            Sure, but what people demand and what they're legally entitled to aren't necessarily the same. It is always safe to ask permission and pay a fee before using someone's likeness, but that doesn't mean it's strictly necessary. All the normal exceptions- news reporting, commentary, criticism, parody, etc.- still apply.

            • Sure, but what people demand and what they're legally entitled to aren't necessarily the same. It is always safe to ask permission and pay a fee before using someone's likeness, but that doesn't mean it's strictly necessary. All the normal exceptions- news reporting, commentary, criticism, parody, etc.- still apply.

              In most cases, you can get permission to use someone's likeness by just asking them to sign a model release. Using a recognizable picture of someone in a commercial situation without a model release can easily get you into legal trouble. Reciting the "I am Equity" mantra is saying that you are not willing to sign a model release without compensation.

              The best time to say "I am Equity..." is when the camera and microphone are pointed at you.

    • Sure. But is the fine large enough to discourage doing it? Because if the advertising effect generates more profit than the fine is, the fine is just part of the cost of operation.

    • Not illegal. It's no different then if someone made a cartoon figure animated to speech.
    • That's illegal.

      Citation needed.

  • by DrXym ( 126579 ) on Tuesday October 25, 2022 @03:55PM (#62997597)
    I've lost count of the number of times I've seen interstitial ads / videos featuring some celeb purportedly endorsing some obvious scam. YouTube is rife with these ads. I saw a recent one where somebody dubbed over Elon Musk and pretended he was endorsing some shit called Quantum Code which is a binary trading scam.

    You would think that YouTube or Facebook could transcribe these videos and nullify them within seconds but apparently not. You would think they would make it easy to report ads for impersonation or false endorsement but nope. You would think they would require advertisers to put up sizable deposit that is forfeit if they violate the rules but nope to that too. Clearly they don't care enough to implement proper checks and balances. Maybe it's time that the EU or some other power block starts dumping huge fines on social media until they get their house in order.

    • Don't get me started on YouTube's partisan ways when it comes to policing content!

      It seems that Google's uber-powerful AI tech can successfully spot even a single moment of copyrighted music or video content in a video and immediately issue a strike against a channel -- perhaps even removing that channel completely for what may have been an innocent bit of background music picked up in a public place while filming.

      Apparently the AI can also spot violence, pluck out keywords that indicate disinformation (via

  • by p51d007 ( 656414 ) on Tuesday October 25, 2022 @04:34PM (#62997677)
    Eventually, they can use AI to generate a "person" to play a part in a movie/tv show and these high priced pampered actors won't be necessary.
    • Eventually, they can use AI to generate a "person" to play a part in a movie/tv show and these high priced pampered actors won't be necessary.

      They don't need AI. Hollywood has been doing this for years. Mickey Mouse doesn't collect residuals. The actors who play R2-D2, C-3PO, Darth Vader and Chewbacca can be replaced in sequels as needed, and the audience won't even notice.

  • by Petersko ( 564140 ) on Tuesday October 25, 2022 @04:48PM (#62997701)

    Digital video and audio as representative, dependable records of real events, and real acts and statements, is completely fucked, never to return.

    Plausible deniability won't be a goal needing effort - it will simply be the default state of being.

    Good luck, everybody.

  • Deepfakes are not the issue. Trusting "celebrities" blindly is the issue. Image manipulation exists, fix people not computers.
  • Cafe 80's (Score:4, Informative)

    by TwistedGreen ( 80055 ) on Tuesday October 25, 2022 @09:57PM (#62998423)
    I can't wait till I can have Ronald Reagan or Michael Jackson try to sell me the special of the day. I'll just have a Pepsi.
  • Comment removed based on user account deletion
  • Accept that your puny 5 senses will be perfectly simulated ... soon. then Reality goes by by .. tech evolution is going forward, and humans are mostly standing in one place. many years ago audio recording was "proof" and is then faked, then still image was "proof" and Photoshop enabled to make any you want, now artificial video is created good enough and no longer "proof". we just need simulated smell, taste and touch and we are done :)
  • It's about time someone worried about their future.

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...