Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
The Courts AI Politics

Creator of Kamala Harris Parody Video Sues California Over Election 'Deepfake' Ban (politico.com) 337

Longtime Slashdot reader SonicSpike shares a report from Politico: The creator of a video that used artificial intelligence to imitate Kamala Harris is suing the state of California after Gov. Gavin Newsom signed laws restricting the use of digitally altered political "deepfakes," alleging First and 14th Amendment violations. Christopher Kohls, who goes by the name "Mr Reagan" on X, has been at the center of a debate over the use of AI-generated material in elections since he posted the video in July, calling it a parody of a Harris campaign ad. It features AI-generated clips mimicking Harris' voice and saying she's the "ultimate diversity hire." The video was shared by X owner Elon Musk without calling it parody and attracted the ire of Newsom, who vowed to ban such content.

The suit (PDF), filed Tuesday in federal court, seeks permanent injunctions against the laws. One of the laws in question, the Defending Democracy from Deepfake Deception Act, specifies that it does not apply to satire or parody content. It requires large online platforms to remove or label deceptive, digitally altered media during certain periods before or after an election. Newsom spokesperson Izzy Gardon said in a statement that Kohls had already labeled the post as a parody on X. "Requiring them to use the word 'parody' on the actual video avoids further misleading the public as the video is shared across the platform," Gardon said. "It's unclear why this conservative activist is suing California. This new disclosure law for election misinformation isn't any more onerous than laws already passed in other states, including Alabama."

This discussion has been archived. No new comments can be posted.

Creator of Kamala Harris Parody Video Sues California Over Election 'Deepfake' Ban

Comments Filter:
  • Good (Score:2, Insightful)

    That's constitutionally-protected speech. It's irrelevant whatever Mr. Newsom thinks of it.
    • Re:Good (Score:5, Insightful)

      by vux984 ( 928602 ) on Thursday September 19, 2024 @08:51PM (#64801479)

      That's constitutionally-protected speech.

      That's for the courts.

      There's few limits on what you say in term of political speech, but creating faked content showing others saying things they didn't say might be crossing the line.

      Nobody is confused that Alec Baldwin's "Trump" on SNL is the real deal, so they can say anything. But if they AI deep faked it so you couldn't hear or see that it wasn't trump, and then showed it out of context -- yeah, I don't think that's going to be ok. And i don't think "it was parody" is much of a defense either.

      • Re: (Score:2, Interesting)

        by RedK ( 112790 )

        "It was parody" is the ultimate defense actually.

        To this day, people still think Sarah Palin said she could see Russia from her house. It was actually Tina Fey who said that.

        I'm sure you don't actually think Kamala Harris would call herself a "Deepstate Puppet". You're not one of the crazies that thinks Sarah Palin said she could see Russia from her house right ?

        • But that was not a computer generated deep fake, but indeed a parody.

        • Re:Good (Score:4, Insightful)

          by thegarbz ( 1787294 ) on Friday September 20, 2024 @03:05AM (#64802007)

          "It was parody" is the ultimate defense actually.

          Actually it's not the ultimate defense. You can't declare something a parody unilaterally. It needs to be clear that something is a parody. The issue with deep fakes vs what the parent talked about being a dressed up Alec Baldwin, is that the latter is obviously a parody while the former may be misunderstood to be real. It doesn't matter what you the creator intend, it matters what other people view something as.

          "It was parody" is up to the judge to decide.

          To this day, people still think Sarah Palin said she could see Russia from her house. It was actually Tina Fey who said that.

          A reasonable person test applies. The fact that some people can't tell Tina Fey apart from Sarah Palin isn't reasonable. However if you deep fake Sarah Palin then it would be reasonable if it was well done, and that would no longer result in protection of "parody". Remember a parody is not wholesale copying, it's an Imitation of style with exaggeration for comic effect.

          • Remember a parody is not wholesale copying, it's an Imitation of style with exaggeration for comic effect.

            ..which parody is an easy sell, when you’re not a joke of a leader.

            The parody, is obvious. The actual joke, is the society the parody is parroting a bit too accurately. Feelings got hurt, and we know what happens after that. “Racist” and “sexist” labels start flying. Just like the video stated.

            The real problem, is reality being the parody.

        • Re:Good (Score:4, Insightful)

          by F.Ultra ( 1673484 ) on Friday September 20, 2024 @03:10AM (#64802013)
          I think that one in particular worked due what she actually said was quite close "You can actually see Russia from land here in Alaska" and more importantly, the whole idea that this somehow made her qualified is the identical context in both the parody and in what she tried to do in the real world with that quote.
        • Re: (Score:2, Informative)

          by jamienk ( 62492 )

          "They're our next-door neighbors, and you can actually see Russia from land here in Alaska"

          https://www.youtube.com/watch?... [youtube.com]

        • Re:Good (Score:5, Insightful)

          by ToasterMonkey ( 467067 ) on Friday September 20, 2024 @12:18PM (#64803159) Homepage

          "It was parody" is the ultimate defense actually.

          To this day, people still think Sarah Palin said she could see Russia from her house. It was actually Tina Fey who said that.

          I'm sure you don't actually think Kamala Harris would call herself a "Deepstate Puppet". You're not one of the crazies that thinks Sarah Palin said she could see Russia from her house right ?

          I want to know what planet you've been living on the last twenty years where thinking Sarah Palin said she could see Russia from her porch or that Al Gore said he invented the internet rises to the level of crazy.

          The President of the United States of America said we should check up on injecting bleach, that actually happened. Now we're spreading stories about migrants eating pets.

          Parody doesn't exist anymore, we're living in The Onion now. So no, parody is not the ultimate defense. Anyone on either side of the aisle will tell you, they don't know what's real anymore, they've stopped listening or reading the news, it's all rumors on Facebook and YouTube now. People are gullible and afraid.

      • by lsllll ( 830002 )

        There's few limits on what you say in term of political speech, but creating faked content showing others saying things they didn't say might be crossing the line.

        Cough, Campari Interview [boingboing.net], cough. The Supreme Court would like to have a word with you [wikipedia.org] there.

        • Re:Good (Score:5, Informative)

          by machineghost ( 622031 ) on Friday September 20, 2024 @12:22AM (#64801797)

          Your second link notes the Hustler parody ad in the case was ...:

          >marked as a parody that was "not to be taken seriously".

          Are all deep fakes as clearly marked? If not, that case hardly seems relevant.

          • Re:Good (Score:4, Insightful)

            by lsllll ( 830002 ) on Friday September 20, 2024 @01:11AM (#64801863)

            Althoug the parody marking helped in the libel case against Hustler, SCOTUS didn't write that the decision was because of that. The decision clearly protected parody as a form of speech and its test was that if a reasonable person didn't believe the statements to be true, then it was protected speech. They didn't say "as long as the piece clarifies that it is a parody."

            I watched the video. Would some people believe it? I guess so. Would most? I don't think so. She's so egregious in what she says that no reasonable person would believe it was real, especially in the day and age when we're all becoming more aware of the capabilities of AI, not to mention technologies that have been used in the film industry for decades.

          • Are all deep fakes as clearly marked? If not, that case hardly seems relevant.

            Interestingly they are now. The video title itself now has PARODY in it in all caps. It didn't on release. It seems like someone is trying to retrospectively undo something they may themselves realise they did wrong.

            That said the video ended with a typical approval messages as political ads do, in this case from "Professor Suggon Deeznuts". That would IMO (and I'm not the judge) make it an obvious parody. However... something could be said about that being at the very end of a video.

      • And i don't think "it was parody" is much of a defense either.

        Actually, parody is a strong defense if a "reasonable person" would recognize it as parody.

        • >"Actually, parody is a strong defense if a "reasonable person" would recognize it as parody."

          Agreed. My only issue is that the video probably should have been clearly labeled as a parody inside the video, or that it contains generated (AI) content (if it did). In this particular case, you would have to be pretty stupid to not recognize that video as parody. But why tempt fate?

        • And i don't think "it was parody" is much of a defense either.

          Actually, parody is a strong defense if a "reasonable person" would recognize it as parody.

          Hang on. My popcorn is almost done. Then we can talk about the definition of “reasonable”. In the era of “woman” being undefined as if it were an out-of-bounds math problem. This should be hilarious trying to find a defense.

        • Re:Good (Score:4, Insightful)

          by vux984 ( 928602 ) on Friday September 20, 2024 @11:56AM (#64803107)

          That's the crux of it right there. If you are using "AI deepfakery" to make it look and sound like its actually the person then its correspondingly less recognizable as parody.

          You don't need deepfakes to make parodies, and if anything parody is served better without them.

    • It's fairly typical of Gavin "do as I say not as I do"som.

      https://www.newsweek.com/gavin... [newsweek.com]
      https://people.com/politics/ga... [people.com]
      https://calmatters.org/comment... [calmatters.org]

    • Re:Good (Score:4, Insightful)

      by timeOday ( 582209 ) on Thursday September 19, 2024 @09:40PM (#64801595)

      "Requiring them to use the word 'parody' on the actual video avoids further misleading the public as the video is shared across the platform," Gardon said.

      What is the argument against that?

      • Re:Good (Score:5, Funny)

        by ShanghaiBill ( 739463 ) on Thursday September 19, 2024 @11:04PM (#64801703)

        "Requiring them to use the word 'parody' on the actual video

        What is the argument against that?

        A simpler solution is to require the politicians to put the words "Not Parody" on all their ads.

        • It's not up to the original to tell people they aren't a fake. I hate politicians as much as the next guy, but that is an arse backwards idea even when you do consider political ads to be weird.

        • "Requiring them to use the word 'parody' on the actual video

          What is the argument against that?

          A simpler solution is to require the politicians to put the words "Not Parody" on all their ads.

          Now that I could get behind!

    • Re: (Score:3, Insightful)

      by Freischutz ( 4776131 )

      That's constitutionally-protected speech. It's irrelevant whatever Mr. Newsom thinks of it.

      Once again you are either trolling, trying to be sarcastic or you really are dumb enough to believe that. Assuming the third option because there's a lot of people these days who fit that description, It's been my experience that people who argue your way in cases like this and approve of this sort of activity do so because (A) they see an immediate short term advantage in it for their own faction and (B) they can't even think one step ahead. You do realize that if that interpretation prevails it means that

    • That's constitutionally-protected speech. It's irrelevant whatever Mr. Newsom thinks of it.

      As long as the content identifies itself as parody, I -- a dyed-in-the-wool leftist -- fully agree. Gavin Newsom also agrees with you. As do the laws in question (as referenced in the blurblet above). However, as cited here, the video was able to be posted by Elon Musk "without calling it parody." If the inference that one draws from that -- namely, that there was no disclaimer on the video identifying it as a deepfake -- then that implies responsibility for both Musk and the content creator. No disclaimer/

    • Re:Good (Score:5, Insightful)

      by tragedy ( 27079 ) on Friday September 20, 2024 @07:50AM (#64802427)

      That's constitutionally-protected speech. It's irrelevant whatever Mr. Newsom thinks of it.

      That's where it gets a bit tricky. Speech is, indeed, constitutionally protected. However, what is fraud, if not speech? What is conspiracy, if not speech? For example, saying "I will pay $54.20 per share for your company's stock to buy out the entire company and take it private." is free speech. However, if you say it in the right circumstances (as in, actual negotiations on buying the company), it turns out that turning around and saying "just kidding" is fraud. The courts can force you to either follow through, or face civil and potentially criminal charges. Because speech, even free speech, carries meaning and that meaning is considered very important in certain contexts. The same goes for, "we should do a drive-by and kill Bob!". That's free speech. It could just be a joke too. People say things like that all the time without being serious, but then sometimes they are perfectly serious. Circumstances usually help illuminate that. For example, if you say that, then you or one of the people you said it to takes steps towards killing Bob in a drive-by, or actually does it, then you've taken part in an illegal conspiracy just through your free speech.
      The same goes with creating a fake, AI or not, of someone else saying something they never said. In and of itself it is free speech. If you release it with the intention of making it look like the person actually said or did what is in the video though, then you run into some problems. Those can range from slander/libel all the way to outright fraud. You can't, for an extreme (but real) example, create a deepfake of a company CEO demanding a massive funds transfer to you and use it to try to trick a company into making that funds transfer. That's serious criminal fraud.
      So, it's a matter of circumstance. The circumstances in this case being an election. Out of necessity, all sorts of special rules already apply to political speech and for pretty much exactly the same reasons. That is to say, certain kinds of speech in a political context qualify as libel and fraud. Especially in the context of political fundraising. For example, if you post an AI-generated video of a political candidate butchering and eating kittens and you're implying that, if they win, all kittens everywhere will be butchered and eaten and you are also soliciting donations to beat them, that can certainly qualify as a form of fraud. You can make the "no-one would ever believe it argument" and that holds sometimes, but reality demonstrates that there's a pretty large portion of the population on whom these things often do work.
      So, in the end, there are limitations on free speech. Those limitations may sometimes be unreasonable, but they often are actually reasonable within defined contexts.

  • Satire (Score:5, Informative)

    by colonslash ( 544210 ) on Thursday September 19, 2024 @08:40PM (#64801455)
    It was obviously satire; here's the video [x.com]—judge for yourself.
    • It was obviously satire; here's the video [x.com]—judge for yourself.

      Well yeah, it's nearly 2 friggin' minutes long for one thing. These days, real ads try to get to their point before you can smash that "skip ad" link.

    • Re: (Score:2, Flamebait)

      by RitchCraft ( 6454710 )

      Hilarious! But what was fake about it? I don't get it?

    • Yes, but there's a spectrum to how obvious something is. This is abused by ad companies to trick some of the dumber customers, and then in court they argue that the misleading ad is legal because a reasonable person would know they're despicable liars. I expect a similar thing will happen with AI-generated videos. I don't think our world is ready for idiots being fully convinced by a fake video that matches their prejudices, while declaring any video that they dislike an obvious fake.

      On a related note, some

      • by lsllll ( 830002 )

        I don't think our world is ready for idiots being fully convinced by a fake video that matches their prejudices, while declaring any video that they dislike an obvious fake.

        If we had to reduce ourselves to the least common denominator, society as a whole would be doomed. The right way is going up, not bringing things down.

        • Well, the usual way to deal with the least common denominator is to either get them the help they need, or put them in a box where they can't cause too much damage, and possibly take some sort of precautions against them. For example, a few of the lowest common denominator types (either mentally unstable, or political) killed off enough presidents we decided to give them bodyguards, which seems like a good idea considering the few dozen attempts to kill Obama.

    • Re:Satire (Score:4, Interesting)

      by Jayhawk0123 ( 8440955 ) on Thursday September 19, 2024 @11:04PM (#64801705)

      you say satire... I say risky... Considering the critical thinking levels of most American's... you can't trust that most will know it's satire, and there is likely a good proportion of the populace that will argue with you that it's real, and you saying it's satire is part of the deep state agenda to keep the truth hidden.

      Just look at history and how many articles from The Onion were picked up as legit news by ACTUAL news reporting agencies, let alone the number of random people that took the parody and satire as factual. These are the people that ate tide pods, made cheque fraud into a trend... and brought us the popped polo collars, plus flat earth, lizard people, etc...

      Unless there is a massive banner in the video saying PARODY... It will be abused... can you afford to have even more misinformation in politics.

      I know a good chunk of Americans are smart enough and don't fall under this, but the fact that there is enough that do and can sway an election is enough of a concern.

      • by AmiMoJo ( 196126 )

        The potential for abuse is the strongest argument here. It can be edited down to just her saying that, and any warning labels removed.

        One way to handle this is to set the bar as if a "reasonable person" might be mislead. Similar to things like decency laws, where what is obscene comes down to what a supposedly representative jury decides.

        Unfortunately it doesn't help with the dumbest in society, but it's legally sound and well tested against the constitution.

    • Thanks for the link, I was looking for it. My judgement is that I would ban it too. But I am an European and we judge ads differently here compared with the US, so there might be some bias.

  • by iAmWaySmarterThanYou ( 10095012 ) on Thursday September 19, 2024 @08:42PM (#64801459)

    If you can't tell the difference between that sort of blatant parody video and real video without a #parody tag on it then you really shouldn't be voting.

    This is just the first step to further erosion of the 1a. They went straight for "disinformation" as determined by politicians and failed badly so they pulled back and will just get there slower.

    In the US, we are citizens, not subjects and those politicians work for us, If a citizen wants to brutally mock a politician, (even using "AI"... gasp!) they have the right to do so. The 1a does not say, "you can say whatever you want but you have to label it the way the government says".

    The other side of this coin is, "let's file criminal charges against people who say Trump is Hitler because they're inciting violence". I'm cool with people saying Trump is Hitler. It's dumb but sure whatever, go for it. I'm also ok with some random mocking Kamala word salad with a parody video.

    The 1a requires people to have a brain and thick skin. It's not for thin skinned idiots. We should not reduce our rights to serve the dumbest and thinnest skinned people in our country.

    • by ArmoredDragon ( 3450605 ) on Thursday September 19, 2024 @09:05PM (#64801517)

      Progressives are concerned they might think Harris really said she is the ultimate diversity hire and deep state puppet if stuff like this is legal. When you look at the shit Animoji and rsilvergun say, I think this might be a credible statement.

      • by AmiMoJo ( 196126 )

        Says the guy so far down the rabbit hole he will never see the light of day again...

        It's interesting how all this fake news is coming from the Republican side. Everything from eating pets to Kamala saying she is a DEI hire. People have already edited videos misleadingly to try to support the pet eating stuff, and that recent movie "Am I Racist" is just a series of misleadingly edited interviews with people who never agreed to be in it.

        What is the equivalent from the Democrat side? I'm no fan of Kamala and t

    • Voting is not an intellectual challenge and people who think it is shouldn't be voting.

      Free speech is a human right recognized and protected by the first amendment. I don't think the right of free speech includes the right to deceive people whether deliberately or not. I don't think requiring someone to label their parody as parody threatens our liberty. Whether it violates the first amendment is entirely up to the corporate lawyers on the supreme court.

      The real threat to liberty is that a small number of

    • by quantaman ( 517394 ) on Thursday September 19, 2024 @10:46PM (#64801679)

      If you can't tell the difference between that sort of blatant parody video and real video without a #parody tag on it then you really shouldn't be voting.

      You missed the point.

      The problem is that people are pushing disinformation under the guise of parody. That ad with the fake Kamala quotes for instance, clearly it's anti-Harris. But it's done in the style of an ad playing someone's actual statements against them and it actually does have a bunch of (what I assume are real) clips of her speaking later. People sometimes say dumb things, I could forgive a lot of folks for thinking those quotes were real.

      The other side of this coin is, "let's file criminal charges against people who say Trump is Hitler because they're inciting violence". I'm cool with people saying Trump is Hitler. It's dumb but sure whatever, go for it.

      No it isn't, those are completely different coins.

      The Harris deepfakes are explicit attempts at deception. The "people who say Trump is Hitler" is straight up honest political speech.

      I'm also ok with some random mocking Kamala word salad with a parody video.

      As am I, heck, I don't even complain when Trump spends entire rallies serving out bowls of word salad.

      But there are exceptions to the US 1A and deepfakes designed to mislead voters can certainly qualify.

      • by lsllll ( 830002 )

        The Harris deepfakes are explicit attempts at deception.

        For the record, I'll be voting for Kamala. But what you call "deception", I call "funny".

        But there are exceptions to the US 1A and deepfakes designed to mislead voters can certainly qualify.

        Not if they are clearly parody, which this would fall under. Most people would not believe she'd call Joe senile. Most people would not believe that she would say we have an alliance with the Republic of North Korea. What's next? People watch the movie The Interview and think it was a documentary?

        • It was pretty fucking funny, and I agree it's pretty fucking obviously fake.

          However, it was enough to make me open my eyes and imagine a world where more insidious and less-obviously fake ads proliferate out of control, and it looks pretty fucking bad to me.

          That being said, I have no fucking idea what one does about it. There are obvious 1st amendment issues, but at the same time, it's obvious these can actually get closer to libel. Feels like it's a cat's-out-of-the-barn thing, and I think society will
      • Re: (Score:3, Funny)

        The "people who say Trump is Hitler" is straight up honest political speech.

        plus if they banned those, Trump would have to find a new running mate!

    • If you can't tell the difference between that sort of blatant parody video and real video without a #parody tag on it then you really shouldn't be voting.

      Then some 90% of your population would not be allowed to vote.

    • The other side of this coin is, "let's file criminal charges against people who say Trump is Hitler because they're inciting violence". I'm cool with people saying Trump is Hitler. It's dumb but sure whatever, go for it.

      Is totally legit to compare any wannabe dictator to the most famous dictator of our times.

    • If you can't tell the difference between that sort of blatant parody video and real video without a #parody tag on it then you really shouldn't be voting.

      The problem, isn’t the parody.

      The problem, and why Democrats are quite pissed, is because it’s a bit too accurate. Literally referencing actual “defense” tactics abused by liberals. Like calling someone “sexiest” or “racist” when questioning the competency of the lowest-rated VP in history.

      Again the problem isn’t parody or comedy. It’s the joke of a person were calling “qualified”.

  • Forgery != Parody (Score:5, Insightful)

    by Outland Traveller ( 12138 ) on Thursday September 19, 2024 @08:45PM (#64801465)

    Subject says it all. These deepfakes are closer to impersonating a police officer, or forging documents/money than they are to parody and satire.

    • Subject says it all. These deepfakes are closer to impersonating a police officer, or forging documents/money than they are to parody and satire.

      The fact that Kamala is so bad in real life, IS the entire reason people don’t see this as painfully obvious parody.

      “I may not know the first thing about running the country, but remember..that’s a good thing.” Are you fucking kidding me? People actually believe she said that? How bad do you have to be in real life to believe that shit?

      And the mockery in the video, isn’t even fake. It’s literally describing actual defense tactics being abused today. Leftist society i

  • Just listen to the two of them talk at length about anything on camera. Same mannerisms, same verbal tics, same exaggerations. Apparently the two of them actually hit it off back in '18 when Trump went out there to tour the wildfire damage, and they swore eachother to secrecy about it.

    • The difference is that Trump is a barely literate moron who has not even the tiniest commitment to maintaining a democratic government. I agree with you about similarities, though. Both men are monumentally corrupt, and both have blood on their hands due to their penchant for prioritizing donors over voters.

  • by Baron_Yam ( 643147 ) on Thursday September 19, 2024 @10:08PM (#64801641)

    You want to do a parody? Use real actors or make sure the deepfake isn't all that deep and not a good fake.

    It's not complicated, and if you have an issue with that it isn't because you're a hero fighting for free speech, it's because you're fighting for the opportunity to deceive the easily led.

    • Use real actors or make sure the deepfake isn't all that deep and not a good fake.

      Having watched it, I think they absolutely nailed the "not good" part. If somebody watches it and truly believes it's a legitimate campaign ad, that says more about the sorry state of our education system than anything else.

    • by RedK ( 112790 )
      Dude, if you got tricked into thinking this is real, thatâ(TM)s on you. Obvious Parody is obvious.
    • You want to do a parody? Use real actors or make sure the deepfake isn't all that deep and not a good fake.

      It's not complicated, and if you have an issue with that it isn't because you're a hero fighting for free speech, it's because you're fighting for the opportunity to deceive the easily led.

      You want to do parody and not have The People fall for it so easily? Stop electing a joke of a leader. It’s not complicated.

      Kamala is pissed because she’s so bad in real life that parody is that believable. Says a lot more about the person being mocked, than the mocking technology. If she was competent, parody would be painfully obvious and accepted. Like comedy.

    • Someone please tell me why this is modded as "Insightful", it's just bog-standard comment griping: "When you did the thing, you didn't do it the way I would have done it, so you're wrong."

      Where is it written that parody has to use "real actors", not be "too deep", and a "good fake"? None of those things are required. All that is required of parody is that it pokes fun at its subject. Whether you use AI or not is irrelevant.

    • You want to do a parody? Use real actors or make sure the deepfake isn't all that deep and not a good fake.

      After all, that still works - a bunch of fools still believe the claim that Sarah Palin said she could see Russia from her house.

  • by tiqui ( 1024021 ) on Thursday September 19, 2024 @11:49PM (#64801751)

    This case was effectively heard in 1988. The person being parodied was a public figure, the parody was intentionally hurtful and dishonest, basically the only difference was superfluous - there was no AI back then. The case involved Larry Flynt and his Hustler magazine doing a parody of TV preacher and leader of the organization called "Moral Majority" Jerry Falwell. Falwell originally won his case, overcoming the argument that he was a public figure by showing actual intentional infliction of distress (Larry Flynt made no secret of his desire to harm Falwell). The case was eventually overturned by the United States Supreme Court in a unanimous decision when it handed a victory to Flynt, protecting parody of public figures even when actual malice was shown.

    When somebody runs afoul of Newsom's newest folly, and after they go broke fighting the case in the courts, Newsom and the idiots of the California legislature will lose this one - The Supreme Court won't be fooled into thinking this is totally new and different just because somebody used AI, instead of their own manual skills, to create the parody. In the end, it's still just a parody of a public figure that the reader/viewer is left to assess, and the public is assumed to be able to figure it out.

    • From a legal perspective, I agree with your assessment. If the argument is "sure, you can make a parody -- provided you don't make it this way", then it's hard to imagine that not being tossed out, maybe even on its face.

  • Babylon Bee (Score:2, Interesting)

    by jroysdon ( 201893 )

    The Babylon Bee is giving Newsom the finger as well.

    https://www.youtube.com/watch?... [youtube.com]

It is better to live rich than to die rich. -- Samuel Johnson

Working...