Senate Passes a Bill That Would Let Nonconsensual Deepfake Victims Sue (theverge.com) 63
The U.S. Senate unanimously passed the Disrupt Explicit Forged Images and Non-Consensual Edits Act (DEFIANCE Act), giving victims of sexually explicit AI deepfakes the right to sue the individuals who created them. The Verge reports: The bill passed with unanimous consent -- meaning there was no roll-call vote, and no Senator objected to its passage on the floor Tuesday. It's meant to build on the work of the Take It Down Act, a law that criminalizes the distribution of nonconsensual intimate images (NCII) and requires social media platforms to promptly remove them. [...] Now the ball is again in the House leadership's court; if they decide to bring the bill to the floor, it will have to pass in order to reach the president's desk.
It's a start (Score:2)
Re:It's a start (Score:4, Interesting)
Re: It's a start (Score:2)
Re: (Score:2)
Re: (Score:2)
If you can tell that a watermark was removed, the watermark wasn't (fully) removed. The point about removing is, that you cannot tell afterward.
Re: (Score:3)
There are already many other ways to identify fake/AI generated images besides looking for a watermark. The watermarks are not strictly necessary for the task.
The reason for putting in a watermark would be to show voluntary cooperation with law enforcement. When the watermark has been removed, someone will have broken the law. Then it's just a matter of time
Re: (Score:2)
>"Here's a thought: Mandate that all AI tools include watermarks in their output."
Seems distasteful, but I am in full agreement at this point. Even if they can be removed, it will help and evidence of removal might be detectable as well.
Re: (Score:2)
Here's a hypothetical video of Locke2005 torturing a kitten. The lack of a watermark must mean it's real. Quick everyo
Re: (Score:2)
Re: (Score:2)
If you allow people to sue, you do not have a problem with open source, because you tackle the right point: creators (users) and distributors of the images (X) and not technology. There are a lot of tools to create fakes and one wonders if this act should really be restricted to AI (why not cover the Photoshop created fakes?) but they are not only "dual use" but also "primarily legal use" and it is only a minority that abuses them.
Re:This is ridiculous, what is even the harm here? (Score:5, Insightful)
It sounds like you've been asleep for a few years. They are getting quite realistic and will continue to become even more so. If you were the victim of one of these, you wouldn't be claiming they are harmless.
Re: (Score:2)
It sounds like you've been asleep for a few years. They are getting quite realistic and will continue to become even more so. If you were the victim of one of these, you wouldn't be claiming they are harmless.
If they are already harmful, for example damaging reputations would they not already be something you can sue for like slander?
If you can’t make a case already that they are false, damaging, and intended to damage should you be able to sue?
Does this expand the useable cases from “intent to harm” to “happened to harm”? Or even remove the burden to show any harm? Extending it directly to “it involves a picture of me, was made using a computer, and I don't like it!
Re: (Score:2)
There is harm when deepfakes are created for revenge porn.
The viewers aren't aware the pictures aren't real. Even if they are, it works for shaming just as well as true images.
You'll change your tune when it's your daughter (Score:5, Insightful)
People should have control over all private images, especially revenge porn. Some chick's BF takes a picture of her asleep and naked and posts it to reddit?...yeah, reddit needs to take it down. That's illegal. Generating images that do the same?...pretty similar.
Remember, just because it's not convincing today, doesn't mean it won't be in the near future...and even if everyone knew it was fake, it still is very hostile. It is quite traumatic for most women to go through that, especially those from conservative cultures. In some parts of the world, women are murdered by their families for having sex out of marriage. Having deepfakes of her doing a murderable-offense seems pretty stressful to me.
Even if you don't believe me, non-consensual porn is just gross and something that no one will miss. We don't need it. There is no value to it. If you need porn, there are many beautiful women and men who will happily do whatever you're into on camera for a reasonable price and often free. Support your local pornstars!
Re: (Score:2)
I disagree..it's a form of sexual harassment.
Allowing people to "sue" is not enough. Criminal charges are warranted for this behavior.
Re: (Score:2)
If the porn star resembles your daughter, that's unfortunate, but if that porn star then uses tech and,or makeup to disguise as your daughter, specifically, then makes and distributes porn claiming to be your daughter, that's different. It's a new take on slander or libel.
The laws have to change to address deep fakes. Trying to shoehorn this new space into existing frameworks is clumsy. It's good that it is happening.
Re: (Score:2)
It sounds like you're claiming in a broader sense that defamation does not cause harm.
She's 10...so...yeah :) (Score:2)
The deepfake isn't my daughter, it just resembles her. Do you get angry when a porn actress resembles your daughter? Like I said, it's annoying but there's no real harm.
Well, if she resembles my 10yo, that's pretty criminal. However, your point was if a porn star resembles a loved one. That's just coincidence. They didn't take a photo of my loved one and superimpose it. Using technology to deep fake a loved one means you're taking active steps to deceive.
Rules are arbitrary. We set boundaries and people often disagree. However, making fake sex photos of someone is illegal. That's the boundary. If you disagree, you are welcome to contact your local representativ
Re: (Score:2)
However, most are not monsters like you and I.
I agree with most everything else you said. But this? Come on, man. We're all monsters to some degree. Despite our airs, we have yet to kill the beast inside us all. We've just gotten really good at pretending it's not there when in mixed company.
Re: (Score:2)
People should have some control over images published of themselves.
Re: (Score:1)
Re: (Score:2)
I doubt it will be enforced like that. Without the publishing step, how would a victim discover the creation? There of course would be issues with "art" produced for private consumption that was subsequently leaked to the public. Due to theft or whatever. That creates the deniability of intent to publish. But that's just a case of strict liability. If you own something dangerous (like a gun) you are liable for its falling into the wrong hands and doing harm.
So if you want to see some woman (or man) undress
Non-consensual? (Score:2)
Re: (Score:2)
He\s also non-consensual. I wonder if this gives him ideas to sue SNL and any other impersonators he doesn't like (who am I kidding, of course it will).
Re: (Score:2)
So "Sassy Justice" will become illegal? (Score:2)
Re: So "Sassy Justice" will become illegal? (Score:3)
Re: (Score:2)
The point of using a deepfake is to imitate the target person in a way that can convince others that it's real, or at least plausible. That is way beyond "parody."
You do not need a convincing doppleganger of an individual to parody them. If anything, such an exact recreation is antithetical to what parody is and how it works.
=Smidge=
Re: (Score:2)
Re: (Score:2)
There are people who read satire blogs/"news" sites and are convinced that the stories are real, because even though the site plainly labels itself as satire these people are often only presented with a link directly to the article and they do not take the time to check the source of the material.
Yes, you are 100% giving people too much credit.
=Smidge=
Re: (Score:3)
I have not seen Sassy Justice, but the law does look to apply specifically to "intimate images", which I assume Sassy Justice would not fall under. Also, I assume parody could be used as defense in the case as well.
Re: (Score:2)
I haven't seen Sassy Justice, but given what I know about the South Park universe, I imagine it would not meet the criteria of: is indistinguishable from an authentic visual depiction of the identifiable individual when viewed as a whole by a reasonable person.
What if you paint really well? (Score:3, Insightful)
Should painting [mymodernmet.com] also be criminalized?
Re: (Score:2)
There's about one person in the world who can pain that well, there are about 7 billion who can tell an AI to make a deepfake. Do you think it's worth wasting legislative bandwidth on paintings?
This is why libel and even criminal libel laws exist. It's really easy to write shit about people and literally any one can do it. That's why despite muh freeze peaches and muh first amendment libel laws still exist on the books and have not been struck down as unconstitutional for example.
Re: (Score:3)
Someone who can paint well enough to create a plausible fake is already subject to existing laws: counterfeiting and fraud.
Re: (Score:2)
Bah (Score:1)
Hand drawn included? (Score:2)
There are real people that can hand paint a nude painting of people. Many of them accept commissions.
Will they be sue-able too?
If not, will you have to prove it was hand drawn?
Art is subjective, but we all know what we like.
Re: (Score:2)
I'm not a lawyer, but my interpretation of the law is that they would be OK if they used traditional paints. If they hand drew the image using something like Photoshop or Painter, they might still be liable.
is created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means.
Maybe an ambitious lawyer would pull up some renaissance manuscript describing oil paint as high technology, but I think most judges are going to interpret that to mea
Re: (Score:2)
Will they be sue-able too?
You might well be able to sue the under some kind of defamation law if the picture ever got out (and how would you even know to sue them if it didn't).
Turns out makin' shit up about people is a problem that isn't new.
Zero chance this will (Score:4, Interesting)
Remember who is in the oval office. Remember who controls Grok. The law will be swiss cheese, or more likely just die in committee.
Re: (Score:3)
Passing unanimously means it's likely veto-proof, and anyway it's hard for me to imagine why you think Trump would want to veto this one.
Re: (Score:2)
Easy, Musk et. al. pays Trump money to avoid "hindering AI growth opportunities".
And Trump's vetoed veto-proof bills - see the two he vetoed this year. They had bipartisan support but Trump just rattled his saber and many (R)s chickened out.
So unanimous approval now, but (R)s are of the belief that if they lose support of Trump, someone else will get in. So they chicken ou
Re: (Score:2)
Re: (Score:2)
Why couldn't you sue before? (Score:1)
Re: (Score:2)
Social platforms enjoy *some* immunity from being sued over user posts. This could be an attempt to erode that. It is also likely tailored to celebrities or politicians (Trump may be able to sue Youtube / Youtubers who impersonate him thereby chilling dissent) while being ineffectual for actual victims of deep-fakes (doesn't that stuff mostly come from Russia?).
The fact that it would have been political suicide to vote against this bill (just imagine the attack ads) makes me legit nervous about wher
Re: (Score:3)
The fact that it would have been political suicide to vote against this bill (just imagine the attack ads) makes me legit nervous about where we're headed.
There are masked agents kidnapping and sometimes murdering citizens. But what worries you is that most people would like laws that make it easier to deal with defamation generated from a new kind of automated defamation machine which didn't exist before.
Publishing falsehoods, including faked pictures, about people has never been legal, but as of today wit
Right to your own image? (Score:2)
If people have a right to their own image, the US should consider making nonconsensual photography illegal, even in public. N.b. this is the law in Europe.
I think most of the US requires consent for audio recording. Why should this be different for video?
Re: (Score:3)
>"If people have a right to their own image, the US should consider making nonconsensual photography illegal, even in public."
There is a huge difference between being casually photographed or videoed in public and kept original, and taking those and modifying them with AI (or other) tools to change what was seen/heard.
Re: (Score:2)
If people have a right to their own image
In the US people have basically zero rights to use of their image if the image was of them in a public place and taken with a “normal” lens (“normal” was not given a legal definition). I believe it is legal if the person in the image was in a non-public place but the person was taking a picture from a public place. I expect “normal” was intended to mean “you can’t stand in a public street and take a picture with an extreme telephoto through a bedroom window
easier way to win legal case (Score:2)
This is the correct way (Score:2)
why just the individual? (Score:2)
Re: (Score:2)
I miss the days when people were held responsible for their own actions.