

AI Industry Horrified To Face Largest Copyright Class Action Ever Certified (arstechnica.com) 187
An anonymous reader quotes a report from Ars Technica: AI industry groups are urging an appeals court to block what they say is the largest copyright class action ever certified. They've warned that a single lawsuit raised by three authors over Anthropic's AI training now threatens to "financially ruin" the entire AI industry if up to 7 million claimants end up joining the litigation and forcing a settlement. Last week, Anthropic petitioned (PDF) to appeal the class certification, urging the court to weigh questions that the district court judge, William Alsup, seemingly did not. Alsup allegedly failed to conduct a "rigorous analysis" of the potential class and instead based his judgment on his "50 years" of experience, Anthropic said.
If the appeals court denies the petition, Anthropic argued, the emerging company may be doomed. As Anthropic argued, it now "faces hundreds of billions of dollars in potential damages liability at trial in four months" based on a class certification rushed at "warp speed" that involves "up to seven million potential claimants, whose works span a century of publishing history," each possibly triggering a $150,000 fine. Confronted with such extreme potential damages, Anthropic may lose its rights to raise valid defenses of its AI training, deciding it would be more prudent to settle, the company argued. And that could set an alarming precedent, considering all the other lawsuits generative AI (GenAI) companies face over training on copyrighted materials, Anthropic argued. "One district court's errors should not be allowed to decide the fate of a transformational GenAI company like Anthropic or so heavily influence the future of the GenAI industry generally," Anthropic wrote. "This Court can and should intervene now."
In a court filing Thursday, the Consumer Technology Association and the Computer and Communications Industry Association backed Anthropic, warning the appeals court that "the district court's erroneous class certification" would threaten "immense harm not only to a single AI company, but to the entire fledgling AI industry and to America's global technological competitiveness." According to the groups, allowing copyright class actions in AI training cases will result in a future where copyright questions remain unresolved and the risk of "emboldened" claimants forcing enormous settlements will chill investments in AI. "Such potential liability in this case exerts incredibly coercive settlement pressure for Anthropic," industry groups argued, concluding that "as generative AI begins to shape the trajectory of the global economy, the technology industry cannot withstand such devastating litigation. The United States currently may be the global leader in AI development, but that could change if litigation stymies investment by imposing excessive damages on AI companies."
If the appeals court denies the petition, Anthropic argued, the emerging company may be doomed. As Anthropic argued, it now "faces hundreds of billions of dollars in potential damages liability at trial in four months" based on a class certification rushed at "warp speed" that involves "up to seven million potential claimants, whose works span a century of publishing history," each possibly triggering a $150,000 fine. Confronted with such extreme potential damages, Anthropic may lose its rights to raise valid defenses of its AI training, deciding it would be more prudent to settle, the company argued. And that could set an alarming precedent, considering all the other lawsuits generative AI (GenAI) companies face over training on copyrighted materials, Anthropic argued. "One district court's errors should not be allowed to decide the fate of a transformational GenAI company like Anthropic or so heavily influence the future of the GenAI industry generally," Anthropic wrote. "This Court can and should intervene now."
In a court filing Thursday, the Consumer Technology Association and the Computer and Communications Industry Association backed Anthropic, warning the appeals court that "the district court's erroneous class certification" would threaten "immense harm not only to a single AI company, but to the entire fledgling AI industry and to America's global technological competitiveness." According to the groups, allowing copyright class actions in AI training cases will result in a future where copyright questions remain unresolved and the risk of "emboldened" claimants forcing enormous settlements will chill investments in AI. "Such potential liability in this case exerts incredibly coercive settlement pressure for Anthropic," industry groups argued, concluding that "as generative AI begins to shape the trajectory of the global economy, the technology industry cannot withstand such devastating litigation. The United States currently may be the global leader in AI development, but that could change if litigation stymies investment by imposing excessive damages on AI companies."
Maybe? (Score:5, Insightful)
They've warned that a single lawsuit raised by three authors over Anthropic's AI training now threatens to "financially ruin" the entire AI industry if up to 7 million claimants end up joining the litigation and forcing a settlement
Maybe don't steal stuff from so many people if you can't handle the consequences?
Re: (Score:2)
keyword "if"
Re: (Score:2)
Many people rely Open AI and Anthropic. For example, I use their products to write a lot of bash scripts. If those companies go offline, where will people turn? Will everyone flock to Chinese AI offerings? It is going to be a lot harder to shut down those for IP reasons.
Re: (Score:2, Funny)
> For example, I use their products to write a lot of bash scripts. If those companies go offline, where will people turn?
How about pick up a fucking book and learn how to use your fucking computer like you weren't a decerebrate child? If you can't write bash scripts, then go back to your fucking Windows box.
Re: (Score:2)
Maybe write it yourself?
Or, if it's a large thing, see if your friends (who happen to know bash) can help.
Re: Maybe? (Score:2)
Re:Maybe? (Score:5, Insightful)
Re: (Score:3)
Re: (Score:2)
Re: Maybe? (Score:5, Insightful)
That is beside the point. GenAI businesses are the biggest offenders in the history of copyright violation, by far. Why should they not get the biggest punishments in the history of copyright violation, by far?
Re: (Score:2)
Re: (Score:3)
Re: Maybe? (Score:2)
Re: (Score:2)
Re:Maybe? (Score:5, Insightful)
It's crazy how Slashdot's commentariat has pivoted to spouting and defending the bullshit the RIAA and MPAA have always been spewing and which old Slashdot would have shat on liberally.
Re: (Score:2)
Well, everyone was a poor high school student 25 years ago and couldn't afford crap and we were all losers that nobody wanted to hang around anyways.
Re: Maybe? (Score:2)
Just how hard did you sell out, my good man?
Re:Maybe? (Score:5, Insightful)
Leave it to the tech sector to end up creating something we can all hate worse then the RIAA and MPAA.
Impressive when you think of it that way.
Re: (Score:2)
Which raises the question: where the hell ARE the RIAA and MPAA in all of this? Why aren't they getting in on this?
Re:Maybe? (Score:4, Interesting)
Which raises the question: where the hell ARE the RIAA and MPAA in all of this? Why aren't they getting in on this?
They're there - it's just a different strategy:
Before they needed to make an example of people publicly to stem piracy of "their" content - many fronts (individuals) to deal with to make that happen.
With AI companies, they can just use their size & leverage once others have successfully argued that content is being scraped "illegitimately", and force some perpetual periodic/usage licensing or percentage fee, like they've done for small businesses that play radio or songs in their establishments.
Re: (Score:3)
They don't want the AI industry squashed by a huge judgement (or for AI to not be able to train on previous creative works).
They still have a dream of a future where their corporate leadership can sit in a chair and make new product simply by describing the music or movie to a computer with voice recognition, never having to deal with (let alone pay) any other human being in the process to make it.
Re: (Score:2)
I don't think Slashdot was ever in favor of big business stealing other people's movies, books or games and selling them as their own.
Re: Maybe? (Score:2)
Why can't AI jailbreakers break the subscription payment part of AI?
Re: Maybe? (Score:4, Informative)
Slashdot used to be very clear that taking an unlicensed copy of some infinitely reproductive piece of information was at worst a very different crime than stealing.
Re:Maybe? (Score:5, Insightful)
...how Slashdot's commentariat...
The two are completely different. People downloading movies and songs for themselves and friends, while morally gray, is vastly different from AI companies taking people's works and reselling them. And don't even try to sell me the nonsense that LLMs are "learning" like a human does. That is complete and utter bullshit. LLMs and other so-called AI programs are retrieval, processing, and storage engines. The term "AI" is just a marketing term for the gullible and the ignorant.
Re: Maybe? (Score:2)
How smart was Buddy Bolden, who refused to record because others would just steal his licks?
Re: (Score:2)
Re: (Score:3)
And this is why you're stupid. You're not getting the fact that the Slashdot Commentariat want some consistency in the way this stupid world is inconsistently running.
Re:Maybe? (Score:5, Insightful)
It's crazy how Slashdot's commentariat has pivoted to spouting and defending the bullshit the RIAA and MPAA have always been spewing and which old Slashdot would have shat on liberally.
I don't think that's what's happening here. What you're seeing is a lot of bitterness and sarcasm. Copyright laws are bullshit, but the corporate world damned well ought to live by the draconian laws they purchased.
One thing worse than shitty, draconian laws are shitty and draconian laws that are enforced to the fullest extent against some groups while others are given a free pass.
Re: (Score:3)
Right. Those things are wrong too. People like to do the mental gymnastics required to convince themselves otherwise, but it's horseshit.
If it's okay to take those things without consequence it should be equally fine to ignore the GPL and do whatever the hell you like with the software.
Re: Maybe? (Score:2)
Anyone else remember the Bugroff license?
https://www.lonsteins.com/post... [lonsteins.com]
"The âoeNo problem Bugroffâ license is as followsâ¦
The answer to any and every question relating to the copyright, patents, legal issues of Bugroff licensed software isâ¦.
Sure, No problem. Donâ(TM)t worry, be happy. Now bugger off."
Personal use versus commercialized service (Score:5, Insightful)
It's one thing to download a movie without paying for it. It's not good if you share it with others. It's absolutely wrong on all levels when you turn it into a commercial service. It's a few hundred orders of magnitude worse when you you're doing this at the scale of the generative AI companies.
Re: (Score:2)
It's one thing to download a movie without paying for it. It's not good if you share it with others.
To download something requires someone else sharing it with others. So why is the first one "one thing" and the second one somehow any different?
It's absolutely wrong on all levels when you turn it into a commercial service.
Why is taking something that isn't yours any better than taking something that isn't yours and selling it to someone else?
Re: (Score:2)
So, it's ok to rape a woman, as long as you don't come and risk making her pregnant, which is where the real damage is. Got it.
Re: (Score:2)
Re: Personal use versus commercialized service (Score:2)
You are hereby notified that this comment is my private intellectual property, and you are strictly forbidden from learning from it, or using it in any way to update your beliefs about the world.
But I guess you weren't going to, anyway.
Re: (Score:2)
Maybe don't steal stuff from so many people if you can't handle the consequences?
Like people who steal movies, software, videos, and games, right?
Those people that you mean, did they sell the stuff they downloaded to other people for lots of money, like today's "AI"-industry?
Re: (Score:2)
If you sell copies or make money off of the copyrighted works then you can expect to be hammered flat.
Are the AI companies making money from inappropriate use of copyrighted works? Then hammer them.
If the AI is fed only material from expired copyrights and non-copyrightable facts and government publications then they are in the clear though their prose might be in the style of Victorian times.
Compairing apples to oranges (Score:2)
Like people who steal movies, software, videos, and games, right?
I've already explained [slashdot.org] how the above argument is absolutely invalid/irrelevant:
Re: Maybe? (Score:2)
But technology is about the free exchange of other peoples ideas!
I wanna make out with my MonroeBot ;)
Re: Maybe? (Score:2)
no, not like that. there is the breach involved there by the MAFIAA and her international partners, who actually privatized the public domain that their riches are based on.
this invalidates a lot of the credibility they claim against the people who gave them that monopoly in the first place.
The "AI industry" has always been a leech, it has no standing in that argument.
Re: (Score:2)
But they aren't stealing. They are copying.
It is to the benefit of copyright holders to conflate the two. But the two never meant the same thing until they made up "theft of service", and tried to make folks believe that watching a movie is a "service".
Re: Maybe? (Score:2)
If you aren't a techbro, why are you here?
LMFAO! (Score:4, Insightful)
Re: (Score:3)
If it's a hallucinogenic program then what's the copyright infringement?
Re:LMFAO! (Score:4, Insightful)
If it's a hallucinogenic program then what's the copyright infringement?
The training set is itself presumably typically infringing whether the output is or not.
I personally don't think the output is infringing, but there is a plausible legal argument to be made that it is if the output is similar enough to copyrighted input.
Re: (Score:2)
It's worth noting that the purpose of copyright was supposed to be protecting livelihoods. If it's not doing that, the law has no legitimacy.
But we're rich (Score:5, Insightful)
Once upon a time, things didn't work out for Napster. Personally I felt that copyright rules had been skewed too far against the general public, but in that time period the general pattern was that if you couldn't run a business without breaking the law, then you'd just go out of business.
I guess these days I still feel like the rules are still skewed too far against the general public. The big difference now is this expectation that not only do the extremely wealthy rewrite the rules in their favor, they also take it as a given that if, somehow, they encounter a rule that doesn't let them do whatever they want, the rule must not have been intended to apply to them in the first place, so why should they even have to go to the trouble of getting it rewritten before they ignore it?
This line says it all (Score:5, Insightful)
"One district court's errors should not be allowed to decide the fate of a transformational GenAI company like Anthropic or so heavily influence the future of the GenAI industry generally,"
Translation: "We're so big and important that we're above the law."
No love (Score:3)
Sorry, there is no love lost here. These companies have been pushing snake oil for long enough. Time to rid the yard of snakes.
Re: (Score:2)
Fun fact, in Japan they call it Toad Oil, "gama no abura"
It's exactly the same thing, except toads instead of snakes
Not as horrified as the artists. (Score:3)
The artists are far more horrified by how AI is destroying their business.
Voiceovers, commercial artwork, etc. have all become 'cheap' because they no longer need to pay artists.
The artists find the total elimination of their work to be devastating.
Re: (Score:2)
That's a different problem that won't be solved with lawsuits. Once the technology exists, they will be replaced. The only question is when.
Consider how many legal jurisdictions still force people to hire horse carriages if they don't want to walk or bike. There's Mackinac Island and a few other tourist destinations with a few hundred residents each.
How many ban the internet? Just North Korea, but they still have their own internal version.
How many ban knitting machines? None that I can find.
Future artists
Then share in the spoils (Score:5, Insightful)
If they can't pay upfront they need to make good with some other method.. shares (public or private) or something of value perpetual revenue from MAX accounts etc. It is beyond crazy how they argue it's fair when it isn't. If I trained my AI using their system they would slap me with a lawsuit so fast.. but hey might makes right
They'll come up with a comprimise (Score:3)
The US gov isn't going to let AI die, they'll hash a compromise with the copyright holders. The stupid thing is and individual can't break copyright, they'll loose their whole net worth over copyright infringement. A corporation can do copyright infringement and get away with it (or maybe not). If they go after anthropic, Open AI and Meta are next.
Re: (Score:2)
> The US gov isn't going to let AI die
Clearly the US gov needs to die right along with AI.
Consequences (Score:2)
Re: (Score:2)
It's copyright infringement, it's not theft. That's motto around here, right?
Good (Score:3)
About time the AI industry is held accountable for its theft.
And no, I don't condone stealing software, movies or videogames either. But these AI companies are stealing on a far more massive scale than someone who copies the odd movie or game.
Re: (Score:3)
I think the difference between just copying something to use for your own personal usage is nothing like what these AI companies are doing. Now if I was distributing these things I copied and charging money for it, that would be akin to what these AI companies are doing.
Losing the Battle and the War (Score:3)
Re: (Score:2)
It's lead in what exactly? Unemployment? AI Slop generation? Homeless people?
If these tech companies get what they want we'll all be wage slave prisoners shuffling around in the dirt of their techno-utopia city-states while they invent creative new ways to entertain the idle rich and casually live out their lives in extreme wealth and opulence on the backs of the rest of us.
Fuck em..
Re: (Score:2)
Re:Losing the Battle and the War (Score:5, Interesting)
If they can't figure out a way to pay people for their work, why should they be allowed to have it? They're just impoverishing people that actually spent time and energy and their own money to make things, and they're saying they should get it 'because'.
Is your problem with people doing it on an individual basis that the crime is too small, or?
They didn't even pay for the value of a single book before slurping it up. They scrape videos and obviously never watch any ads or would even need any of the products if they did. They're undermining everything to sell your own creativeness back to you. It's gross.
This isn't the same as the early days of the internet where people downloading musing were shown to, on balance, buy MORE music than average. There's no argument that it's actually good advertising or disseminating work to the public so it can be discovered. They're just taking it, and they reap all the profits. (Or, more likely, they raise billions in venture capital, never make a profit, and stay alive because billionaires have nothing else to spend their money on while the rest of us eat dirt.)
Re: (Score:2)
Re: (Score:2)
Technology.
Re: (Score:2)
^Exactly!^
Ever see the Animatrix episode "The Second Renaissance"? What you described is exactly that.
Once "AI" takes over your job flipping burgers at McDonald's, you'd better have something put away money-wise.
Maybe they should get busy with working out the UBI bugs, 'cause if they keep making "AI" better and more capable, less and less people are going to be able to find work... maybe, for every person that "AI" makes unemployed, the UBI should come from the company that made that "AI".
Re: (Score:2)
Re: (Score:2)
That's a lost argument. All innovations till now were replacing a part of what people did (either mechanically or mentally). So only a section of the population was impacted and further the roll out was gradual over years - giving time for the impacted ones to transition to something else. But AI will eliminate most mental work that bulk of population does currently for employment. The only jobs left out will be those that are physical and not suitable for robots - like gardeners or nursing. Not everyone ca
Re: (Score:2)
So the sacred copyright is secondary to national security huh?
I guess it wasn't so sacred after all...
Re: (Score:2)
Why should native Americans receive any better treatment than any of the various other conquered peoples throughout some 5000 years of human history?
Being Big and Rich Doesn't Make You Right (Score:2)
Having lots of money and claiming you are important for the future of the universe doesn't give you the right to steal property from the people who own it.
Re: (Score:2)
It's copyright infringement. It's not theft. Different crimes here.
Re: (Score:2)
Being displaced by technology is not wage theft. Wage theft is when you perform a job for someone and they don't pay you. This is definitely not wage theft.
It will be interesting to see what comes of this case. If the judge sides with the artist, you'll see the stock market crash and a recession, if not a depression, will almost certainly happen. On top of that, other countries such as China will surge ahead with AI because they don't care about copyright laws. So really, do we support copyright laws that m
Re: (Score:2)
See my comment above... https://yro.slashdot.org/comme... [slashdot.org]
If "AI" lives up to the hype, we'll all be on the streets sleeping under bridges while the C-suite people make billions a day with their "AI"... the C-suite will live in the utopia, and that'll be it.
Our courts side with whoever has the most money (Score:2)
At best this will hit the supreme Court where they will declare that it's fair use. The decades of precedence around storing the contents of copyrighted material won't matter.
There is no way something is valuable is AI is going to get derailed by a little thing like the rights of the copyright holders. This is happening and we all need to get comfortable wit
copyright did it to single moms (Score:5, Insightful)
Single moms that shared a few songs paid dearly. Watch tech bros that stole everything walk.
If copyrights were reasonable we would be having a different conversation.
Re: (Score:2)
For one thing, single moms did not profit from sharing songs. Tech Bro are aiming for Billions if not Trillions in profits, based on the collective works of society without which they have no business. Without being vindictive but just considering the profit motive alone, Tech Bros are liable for the cost of what went into their product. I think this is the same company that also bought a ton of used books and scanned them into for training - that is the way to do it, they paid for the access and didn't jus
Good! (Score:2)
Those companies SHOULD be bankrupted, and the CEOs put in prison for a few years. They are thieves of the highest order, and of which the world has never seen before.
Really? The whole sector will die? (Score:2)
Pay up or shut up (Score:2)
These AI companies are throwing fortunes at employees working on their function. Start throwing that money at the people creating the body of knowledge.
I hope it does ruin them (Score:2)
I'd be a very happy person if all AI companies were ruined to the point of bankruptcy and their CEOs sent to prison for life.
Entire AI industry facing financially ruin ? (Score:3)
Re: (Score:2)
I hope they lose (Score:2)
I hope the AI companies loose in court and we see an end to this generative AI crap.
Copyright infringement, no consequences? (Score:2)
So Anthropic wants no consequences for committing blatant copyright infringement, while everybody else faces prison time over this?
No. They trained their AI by feeding it works they didn't obtain the rights to. Now there should be consequences. If this means Anthropic goes belly-up, so be it.
If this succeeds, we can basically say goodbye to the AI industry. Whether this is a good thing or a bad thing remains to be seen.
Re: (Score:2)
If this succeeds, we can basically say goodbye to the AI industry. Whether this is a good thing or a bad thing remains to be seen.
Well, we can say goodbye to this version of the AI industry.
And that would actually be a good thing. The GenAI industry is basically a dead-end as far as artificial intelligence is concerned, because it's leading research time and money away from achieving General Artificial Intelligence. How can we know this? Because: in order to write this sentence, I didn't need to ingest the entirety of humanity's written word. I'v read a couple of thousand book so far in my life (if that); I've had a high school educat
Oh no! (Score:2)
If you can't do the time don't do the crime. (Score:2)
Re: (Score:2)
Is Slashdot letting Donald Trump write the headlines again?
That is the literal headline from ArsTechica.
Re:Has a point (Score:5, Insightful)
cry me a river. where was all this hand wringing nonsense when piratebay was shut down repeatedly ? how about the DMCA which US industry was happy to back. oh and the mickey mouse perpetual licensing joke. now its come back to bite the tech industry in the ass. fuck em. let them ALL go out of business. let the investors cry. then maybe we will get some sanity back in copyright law.
Re: (Score:2)
Stealing, e.g., book authors' work, mashing it up using an "LLM" algorithm, and selling it in a deliberate attempt to undercut the original, content-creating authors would seem like a 'kill switch' for human creativity and livelihoods, no? Ursula K. LeGuin was 101% right when she fought Google and the what was supposed to be her own advocate, the SFWA, on this 20 years ago and everything she predicted about the destruction of au
Re: (Score:2)
They pirated everything under the sun to create the training set.
After they got some money they started book scanning to at least have some chance at a fair use defence, but even now their training set is likely mostly pirate booty. Any transformation after the fact is besides the point.
Re: (Score:2)
There's only two real judgements on fair use and one is weak as shit.
Anthropic lost on the piracy part of their case. Which can kill them, regardless if using scanned books for training is deemed fair use at the supreme court.
Re: (Score:2)
So... if I have a text file of The Stand, and change all the names and places (maybe make Randall Flagg a fluffy unicorn) and give it a happy ending, I didn't steal it, right?
If it's 'only' doing the transformative thing, even if it's trained using Romeo & Juliet, it should not be quoting word-for-word the original play.
If it's fair use' to train an "AI" (really just a LLM predictive text thing like your phone uses), then it's fair use for it to scrape your computer and Facebook and medical data and dri
Re: (Score:2)
(mangled that one)
should read:
"If it's fair use' to train an "AI" (really just a LLM predictive text thing like your phone uses) using copyrighted works, then it's fair use for it to scrape your computer and Facebook and medical data and driving record and ID card for "training data".
Re: (Score:2)
Oh, bullshit. It's been shown that AI models can and will regurgitate obviously infringing output [ieee.org]. Show us that compression which will reduce one of the examples to "just a few bytes."
Re: (Score:2)
"Look, $150K per work × 7 million potential claimants = over $1 trillion in damages. That's not justice, that's basically a kill switch for the entire industry."
Maybe an industry based on stealing everyone else's stuff and pretending it's theirs because 'muh training' should be killed?
Particularly given the reports of AI companies ignoring robots.txt to steal web content after being denied permission to access it.
Re: (Score:2)
How about we allow the "AI" industry to become the next dodo, and human's go back to doing the work... that sounds like a plan!
Drug discovery and protein folding... how about distributed computing?
And, boring coding tasks? Even with an "AI" doing the coding, a human still has to go through and validate and proofread and correct any errors that the "AI" made... would that human still have to do all that if they did the coding themselves?
What do we really need "AI" for? Starting Cyberdyne? Building the Mat
Re: (Score:2)
Weird Al creates a derivative work. AI doesn't even do that. You misunderstood how the models work, when you talk about "changing the storage method". There is not storage for the originals in the model.