Forgot your password?
typodupeerror
The Courts Books

Anthropic Agrees To Pay Record $1.5 Billion To Settle Authors' AI Lawsuit (deadline.com) 36

An anonymous reader quotes a report from Deadline: Anthropic has agreed to pay at least $1.5 billion into a class action fund as part of a settlement of litigation brought by a group of book authors. The sum, disclosed in a court filing on Friday, "will be the largest publicly reported copyright recovery in history, larger than any other copyright class action settlement or any individual copyright case litigated to final judgment," the attorneys for the authors wrote.

The settlement also includes a provision that releases Anthropic only for its conduct up the August 25, meaning that new claims could be filed over future conduct, according to the filing. Anthropic also has agreed to destroy the datasets used in its models. The settlement figure amounts to about $3,000 per class work, according to the filing.
You can read the terms of Anthropic's copyright settlement here (PDF). A hearing in the case is scheduled for Sept. 8.
This discussion has been archived. No new comments can be posted.

Anthropic Agrees To Pay Record $1.5 Billion To Settle Authors' AI Lawsuit

Comments Filter:
  • US model has to pay $1.5bn and also can no longer use the books. Chinese models have to pay $0 and can use the knowledge from all the books. This seems lopsided.

    • We are halfway there to the dictatorship. We could do the whole mix of corporation and government. I think there's a name for that. Starts with the letter f I think....

      Anyway let the Chinese have their AI. The only thing it seems good for is taking low level jobs and all that's going to do is create a fuck ton of social unrest we don't need. They can deal with the tens of millions of completely unemployable people in a "communist" Nation if they want.
    • So what you're saying is that criminals should be allowed to crim away, because if you stop the criminals from crimming, then someone else might profit?
  • Write crap books. "publish" them and then collect big time on the copyright infringement.

    Regardless of the ethics of AI, this is very difficult news for the rest of us when the collection for copyright violation far outstrips the commercial value of the materials.

    • by darkain ( 749283 )

      $3000/book is hardly "big time" - that's only 1 month (maybe 2-3 if you're super scrappy) of rent/bills in most large cities in this country.

      • It's $3000 more than they had before this settlement, so based on a ten percent royalty (apparently that's average for a non-self-published book) that's $30,000 worth of books sold. At $20 per copy, that's 1500 individual book sales.

      • Yeah but if you only used $50 in electricity to power the LLM for writing the book to get infringed, that's still $2950 profit.
    • Big valuation AI companies aren't going to be pirating books like that. Anthropic did this when they were a little fish
      • Yea, no way a big company would ever do such a thing. No way at all. They have smarty pants lawyers to tell them, "Don't train on stolen content like books and other media."

        So yea, big companies are _totally immune_ from doing such things.

        In other news... https://www.techspot.com/news/... [techspot.com]

    • by gweihir ( 88907 )

      You are such an idiot.

    • Yay, it's fun to blame victims and cry for mega corporations!

      The number is high for several reasons

      1) They were guilty and had no clear path to victory
      2) AI Companies have more money than they can reasonably manage right now
      3) Letting the case proceed and getting a precedent on record would be a nightmare for AI companies even if the amount in this case was far lower.

      This was about managing risk and throwing enough money at it to make it go away.

    • I've spent more than 30 years of my life writing science fiction, all of which has apparently been hoovered up for AI training. I won't see a cent of any settlements or class actions as I don't live in the US, and to be honest I don't care about any of that.

      I haven't written a word of fiction for five years, and I don't know if I'll ever bother again. I meet people every day who are excited about publishing 'their' novel which they 'wrote' using AI, with an AI-generated cover. Then they ask my advice on h
    • Write crap books. "publish" them and then collect big time on the copyright infringement.

      Cheaper and simpler to just churn out AI slop books and flog them on Amazon

  • by MrBrklyn ( 4775 ) on Friday September 05, 2025 @05:42PM (#65641952) Homepage Journal
    • Re: (Score:2, Troll)

      by MrBrklyn ( 4775 )

      https://tnq.ca/story/welcome-t... [tnq.ca]

      Welcome to this Issue
      By Pamela Mulloy

      In the midst of copy editing the material for this issue I read “These 183,000 Books are Fueling the Biggest Fight in Publishing and Tech” by Alex Reisner in The Atlantic. The article outlines how big tech companies are using the published work of writers to train generative AI systems, and the fact that this work is being scooped up without permission has the writing community up in arms. The list of writers whose work has be

  • If I follow the PDF link, is that another $3,000 ?
    • by linuxguy ( 98493 )

      > If I follow the PDF link, is that another $3,000 ?

      It may be possible to read the copyright message at the beginning of the PDF and then decide not to train on that document.

      I am sympathetic to the copyright holder's argument. But at the same time, I do feel that if the LLMs got a little dumber after having lost access to good books, we'd all be worse off.

      • by allo ( 1728082 )

        I only get the copyright issue in cases where really something is reproduced (what should not happen and doesn't happen for the majority of works, but happens for a small part due to issues with overfitting). Everything beyond that is mostly some moral argument about "If it would not exist without the books it should not exists" not backed by any measurable monetary loss or copyright issues and more motivated by wanting the model to not exist than by personal loss.

  • by EllisDees ( 268037 ) on Friday September 05, 2025 @05:49PM (#65641974)

    Anthropic wouldn't have gotten into any trouble at all with this if they hadn't torrented a metric shit ton of books. If they had simply purchased a copy of each one second hand, they would have been clear training their models on those.

    • In every printed book there's a legal notice that it's illegal to store the book in any electronic retrieval system (ie scan it). That's been around since computers were just a theory.
      • by allo ( 1728082 )

        You can put a lot of notes in your books, that doesn't necessarily mean they have legal relevance.

        The whole point with AI training (or transformative use in general) is, that if it is rules to be transformative use, the company does not need to obtain a license. If you do not need a license, it doesn't matter what terms someone would like to put into the license, as you didn't agree to them but used the work under the copyright exemption instead. Licenses and terms of services are relevant in cases where yo

    • Anthropic wouldn't have gotten into any trouble at all with this if they hadn't torrented a metric shit ton of books. If they had simply purchased a copy of each one second hand, they would have been clear training their models on those.

      True, but it's a hell of a lot easier to upload a torrent of scanned books than purchased and scan them yourselves.

  • Why would they destroy their datasets? The settlement should have covered the purchase price for those materials.

    • by darkain ( 749283 )

      "That Anthropic later bought a copy of a book it earlier stole off the internet will not absolve it of liability for the theft but it may affect the extent of statutory damages" - The Judge in the case

      • by MrBrklyn ( 4775 )

        seems obvious the Judge is getting paid off. He probably gets 10% of the bounty.

      • by cowdung ( 702933 )

        Ok sure. It doesn't need to absolve them. But you might as well settle to pay for the license to the materials and keep your datasets.

    • by MrBrklyn ( 4775 )

      That would be an astonishing little amount of money and show the mockery of the law, and embarrass the plaintiffs... so don't bet on that happening.
      top posting is fun...

      Why would they destroy their datasets? The settlement should have covered the purchase price for those materials.

    • I'm not sure it matters. The destruction looks like it's after the appeal. So, while they're appealing, other employees can use the existing dataset to produce a work that can be loaded to generate replacement datasets after the originals are destroyed.
    • by allo ( 1728082 )

      The decision says:
      First, you are not allowed to have the books without buying them (so you need to delete them)
      Second, you get a penalty (that's the $1.5 billion)

      They can probably buy the books again and then continue training. If they now paid $3000 per book the additional $50 to buy it again shouldn't matter too much, would it?

  • by Mirnotoriety ( 10462951 ) on Friday September 05, 2025 @07:15PM (#65642134)
    > Nobel laureate Geoffrey Hinton has warned that AI will concentrate wealth among a small elite while impoverishing most workers.

    Not only that, as our access to information becomes increasingly funneled through AI systems owned and operated by these elites, the biases inherent to their control may result in significant informational inequality.
  • they are doing it out of the goodness of their hearts
  • each?

    Presumably they're still gonna keep assfucking you though?

"If you want to eat hippopatomus, you've got to pay the freight." -- attributed to an IBM guy, about why IBM software uses so much memory

Working...