Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
Facebook AI

Nick Clegg Says Asking Artists For Use Permission Would 'Kill' the AI Industry 218

As policy makers in the UK weigh how to regulate the AI industry, Nick Clegg, former UK deputy prime minister and former Meta executive, claimed a push for artist consent would "basically kill" the AI industry. From a report: Speaking at an event promoting his new book, Clegg said the creative community should have the right to opt out of having their work used to train AI models. But he claimed it wasn't feasible to ask for consent before ingesting their work first.

"I think the creative community wants to go a step further," Clegg said according to The Times. "Quite a lot of voices say, 'You can only train on my content, [if you] first ask.' And I have to say that strikes me as somewhat implausible because these systems train on vast amounts of data."

"I just don't know how you go around, asking everyone first. I just don't see how that would work," Clegg said. "And by the way if you did it in Britain and no one else did it, you would basically kill the AI industry in this country overnight."

Nick Clegg Says Asking Artists For Use Permission Would 'Kill' the AI Industry

Comments Filter:
  • Nutshell (Score:5, Informative)

    by aaarrrgggh ( 9205 ) on Monday May 26, 2025 @04:32PM (#65405503)

    Obviously AI in its current incarnation is incapable of existing with consideration for artists/authors rights.

    • Re:Nutshell (Score:5, Insightful)

      by Kisai ( 213879 ) on Monday May 26, 2025 @05:18PM (#65405625)

      Of course. AI is not creative. It has to learn off existing material, be that text, voice, music, people's faces, paintings, drawings, television shows, anime, etc.

      Like here's the thing. I would be OK with the AI scraping existing publicly-reachable information, if it only scraped it once and retained the credit. The problem is that it does neither of those. It does not respect the bandwidth websites pay for ( a problem that web crawlers/spiders have a problem with in the first place ) and it does not respect the copyright and ownership of the materials on the websites. Like that "studio ghibli" art style one of the AI's came out with... that is absolutely wrong. You should not be able tell the AI "studio ghibli" or name any of the films they were responsible for and get an art style like it. This is literately telling the AI to not be creative, but to clone the style.

      And that's what AI is good at. Transforming thing A into thing B. It's not whole-cloth making anything. Music is the worst though because at present AI does not sing, either it can "choir" or it requires someone else to sing (eg the original artist) and squeeze it through an AI-autotune into another signer's voice style, but it's still very clearly the original singer.

      • Music is the worst though because at present AI does not sing, either it can "choir" or it requires someone else to sing (eg the original artist) and squeeze it through an AI-autotune into another signer's voice style, but it's still very clearly the original singer.

        Man, you're behind the times a bit here. I've been playing around with Suno quite a bit and their latest model is so good it's creepy.

        Hear it for yourself [youtu.be], that's one of the songs I made with the free trial of their paid model.

        Now here's the thing, do I fancy myself an actual artist because I'd collaborated with ChatGPT to turn my ideas into some lyrics, then had Suno make it into a song? Not really, because being a "real" artist is about having the correct industry connections. Then you can get away wit

        • Damn, that's awful.
    • by dbialac ( 320955 )
      Not that AI shouldn't be asking for permission, but human thought requires the same. That said, buying the book or a similar act by a human covers the license. Perhaps AI needs to do something equivalent.
    • The trouble is, this is possible, but not obvious, since artists/authors rights were defined at a time when LLM ingestion wasn't really a thing, so they weren't made clear in respect to it. Recording all the world's available media and spitting bits of it out whenever you want to is clearly against the rules. Reading/watching it all with a human brain and then having thoughts influenced by it is clearly in accordance with the rules. And LLM training is about half way between these two things, not really sim
    • That does not follow. Note that actual human artists learn by working with copyrighted art all the time. Look at how many kids learn to draw by doing things like first tracing Pokemon. Similarly, no one learns how to write without reading a heck of a lot.
    • He didn't really say that. The summary started out misrepresenting him. Then you get down to the last paragraph and find out what he actually said:

      "And by the way if you did it in Britain and no one else did it, you would basically kill the AI industry in this country overnight."

      That's probably true. If only one country had the restriction and no other, all the AI work would move to other countries. That's why we need international agreement that ignoring the rights of authors is not ok.

  • Is this bad? (Score:5, Insightful)

    by XXongo ( 3986865 ) on Monday May 26, 2025 @04:32PM (#65405505) Homepage
    "you would basically kill the AI industry in this country overnight."

    He says that as if it were a bad thing.

    • Re: Is this bad? (Score:4, Insightful)

      by jrnvk ( 4197967 ) on Monday May 26, 2025 @04:39PM (#65405525)

      Right? I mean, no other industry gets a pass for this behavior

    • Killing the AI industry in its current form _worldwide_ would be a good thing. A big reset and rethink, kind of like reining in the nuclear arms race. Killing the AI industry in the UK only, while it remaining a free-for-all elsewhere is economic harakiri. If the UK bans it, but some other country doesn't, then companies will simply go set up shop in that other country, do all the creative output mining where it's legal, and then sell whatever they can to whomever they can. They make money, the UK doesn't.

      • Re:Is this bad? (Score:5, Insightful)

        by postbigbang ( 761081 ) on Monday May 26, 2025 @06:06PM (#65405757)

        I'd be OK if every AI crawler that dropped by a site put a nickel, several pence, or even bit into that site's coffer. Here's your dosh, here's my scrape.

        They don't do that. It's kleptocracy, purely and simply.

        With millions of BS sites on the web, AI is made up of millions of garbage pages masquerading as "intelligence". The quality of the web is hideous, and people scratch their heads when AI hallucinates.

        The real and human content, no matter its quality, is as unrewarded as the BS goo found at every third IP address.

        I hereby invent, AutoDosh. It's the tip jar for a crawler to get into a site. It has a unique code. If a training model doesn't like the content or it's redundant/useless drivel, remember not to go by that site and drop the nickel. Otherwise, gimme my nickels.

        • I would like to see the Google anti-trust trial solve this. A good solution is to have a single crawler for the web (can be Google's via anti-trust settlement) and then everyone pays into a pool to get access to the feeds from that single crawler. Payment into that pool can then be used to make the equivalent of statutory royalty payments to the sites crawled. If you don't want to be crawled put your stuff behind a login. Of course you are going to be sorely disappointed in the amount you get from those

          • Your model is good, save for the pool idea, as everyone will try to drain it inequitably. Direct payment. Kafka could handle it, or a pub/sub model funds disbursement model.

            For those that have scraped prior to this, may they rot in hell, broke, with GPU payments to make.

            • The non-profit controlling the pool can negotiate the user fees based on usage. As for people who bypass the pooled crawler -- every web site should let them make requests and then never respond to those request effectively keeping them in infinite timeouts. Public embarrassment of the bypassing entities will also help control this.

              • I don't know if you've looked recently, but there is no such thing as embarrassment in all vectors of AI. But I get your idea.

                I suggest also adding tokens by PAYING IN ADVANCE in some negotiated HARD CURRENCY to fuel the pool. Admittance is then a matter of a valid public key generated against the pool and site.

                • Now get the judge in the Google anti-trust trial to order Google to create this are you're done. I don't think the proposal is unreasonable, someone just needs to get it in front of the right people. Google would certainly prefer doing this instead of selling off Chrome.

          • Another option would be to let each page set a micro payment amount in it's headers. Then the crawler could crawl until they run out of money. This works as a double-edged sword. If you set your micro payment amount too high you are not going to get crawled and then you'll drop out of every search index. So it's your choice. The single crawler would crawl free pages first and then crawl from cheapest to most expensive until it runs out of money. Obviously if you set your micro payment at $100 you're never

            • There is a good solution to prevent gaming this. The crawler can use AI to assess whether it wants to pay the price the page is asking or not. It can always decide the price is to high and not add the page to the index. In that case it doesn't pay. The payment is not for crawling, it is for giving permission to be added to the global index.

              • You're both setting up curation. This Darwinian approach might also eliminate AI sucker-bait. I like the idea.

    • "We're the new hotness that everyone wants to throw huge stacks of cash at and use as an excuse for / against anything at all, but we don't want to spend any of those billions of dollars fairly compensating the creator of the works we want to create derivative works of for their copyright, which is explicitly under the control of the copyright holder."

      Or, the TL;DR version: "waaaahh we don't want to pay licensing for their work, but expect everyone to pay us to license the derived product of their work"

      And

    • Re:Is this bad? (Score:5, Insightful)

      by Cyberpunk Reality ( 4231325 ) on Monday May 26, 2025 @05:34PM (#65405675)

      Year by year "right to a profitable business model" (as long as you have a big enough pile of wealth to begin with) marches onward.

      When your businesses' "one neat trick" is *ignore the law* (because money), your business should not be permitted to exist.

    • Thank you very much
    • "you would basically kill the AI industry in this country overnight."

      He says that as if it were a bad thing.

      The genie ain't going back in the bottle.

    • by GrahamJ ( 241784 )

      farnsworth_good_news.gif

  • by Gravis Zero ( 934156 ) on Monday May 26, 2025 @04:33PM (#65405511)

    If your industry is unable to survive by following the law then isn't that the same as admitting the basis of your industry is violating the law?

    • Not once you understand the Golden Rule: he who has the gold, rules.

    • Yep. What he's saying is the very definition of corporate theft.

    • It's a terrible example because I hate those damn things, but the dockless scooter rental industry followed a similar playbook.

      The only good to come of it was that, in most places, the revised laws that resulted made it legal for privately-owned scooters to be ridden as a means of transport. Whereas previously, they existed in sort of a legal gray area as something you were technically only supposed to ride on private property.

      IMHO, copyright lasts too long and it is badly in need of reform anyway. If the

  • Familiar argument (Score:5, Insightful)

    by ThumpBzztZoom ( 6976422 ) on Monday May 26, 2025 @04:40PM (#65405527)

    That's the same argument Napster used.

    If we can't get all our inputs for free, it would kill our ability to charge for similar stuff based on those inputs.

    If we can't take everything we want without permission whenever we want it, it would kill our plan to replace human creativity with cheap imitations.

    • Shutting down Napster didn't kill music streaming, either. The industry went back to the drawing board, came up with a better business model, and now rules the entertainment world with it.

      • by evanh ( 627108 )

        And that's exactly what is expected of AI proponents too. Cut them off from the free lunch. The alternative is the scrapping of copyright.

        Hehe, my Slashdot CAPTCHA word is "plunders". So apt.

    • There's your answer, then. Start feeding AI copyrighted music, which will piss off the RIAA. That should bring the issue to *some* sort of resolution really quickly.
    • by dgatwood ( 11270 )

      That's the same argument Napster used.

      If we can't get all our inputs for free, it would kill our ability to charge for similar stuff based on those inputs.

      Napster A. didn't charge, and B. didn't provide "similar stuff"; it provided identical stuff (ignoring compression artifacts). So no, that's not the argument Napster used.

    • Napster's argument was that they weren't breaking the law because they didn't host any of the content that was being illegally distributed. Technically that was true, but then contributory infringement became a thing.

      AI isn't helping you get a bootleg copy of Taylor Swift's latest album. Well, it might explain the process with the right sort of prompt, but you'll still have to do the actual legwork yourself.

    • The difference is this will make the rich richer. By trillions. So it doesn't matter if it's legal, they'll make it legal.
  • ...too see the problem.
  • Asking Artists For Use Permission Would 'Kill' the AI Industry

    First, a lot of hyperbole there (like AI couldn't possibly do anything else?!?), but, when you think of it, what's the alternative? Not asking artists, and kill off that industry (which, btw, if didn't exist these AI companies wouldn't have anything for their models to train on)

  • by karmawarrior ( 311177 ) on Monday May 26, 2025 @05:17PM (#65405623) Journal

    Make it mandatory, but force the AI industry to pay for every work they use. Photos and pictures could cost a million quid per photo/picture. And written material a thousand pounds per word.

    Oh, and add an extra 10% on top of that to be redistributed as a UBI to the entire country.

    If you're going to replace good quality work with slop, you need to compensate the people who'd have otherwise been able to get quality content. And if you're going to automate people's jobs out of existence and demand their labor be used to automate it, you should set them up for life.

    Oh, your paymasters at Facebook don't like that Clegg you big fucking corporate shit? Maybe you should rethink your world view then.

  • Seeing a computer on a street corner with a sign that says "starving AI"

  • by Alworx ( 885008 ) on Monday May 26, 2025 @05:28PM (#65405665) Homepage

    The business plan therefore is to grab content for free, digest it, then provide the resulting service for a fee?

    Or pay for the input content and charge more/a lot for the service?

    In other words, the business plan doesn't hold water so the only option is to lobby government and public opinion.

    Sad.

  • ...claimed a push for artist consent would "basically kill" the AI industry.

    That strikes me as a feature, not a bug.

    Furthermore, who do you think you are, that you imagine the Silly Valley cunts you just finished hanging with have some kind of moral right to do whatever the hell they want with artists' works? Un-fucking-believable.

    Can we break out the torches and pitchforks yet?

    • "Artists" I'd empathize with. But we're talking about media conglomerates who just hoard IP like the proverbial dragon sitting on its pile of golden treasure.

      I'm totally fine with slaying the dragon.

  • Obviously the dead baby industry is going to suffer if I am not allowed to go around murdering babies without permission. You'll kill the dead baby industry if you continue down this path of enforcing laws against murdering babies.

  • But first, go ask disney to use all their images for free to train your AI.
    Bring all your important personal as well, and go unarmed.

  • ... I guess AI will have to stick to searching for a cure for cancer.

  • by Rosco P. Coltrane ( 209368 ) on Monday May 26, 2025 @06:15PM (#65405773)

    Before biologists figured out how to "revive" pluripotency in adult stem cells - sometime in the 90's if memory serves - only embryonic stem cells could be use for research on stem cell therapy. And of course, if any therapy was to be discovered, only embryonic stem cells could be used for that also.

    I remember back then, researchers were crying out that the potential of stem cell therapies was so great that it was worth harvesting embryos just to get the cell. That was obviously something of an ethical issue.

    The pressure was huge to sidestep ethics and carry on with this wonderful new groundbreaking technology. Humanity couldn't pass up the opportunity to take advantage of this because of silly outdated ethical principles!

    Well, the law stood firm, and it forced the researchers to fnd another way to get pluripotent stem cells from adult cells - something which is now commonplace: nobody harvests embryos anymore.

    Same thing for AI: if it can't exist without stealing from everybody, then it shouldn't exist. Nobody should cave to the AI industry's demands, however much pressure they put on the legistlator. It will force AI professionals to resolve the theft problem and reinvent their industry to be compatible with common decency. Just like the stem cell industry did.

    Caving to the AI industry's demands is the lazy way out of the problem, wouldn't foster innovation and would simply entrench theft.

  • So we must steal too?

    Though if I recall correctly chatgpt was first and is US based.

  • The trained AI will be free, right?

  • I guess AI isn't compatible with IP and capitalism. Time to shut it down.

  • Copyright Act [wikipedia.org] of 1790 – established U.S. copyright with term of 14 years with 14-year renewal
    Copyright Act of 1831 – extended the term to 28 years with 14-year renewal
    Copyright Act of 1909 – extended term to 28 years with 28-year renewal
    Copyright Act of 1976 – extended term to either 75 years or the life of the author plus 50 years
    Copyright Term Extension Act of 1998 – extended terms to 95/120 years or life plus 70 years

    Now here we are and the business world is clamoring for AI. And a way to solve the bulk of the potential copyright claims is to reduce the term of copyright...

  • The lifetime+75 years people are battling with the Transhumanist tech bros.

    It's a shame the small current creators will be swept up in this but two groups who are parasitic on society are going to war, so all I need to do is find some low-carb popcorn.

  • to get any work done - having to opt out all yime. Why apart from the fact his ai tech bros pay - should they be treated any different than any other people they claimed where thieves,
  • literary it's something this "industry" should have started with
  • That would mean that I can use Veo without asking also.
  • Great! Now we know how to kill the AI industry. So let's do it!

  • Will kill the Criminal Defense Lawyer industry....

  • by Mysund ( 60792 ) on Monday May 26, 2025 @09:24PM (#65406223)

    We see the same in other industries... making murder illegal has almost killed the asassination industry.

What this country needs is a good five cent nickel.

Working...