Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
AI Movies

Hollywood Already Uses Generative AI (And Is Hiding It) (vulture.com) 56

Major Hollywood studios are extensively using AI tools while avoiding public disclosure, according to industry sources interviewed by New York Magazine. Nearly 100 AI studios now operate in Hollywood with every major studio reportedly experimenting with generative AI despite legal uncertainties surrounding copyright training data, the report said.

Lionsgate has partnered with AI company Runway to create a customized model trained on the studio's film archive, with executives planning to generate entire movie trailers from scripts before shooting begins. The collaboration allows the studio to potentially reduce production costs from $100 million to $50 million for certain projects.

Widespread usage of the new technology is often happening through unofficial channels. Workers are reporting pressure to use AI tools without formal studio approval, then "launder" the AI-generated content through human artists to obscure its origins.

Hollywood Already Uses Generative AI (And Is Hiding It)

Comments Filter:
  • by ebunga ( 95613 ) on Wednesday June 04, 2025 @11:34AM (#65427243)

    What is the deal with AI and dishonesty? Roughly 1/3 of all articles are shills hyping the latest trend (we're on to post-agentic AI now, yay), while 2/3 are about the harm of AI, and a good portion of that is dishonesty, whether it's mass theft of data and intellectual property to train the beast, lies about green energy and all those diesel generators out back, literal cheating in education, lawyers and policy makers getting slapped for not checking and removing all those fake sources in "their work", failing to pay their bills and folding up shop and disappearing overnight, and then there's this.

    • by Zocalo ( 252965 ) on Wednesday June 04, 2025 @12:01PM (#65427319) Homepage

      ...and then there's this.

      "This" is a cast iron example of why everyone involved in AI - the content producers, AI companies, VCs backing them, policitians, and users - need to deal with the elephant in the room; copyright law was not designed for the digital age, and certainly wasn't designed for the wholesale ingestation and regurgitation of AI engines. That the media companies, usually the first to cry "foul" and demand outrageous amounts of damages because copyright, are themselves playing fast and loose with other's content while complaining about their own being used as training data more than proves the point it's way past its sell by date.

      While amended since, the Berne Convention dates from 1886. AI isn't a crisis for copyright; it's an opportunity to give it a thorough overall, make it fairer for all given it's now so easy to content shift and share data, and generally fit for purpose and fair for the 21st century and beyond. Fail to do so, and it's just a matter of time before the legal fallout (and damages) under the current system are going to give the lawyers on the winning sides of the inevitable disputes a whole new fleet of superyachts.

    • That's capitalism, late-stage capitalism. It's ok to lie, cheat and steal as long as you are incorporated. It's also ok to be an illegal immigrant and take an heroic amount of drugs everyday.
    • Re: (Score:1, Insightful)

      by Anonymous Coward

      2/3 are clickbait from 404media fearmongering that AI will kill us all, we will all get unemployed and if AI can't kill us it will at least make us addicted to it, make us dumb, and suggest us to commit suicide. Slashdot should really stop using 404media as if it were a reputable source. The other 1/3 are articles from different sources, some more insightful than others.

    • by shanen ( 462549 )

      Should have been the FP rather than the AC brain fart that took FP.

      However, I can't tell from your post if you are mostly for or mostly against.

      I think my main take that seems to be related to your unclear take is that AI is not helping us become better people. I still dabble with the toys from time to time, but it used to feel like playing with a loaded gun and now it feels more like juggling hand grenades with loose pins.

      And as regards ye olde Turing Test, the AIs definitely lie bigger and better than man

      • by ebunga ( 95613 )

        And as regards ye olde Turing Test, the AIs definitely lie bigger and better than many actual human beings. The main difference is that the humans usually have motivations and the AIs still don't. At least not that I've been able to detect yet.

        The AIs aren't telling lies when they "hallucinate" as they're doing exactly what they're programmed to do. It's the humans that go around pretending that the gibberish generators are magical thinking machines that are telling the lies. And why wouldn't they? There are trillions of dollars on the line.

      • You have to be conscious to lie. A sign that points the wrong way isn't lying. It's providing false/faulty information.
        We may unintentionally anthropomorphize LLM's because we use natural language to control them. We are not communicating with a machine, we're controlling it with queries/prompts. It's not a conversation, because by definition a conversation is an exchange of thoughts, feelings, concepts using a common language. It can't embody concepts due to a lack of consciousness. It's no more a conversa

  • by Anonymous Coward

    There's a built-in assumption here - that anyone *has* to disclose which tools they use. Nobody has to disclose anything.
    But the culture among grunt-level artists is that AI companies are evil and anyone who uses AI is morally questionable, and so anyone who doesn't come right out and say how they use AI is "hiding" it from the public.
    "People are angry on social media! It's the socially responsible thing to do!"
    Go have your butlerian jihad you zealots, get it over with already.

    • by DarkOx ( 621550 )

      Right the proposition here is that there should be 'good money in commercial art.'

      What is technology really but application of knowledge to replace manual labor activities with capital assets.

      Being anti-AI at this point is really just being anti-tech. Sorry that is what it boils down to, you can justify it anyway you want but the objection is really no different than a ferrier being upset with the development of rubber tires and the automobile.

      Really this boils down to very fundamental questions about 'wha

  • Wow ... (Score:2, Interesting)

    by Anonymous Coward

    Graphics and video tools have AI functions for like 15 years, and depending on what you count as AI for even longer. Did you ever think about how "smart selection" works? What is generative fill, if not generative AI? Just because a few really good models are hyped now, that doesn't mean that creative use of AI is new.

  • Really I think the biggest use for AI would be to create storyboards or early 'test screenings' before a lot of the expensive production is started. Nobody will publicly see any of this, so I fail to see how it would run afoul of copyright.
    • by Hadlock ( 143607 )

      The movie studios (particularly 900lb gorilla behemoths like Disney, and whatever the name of the company that owns HBO is this week) already own the copyright on enough content they don't need to train outside of their own copyright library. LLM summary of script -> Script of trailer -> shot by shot description of trailer -> 2-5 second "video clips" -> human or machine stitches them together, throws on a backing track is definitely something they can do/train on using their existing library. Si

  • by TurboStar ( 712836 ) on Wednesday June 04, 2025 @12:44PM (#65427437)

    I feel for the artists that are going to get hit by this, but no reasonable person would expect movies to come with a list of ingredients like a box of cereal. We don't expect disclosure of whether special effects are practical or CGI but now AI needs disclosure because why? It's just a new type of CGI.

    • by Aristos Mazer ( 181252 ) on Wednesday June 04, 2025 @12:56PM (#65427471)

      The argument is that how the training is done makes this new and different from CGI. LLM output isn't some clever programming someone wrote based on math they thought up. It is deep analysis of every available archive to reproduce mechanically what was done by hand. The argument is that LLMs learning from the available works is a violation of copyright when that LLM is then used to generate new work that displaces the artists whose work was used in the input. It's definitely new. It's definitely not a legal violation as law is currently written. It's definitely hurting existing artists. It's definitely opening up new ways of creating for non-artists. It's definitely got unanswered questions all over it. And we definitely need to have a discussion as a society and as a legal system about what we think is right for these tools. None of that was true of CGI. :-)

      • TFS says the studio is using its own back catalogue, unless they can somehow infringe on their own copyrights?

        • Part of the ARAG strike was the argument that artists never signed away the rights to reproduce their voices and likenesses for eternity, even when they signed for a particular project (e.g. motion capture for a specific movie). That applies to FX artists as well. So even when these films were made "work for hire", was it really intended to be used like this? Not from the artists' POV it definitely was not. None of them would have signed a contract that permitted this. And the argument is that whatever copy

    • You actually have this very backwards. Artists aren't negatively impacted by AI, they are embracing it. VFX is forever the lowest margin bottomfeeder of the movie industry. They have shitty budgets, shitty timetables, and there's massive expectations on them. The industry has set them up for failure to the point where you can actively see in many movies precisely where the money ran out with world class VFX cast alongside horrendous rubbish all caused by being out of time and out of money.

      Generative AI isn'

  • by ZipNada ( 10152669 ) on Wednesday June 04, 2025 @12:57PM (#65427473)

    The time of "major Hollywood studios" is visibly coming to an end. No need for massive resources to make a movie. Each of the production phases can be accomplished by one person working with an AI. A couple of writers working with an AI will come up with a plot and a screenplay. Someone else will use that to quickly generate one or more prototype storyboards the team can evaluate.

    No sound stage or on-location shooting required. The AI will motion-scan your actors in front of a green screen to get a library of relevant style and emotional expressions. The human talent will show up for a few days of in-person filming for critical scenes. Then the AI will generate your movie segment by segment over the course of a week or so. Viewings by test audiences will let you tweak the results for maximum satisfaction and then its off to the Oscars.

    • LOL we are insanely far away from that reality. There's a big difference between someone cranking out a couple of minute short that is tethering on the edge of uncanny valley and producing a major studio film. AI have a huge consistency problem. AI has an art style problem - you can usually tell something is AI because ... it actually all looks the same if you focus on what to look for. There's no variance, no creativity, and no real intelligence helping it along.

      • >> we are insanely far away from that reality

        Are you sure? There seems to have been a huge amount of progress just over the past few months.
        https://www.msn.com/en-us/news... [msn.com]
        "We've never seen anything like Veo 3 before. It's impressive. It's scary."

        "Veo 3 also produces audio and dialogue. It doesn't just offer photorealism, but fully realized soundscapes and conversations to go along with videos. It can also maintain consistent characters in different video clips, and users can fine-tune camera angles,

  • There'll have to be a law designated how AI can use copyrighted data. I feel like the time is right for AI to use all our output, the function of IP law (according to the constitution) is so that the the "useful arts will be advanced" (see reference below). Since human creativity seems to have maxed (I am being sarcastic). But no, seriously .. why the fuck is every movie a sequel. Even politicians are the same recycled idiots.

    From the constitution:
    "To promote the Progress of Science and useful Arts, by secu

    • Film making is about making money. Most movies flop (AKA go straight to DVD). However sequels have already an audience who enjoyed the first episode, so are far more likely to be willing to splash out on a second. Unfortunately this works - so we get sequel and remakes and prequels etc etc.

    • The bad thing is Congress in it's being bought and paid for workings made the Constitutions intentions moot.

      "The Sonny Bono Copyright Term Extension Act â" also known as the Copyright Term Extension Act, Sonny Bono Act, or (derisively) the Mickey Mouse Protection Act â" extended copyright terms in the United States in 1998."

      Ref:

      https://en.wikipedia.org/wiki/... [wikipedia.org]
  • oh absolutely (Score:3, Interesting)

    by psycandy ( 5470982 ) on Wednesday June 04, 2025 @02:04PM (#65427629) Homepage
    i work in art department in film and for the most part, gen AI has made life a lot easier for art directors, set dressers and so on... where the old brief was maybe a sketch or storyboard, it's now something the director liked while he was mucking about and wants something 'just like that'. in a way it has created more work.
  • by Mononymous ( 6156676 ) on Wednesday June 04, 2025 @02:49PM (#65427753)

    The suits would be very upset about buying, shooting, and releasing screenplays that can't be copyrighted.

  • If the AIs can write films where the men are men, the women are women and not "strong women", the characters' sexualities aren't mentioned unless they're relevant, and there are no other unsubtle messages that the filmmakers obviously think are more important than telling the fucking story... ...then LET THE AI'S RIP !

    Disney must die.
    • Particularly I don't mind "strong women" (as well as "strong men") as long as it makes sense in the story (some classic examples: Sarah Connor, Ellen Rippley).
      There must be a reason or trajectory of a character becoming strong/powerful/skillful/etc., it's stupid making them inexplicably strong because yes. That's valid no matter the gender.
      There are a lot of "strong men" movies that sucks as well, but "strong women" movies sucks more because they're forcing that down our throats because of agenda, it's not

  • well if its good enough for law enforcement and the 3 letter agencies to use 'parrarell construction'.
  • The remnants can be mostly ignored. They are fiddling, Rome is burning. People are leaving en masse, it is not a big secret. The L.A. fire was only the latest reason why.
  • by Mal-2 ( 675116 ) on Wednesday June 04, 2025 @08:11PM (#65428378) Homepage Journal

    Obviously special effects was bound to go this direction, and that would almost certainly be legal with or without actor permission. Replacing a computer mildly attended by a human with another computer mildly attended by a (much less paid) human is so common as to attract practically no notice. Programming explosions was always a job with a shelf life. Either the production can afford real (if scaled down) pyrotechnics and practical effects, or they're a no-budget indie production that would otherwise go with some stock library for the purpose. So that gets AI into machines and onto desktops very quietly and legitimately.

    Also, it's still acceptable to use AI to produce storyboard images and placeholder music and the like that are never going to see the light of day, right? I imagine the writers throw their scene into an AI and let it churn a few iterations. If none of them are even close to what they want, send it off to a sketch artist like always. Otherwise it may be faster and involve a lot less message-passing to just fake it themselves and explain/caption how it's wrong. They already do this when a scene changes after sketches have been made. Again this gets it into machines and onto desktops. It allows for a plausible sounding excuse of "there aren't any clean systems, every editing rig uses AI for in-house purposes". Render rigs make half-decent AI rigs too, even if they're not designed for that purpose. The builds are very, very similar -- GPUs, RAM, storage, and to a lesser extent the CPU itself are all pushed to 100% at some point in both workflows. A pair of 48 GB RTX 4090 is great and all, but you need the bandwidth on the system side to feed it and to display/store the results.

    The questions start when the material designed for in-house use gets disseminated to the world, as it might be for a trailer of a movie still in early production. But if they haven't even hired a cast yet, they're not contractually obligated not to use something resembling a known actor -- although they may burn bridges if it ends up they want that person for the real deal. I suppose if they said "do it in Ghibli style" then nobody could claim to be fooled that it actually is Famous Actor.

  • to make playbooks, spitball ideas and transfer them from static images to demo reels would indeed be a cool idea to use these tools to quickly get something out there to see if there is Animo for a concept. If it helps them realize quicker those ideas that can spark, and those that cannot, I'm all for it. but alas, their goal will ultimately be cost cutting with zero revenue sharing

Surprise due today. Also the rent.

Working...