What a major movie studio’s AI deal could mean for the future of Hollywood

Generative AI might save studios 'millions and millions of dollars,' but at what cost?
Lionsgate will let Runway AI train its model on John Wick and thousands of other titles.
Lionsgate will let Runway AI train its model on John Wick and thousands of other titles. Credit: Lionsgate

Share

When Hollywood’s actors took to the streets last year for a 118 day strike, many wielded signs reading “no digital clones,” “AI is soulless,” and “AI is not art.” These ticked-off thespians were expressing a sentiment shared by a growing share of writers, video games voice actors, and many other creatives: generative AI tools, trained off their work, may threaten their jobs and shrink the entertainment industry. When the strike ended, actors were awarded new, hard-won protections against AI-generated clones. Since then, California has passed several landmark laws limiting Hollywood’s use of certain generatrice AI use cases. But none of those efforts will outright stop major studios from using generative AI to try and cut costs around new movies and shows.

Lionsgate, the studio behind popular films series’ like The Hunger Games and John Wick, recently announced it’s letting a startup mine its catalog works with the goal of creating an AI model capable of creating storyboards and other pre- and post-production work. The deal represents the first of its kind between a studio and an AI maker but it’s unlikely to be the last.  

AI could save studios ‘millions and millions of dollars’ 

As part of its new deal, Lionsgate will let New York-based AI research firm Runway AI create a generative model trained on its corpus of 20,000 titles spanning 27 years. The custom generative AI model will then create “cinematic video” which the companies say will initially be used as inspiration during storyboarding sessions or pre-production meetings. Lionsgate believes these tools will help filmmakers and other creatives “augment their work” to deliver “capital-efficient content creation opportunities.” AI, in other words, could help cobble together  more movies and shows with lower investment costs. Eventually, according to statements made by Lionsgate founder Vice Chairman Michael Burns during an interview with The Wall Street Journal, that same tools could be used to generate explosions or other background effects in films. 

“We do a lot of action movies, so we blow a lot of things up and that is one of the things Runway does,” Burns said during an interview with the Journal. Burns went on to say he hoped the tool could save the studio “millions and millions of dollars.” 

The announcement comes weeks after Lionsgate distanced itself from a trailer for its film Megalopolis, which reportedly featured inaccurate, AI-generated movie reviews. A Deadline report claims the marketing consultant in charge of the materials for the trailer has reportedly since been removed from the marketing team following public backlash.

For now at least, it looks like this particular model will steer clear of generating AI “clones” or replicas of actors. A person with knowledge of the agreement told Popular Science Runway AI model won’t be used to generate new AI characters or replicate existing actors. Instead, it will primarily be used as a tool to enhance and augment existing projects, the person said. 

But even if Lionsgate avoids using Generative AI to create digital actors there’s no guarantee other studios will follow a similar approach. Critics of AI’s impact on the film industry, including many background actors, worry studios could simply replace them with their AI doppelgangers, essentially making them redundant. 

SAG-AFTRA, the union representing Hollywood actors recently added language to its bargaining agreement requiring producers to explain to actors how a digital replica of them would be used, and to obtain their consent, before including the replica in a show or film. The union agreement was viewed widely as a win, but not all actors walked away satisfied. Some balked at their work ever being used to train an AI model. The Writers Guild of America (WGA), a separate union representing many Hollywood screenwriters, reached their own agreement clarifying that writing generated by cannot be considered “literary material.” 

“We view AI as a great tool for augmenting, enhancing and supplementing our current operations,” Burns said in a written statement.  

Runway, meanwhile, is currently fighting a class-action copyright lawsuit brought against it by multiple visual artists who alleged the company trained its models on their work without permission. Runway has field motions to dismiss that case. 

Hollywood deepfake laws won’t stop studios from partnering with AI firms

The Lionsgate agreement comes less than one day after California governor Gavin Newsom signed two bills into law setting guardrails in places over how Hollywood can use certain generative AI features. The first bill, AB2602, prevents employers from using digital replicas of a performer in a project instead of the real person, unless that person consents and knows how the replica will be used. AB1836, meanwhile, makes it clear studios and entertainment employers need to receive consent from a deceased performer’s estate before they can use an AI replica of them. SAG-AFTRA sponsored both of the bills. 

None of the provisions in the new Hollywood AI laws, however, appear to prevent Lionsgate or other studies from letting other companies use their content to train models. So long as the companies steer clear of using the models to generate AI versions of actors, existing laws don’t prevent studios from using AI-generated videos to shape ideas or even appear as effects or backgrounds for a production. 

Generative AI in movie production could become much more common 

The agreement between Lionsgate and Runway could serve as a template for others to follow. Disney has reportedly set up a taskforce to study how generative AI can be used across its entertainment offerings. Paramount, according to the Journal, has also reportedly been in discussion with generative AI companies. By training new models exclusively on a studio’s catalog AI companies like Runway can potentially avoid opening themselves up to more copyright lawsuits creators who say their works were scrapped against their will. Other, less tailored generative AI models that scrape large swaths of the open internet, like those offered by OpenAI and Stable Diffusion, are currently operating in legal gray areas as related copyright suits wind their way through the courts. Studios and AI-makers mostly sidestep those headaches by simply limiting an AI’s training set to material the studio already owns. 

And while protections are now in place regarding AI replicas, little is stopping future movie-makers from creating entirely new-looking “Synthetic Performers” pieced together from training data. In this case, an AI could compile data from a studio’s catalog (including actors’ performance) and use that as the foundation to generate an AI character film. Some, like prominent actor Joseph Gordon-Levitt, argue they should be compensated when their work is included in training data used to build a synthetic performer. 

“AI can’t do our jobs yet, but it might be able to soon,” Gordon-Levitt wrote in an op-ed for The Washington Post. “And people whose jobs are threatened by AI will be the same people who produced the data used to train it.”

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.