Chat GPT Is Coming for Hollywood

The prospect of A.I. taking human jobs has escalated rapidly, so it’s no surprise that it’s caught the attention of Hollywood guilds, which are about to negotiate new overall deals with studios.
The prospect of A.I. taking human jobs has escalated rapidly, so it’s no surprise that it’s caught the attention of Hollywood guilds, which are about to negotiate new overall deals with studios. Photo: Pool Benainous/Reglain/Gamma-Rapho via Getty Images
Eriq Gardner
February 6, 2023

Has David Zaslav used ChatGPT? I ask because the Warner Bros. Discovery boss, who everyone knows is out to save a few bucks, has come up a few times in my recent conversations with Hollywood dealmakers. They don’t think Zaz would hire a bot to replace salaried writers just yet, but as for a punch-up job, well, all bets are off.

After all, the prospect of A.I. taking human jobs has escalated rapidly from “Ha, ha, let me just get comfortable in my Matrix coffin” to “OK, so how much time do I have?” Suddenly, everyone I know is nervously assessing its capabilities. Can Stable Diffusion replace animators? What about the dozens of A.I. startups that can replicate the voices of famous actors? Can ChatGPT write a court brief? How about a legal newsletter?

We’ve already seen these technologies revive Val Kilmer’s voice in Top Gun: Maverick, simulate Bruce Willis in a Russian telecom advertisement, and write and direct short films, so it’s no surprise this has all caught the attention of the Hollywood guilds, which are about to negotiate new overall deals with the studios. From my conversations with insiders, it sounds like A.I. will be an important and contentious topic during these talks. 

SAG-AFTRA, the biggest labor union in Hollywood, has just taken the first shot, I’ve learned. On Jan. 28, the union’s national board unanimously agreed that “Global Rule One”—no member should work for an employer who hasn’t accepted the union’s basic minimum agreement—will now extend to the recreation of an actor’s voice or image. The board explicitly articulated its position that the right to simulate an actor’s voice, likeness, or performance using digital technology is a mandatory subject of bargaining.

On Friday, talent lawyers got a memo from SAG-AFTRA that spelled out what this means: Don’t assign likeness rights to anyone working outside the union purview. As for any language in a contract that purports to control the right to simulate an actors’ performance, that’s “void and unenforceable until the terms have been negotiated with the union.” This stance not only represents the first salvo of what’s sure to be a fascinating round of labor talks; it also foreshadows some epic court battles ahead. 


A.I. Zendaya?

For a taste of what’s coming, check out singer Rick Astley’s new lawsuit against rapper Yung Gravy, among others, for impersonating his voice on Never Gonna Give You Up for the creation of a new song, Betty (Get Money). The complaint doesn’t mention artificial intelligence, of course, but with talk of vocal recreation, it does tread on legal ground we are likely to see resurface as generative A.I. becomes prevalent. In short, the defendants licensed Astley’s composition but not the ability to sample the song’s 1987 recording. So they had some rights to Asltey’s song—namely, to make a cover version. The question for the court is whether the musicians violated California law anyway by imitating Astley, instead of offering an original take on his music. Or does federal copyright law, which permits soundalikes, outweigh state law protecting one’s likeness? 

These are no longer just academic questions: Today’s A.I.s are frequently trained on copyrighted material, raising thorny legal questions about whether that’s fair use, whether their outputs are too similar to the source material, and even whether the results are copyrightable. The dealmakers I talk to are particularly worried about vague old contracts that might allow the studios to take the position that they already have rights to make A.I. recreations. No wonder SAG-AFTRA is using the phrase “truly informed consent.” Without it, authors and performers might find themselves in court, like Astley, arguing that it’s unfair to appropriate their identity or likeness in a way that falsely suggests their involvement, even if a producer has broad rights to existing material.

Of course, this isn’t the first time that new technology has created a legal mess in Hollywood. Early filmmakers saw their work re-broadcast on TV, and broadcast TV producers reckoned with the legality of cable re-runs. Later there were fights over video cassettes and DVDs, and, more recently, streaming. Each of these shifts sparked court battles and arbitrations over contracts that were written in less technologically-advanced times. 

That’s precisely why many Hollywood contracts today claim the authority to exploit a work or performance “throughout the universe in perpetuity by means or methods now or hereafter known.” Does that authority include the ability to, say, train an A.I. on old movie and TV scripts? What about swapping out a cameo from an A-list actor with an A.I.-generated facsimile that can reprise their role instead? 

There are opportunities here for actors, too. If you’re Morgan Freeman, for instance, why not license your voice to every documentary filmmaker willing to pay your fee? Or if you’re Zendaya, considering a brief appearance in a new Marvel movie but stuck working on another film, suddenly your earning potential is unlimited. For a real world example, check out musician Holly Herndon, who has released a vocal program called “Holly+” that allows others to easily “deepfake” her singing voice. These scenarios won’t be hypothetical for long.

For now, it’s all a legal gray area. Entertainment lawyers tell me they’ll first refer to old labor agreements to determine whether A.I. infringes their clients’ rights. Unfortunately, they add, when it comes to something like reshoots, it’s ambiguous whether a producer can simulate a performance and avoid having to pay an actor for extra working days. So will they explicitly address these possibilities in contract negotiations, like SAG-AFTRA is doing? “There’s precedent,” one lawyer told me. “We already do things like requiring approval over lookalikes in nude scenes.”

Then again, asking a studio for new rights always runs the risk that they refuse, and you may lose the upper hand in court as everyone figures out how the old contracts should be interpreted. As one savvy dealmaker told me, “I’m more than happy to let the guilds take the lead on this.”