It seems like everyone in media and entertainment is unhappy these days with A.I. programs feasting on their work, but few people are actually taking their outrage to court. While there’s a slew of putative class actions in progress (artists, coders, Sarah Silverman, etcetera), these can all be traced to a single San Francisco attorney, Joseph Saveri. Why aren’t movie studios suing? Or record labels? Or book publishers? In fact, when it comes to industry-driven legal actions led by prestigious law firms, there’s really only one that’s notable—Getty Images’ suit against Stability AI, filed in February, which could lay the groundwork for cases to come.
I’ve been investigating the perplexing lack of action, talking with insiders about their hesitancy and how tomorrow’s A.I. threat compares to, say, how Napster and YouTube remade the industry 20 years ago. Part of Hollywood’s reluctance to take on Silicon Valley rests on the hope that A.I. can be harnessed for positive creative purposes (and, yes, profitability). But it’s more than that. For valid reasons, insiders are apprehensive that I.P. lawsuits might not stand up in court.
While many seem to believe that ChatGPT, Bard, and the like are clearly violating copyright law by training themselves on proprietary content (scripts, musical scores, still images, etcetera), very few experts I know share their certainty that it would be a slam-dunk case. On the contrary, academic literature and case law suggest that might count as fair use. Sure, it gets complicated when an A.I.’s output is overly similar to its inspiration, and thus presents direct market competition. (Consider what scholar Mark Lemley has to say about training a system to make a song in the style of Ariana Grande. Also, keep in mind the Supreme Court’s recent Warhol decision.) But the legal avenues look bumpy enough that even copyright hawks are proceeding cautiously here. As one seasoned industry figure told me, “I’m closely following these cases. I’ve got time, and when I do something, I don’t like to lose.”