As of this writing, the Writers Guild of America has been on strike for more than four months. The Screen Actors Guild and the American Federation of Television and Radio Artists joined the writers in mid-July. A key concern for members in both organizations is the rapid development of artificial intelligence and the implications of vague clauses appearing in their contracts. Left unchecked, AI-generated content could drastically reduce job opportunities and lead to untold lost income for all involved.
AI writing scripts
The more immediate threat is using large language models to write part or all of scripts, books and other print material.
Fraudsters are already using ChatGPT, a free service, to write content such as travel guidebooks, which they self-publish and dump onto Amazon. After gaming several dozen fake reviews to boost their search results on the site, these books can generate a nice payday. The accuracy of these guidebooks ranges from questionable to complete fiction. Still, buyers may not realize this until they’re standing on a street in Barcelona, looking for a non-existent museum.
Large language models’ writing quality and accuracy degrades significantly with long-form content, such as scripts or even long blog posts. Yet everyone seems to agree that it’s only a matter of time before they work out the kinks and AI is writing passable, though likely not award-winning, TV shows, films and novels.
The threat of “deepfakes”
You may have seen remarkably realistic videos of politicians making incendiary comments or celebrities’ faces superimposed onto different bodies. These so-called “deepfakes” have become alarmingly sophisticated in recent years.
This isn’t cutting edge technology available only to high-end production companies. For only $20 a month, the app VoxBox, “the ultimate AI celebrity voice generator,” allows you to attach one of 3,200+ celebrity voices to any text you write. You can have President Biden recite the lyrics of a raunchy Cardi B song or Optimus Prime read Shakespeare.
It isn’t possible to produce a feature film with a scan of an actor’s face and body just yet, but most people believe we’ll get there soon enough. When that day arrives, a studio could theoretically hire an actor for a one-day job and then use that person’s voice and likeness to create endless new content or even marketing campaigns that the actor would never agree to do. This could go on for years and the actor would still only have a single day’s pay to show for it.
AI law is still in its infancy
For the moment, plaintiff’s usually settle AI-related lawsuits out of court, meaning there’s scant legal precedent for attorneys to work with. Historically, this lack of legal clarity has led to industry-wide catastrophes, such as the music sharing firestorm of the early-2000s. Unsurprisingly, people in the entertainment industry would like to avoid a similar crisis.
Studios and unions will need to define issues like residual rights and compensation for the future use of someone’s likeness. Job security for set crews and post-production staff is another matter entirely. It will be a challenging negotiation, but that’s why we have (human) lawyers.