California Enacts Laws to Protect Actors from Unauthorized AI Replicas

September 20, 2024

California Governor Gavin Newsom has enacted two significant bills designed to protect actors and performers from unauthorized AI replicas of their likeness or voice by ensuring that digital reproductions cannot be used without explicit consent. The first bill, AB 2602, prevents contract clauses that allow digital replicas to replace live performances unless explicitly outlined and negotiated with legal representation or a labor union. The second bill, AB 1836, mandates entertainment employers to obtain consent from a deceased performer’s estate before using their digital likeness, addressing loopholes in existing postmortem right of publicity laws.

These legislative measures come amid rising concerns over the ethical use of AI in the entertainment industry. The controversy around AI digital replicas intensified last year during a four-month strike by SAG-AFTRA, the actors’ union, which underscored apprehensions about AI clones being used without consent. Supported by SAG-AFTRA and the California Labor Federation, these new laws reflect a growing consensus on the need for stronger protections for entertainment workers in the digital age. The evolving landscape of digital media has necessitated these steps to safeguard not just the artistic integrity of performances but also the personal rights of those behind them.

Industry Response to AI Protections

Governor Newsom highlighted the importance of these laws in ensuring the industry’s continued success while protecting workers’ rights and likenesses. Prominent voices in the industry describe this legislation as a crucial step toward balancing innovation with ethical standards. These actions align with broader efforts in California to regulate AI-generated content, such as new laws targeting the creation of political deepfakes ahead of the 2024 elections. The move emphasizes the state’s commitment to remaining at the forefront of both technological advancements and regulatory measures, ensuring that the rapid evolution of AI occurs within a framework that respects individual rights.

SAG-AFTRA and other labor organizations have applauded the bills, viewing them as a victory for performers who have long feared the implications of their likenesses being used without their knowledge or authorization. These laws are seen as essential tools in fighting unauthorized digital reproductions and ensuring that creators are fairly compensated for their contributions. Actors and performers now feel a sense of security in knowing that they have the backing of state law to protect their images and legacies in an increasingly digital world.

Broader Implications for the Future

California Governor Gavin Newsom has signed two landmark bills to safeguard actors and performers from AI-generated replicas of their looks or voices without their permission. The first bill, AB 2602, stops contract clauses that allow digital replicas to replace live performances unless specifically agreed upon with legal counsel or a labor union. The second bill, AB 1836, requires entertainment companies to get approval from a deceased performer’s estate before using their digital likeness, fixing gaps in existing postmortem publicity rights laws.

These laws respond to growing ethical concerns about AI in the entertainment world. The debate over AI replicas became more intense last year during a four-month strike by SAG-AFTRA, the actors’ union, highlighting fears that AI clones might be used without performers’ consent. Supported by SAG-AFTRA and the California Labor Federation, these laws reflect a growing agreement on the need to better protect entertainment workers in today’s digital landscape. As digital media evolves, these measures are essential to protect both the creative integrity of performances and the personal rights of those who create them.

Subscribe to our weekly news digest!

Join now and become a part of our fast-growing community.

Invalid Email Address
Thanks for subscribing.
We'll be sending you our best soon.
Something went wrong, please try again later