For performers, their voice and likeness are the foundations of a finely cultivated personal brand that’s been developed through years of dedication to craft. They are the two most necessary tools SAG-AFTRA members use as they make a living, and this is why protecting members against the misuse of A.I. technology and providing enforceable guardrails on the use of digital voice and likeness replicas are among the union’s top legislative priorities.

Recently, SAG-AFTRA participated in the Senate Judiciary Committee’s Subcommittee on Intellectual Property hearing about the NO FAKES Act, supported Senate Majority Leader Chuck Schumer’s Senate A.I. Working Group Roadmap, and joined legislators in California and New York to advocate for a package of A.I. bills. All efforts are part of what SAG-AFTRA National Executive Director Duncan Crabtree-Ireland calls “a mosaic of protections” that will ensure individuals maintain their fundamental rights in this new era.

The Battle Continues
The need for such for voice and likeness protections became especially clear in April, when Scarlett Johansson raised concerns that OpenAI’s personal assistant sounded “eerily similar” to her, even after she declined initial requests to license the use of her voice for the app. It was an unsettling situation that highlighted many apprehensions around the technology, its capabilities, and what boundaries tech companies are apparently willing to cross to attain their objectives. A recent victory for creatives came in March, when Tennessee Gov. Bill Lee signed the SAG-AFTRA-supported Ensuring Likeness Voice and Image Security Act into law. Known as the ELVIS Act, it defines and adds voice as a protected personal right. It’s the first legislation of its kind to focus on digital replica protections for recording artists and performers in the age of A.I.

On the federal level, the Nurture Originals, Foster Art and Keep Entertainment Safe — or NO FAKES Act — will prohibit the nonconsensual use of a voice or likeness digital replica in sound recordings and audiovisual works and would create the first-ever intellectual property right in voice and likeness, allowing individuals to notify online platforms about replicas and demand takedowns. After almost a year of working with stakeholders, including substantive input from SAG-AFTRA, the official draft of NO FAKES was introduced in late July.

In his testimony delivered during the Senate Judiciary Committee’s Subcommittee on Intellectual Property about the NO FAKES Act, Crabtree-Ireland explained the urgent need for voice and image protections.

“Enshrining this protection as a federal intellectual property right will ensure our members, creative artists and, frankly, all of us are protected ... These rights should be transferable and descendible, just like any other intellectual property or any kind of property someone owns, with durational limitations on transfers during one’s lifetime to ensure that we don’t enter into an era of digital indentured servitude.”

If signed into law, the NO FAKES Act would establish a digital replica “right of publicity” at the federal level that applies to everyone. 

Take It Down Act 
In June, SAG-AFTRA celebrated the introduction of the bipartisan Take It Down Act, which would require internet sites to take down nonconsensual intimate images, including deepfake images, within 48 hours. The bill is co-sponsored by 11 other senators in addition to Sens. Ted Cruz and Amy Klobuchar, who introduced it.

Other bills making their way through the state and federal legislative systems can be found at sagaftra.org/gapp.

The COPIED Act
In July, Sens. Maria Cantwell, Marsha Blackburn and Martin Heinrich introduced SAG-AFTRA-supported legislation to combat A.I. deepfakes and put journalists, artists and songwriters back in control of their content. The Content Origin Protection and Integrity from Edited and Deepfaked Media Act, or COPIED Act, would set new federal transparency guidelines for marking, authenticating and detecting A.I.-generated content; protect journalists, actors and artists against A.I.-driven theft; and hold violators accountable for abuses.

“For SAG-AFTRA, protecting the ability of our members to control their images, likenesses and voices is paramount,” said Crabtree-Ireland. “Sen. Cantwell’s legislation would ensure the tools necessary to make the use of A.I. technology transparent and traceable to the point of origin and will make it possible for victims of the misuse of the technology to identify malicious parties and go after them.”

Other federal A.I. bills which SAG-AFTRA is supporting include the Preventing Deepfakes of Intimate Images Act, the A.I. Labeling Act and the Generative A.I. Copyright Act.

Closing the Loopholes
In a recent SAG-AFTRA podcast episode that addressed the ongoing work to protect the creative community from A.I., Crabtree-Ireland explained the current legislative reality.

“Broadly speaking, [no one] has a federal right to control the use of their image or likeness,” he said. “While there are sometimes state laws that give those protections, not every state has them — and even where they do exist, a lot of times there are significant limitations ... We really do need some sort of consistent federal protection for individuals’ rights.”

SAG-AFTRA’s Government Affairs & Public Policy team plays a vital role in developing and supporting public policy at city, state, federal and international levels. On the A.I. front, they advocate for legislation that requires consent and compensation for use of a person’s digital voice or likeness, protecting individuals — especially creatives — from having to compete against their own digital replica in the entertainment and media marketplace.

Another area of concern is the prospect of A.I. technology being used to manipulate digital replicas of performers after they die. If passed, California’s AB 1836 would prohibit the nonconsensual digital replication of deceased performers in audiovisual works and sound recordings without the consent of their estate.

Speaking at a July 2 California Senate Judiciary Committee hearing, SAG-AFTRA Vice President, Los Angeles Jodi Long said, “Protections need to exist for the deceased and their families. After all, if we’re indeed a free country, then we should be free to live and die without the fear of becoming someone else’s unpaid digital puppet.”

A.I. Roadmap
In May, SAG-AFTRA released a statement of support for Senate Majority Leader Schumer and the bipartisan A.I. working group that has drafted the Senate roadmap for A.I., which lays out a number of policy priorities that this bipartisan group of senators believe merit consideration.

Areas the roadmap focuses on include “Ensuring enforcement of existing laws for A.I., including ways to address any gaps or unintended harmful bias; prioritizing the development of standards for testing to understand potential A.I. harms; and developing case-specific requirements for A.I. transparency and explainability.” The roadmap also focuses on “encouraging a conscientious consideration of the impact A.I. will have on our workforce, including the potential for job displacement and the need to upskill and retrain workers.”

Crabtree-Ireland lauded the roadmap while speaking on the May 29 episode of the SAG-AFTRA podcast.

“One of the things that really has been articulated in [the roadmap], and is so important, is that the goal should be for A.I. to supplement and not to supplant the workforce. And that can be done through a variety of tools, including using unions for help with training workers, helping make sure that worker-centric design principles are included in all A.I. development and design, and just generally including unions so that feedback is incorporated at early stages in the process.” He added, “I think there’s also very clear indication in this roadmap about privacy and liability, and that A.I. systems can’t just continue to be a black box where no one understands what the inputs were or how decisions are made by these systems, and making sure that there are real systems in place to protect against the kinds of risks that can come from A.I.”

What’s apparent is that no one piece of legislation will cover all the protections individuals need to ensure a safe and sustainable future, so it’s going to be an ongoing process to anticipate the ways the technology is evolving and how it will impact humanity and society.

SAG-AFTRA Applauds Department of Justice Warning
In late May, when the Department of Justice sent a warning to tech companies that they could face action from regulators if they don’t fairly compensate artists, entertainers and other creators for the use of their work, SAG-AFTRA commended the action, saying, “We encourage this kind of regulatory and legislative leadership as we aim to protect workers from the exploitative uses of these emerging technologies.”

Wins in New York On June 7, thanks to SAG-AFTRA members’ calls and emails to New York legislators, a digital replica licensing bill drafted by SAG-AFTRA passed unanimously in the New York State Assembly. The bill, S.7676-B/A.8138-B, will protect SAG-AFTRA members by requiring employers to provide an opportunity for both informed consent and proper union or legal representation before the rights to the digital replication of voice or likeness in place of physical work can be licensed. Licenses for voice and likeness rights are sometimes buried within the fine print of contracts or terms of service, and performers may inadvertently grant these rights without knowing the significance. This legislation creates an important additional layer of protection.

Other Efforts
A.I. and Video Games
SAG-AFTRA’s panel, Copyright & A.I. Work at the 2024 Video Game Bar Association Summit held June 3 at Loyola Law School, gave Crabtree-Ireland a chance to advocate for video game performers while addressing the effect A.I. is having on video game developers, industry professionals and performers.

Protective A.I. guardrails for actors who work in video games remain a point of contention in the Interactive Media Agreement negotiations which have been ongoing from October 2022 until last month’s strike. Other A.I.-related panels Crabtree-Ireland participated in included a U.S. Department of Justice and Stanford University co-hosted event about promoting competition in A.I., as well as a Vanderbilt University summit on music law and generative A.I. SAG-AFTRA Executive Vice President Linda Powell discussed the interactive negotiations and A.I.’s many implications for creatives during her keynote speech at an Art in the Age of A.I. symposium put on by Villa Albertine at the French Embassy.

She said A.I. represents “a turning point in our culture,” adding, “I think it’s important that we be participants in it and not passengers in it ... We need to make our voices known to the handful of people who are building and profiting off of this brave new world.”

Union Educates at Virtual A.I. Summit 
On July 22, Crabtree-Ireland spoke at the Summit on Artificial Intelligence. The panel he participated in, A.I. and the Crisis of Creative Rights and Disinformation: Deep Fakes, Ethics and the Law, addressed the ongoing fear that artificial intelligence poses an existential threat to creative professions.

Crabtree-Ireland joined Intel Lab’s Senior Staff Research Scientist Ilke Demir, Federal Election Commission Commissioner Ellen Weintraub and attorneys Rob Rosenberg and Lisa Oratz to discuss competing concerns and whether existing legal frameworks such as right of publicity, copyright and regulations, are sufficient to address this powerful technology. At the second session, hosted by SAG-AFTRA, National Director, Entertainment Contracts Jessica Johnson, and National Director, Contract Strategic Initiatives & Podcasts Sue-Anne Morrow, explained the A.I. protections the union has achieved during contract negotiations and which state and federal laws are on
the horizon.

Dynamic A.I. Audio Commercials Waiver
On the contractual front, SAG-AFTRA’s new Dynamic A.I. Audio Commercials Waiver, part of the Audio Commercials Contract, provides A.I.-specific contractual protections for members, requiring a performer’s informed consent for the creation of their digital voice replica and additional consent for the use of their digital voice replica in any ad. Performers will always have the ability to opt out of the use of their digital voice replica in an ad, and each time a new commercial is created using their digital voice replica, a performer must be compensated per the terms of the waiver. The waiver also requires producers to take reasonable steps to ensure the security of voice material and to prevent unauthorized use of actors’ voices by any third party, and it mandates the deletion of all copies of an actor’s voice at the end of the employment relationship.

This agreement furthers the union’s goal of ensuring that whenever A.I. is used in ways that impact SAG-AFTRA members, appropriate protections, informed consent and proper compensation are always required.

This item originally featured in the 'SAG-AFTRA' summer 2024 magazine issue
 

Help Center

On-Set Emergency

On-Set Emergency: (844) 723-3773

Help Center

How can we help? Call, chat with a rep, get answers to FAQs or send us an email.