SAG-AFTRA National Executive Director & Chief Negotiator Duncan Crabtree-Ireland testifies in support of the NO FAKES Act.
On April 30, the U.S. Senate Judiciary Subcommittee on Privacy, Technology, and the Law convened a hearing on the Nurture Originals, Foster Art and Keep Entertainment Safe, or NO FAKES, Act in Washington, D.C. If passed, the NO FAKES Act stands to prohibit the unauthorized use of digital replicas without informed consent. Additionally, it would offer historic intellectual property protection against the misappropriation of voice and likeness performances at the federal level.
SAG-AFTRA National Executive Director & Chief Negotiator Duncan Crabtree-Ireland was among those to testify before the committee in support of the proposal. In his statement, he noted an urgent need to safeguard artists’ images, likenesses and voices from the threat of generative artificial intelligence, and cited the proposal’s passing as reaffirming all citizens’ First Amendment rights to control their freedom of association and speech.
“For an artist, their image and likeness are the foundations of their performance, brand and identity developed over time through investment and hard work. SAG-AFTRA has long fought for right-of-publicity laws and voice and image protections. The exponential proliferation of artificial intelligence technologies — technologies which allow for rapid and realistic fakes of voices and likenesses and audiovisual works and sound recordings — makes this fight urgent for our members.
“Enshrining this protection as a federal intellectual property right will ensure our members, creative artists and frankly, all of us, are protected and that service providers [offer] the same protections to individuals’ images, likenesses and voices that they provide now for other intellectual property rights. These rights should be transferable and descendible, just like any other intellectual property or any kind of property someone owns,” said Crabtree-Ireland.
Other testimonials were provided by SAG-AFTRA member and singer-songwriter Tahliah “FKA twigs” Debrett Barnett; Warner Music Group Chief Executive Officer Robert Kynel; University of San Diego School of Law professor Lisa P. Ramsey; Digital Media Association President & CEO Graham Davies; and Motion Picture Association Senior Vice President and Associate General Counsel, Law and Policy Ben Sheffner.
Following the group’s testimonies was a Q&A session. The NO FAKES Act is sponsored by Sens. Chris Coons (D-DE), Marsha Blackburn (R-TN), Amy Klobuchar (D-MN) and Thom Tillis (R-NC).
SAG-AFTRA’s support of the NO FAKES Act is the latest in its ongoing work to push for legislative protections against A.I. on both a state and federal level. Recently, the union pushed for the passing of Tennessee’s Ensuring Likeness Voice and Image Security, or “ELVIS,” Act, the first enacted legislation specifically designed to protect performers from the unauthorized use of their voice and likeness in audiovisual works and sound recordings.
The union has also previously supported numerous state laws in California and New York that offer protections against deepfakes, including CA AB 602 and NY S 5959-D in 2020. In addition, the SAG-AFTRA TV/Theatrical, TV Animation and National Code of Fair Practice for Sound Recordings — or “Sound Recordings Code” — contracts include groundbreaking protections and provisions for members against nonconsensual uses of member voices and likenesses.
Below is the full transcript of Crabtree-Ireland’s testimony:
Thank you very much Chairman Coons, Ranking Member Tillis, and the members of the subcommittee on intellectual property. My name is Duncan Crabtree-Ireland. I'm the National Executive Director of SAG-AFTRA, the country's largest labor union for entertainment and media artists. And I'm here today to testify in support of the NO FAKES Act. Our members believe that AI technology, left unregulated, poses an existential threat to their ability to: one, require consent for the creative use of their digital representation; two, receive fair payment for use of their voice and likeness; and three, to protect against having to compete against themselves, their own digital self, in the marketplace.
I'm the chief negotiator for the union's contracts, including last year's historic agreement with the major entertainment studios, which was only finalized after the longest entertainment industry strike in over 40 years, a strike that lasted nearly four months. The strikes and the public's response to them highlighted that the entertainment industry and the broader public understand that AI poses real threats to them, and they fully support protections against those threats.
For an artist, their image and likeness are the foundations of their performance, brand and identity developed over time through investment and hard work. SAG-AFTRA has long fought for right-of-publicity laws and voice and image protections. The exponential proliferation of artificial intelligence technologies —technologies which allow for rapid and realistic fakes of voices and likenesses and audiovisual works and sound recordings — makes this fight urgent for our members.
Enshrining this protection as a federal intellectual property right will ensure our members, creative artists and frankly, all of us are protected and service providers provide the same protections to individuals. Images, likenesses and voices that they provide now for other intellectual property rights. These rights should be transferable and decennial, just like any other intellectual property or any kind of property someone owns with durational limitations on transfers during one's lifetime, to ensure that we don't enter into an era of digital indentured servitude.
Just as actress and SAG-AFTRA member Olivia de Havilland fought to establish the seven-year rule to end long-term abusive contracts in the old studio system, some will argue that there should be broad, categorical, First Amendment-based exemptions to any legislation protecting these important rights.
There are no stronger advocates for the First Amendment than our members. They rely on their First Amendment rights to tell the stories that artists and other countries are often too endangered to tell. However, the Supreme Court has made clear over half a century ago that the First Amendment does not require that the speech of the press, or any other media, for that matter, be privileged over protection of the individual being depicted to the contrary.
The courts have applied balancing tests which determine which right will prevail. These balancing tests are critical and they are incorporated into the discussion draft of the NO FAKES Act. They ensure that the depicted individual is protected and rewarded for the time and effort put into cultivating their persona, while not unduly burdening the right of the press to report on matters of public interest or the entertainment media to tell stories.
At the same time, these tests help ensure the depicted individual is not compelled to speak for the benefit of third parties who would misappropriate the value associated with the persona they have carefully crafted with new A.I. technologies that can now realistically depict an individual's voice or likeness with just a few seconds of audio or even a single photograph.
And with constantly evolving capabilities of these technologies, it is even more important that broad categorical exemptions be avoided and that the courts be empowered to balance the competing interests. It's also essential that action be taken to address these harms. Now our members, the public and our society are being impacted right now by the abuse of deepfake technology, and we must take timely action.
Just as one of many examples of the abuse of deepfake technology during the ratification campaign for our contract after the strike last year, an unknown party on the internet created an unauthorized deepfake video of me saying false things about our contract and urging members to vote against it, anathema to me as someone who had devoted my life for more than a year to a contract I deeply believe in.
There was no federal rights protecting me. No takedown rights. And tens of thousands of people were misled about something that really mattered to so many of us. It's neither necessary nor appropriate to wait for broader artificial intelligence regulation to be adopted. This narrow and technology-neutral approach can and should proceed expeditiously forward. The companies behind many of these technologies are asking for rules so they better understand the appropriate boundaries on their conduct.
The NO FAKES Act will provide them with important guidance, while helping to ensure individuals are protected from exploitation that puts their livelihoods and reputations at risk. Thank you again for this opportunity to speak today and I look forward to answering your questions.
Photo: SAG-AFTRA National Executive Director Duncan Crabtree-Ireland testifies at the U.S. Senate Judiciary Subcommittee on Privacy, Technology, and the Law hearing on the Nurture Originals, Foster Art and Keep Entertainment Safe, or NO FAKES, Act in Washington, D.C. on April 30.