There is no question that generative artificial intelligence (GAI) is changing the way businesses develop code, visual art, music, and other creative works but, as we have identified in here, growing pains still persist as intellectual property, security, and privacy concerns are being closely scrutinized. One such growing pain is the ability for bad actors to imitate an individual’s identity using GAI for their own interests.
Of course, there is recourse: the right of publicity. Most states across the country have implemented the right of publicity into law to protect individuals against the commercial use and replication of their name, image, likeness, and other elements of their identity except where the replication is protected by the First Amendment. In other words, works replicating an individual’s identity that are informative, satirical, newsworthy, or otherwise artistic are often protected under U.S. federal law regardless of what a state’s law says. This is why your news channel of choice can use public images and voice clips of your favorite musician as part of its reporting or public commentary on controversial matters. While many states provide the right of publicity to all individuals, some states only recognize this right for public figures and/or celebrities. Even for states that do acknowledge a right of publicity for all individuals, the burden is rather high if the person trying to assert the right is not a public figure.
As it stands, a federal right of publicity does not exist in the United States, but the ability for GAI to make it easier to imitate someone’s voice and visual likeness might have been the push that Congress needed to act to create this right. The Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act is a bill authored by Senators Chris Coons (D-DE), Marsha Blackburn (R-TN), Amy Klobuchar (D-MN), and Thomas Tillis (R-NC) that would provide all individuals with the right to authorize (or refuse) the digital replications of their image, voice, and visual likeness or otherwise hold individuals, companies, and platforms liable for their part in enabling unauthorized digital replications in an audiovisual work or sound recording.
While this draft bill might “walk and talk” like a right of publicity proposal, it is not quite that. State right of publicity laws are rooted from privacy concerns invoked when an individual’s identity is misappropriated for commercial purposes. The NO FAKES Act frames the right to use an individual’s identity as a type of property right in which the individual has certain economic control over their identity. An important aspect of the right of publicity is the First Amendment exclusions that allow for important forms of free speech expression. The First Amendment exclusion is particularly important for businesses that rely on creative expression. For this reason, state right of publicity laws include detailed exclusion provisions shaped by long-standing experience and First Amendment case law. While the NO FAKES Act provides for general First Amendment exclusions for certain “digital replicas” in the bill’s draft language, it lacks the specific protections for expressive works found in parallel state laws, such as New York’s, outlined here.
In the age of new AI tools, ACT | The App Association encourages the U.S. government to provide stronger federal guidance on existing laws and to avoid premature and incomplete legislation. The NO FAKES Act is a significant draft bill that would create the first federal right of publicity in an attempt to protect individuals from the dark side of AI. It is our hope that Congress continues to develop the bill to ensure that it does not allow individuals to improperly enforce a right of publicity against lawful creative and inventive digital works that benefit the public interest. While the NO FAKES Act is far from perfect, we are encouraged by its potential to mend fragmented state right of publicity laws.