NO FAKES Act Introduced in Senate

By Jennifer E. Rothman
September 9, 2024

On July 31st, right before the summer recess, a bipartisan group of Senators introduced the NO FAKES Act, which has been substantially revised and improved from the previously circulated discussion draft.

The bill seeks to address concerns over the use of unauthorized digital replicas of real people in the wake of improved and widely available AI technology. The new version has gained support across the entertainment industry, bringing together SAG-AFTRA, the Recording Industry Association of America, and the Motion Picture Association. The Senate released both a one-pager and a section-by-section summary along with the draft bill.

Overview

The bill targets digital replicas, rather than tackling a broader federal right of publicity bill. This has advantages and is a less ambitious undertaking than tackling a preemptive federal right of publicity, but also has the downside of adding yet another layer of potentially conflicting rights over a person’s voice and likeness to the current “identity thicket.” Many state and federal laws already extend protection against unauthorized uses of a person’s voice or likeness, including when used in computer-generated digital replicas. Still, having a federal law targeting digital replicas may make suing easier for (some) plaintiffs.

The bill importantly bars the assignment of digital replica rights while a person is living and limits the scope of licenses. But concerningly, still risks a person's loss of control of their own voice and likeness because the licensing permitted is overly broad. The bill also may exacerbate the use of digital replicas that deceive the public. Concerns of the impact on free speech are mitigated by broad exclusions from liability and the requirement of actual knowledge. The bill provides postmortem rights and establishes both a registration system and a notice and takedown process.

The bill is lengthy, running 28 pages. Below I provide some highlights, as well as some concerns and suggestions for future revisions.

New Federal Digital Replica Right

The bill would create a new federal “intellectual property” right over a person’s digital replica that can be enforced through a civil action.

A “digital replica” is defined as a “newly-created, computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual.” This “representation” must be “embodied in a sound recording, image, or audiovisual work.” This does not include a representation in which the “actual individual” did “perform or appear,” unless the “fundamental character of the performance or appearance has been materially altered.” Importantly, especially for the film and music industries, “sampl[ing], remixing, mastering, or digital remastering of a sound recording or audio visual work” when “authorized by the copyright holder” are expressly excluded from the definition of a digital replica. The “material alter[ation]” language likely limits reuses of performances in new derivative works beyond these explicit exclusions, but the scope of the limit is uncertain.

An “individual” is limited to a natural person but includes the “dead.”

The bill provides each individual or a different “rights holder” the “right to authorize the use of the voice or visual likeness of the individual in a digital replica.” The right is violated by the “display, copy made, [or] transmission” of the digital replica, or its being made available.  Each instance of those acts produce a violation unless an online service has taken reasonable measures to remove or disable access to the digital replica.

The bill provides $5,000 in statutory damages per violation for individual defendants and online service providers but raises the amount to $25,000 for entities (that are not online services). Actual damages and profits attributable to the unauthorized uses are recoverable. Injunctive relief is available, as well as punitive damages for willful acts. Prevailing plaintiffs can recover attorney’s fees and prevailing defendants can as well but only if the action was brought in bad faith.

Scope of Liability

The bill extends liability for producing, publishing, reproducing, displaying, distributing, transmitting or “otherwise making available to the public” a digital replica. However, this liability is limited by the requirement that the defendant have “actual knowledge” that the use is of an unauthorized digital replica. Willfully avoiding actual knowledge constitutes actual knowledge. The actual knowledge standard is an important limit on an otherwise very broad liability provision.

Future versions of the bill should affirmatively require products or services that enable the creation of digital replicas or their dissemination to have some affirmative obligation to regulate the use of unauthorized digital replicas. Without such pressure to find technological mechanisms to stop the creation of and spread of digital replicas, the law may have a limited effect even with the provided notice and takedown process.

Claimholders

The digital replica right can be enforced by a variety of people and entities. The bill defines a “right holder” as “the individual whose voice or likeness is at issue and any other person that has acquired, through a license, inheritance, or otherwise, the right to authorize the use of such voice of visual likeness in a digital replica.” (emphasis added).

The bill provides that actions may be brought by a right holder, a parent or guardian of a minor, anyone who controls the voice or likeness of a “right holder” or deceased individual, and anyone holding a “personal services contract” with a sound recording artist.

This means that there may be multiple potential claimants when a digital replica is created or used by a potential defendant. This will add to the identity thicket that I have described by adding additional overlapping rights in a person’s name, likeness, and voice.  See Jennifer E. Rothman, Navigating the Identity Thicket, 135 Harvard L. Rev. 1271 (2022). The bill doesn’t clarify whether a rights holder could sue the identity-holder for using the individual’s own digital replica, nor what to do if there is a conflict between publicity rights, which encompass the rights to a person’s voice or likeness, and the digital replica rights’ holder(s).

Because the bill allows someone other than the identity-holder to authorize use of the person’s likeness and voice in a digital replica, it allows and even authorizes uses of a person’s identity in ways that could deceive the public into thinking computer-generated performances are authentic. 

Authorizes Use of a Person’s Digital Replica without Explicit Approval Performance or Disclosure that is Fake

Although the bill thankfully does not allow assignments of the digital replica right “during the life of the individual,” the expansive allowance of licensing of the right works at cross-purposes with the legislation’s objectives. While a big improvement over the House’s introduced version of the No AI FRAUD bill and the previous discussion draft of this bill, the introduced-version of the NO FAKES Act still would allow broad licensing that could exacerbate the deception of the public and dangerously limit a person’s control over their own voice and likeness.

The bill limits licenses for digital replica rights to no more than “10 years” and requires that they include a “reasonably specific description of the intended uses of the applicable digital replica.” These agreements must be signed and in writing. These are big steps forward from the initial licensing provision that had no limits in scope or duration whatsoever. But the bill still allows licenses to be far too broad and these limiting provisions are are far less protective than they first seem. 

The bill allows an “authorized representative” to sign licensing agreements instead of the person themselves. This is a very concerning provision as young, aspiring performers and student-athletes may “authorize” someone such as an agent or manager or relative to sign these agreements for them, and to the extent that state publicity laws are interpreted (albeit I think incorrectly) to be transferable, these agents or managers or others could authorize these digital replicas without the person even seeing the agreement. See Jennifer E. Rothman, The Inalienable Right of Publicity,100 Georgetown L.J. 185 (2012); see also Jennifer E. Rothman, The Right of Publicity: Privacy Reimagined for a Public World (Harvard Univ. Press 2018).

Federal legislation has the potential to fix this problem, but because it does not do so this bill is particularly worrisome. It allows someone other than the individual to sign away the right to the person’s own voice and likeness (at least in the context of digital replicas) with no ongoing oversight over what is done with them for the ten-year period. Furthermore, there is no statutory limit on the reuse of works created during this time period, meaning that the uses could extend far beyond ten years.

The bill also eliminates the need to have an individual sign any licensing agreement or have uses described with any specificity if the person is covered by a collective bargaining agreement that covers digital replica use.  No matter how well meaning a union is, a person’s digital replica should not be able to be licensed for 10 years without the individual having specific, individual control over to whom it is licensed and how the replica is being used.

Even in the best of circumstances, ten years is a long time to lose control over one’s digital replica. The potential scope of the restrictions on the individual identity-holder during this period is unclear because it is uncertain how broadly this digital replica right will be interpreted by courts and how it will intersect with state right of publicity and privacy laws that extend to an individual control over their likeness and voice. For example, will a performer have the freedom to take movie roles or make a sound recording or will this infringe a digital replica right that might be held (for 10 years) by someone else? Even if an actor could do such an actual performance without infringing the right, would it infringe the right if the filmmakers use visual effects that involve the creation or use of a digital replica of the performer?

The harm that flows in the wake of a possible ten-year loss of control and oversight over a person’s digital replica will turn in part on what it means for the licensing agreement to make “reasonably specific” the uses it covers. Is a license to use a person’s digital replica in a series of short films produced by Company A sufficiently specific? Such a general license would allow a person’s digital replica to be shown in pornographic contexts and saying and doing things they never agreed to do. Similarly, is it “reasonably specific” to authorize an AI-interactive digital replica in the form of a chatbot that could say and/or be shown doing anything? Even limiting a license to uses of digital replicas in advertisements for soda or sneakers could be too broad. A Penn student from Ukraine was recently turned into a digital replica without her consent and then this "clone" was used to advertise candy and also to voice support for Putin and Russia. If this student had signed an agreement to allow her digital replica to appear in advertisements for candy, there needs to be some ongoing oversight and agreement as to what her digital replica will say and be shown doing in such advertisements. It is also unclear whether the “writing and signed” requirement could be met by various click-through agreements for online providers, including social media companies.

Given the gravity of the harms that flow from such broad licenses of digital replicas of real people, greater clarity on the scope of what is required to be “reasonably specific” should be added and “authorized representatives” should not be able to sign away a person’s digital replica rights for 10 years.  Only the individual themselves should be able to license their identity; licenses should be for specific, known, and identified uses rather than broad categories of uses. There should also be public disclosure requirements if a performance appears authentic but is not. If you are talking to a Charlie Puth chatbot, you probably know that the “speaker” is not really Charlie Puth.  But if you see a digital replica that looks like Puth endorsing a political candidate who Puth in reality found hateful, you might well believe the endorsement is authentic. If Puth signed a terrible initial deal with his first manager (an “authorized representative”), and the manager licensed his digital replica rights for use in advertising and then authorized the use, the use might not violate the proposed law and could instead be protected by this federal law even though the use would dupe the public and violate Puth’s own autonomy, dignity, and potentially his future market value.

The bill does a somewhat better job of protecting minors by limiting licenses to 5 years, having any licenses terminate when the minor turns 18, and requiring that a court approve any agreement for uses of a minor's digital replica. These requirements, however, are waived if the minor is covered by a collective bargaining agreement that addresses digital replicas. And most importantly, minors should not be digitally animated saying and doing things they didn’t do and don’t agree with even if their parents or managers authorize it. Ideally, any future revisions will require a child, who is old enough to do so, to agree to the specific uses at issue.

Postmortem Digital Replica Right

The bill provides that the digital replica right exists even for dead people. These postmortem rights have an initial term of 10 years after death, but if commercially exploited, they can be renewed for five-year terms up to 70 years after death. This potential term appears tied to the copyright term, even though the basis for such rights is quite different. It is not clear why the longer term of protection should be provided to those who wish to commercialize these rights, while providing less protection to those who wish to limit the commercialization of their deceased loved ones. The postmortem rights are capable of being conveyed and held by anyone or any entity, including those with no connection to the deceased person.

This approach to extending postmortem rights is "off kilter." As Anita Allen and I have written in our recent article, Postmortem Privacy (forthcoming in the Michigan Law Review (2024)), there is no justification for strangers to commercially profit from the deceased nor is there a public interest in incentivizing the commercialization of the dead. Instead, postmortem provisions should focus on the preferences of the deceased and the well-being of their relatives, and should not vest in unrelated third-parties.

The bill also does not address the tax problem of extending such digital replica rights into the afterlife. This means that heirs could face a giant tax bill as a result of this legislation and surviving relatives and other heirs could be forced to commercialize the deceased person’s identity to pay it off even if that is not what they (or the deceased) would want.

The postmortem provision sets up a registration system for postmortem digital replica rights (required once renewed). This will be administered through the U.S. Copyright Office. This could help those who wish to use a deceased person’s digital replica to track down the rights holders who could grant permission to do so.

Preemption

The bill would preempt state claims that protect “an individual’s voice and visual likeness rights in connection with a digital replica in an expressive work.” This is a disappointingly limited preemption provision and is made even more limited by not applying to any state laws passed prior to 2025, related to elections, or that restrict products or services that enable the creation of digital replicas. A broader preemption law would be helpful to address conflicting state digital replica laws that are already being passed. See, e.g., The ELVIS Act (signed into law March 21, 2024).

Exclusions

The scope of liability is limited by significant exclusions.  There is no liability if the use of a digital replica is in a “bona fide news, public affairs, or sports broadcast or account” as long as the use is “materially relevant.” Hopefully, the material relevance provision would not be satisfied by deceptive uses in such broadcasts or accounts. Notably, the exclusion is not required by the First Amendment. One of the few certain things we know about the interaction of right of publicity laws and the First Amendment is that the news cannot broadcast a person’s performance even on the news. See Zacchini v. Scripps-Howard Broadcasting, 433, U.S. 562 (US 1977); see also Robert C. Post & Jennifer E. Rothman, The First Amendment and the Right(s) of Publicity, 130 Yale L.J. 86 (2020).

There is no liability for uses of digital replicas in a “documentary or in a historical or biographical manner, including some degree of fictionalization.” This exclusion is limited if the use creates “the false impression that the work is an authentic,” one in “which the individual participated,” or “the digital replica is embodied in a musical sound recording” synched to an audiovisual work. The exclusion notes that this exception to its safe harbor doesn’t apply if the First Amendment prohibits it, which seems like a superfluous point to explicitly include a bill. Additionally, the requirement that the use not give a “false impression” of authenticity should apply to all of the exclusions, particularly uses in news.

The bill also excludes from liability “bona fide commentary, criticism, scholarship, satire, or parody,” “fleeting or negligible” uses, and uses in advertisements for any of the excluded uses. None of these exclusions apply if the use “depict[s] sexually explicit conduct.” The use of disclaimers is also expressly not a defense to violations of the digital replica right.

Safe Harbors and Notice and Takedown Procedure

The bill provides immunity from secondary liability for using a digital replica for “manufacturing, importing, offering to the public, providing, or otherwise distributing a product or service” unless the product or service is “primarily designed to produce” “unauthorized digital replicas.” The immunity also does not apply if there is limited commercial value to the product/service other than producing unauthorized digital replicas, or if the product/service is promoted for creating unauthorized digital replicas.

The bill exempts online services from liability for referring or linking to locations with unauthorized digital replicas—but requires that material be taken down once a service is on notice. The bill also exempts online services from liability for hosting user-uploaded material so long as they take the material down once on notice. The bill specifies some additional provisions related to agents for such notices and requirements of the notification, as well as penalties for sending false or deceptive takedown notices. There is also a put-back process that involves filing a lawsuit within 14 days after receiving a notice, which is a short-time frame for ordinary people.

* * *

Overall, the bill is a major improvement over the discussion draft and a similar House bill introduced earlier in the year. But it still needs some revisions to avoid it worsening rather than improving matters. Hopefully, the bill will continue to be improved upon as Congress further considers the proposed legislation.