Revised No FAKES Act Still Poses Danger of Our Losing Control of our Digital Selves

By Jennifer E. Rothman
June 4, 2025

On May 21st, the Senate Judiciary committee held a hearing to consider “The Good, the Bad, and the Ugly” of “AI-Generated Deepfakes.” In addition to a broad-based conversation about the dangers and opportunities presented by generative AI technology that can deceptively replicate a person’s voice, likeness, and performances, the hearing considered a revised version of the No FAKES ACT that was introduced in April in both the House and Senate.  (S. 1367; H.R. 2794).

The revised bill has ballooned even further in size—now running 39 pages. The bill has grown in its effort to satisfy the interests of Google/YouTube and other big tech companies but has still left largely unaddressed the danger that the bill poses for individuals who may lose control over their digital replicas if the bill were to pass in its current form. The expanded preemption provision also raises a host of uncertainties surrounding the bill.  Hopefully, as this bill continues to be revised these concerns will be addressed.

The bill ostensibly seeks to protect performers’ livelihoods and the public from being deceived by unauthorized digital replicas by creating a federal right in such replicas. Unfortunately, as drafted, the bill still works at odds with these objectives and could make things worse than the status quo by erecting a federal law that would legitimize deceptive uses of digital replicas rather than appropriately regulate them. The current iteration of the bill protects record labels, large tech companies, the movie industry, and those who seek to profit from and control dead celebrities, but it gives insufficient protection to the real human beings at risk of losing control over our digital replicas and the public who may be deceived by such uses.

The proposed legislation avoids adopting a federal right of publicity or  preempting state laws, and therefore if passed it would add yet another layer to what I have called the “identity thicket,” in which multiple laws and entitlements conflict over who has control over a person’s name, voice, and likeness, including over digital replicas.

The main additions to the new draft—more than ten pages—focus on setting up a more elaborate immunity and notice and take down process that will largely insulate the likes of Google/YouTube from liability and encourage them to take down material, as well as to protect AI software designers. My main focus is not on these provisions or the speech-related exclusions because I know that there are already many parties well representing these issues before Congress. Instead, I want to focus on concerns that are not as likely to be highlighted by the big players involved in drafting and pushing this legislation.

I will therefore highlight a few of my key concerns, many of which remain from prior versions, in the hopes that they can inform some much-needed revisions if this continues to move forward.

Continued Danger of Losing Control Over One’s Digital Replica

Although the bill thankfully prohibits the assignment of a person’s digital replica while the person is alive, the bill still allows for long-term and broad licensing of a person’s digital replica, which undermines protections for those depicted and the public from deception.

Although the licensing of a person’s replica is restricted to ten-year terms, the bill allows for the continued use after this time of replicas and derivatives created during the ten-year licensing period, which makes this durational limitation less meaningful than it initially appears. It also raises the question of whether future digital replicas could potentially infringe copyrights in earlier replicas or displays of such replicas. There remain numerous open questions about how to treat digital replicas within the copyright system. My 2024 Donald C. Brace Lecture, Copyrighting People, addresses some of these questions in more depth. 

The bill requires that licenses set forth with a “reasonably specific description” the uses to be made, but it is not clear how this requirement will be interpreted by the courts if passed and future revisions should consider providing greater clarity on this point. Consider, for example, whether a description of the use of a person’s digital replica as being used “in audio-visual works” is “reasonably specific”? Similarly, would uses in promotions for a particular brand of soda or apparel be reasonably specific? If these two examples count as reasonably specific—which they should not—the limitation is essentially useless. A person’s replica could appear under such licenses in pornographic contexts and saying and doing things that the person had no awareness they were authorizing except in the most general terms. Such an outcome will work at cross-purposes with the stated objectives of protecting individuals from being exploited by AI technology and will worsen rather than ameliorate the deception of the public.

These concerns are exacerbated by the allowance of digital replica licenses agreed to not by the person themselves but by “authorized representatives.” The bill allows “authorized representatives” to license a person’s digital replica without the knowledge of the individual. This is a chilling possibility, especially given the reality that many individuals, particularly younger student-athletes, singers, models and actors have signed overreaching agreements related to the management of their name, image, voice, and likeness rights. Revisions to the bill should limit approvals to those given by the person themselves and require greater detail about what sorts of uses will be made under the license.

Requiring legal representation to enter digital replica licensing agreements would provide even greater protection. Without a requirement of legal representation, there is a risk that ordinary people will unwittingly agree online or through other agreements to broad licenses for uses of their digital replicas or voice clones. More should be done in future revisions to protect against these dangers.

The bill does potentially allow for greater protection of union members if a collective bargaining agreement applies. This provision is primarily focused on actors who are part of SAG-AFTRA but will not provide sufficient protection for nonunion members or SAG members who may be stuck in agreements they signed before becoming union members or when they work with nonsignatories. In addition, the bargaining landscape is likely to be much more challenging in the entertainment industry in the years to come and the current contract is already due to be renegotiated next year.

The bill does provide better protection to minors by requiring court approval of licenses involving minors. However, given the broad provision about representatives being able to authorize digital replicas, it leaves open the possibility that minors will be bound after 18 by broader contracts entered into by their parents or guardians when they were minors. Minors will also have the same issue that those over 18 would have with regard to the reuse of digital replicas created during the licensing period.

As drafted, the No FAKES Act could even undo some of the protections provided in the Take it Down Act recently signed into law in May by President Trump. The Take it Down Act importantly facilitates the removal from online platforms of unauthorized intimate visual depictions, including AI-generated ones. But the No FAKES Act would allow broad licenses and authorized representatives to allow for the dissemination of such intimate images without the specific knowledge and approval of the underlying person—such uses would be deemed to have been done with authorization. Future revisions to the No FAKES Act should clarify that specific consent for the dissemination of digital replicas (particularly in the intimate image context) must stem from the individual and not from a representative nor under a broad license that has not specified such uses with particularity.

Preemption Provision Raises a Host of Questions

The somewhat revised preemption provision creates significant confusion and may well be unconstitutional as drafted. The provision excludes from preemption any state statutes or common law in existence as of January 2, 2025 “regarding a digital replica” or the production or offering of a service related to the production or dissemination of digital replicas.  What is meant by the language “regarding a digital replica” is unclear, which creates a significant challenge. Lots of state laws cover digital replicas even if they don’t use the magic phrase and even if they were passed long before we were talking about generative AI. For example, many state intimate image laws and longstanding publicity and privacy common law and statutes cover (or will be held to cover) digital replicas but were not passed or recognized specifically to do so.  Do these count as “regarding a digital replica” for purposes of the statute’s preemption provision?  It’s not clear. It is also not clear what happens if a longstanding statute is amended to more specifically address digital replicas but after the January 2nd cut-off.  Similarly, it’s not clear what happens if a jurisdiction recognizes a common law right, for example of privacy or publicity, after January 2nd. In such instances, the analysis is usually based on a conclusion that the right always existed but hadn’t been recognized yet or, alternatively, that the claim is an appropriate evolution of preexisting common law. Would either, both, or neither of these common law determinations be preempted by the proposed federal law? Again, it's unclear.

The preemption provision allows state laws related to “sexually explicit” digital replicas to remain in effect, but again questions loom. Do these laws have to specifically target digital replicas or can they more broadly protect against the dissemination of sexually explicit images? The preemption provision also excludes state laws regulating election-related digital replicas. These selective choices about what to exempt from preemption may raise constitutional challenges for the preemption provision on the basis of being underinclusive. Another constitutional problem is the inequitable treatment across states. As drafted, the preemption provision gives more authority to states like California and Tennessee, that rushed into the fray to pass specific digital replica laws, than other states that have taken more time to deliberate about how best to address technological changes affecting their states.

Postmortem Provisions

The bill would create a federal postmortem digital replica right. Many states already provide postmortem rights of publicity which would also encompass this right, but this is an area where state law is the most variable as to whether postmortem rights exist, and if they do, state laws vary widely on their duration and who qualifies to have such a right. A federal law could help harmonize postmortem rights, but here it does so only for digital replica rights. The creation and design of the proposed federal postmortem digital replica right may create significant wealth tied to dead people. As currently drafted, the proposed postmortem right would incentivize and, in some instances, force the commercialization of the dead even if this goes against the wishes of both the deceased and their families. This is in part true because of the estate tax system which will treat these postmortem digital replica rights as part of the taxable estate of the deceased. This could be addressed by excluding such rights from the estate tax and by clarifying that the rights only exist after death.

Future revisions to the bill should also strongly consider shifting its focus from wealth generation to protecting against rather than incentivizing the commercialization of the dead. In a recent article, Postmortem Privacy, my co-author Anita Allen and I conclude that it is appropriate to extend some postmortem publicity rights, including over digital replicas, but such rights should focus on the preferences of the deceased in coordination with their relatives, rather than on creating an industry composed of the dead. The bill should be revised so that all dead people are given equitable treatment and so it tracks the legitimate objectives of extending such a right. The current draft of the bill gives more than 70 years of protection to those who commercialize the dead but a much shorter period of only ten years to those who do not. There is no justification for this preference for those who commercialize the dead, rather than those who seek to limit the commodification and exploitation of their loved ones.

The postmortem provisions also create a registration process (and requirement) at the Copyright Office. Given the current tumult at the Copyright Office and significant efforts to reduce the federal workforce, it’s hard to imagine the Office taking on an entirely new burden of managing registrations for digital replicas. Reducing the duration of postmortem rights and having them equitably apply to everyone could eliminate the need for such a registration process while also providing a better balance between the rights of the living and the dead.

* * *

This bill is gaining supporters, particularly as it gets longer to address each powerful constituency’s issues. But as the bill picks up traction, the interest of the underlying individuals and the public should not be lost.  Hopefully, future revisions will address these concerns while still adequately protecting record labels, the film industry, and tech companies.