'No Fakes Act' Wants To Protect Actors and Singers From Unauthorized AI Replicas
Emilia David reports via The Verge: A bipartisan bill seeks to create a federal law to protect actors, musicians, and other performers from unauthorized digital replicas of their faces or voices. The Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2023 -- or the No Fakes Act -- standardizes rules around using a person's faces, names, and voices. Sens. Chris Coons (D-DE), Marsha Blackburn (R-TN), Amy Klobuchar (D-MN), and Thom Tillis (R-NC) sponsored the bill. It prevents the "production of a digital replica without consent of the applicable individual or rights holder" unless part of a news, public affairs, sports broadcast, documentary, or biographical work. The rights would apply throughout a person's lifetime and, for their estate, 70 years after their death. The bill includes an exception for using digital duplicates for parodies, satire, and criticism. It also excludes commercial activities like commercials as long as the advertisement is for news, a documentary, or a parody. Individuals, as well as entities like a deceased person's estate or a record label, can file for civil action based on the proposed rules. The bill also explicitly states that a disclaimer stating the digital replica was unauthorized won't be considered an effective defense.
Read more of this story at Slashdot.