As a group of United States (US) Senators re-introduced an anti-deepfake regulation in the US Congress, they received YouTube’s backing for the regulation. This anti-deepfake act, called the “Nurture Originals, Foster Art, and Keep Entertainment Safe Act” or the NO FAKES Act, seeks to protect people’s voice and likeness from replication through artificial intelligence (AI) models without consent.In a blog post expressing support for the regulation, YouTube says that it is working closely with its partners, such as the Recording Industry Association of America (RIAA) and the Motion Picture Association (MPA), to push for a shared consensus on this legislation.
Besides these three, Sony Music, Universal Music Group, and Walt Disney have also expressed support for the NO FAKES Act.“The NO FAKES Act provides a smart path forward because it focuses on the best way to balance protection with innovation: putting power directly in the hands of individuals to notify platforms of AI-generated likenesses they believe should come down,” YouTube says, explaining the rationale behind supporting the legislation. It mentioned that this notification is important because it allows platforms to differentiate between authorised content and harmful fakes.
What is the No FAKES Act? Initially introduced in 2024, this regulation gives people the right to authorise the use of their voice or visual likeness in a digital replica. Digital replica, under the regulation, means a “highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual”. A license to use someone’s likeness will be valid only if:The license agreement is in writing and signed by the individual or an authorized representative of the individual.
The agreement includes a reasonably specific description of the intended uses of the applicable digital replica.The license for an adult’s likeness cannot exceed over 10 years and for someone under the age of 18 cannot exceed 5 years. If someone produces a digital replica of an individual without their consent, they will be liable for civil action.
However, to be held liable under the NO FAKES Act, the person publishing/distributing the non-consensual digital replica must have “actual knowledge” that the content they posted is indeed a digital replica and that the applicable right holder did not authorise it. This actual knowledge can be in the form of a notification.Exclusions from the NO FAKES Act:The NO FAKES Act will not be applicable in the following cases:The applicable digital replica is produced or used in bona fide news, public affairs, or sports broadcasts provided that the digital replica is materially relevant to the subject of the broadcast.
The digital replica is a representation of the individual in a documentary or in a historical or biographical manner, including some degree of fictionalisation. Such content is allowed to include digital replicas unless:The digital replica creates a false impression that the individual participated in the work.The replica is included in a musical sound recording that is synchronised to accompany a motion picture or other audiovisual work.
In case a bonafide commentary, criticism, parody, or a piece of satire uses a digital replica in public interest.The audio/visual content uses the digital replica in a fleeting/negligible way.An advertisement or commercial announcement uses the digital replica for any of the above-mentioned purposes and is relevant to the subject of the work so advertised.
Rights of dead individuals:Under the Act, the rights to voice and likeness do not end when an individual dies. Once the individual dies, their executors/heirs/licensees have the right to license/transfer this right. The Act allows rights to be transferred through any legal means or through inheritance as personal property by the applicable laws of intestate succession (succession when the dead person has not left behind any will).
In case of licensing agreements before the death of an individual, a right holder has the rights to a dead individual’s voice and likeness:for a period of 10 years after the individual dies.for an extended period of five years, if during the last 2 years of the 10-year license the rights holder demonstrates that active and authorised public use of the voice or visual likeness of the individual. This five-year extension is renewable provided that the right holder can continue to show active and authorised public use.
The rights will terminate after either the 10-year period, the five-year extension, or after 70 years of the individual’s death.Are platforms liable for digital replicas?The NO FAKES Act says that websites or platforms hosting digital replicas that a user uploaded will not be automatically liable for violating the act as long as they:remove, or disable access to, all instances of the digital material after someone notifies them about the digital replica.after removing/disabling access, take reasonable steps to notify the third party that posted the material that they have removed it.
Why it matters:Securing personality rights has become a significant issue since the advent of deepfakes. This regulation could set a precedent for protecting personality rights in other parts of the world, including India.In India, courts have protected the personality rights of public figures such as journalist Rajat Sharma, and singer Arijit Singh from artificial intelligence (AI) misuse.
Many other celebrities such as Jackie Schroff, Anil Kapoor, and Amitabh Bachchan have knocked the judiciary’s door to protect their personality rights from misuse. In Kapoor and Shroff’s cases, their lawyers specifically argued misuse through AI models. While such eminent individuals have been successful in protecting their personality rights in India, there is no clear regulation for the average person to argue for the same.
Also read:No AI FRAUD Act: A new bill in the US that provides individuals rights over their voice and other featuresThere Could Be a Separate Act to Regulate Deepfakes, If Needed: MeitY Secretary SaysBombay HC Orders AI Platforms To Remove Unauthorized Use of Arijit Singh’s VoiceThe post With YouTube’s Backing, US Senators Reintroduce Bill To Protect Voice, Likeness From Deepfakes appeared first on MEDIANAMA..
Technology
With YouTube’s Backing, US Senators Reintroduce Bill To Protect Voice, Likeness From Deepfakes

United States has reintroduced an anti-deepfake regulation called the NO FAKES Act, that has the backing of YouTube, Sony and Walt Disney.The post With YouTube’s Backing, US Senators Reintroduce Bill To Protect Voice, Likeness From Deepfakes appeared first on MEDIANAMA.