YouTube was revealed on Tuesday that the similarity detection technology was officially rolled out to eligible creators in the YouTube Partner Program, following a pilot phase. The technology allows creators to request the removal of AI-generated content that uses their likeness.
This is the first wave of the rollout, a YouTube spokesperson told TechCrunch, adding that eligible creators received emails this morning.
YouTube’s detection technology detects and manages content created by artificial intelligence and displays the likeness of creators, such as their face and voice.
The technology is designed to prevent their likeness from being misused, either to endorse products and services they have not agreed to endorse or to spread misinformation. There have been many examples of misused AI likeness in recent years, such as the company Electro using an AI clone of YouTuber Jeff Geerling’s voice to promote its products.
On its Creator Insider channel, the company provided guidance on how creators can use the technology. To start the onboarding process, creators must go to the Likeness tab, consent to data processing, and use their smartphone to scan an on-screen QR code that will direct them to a web page for identity verification. This process requires a photo ID and a short selfie video.
Once YouTube grants access to use the tool, creators can view all detected videos and submit a takedown request in accordance with YouTube’s privacy guidelines, or they can submit a copyright claim. There is also an option to archive the video.


Creators can opt out of using the technology at any time, and YouTube will stop scanning for videos 24 hours after they do.
Techcrunch event
San Francisco
|
27-29 October 2025
The similarity detection technology has been in pilot mode since the beginning of the year. YouTube first announced last year that it had partnered with Creative Artists Agency (CAA) to help celebrities, athletes and creators spot content on the platform that uses AI-generated likeness.
In April, YouTube voiced its support for legislation referred to as the NO PLASTIC Lawwhich seeks to address the issue of AI-generated clones imitating a person’s image or voice to deceive others and create harmful content.
