YouTube announced an expansion of its likeness detection tools, allowing users at risk of impersonation to upload images of their faces for cross-checking against other uploads for potential imposters and deepfakes.
This development significantly broadens the platform’s safety measures, now open to all actors, athletes, creators, and musicians, regardless of whether they have a YouTube channel, according to The Hollywood Reporter.
YouTube started developing its likeness detection tools in September 2024. The technology utilizes face scans and government IDs to verify uploaded content across the platform.
Stay Ahead of the Curve!
Don’t miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.
Subscribe Now
The platform can alert users if their images appear in others’ uploads, helping them identify misuse and take action if necessary. Previously, this feature was restricted to select creators, government officials, journalists, and political candidates.
Now, the tool is available for individuals most at risk of having their livelihoods affected by deepfakes, a concern that is growing as artificial intelligence technology advances.
Various deepfake trends have surfaced, including popular depictions like the “Pope in a Puffer Jacket” and fan-generated scenes meant for entertainment. YouTube will still allow certain benign depictions, provided they do not infringe on user rights.
However, harmful deepfakes remain a key issue. YouTube’s expanded detection process allows more individuals to remain informed and request the removal of misleading content that could harm their interests.
The platform acknowledged that deepfakes are likely to increase in prevalence. YouTube stated that as AI tools develop, new forms of misrepresentation will become more complex, complicating efforts to manage misuse.
Featured image credit
