YouTube is expanding its AI deepfake monitoring feature to Hollywood — meaning some celebrity AI videos could soon disappear.

The platform’s likeness detection feature searches YouTube for AI deepfake content and flags it for public figures enrolled in the program. Public figures can use it to keep track of AI content on YouTube of themselves or request removal (takedowns are evaluated against YouTube’s privacy policy, and not every request will be approved). YouTube began testing the feature with content creators last fall; in March, the company expanded the program to politicians and journalists. YouTube says the tool will cover celebrities regardless of whether they have a YouTube account.

The system requires participants to submit an ID and a selfie video of themselves. (Likeness detection is focused on faces specifically, as opposed to a voice or other identifying characteristics.) Removal of deepfakes isn’t guaranteed, and there are protected use cases like parody or satire. YouTube has previously said that when content creators used the feature, they requested only a “very small” number of videos of themselves be removed.

YouTube has compared likeness detection to Content ID, its system for finding (and removing) copyrighted material on the platform. The difference is that with Content ID, rights holders can opt to monetize other users’ videos that use their material and split the revenue. That’s not yet possible with likeness detection, but that clearly seems like the direction the industry is moving toward.

Earlier this month, YouTube announced a feature allowing creators to digitally clone their likeness using AI, which could then be inserted into videos. Talent agency CAA (which YouTube says supported the likeness detection expansion) has a database filled with clients’ biometric data that entertainers can retain — or deploy for commercial opportunities. TikTok star Khaby Lame effectively sold off the rights to his likeness, which would then be used to sell products online. (The deal has run into several road bumps and it’s not clear if it has closed, according to Business Insider.)

In an interview with The Hollywood Reporter, some talent managers frame the explosion of AI deepfakes as a way for the entertainment industry to engage with fans. Some celebrities might want AI content of themselves to be pulled when eligible; others might let fan-made AI content proliferate. And in the future, entertainers might welcome AI deepfakes of themselves — as long as they get paid.

Share.
Exit mobile version