YouTube Adds New Tool to Detect AI-Generated Singing Content

The company is also developing technology that will detect and manage AI-generated content using real people’s faces

YouTube
The YouTube play logo is being displayed on a smartphone, with YouTube in the background, in this photo illustration in Brussels, Belgium, on July 24, 2024. (Photo Illustration by Jonathan Raa/NurPhoto via Getty Images)

YouTube is cracking down on the rise of questionable AI-generated content. Content ID, YouTube’s automated content identification system, now includes a synthetic-singing identification tool, which will allow platform partners to automatically detect and manage AI-generated content on YouTube that simulates their singing voices.

This move comes over a year after a AI-generated song from Drake and The Weeknd went viral and rattled the music world. Earlier this year, over 200 big name musicians, including Billie Eilish, J Balvin, Nicki Minaj, Stevie Wonder and REM signed an open letter calling for protections against the predatory use of artificial intelligence that mimics an artist’s likenesses, voice and sound.

That’s not the only protection against AI that YouTube is planning to launch. The company is also in active development on a technology that will detect and manage AI-generated content using creators and celebrities’ faces on the platform. This, along with the company’s privacy updates, continues YouTube’s commitment to guard against exploitative AI usage.

“We believe AI should enhance human creativity, not replace it,” Amjad Hanif, vice president of creator products of YouTube, wrote in a blog post announcing these new technologies. “We’re committed to working with our partners to ensure future advancements amplify their voices, and we’ll continue to develop guardrails to address concerns and achieve our common goals.”

In the same post, YouTube clarified its own usage of AI. The company currently uses the content uploaded onto YouTube to train machine learning and AI applications in order to better the product experience of both YouTube and Google. This outlined in YouTube’s terms of service and has been used to improve recommendation systems and develop new generative AI tools for the platform, such as auto-dubbing on videos.

Other outside parties who are scrapping YouTube’s content violate the company’s terms of service. It’s these third parties that the company is protecting its partners against.

“That said, as the generative AI landscape continues to evolve, we recognize creators may want more control over how they collaborate with third-party companies to develop AI tools,” Hanif noted. “That’s why we’re developing new ways to give YouTube creators choice over how third parties might use their content on our platform. We’ll have more to share later this year.”

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.