Donate today and get a Marketplace mug -- perfect for all your liquid assets! Donate now
YouTube and Universal Music leap into the AI copyright void
Aug 30, 2023

YouTube and Universal Music leap into the AI copyright void

HTML EMBED:
COPY
YouTube and Universal's incubator could produce interesting technology, says Nilay Patel of The Verge, but it’s largely aimed at monetizing unauthorized use of music. “YouTube is inventing more copyright law without a legislature, without a judge,” he adds.

YouTube recently announced a partnership with Universal Music Group to launch a music AI incubator. Their goal is to come up with new artificial intelligence projects and protect artists.

The venture comes after songs featuring AI versions of singers like Drake, Kanye West and Frank Sinatra got viral attention, raising questions around how copyright law applies to AI-derived music and who should be paid.

Marketplace’s Lily Jamali spoke with Nilay Patel, editor-in-chief of The Verge and host of the Decoder podcast, about how the deal could breed innovation but also create serious problems.

The following is an edited transcript of their conversation.

Nilay Patel: So what we’ve arrived at is YouTube and Universal Music coming to some sort of agreement, to one, allow Universal Music artists into some sort of incubator where I’m confident YouTube engineers and artists will come up with some fun new AI tools. And on the other side, for Universal’s lawyers and YouTube’s engineers, to expand a system called Content ID to cover what YouTube has labeled generated content, which can really only mean one thing — they’re going to find a way to detect AI-generated voices of famous artists and make sure Universal gets paid.

Lily Jamali: So give me a sense of just how serious AI related-copyright strikes are on YouTube.

Patel: Well, there aren’t any. So this is actually a huge, complicated, wonky problem. You cannot copyright a voice. Now there are some state-level rights of publicity claims, maybe there’s a trademark claim that you could fashion out, but all of this is massively untested. No one knows if any of this works. There’s no federal right to your likeness. There’s no federal right of publicity. So you’re just in this pickle. This doesn’t exist for YouTube, there’s no law for YouTube to fall back on. So YouTube is creating its own private right of likeness, presumably just for the music labels, to say, OK, if a fake Drake song shows up, we’ll detect it somehow — unclear how they will do that — and somehow pay you. It is not clear how they will be able to detect it. It’s not clear what the error rate will be. It’s not clear if they’re going to detect not-AI fake Drake. What if there’s just a kid who’s trying their hardest to sound like Drake? How will this new right of likeness work in a world where we are pretty used to being able to try to do impressions of other people? I don’t know the answers to those questions. YouTube has not laid out the answers to those questions. And because there’s no law to fall back on, if there’s a conflict or there’s a disagreement, you can’t file a lawsuit and go have a court figure it out. YouTube just gets to decide, and we know what YouTube is gonna pick. They’re gonna pick their big partners over and over again.

Jamali: So I’m looking at this YouTube video that’s embedded in your piece, a recent piece that you wrote. It’s an AI cover of Frank Sinatra singing Lil Jon’s “Get Low.” It has a million views so far. Does the sheer virality of this type of content enable YouTube to track some of this stuff down?

Patel: I think that helps, but Content ID works across YouTube at scale. So if you upload a video to YouTube, and it has music in the background of it and it only gets five views, Content ID will still detect that copyrighted music in the background of it. It will still allow the record labels to either pull that video or more often monetize that video. So if ads are running in the video, the label is going to get a cut of that advertising revenue because their song was in your video. This is a truce. It took years of litigation between YouTube and the labels, and in particular, the labels and individual people, to arrive at this result. The labels famously sued a parent who posted a video of their child dancing with Prince in the background. That is a ridiculous outcome. Like, we do not want to go back to a world where major labels are suing parents because they’re taking videos of their kids dancing to Prince songs. But that’s what was happening. So Content ID represents this very important truce: OK, we know people are going to use copyrighted music. We’re going to detect it no matter where it is on the platform. And we will send some money our way instead of suing parents.

Jamali: In just talking to you, it’s clear how much this news concerns you. What is it that worries you the most about this development?

Patel: On the YouTube side of it, I think extending copyright law in weird, private ways just for music companies is intensely problematic. Copyright law is the most effective speech regulation on the internet. If you want something taken down off the internet, almost entirely, you are going to copyright law. So this is a problem. YouTube is inventing more copyright law without a legislature, without a judge, without a jury, just because its important music partners want to be able to take more stuff down. That’s weird. I just fundamentally think that’s weird.

Jamali: Does this seem like it could be a safe alternative way for people to create AI-remixed music or content?

Patel: If the end of this road for YouTube is Universal and YouTube have created a tool where you get to use the voices of Universal artists in a copyright-safe way, and Universal gets paid when you do it, maybe that tool inspires a lot more people to make art. There’s no argument, fair use or not, for these voices yet. It just hasn’t gone to court. People want an answer to that question, is it fair use to use an AI clone of Frank Sinatra’s voice in a song? There is a vacuum of an answer there. It has not been litigated. No one knows the answer to that question. For YouTube to skip ahead and say we know the answer to the question because we know what the music labels want, I think that is troublesome.

More on this

We reached out to YouTube for comment but did not hear back before our deadline. You can read through its announcement of the partnership for more details here.

Earlier this year, my colleague Meghan McCarty Carino had a conversation with Dan Runcie, founder of the media research firm Trapital, about the rise of AI-generated songs.

He pointed out that embracing AI-generated music instead of just taking it down could lower the barrier to entry for people interested in producing music independently.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer