The Battle Over AI Song Artists: Copyright and Trademark Issues Taking Center Stage

The Battle Over AI Song Artists: Copyright and Trademark Issues Taking Center Stage

The rise of AI-generated music has sparked one of the most intense legal debates the entertainment and technology industries have faced in decades. What started as quirky “deepfake tracks” created for fun has quickly escalated into a multi-billion-dollar legal battlefield involving record labels, tech companies, individual artists, and regulators. The core question is deceptively simple: What rights exist when the performer isn’t human?

AI tools can now imitate the voices, writing styles, and production signatures of major artists with stunning accuracy. Songs made by anonymous creators using AI versions of Drake, Taylor Swift, Ariana Grande, Frank Ocean, and others have gone viral on TikTok, Spotify, and YouTube, often receiving millions of streams before platforms take them down. As the technology continues to evolve faster than legislation, copyright and trademark law are being tested in ways never imagined.

Below, we break down the copyright issues, the trademark problems, why this matters for creators and companies, and where the law may be heading.

I. Copyright Issues: When the Voice Is a “Work” and the Artist Isn’t Involved

1. Who owns an AI-generated song?

Copyright law protects original works of authorship created by humans. This poses an immediate challenge:
AI-generated songs, no matter how impressive, are generally not copyrightable unless there is meaningful human input. The U.S. Copyright Office has repeatedly affirmed that works created solely by AI cannot receive copyright protection.This principle is reflected in the U.S. Copyright Office’s guidance on AI-generated works, which emphasizes that only human-created content can be copyrighted.

For AI-generated music, this creates two major problems:

  • The creator of the AI output doesn’t hold copyright, because they didn’t “authorship” in the legal sense.
  • The AI company doesn’t hold copyright, because the work wasn’t created by a human employee.
  • The original artist doesn’t automatically hold copyright, unless their actual lyrics, melodies, or recordings were used.

This means an AI-generated knockoff song might not be protectable at all, even by the person who made it, yet it could still infringe on the rights of human artists.

2. Does mimicking an artist’s voice infringe copyright?

This is one of the hottest topics in entertainment law.

A music artist’s voice is not a copyrighted work. Copyright law protects recordings, not the vocal style, tone, or expression itself. So, if AI generates a new song in “the voice of Ariana Grande” without copying existing lyrics or melodies, the case for copyright infringement becomes difficult.

This is why the viral fake Drake/The Weeknd song “Heart on My Sleeve” raised so many legal dilemmas but did not fit squarely into traditional infringement categories.

3. What if the AI was trained on copyrighted music?

This is a major concern for labels.

Training AI systems on copyrighted songs, without permission, may itself be infringing, depending on how courts define the boundaries of “fair use.” Courts have not yet ruled on this issue in the context of generative AI and music. However, lawsuits involving AI companies like OpenAI and Anthropic (in the context of text) are likely to shape future standards.

Key questions include:

  • Is ingesting copyrighted recordings a form of unauthorized copying?
  • Is training data transformative?
  • Should artists and labels be compensated for training use?

The legal uncertainty is enormous and financially significant.


II. Trademark Issues: Names, Images, and “Voice as Brand”

While copyright governs creative works, trademark law protects identity, names, slogans, branding, and indicators of source. The explosion of AI song artists has pushed trademark law into the spotlight.

1. Using an artist’s name without permission

Releasing music under “AI Drake,” “AI Taylor Swift,” or similar labels can trigger trademark claims. Artist names are valuable commercial assets used to identify the source of entertainment services.

If an AI creator labels a song “featuring AI Beyoncé,” it may:

  • Confuse listeners about whether Beyoncé endorsed the work
  • Dilute the value of the artist’s brand
  • Create false advertising or false association claims

This gives artists a clearer legal path than copyright.

2. Voice as part of a trademarked persona

Some courts have accepted the idea that a well-known performer’s voice is protected by “right of publicity” laws. This varies by state, but many recognize voice as part of a celebrity’s identity.

For example:

  • In Midler v. Ford, singer Bette Midler successfully sued when an imitator was used to replicate her voice in a commercial.
  • In Waits v. Frito-Lay, Tom Waits won a similar case.

These cases didn’t involve AI, but the logic fits perfectly. If a voice is part of an artist’s persona, imitating it for commercial purposes can be a violation of their right of publicity.

3. The rise of synthetic artists and brand confusion

Some creators are generating entire AI personas:

  • AI pop stars
  • AI rappers modeled after real artists
  • Entire digital bands
  • Deepfake collaborations

Trademark law may apply when branding for these AI artists resembles real human artists, creating a likelihood of confusion.


III. Why This Matters: The Music Industry Is at a Turning Point

1. Record labels see AI as both a threat and a business opportunity

Universal Music Group, Sony, and Warner are aggressively challenging unauthorized AI songs. These concerns are outlined in Universal Music Group’s official statement on AI deepfakes, which warns against the misuse of artists’ voices and likenesses. At the same time, they are exploring licensing deals with AI companies to allow:

  • Officially sanctioned AI vocals
  • Digital clones of deceased artists
  • AI-powered remixing and sampling

The industry wants to protect its assets while monetizing new technology.

2. Artists are deeply divided

Some artists are excited about AI collaboration. Others believe it threatens artistic integrity, economic stability, and even personal identity.

Concerns include:

  • Loss of control over voice
  • Replacement or dilution of original work
  • Unauthorized exploitation
  • Fans consuming AI versions instead of real music

The fear is not hypothetical, some AI songs have already outperformed real releases.

3. Consumers often can’t tell the difference

As AI continues to improve, distinguishing human from machine is becoming nearly impossible. Regulators, including the FTC, have issued guidance regarding AI, deepfakes, and consumer deception to warn businesses and platforms about potential risks. This creates major concerns about:

  • Fraud
  • Misinformation
  • Fake collaborations
  • AI tracks going viral without context

Regulators are watching closely.


IV. Where the Law Is Headed: Possible Solutions

1. New rights for voice and likeness

Congress is already considering legislation that would create a federal “right of publicity,” including protections for:

  • Voice
  • Likeness
  • Style of performance
  • Digital replicas

This would give artists a powerful tool to stop unauthorized AI clones.

2. Licensing frameworks for training data

AI companies may eventually be required to:

  • Obtain licenses for training on copyrighted recordings
  • Disclose datasets
  • Compensate rights holders

This mirrors how streaming services evolved in the early 2000s after Napster.

3. AI labeling requirements

Some proposals would require AI-generated content to be clearly labeled, helping consumers understand what they’re hearing.

4. Case law will shape the industry

As lawsuits between record labels and AI platforms advance, courts will decide critical issues, likely setting new legal precedents for decades.

Conclusion

The battle over AI song artists isn’t just a technological debate; it’s a legal and cultural turning point. Copyright law wasn’t built for a world where a machine can imitate Beyoncé’s voice in seconds, and trademark law is being stretched to protect identities in a digital world that blurs the boundaries of authenticity. As lawmakers, courts, and the music industry scramble to keep up, one thing is clear: AI-generated music is here to stay, and the legal landscape around it will define the future of creativity.

Share this article

Related Posts:

When Business Litigation Meets Intellectual Property: Navigating the Overlap