Folk Musician Finds AI-Cloned Songs Uploaded Under Her Name on Spotify

Senior AI Reporter
April 6, 2026
Folk Musician Finds AI-Cloned Songs Uploaded Under Her Name on Spotify

A folk musician discovered AI-generated versions of her own songs uploaded under her name on Spotify, highlighting gaps in how streaming platforms handle identity and copyright as voice cloning tools become more accessible.

Murphy Campbell said she found the unauthorized tracks in January, when songs she had not released appeared on her Spotify artist profile. The recordings used her performances of traditional folk songs but featured vocals altered through AI tools after being sourced from her YouTube videos.

“I was kind of under the impression that we had a little bit more time,” Campbell said, describing her reaction after realizing independent artists were already being affected.

Testing cited in The Verge found the tracks were likely AI-generated, underscoring how easily available tools can replicate an artist’s voice using publicly accessible audio. The case points to a growing issue where musicians’ existing online content can be repurposed without consent.

The problem extended beyond the initial uploads. As Campbell attempted to remove the tracks, she encountered false copyright claims filed by third parties, complicating efforts to take down the material. Systems intended to protect creators were instead used in ways that delayed resolution.

The incident also exposed limited safeguards on streaming platforms. Services such as Spotify do not verify whether uploaded tracks originate from the credited artist, leaving room for impersonation until complaints are filed.

For independent musicians, the burden of addressing these issues falls largely on the individual. Without legal teams or direct platform access, they must navigate takedown processes while unauthorized content remains available to listeners.

The technology enabling the misuse is widely accessible. Voice cloning tools can generate convincing replicas from short audio samples, and publicly shared performances provide ample material for training.

Campbell’s experience reflects a broader shift in how AI-related risks are spreading. Earlier concerns focused on high-profile artists, but the same tools are now affecting smaller creators with fewer resources to respond.

The case raises unresolved questions about platform responsibility and copyright enforcement as AI-generated content becomes harder to distinguish from authentic recordings.

This analysis is based on reporting from techbuzz.

Image courtesy of Murphy Campbell/YouTube.

This article was generated with AI assistance and reviewed for accuracy and quality.

Last updated: April 6, 2026

About this article: This article was generated with AI assistance and reviewed by our editorial team to ensure it follows our editorial standards for accuracy and independence. We maintain strict fact-checking protocols and cite all sources.

Word count: 363Reading time: 0 minutes

AI Tools for this Article

📧 Stay Updated

Get the latest AI news delivered to your inbox every morning.

Browse All Articles
Share this article:
Next Article

AI News Daily

Breaking Intelligence • Since 2023

Join hundreds of thousands of AI professionals who start their day with our curated newsletter. Get breaking news, expert analysis, and exclusive insights.

Articles Published Daily
Thousands of Monthly Readers

Stay Ahead of AI

Get the latest AI breakthroughs, tools, and insights delivered to your inbox every week.

Free forever Unsubscribe anytime No spam guarantee

Go Premium

Unlock unlimited AI tools and an ad-free reading experience designed for AI professionals.

• Ad-free experience
• Premium AI tools
Start Free Trial

14-day free trial • Cancel anytime
Plus $9/mo • Pro $90/yr (2 months free)

AI News Hub

Breaking Intelligence

Your daily briefing on what matters in AI. Trusted by developers, researchers, executives, and AI enthusiasts worldwide.

Follow Our Community

© 2026 ChatAI. All rights reserved.