When a fan congratulated Emily Portman on a new album she hadn’t released, the British folk musician assumed there had been a mistake. Instead, she found an AI-generated record uploaded under her own name circulating across major streaming platforms, a discovery that has unsettled artists and exposed vulnerabilities at the heart of the music industry.
A Familiar Name, an Unfamiliar Album
In July, Emily Portman opened her streaming profiles to find an album titled Orca listed alongside her earlier work. She had not recorded it, approved it, or even heard of it. Yet the songs bore titles and stylistic flourishes close enough to her folk-inspired catalogue to pass, at a glance, as authentic.
Portman quickly recognized what listeners could not. The vocals were “pristine,” she said, the lyrics hollow, the overall effect unsettlingly precise. She believes the artificial intelligence behind Orca had been trained on her previous albums, mimicking her instrumentation and phrasing with increasing fidelity.
What disturbed her most was not simply imitation, but misattribution. Listeners arriving at her profile could reasonably assume the work was hers. “I just felt really uncomfortable,” she said, imagining fans encountering the music and wondering what had happened to her sound.
Music That Is ‘Virtually Undetectable’
Portman’s experience is no longer rare. AI-powered music generators such as Suno and Udio have advanced to the point where most listeners cannot reliably distinguish synthetic tracks from human-made recordings. An Ipsos study conducted for the French streaming service Deezer in November found that almost all respondents struggled to tell the difference.
This technological leap has produced unexpected successes. Entirely AI-created bands, including one called The Velvet Sundown, have attracted hundreds of thousands of listeners and more than a million subscribers on Spotify. But it has also fueled a parallel economy of deception.
Industry representatives say that fraudulent uploads often aim to exploit the economics of streaming, where individual plays generate small sums that can accumulate quickly when automated bots inflate listening figures. Uploading under a known artist’s name, said Dougie Brown of UK Music, is a way to ensure those royalties flow somewhere just not to the artist.
An Industry Built on Trust, Not Verification
At the center of the problem lies the structure of music distribution itself. Scammers claiming to represent artists can approach distribution companies, which then upload music to platforms with minimal identity verification. According to artists, there are few meaningful checks to confirm authorship before a track appears publicly.
Australian musician Paul Bender discovered earlier this year that four “bizarrely bad” AI-generated songs had been added to the profile of his band, The Sweet Enoughs.
“You just say: ‘Yes, that’s me,’” he said. “It’s the easiest scam in the world.”
Bender, who also plays bass for the Grammy-nominated band Hiatus Kaiyote, later compiled a list of suspicious releases many appearing in the catalogs of deceased artists, including the experimental Scottish musician Sophie, who died in 2021. A petition he launched calling for stronger security measures drew roughly 24,000 signatures, including from Anderson .Paak and Willow Smith.
Platforms, Laws, and the Limits of Protection
Streaming services acknowledge the growing challenge. Spotify, which has faced criticism over transparency, says it is working with distributors to improve fraud detection, echoing similar efforts by Apple Music.
“Across the music industry, AI is accelerating existing problems like spam, fraud, and deceptive content,” the company said in a statement.
For artists seeking removals, the process can be uneven. Portman and Bender, neither of whom pursued legal action, asked platforms to take down the offending tracks. Some disappeared within a day; others remained for weeks.
Legal protections vary widely. Certain jurisdictions, including California, have enacted laws addressing imitation and likeness rights. In others, such as the United Kingdom, limited copyright frameworks leave artists exposed, according to Philip Morris of the Musicians’ Union. Proposed legislation, musicians warn, could further weaken safeguards by expanding permissible uses of copyrighted material for AI training.
Despite the experience, Portman continues to work on a new album an expensive, collaborative project grounded in human relationships. For her, the distinction still matters. “It’s all about those human connections,” she said, even as the boundary between human creativity and machine replication grows harder to see.