Get all your news in one place.
100’s of premium titles.
One app.
Start reading
International Business Times UK
International Business Times UK
Technology
Vinay Patel

Billie Eilish, Nicki Minaj Lead Fight Against 'Predatory' Music AI

Billie Eilish and 200+ musicians slam AI music as threat to creativity. (Credit: Wikimedia Commons)

Over 200 high-profile musicians, including Billie Eilish and Nicki Minaj, have signed an open letter calling for a halt to the predatory use of artificial intelligence (AI) that mimics human artistic expressions and sounds.

The signatories, spanning musical genres and eras, include A-list stars and Rock & Roll Hall of Famers like Billie Eilish, J Balvin, Nicki Minaj, Stevie Wonder, and REM. The estates of Frank Sinatra and Bob Marley are also signatories.

Last year, YouTube gave users a glimpse into its experiment with AI and Music (in collaboration with Google DeepMind) via a blog post. Led by the Artist Rights Alliance, the letter urges tech giants to abandon efforts to develop AI tools that could replace human songwriters and artists.

"This assault on human creativity must be stopped. We must protect against the predatory use of AI to steal professional artists' voices and likenesses, violate creators' rights, and destroy the music ecosystem," the letter states.

The letter aims to strike a perfect balance, calling out exploitative AI tools and recognising the potential of ethical AI applications in music production. It is also worth noting that music producers have embraced AI tools lately.

Last year, AI isolated John Lennon's vocals from an old demo, allowing for the creation of a "new" Beatles song. The Artist Rights Alliance letter reflects an industry-wide pushback against generative AI, citing concerns over copyright infringement, labour rights, and other ethical and legal issues.

Sam Altman's OpenAI acknowledges the use of copyrighted material in training AI tools like ChatGPT, raising concerns. However, Microsoft, which partners with OpenAI, claims no responsibility if its AI tool Copilot users infringe copyrights.

Is creativity at risk? Artists fear AI could stifle originality

As AI in music evolves, artist unions push for regulations while studios explore its potential to streamline production costs. Using AI for creative endeavours like songwriting, scripting, and generating visuals of actors sparked several contract disputes and union strikes in the entertainment industry throughout 2023.

The spread of pornographic AI-made images of Taylor Swift highlighted the dangers of deepfakes. This malicious use of AI prompted lawmakers to introduce a bill earlier this year criminalising non-consensual, AI-generated sexualised imagery.

Just last week, concerns about responsible use led ChatGPT-maker OpenAI to delay the release of a program that can mimic voices. In a first for the US, Tennessee enacted legislation in March to protect musicians from the unauthorised commercial use of AI-generated vocal likenesses.

The Ensuring Likeness, Voice, and Image Security Act, or "Elvis Act," goes into effect on July 1st. This legislation makes it illegal to replicate an artist's voice without their consent. Regrettably, it does not address the issue of using artists' work to train AI models.

"Some of the biggest and most powerful companies are, without permission, using our work to train AI models," the letter states. "These efforts are directly aimed at replacing the work of human artists with massive quantities of AI-created 'sounds' and 'images' that substantially dilute the royalty pools paid out to artists."

Tom Kiehl, interim head of the UK Music Industry Association, said: "This amounts to music laundering, and any companies engaged in these practices must stop and take a more responsible approach to our music industry."

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.