
Leading chip-making company NVIDIA has been accused of training its AI on data scraped by controversial pirate website Anna’s Archive, despite recently signing a deal with Universal Music Group (UMG) to become its “responsible AI” partner.
Backstory:
As per Digital Music News (DMN), the accusations have surfaced in an amended class action lawsuit against NVIDIA, which was filed by several authors in 2024.
The initial suit claimed the company’s AI models had been illegally trained on the authors' works, and has now been expanded to include more books, authors and infringing AI models, as well as the claims involving Anna’s Archive.
Alleged collaboration:
The authors cite several internal NVIDIA emails and documents suggesting the company consciously downloaded copyrighted works to train its AI models, and that it collaborated with Anna’s Archive to acquire those works.
This despite being warned by Anna’s Archive that its library was “illegally acquired,” as per DMN.
Why it matters:
The news comes weeks after UMG and NVIDIA announced a partnership to “pioneer responsible AI for music discovery, creation, and engagement.”
Anna’s Archive is being sued by Spotify and the major labels, including UMG.
As DMN points out, while UMG may not have known NVIDIA allegedly sourced its data unethically, it now faces a dilemma if it wants to ensure “its offerings aren’t trained on or derived from pirated works.”
NVIDIA
Anna’s Archive
Spotify
Universal Music Group (UMG)
AI Training Controversies
AI Copyright Battles
Legal Battles Over AI Content
Ethical AI Music Sourcing
AI Training Data Provenance
Rising Tide of Music Litigation
AI and Copyright
Differentiating Ethical AI Use
Responsible AI Partnership Scrutiny
Class Action Lawsuit
Litigation
Copyright Infringement
AI Model Training
Data Scraping
Pirate Site Training Data
AI Licensing Deals
United States
👋 Disclosures & Transparency Block
This story was written with information from Digital Music News.
We covered it because it’s news regarding AI training on copyrighted material.













