AI

Next-Level Criminality: Is Your Music Yours Anymore?

An Editorial by Dawoud Kringle

When you thought it couldn’t get worse, the corporate overlords took things to the next level. Music distributors have been sneaking clauses into their user agreements allowing them to use our music to train AI models.

This is how it works. Many distributors (e.g., DistroKid, TuneCore, etc.) include broad licensing terms in their contracts, granting themselves the right to use uploaded music to train AI models. They use terms such as “machine learning,” “data analysis,” or “service improvement” to describe this. Artists often unknowingly consent by agreeing to the terms. The user agreements frequently have no option to refuse to allow the distribution service to use their music for AI training. These clauses usually allow the unrestricted, perpetual use of music for AI without paying royalties or compensation to artists or rights holders. Some distributors sell or provide AI companies like OpenAI or Google DeepMind access to their music libraries to train generative models, including vocal clones, melody generators, etc. The raw audio is still used even if the artist removes the metadata from their uploaded files. This can easily replicate an artist’s unique styles without attribution.

Some In-House AI Tools for auto-mastering, stem separation, etc., are not exempt from this. Platforms such as LANDR and BandLab use uploaded music to train their own AI tools, then sell or lease those same tools back to artists through subscriptions. Even user interactions with these tools, like skipping or rewinding, can train algorithms and indirectly shape AI-generated music.

It’s no mystery why music production, distribution, and streaming services do this. They want to use Artificial Intelligence to produce unlimited content using nothing more than AI models trained to mimic human music and all of its styles and genres. Thus, the time, resources, and cost of relying on human musicians to create music are eliminated. Operating costs are reduced, and profits are increased.

And musicians are kicked to the curb.

These practices occupy legal gray areas, allowing unscrupulous companies to set dangerous precedents. Distributors argue that their contracts override individual copyright claims. Publishing Rights are also ignored. Even if an artist owns recordings, publishers/songwriters are rarely compensated for AI training use. These have not been tested in court for AI-specific cases.

The result is that we lose control of our work. Our music could train AI competing with us, ultimately devaluing our work. There are no royalties from this. To add insult to injury, most artists are unaware that their music is being used like this.

The following are a sample of independent music distribution services whose user agreements include clauses that authorize them to use uploaded music to train AI models without additional compensation or explicit permission from rights holders:

  1. DistroKid’s terms grant them broad rights to use content for AI training and other purposes. Their clause states: “You grant us a worldwide, non-exclusive, royalty-free license to use your content… for machine learning, training models, or improving our services.” This makes the exploitation undeniable.

  2. TuneCore – Their agreement allows music use in AI and machine learning applications.

  3. CD Baby – Their terms include permissions to utilize music for AI model training.

  4. Amuse – Their user agreement permits uploaded tracks for AI and data analysis.

  5. SoundOn (by TikTok) – Their terms include provisions allowing AI training on distributed music.

  6. RouteNote’s terms include broad licensing rights for “research, development, and machine learning.”

  7. Symphonic Distribution – Their agreement grants them the right to use music for “AI, data analysis, and other technologies.”

  8. Ditto Music’s terms permit the use of uploaded tracks for “machine learning and AI model training.”

  9. LANDR – Their distribution service’s terms include clauses allowing AI training on user-submitted music.

These platforms justify these clauses as necessary for “improving services,” while allowing AI companies to exploit artists’ work without consent or payment.

Spotify’s AI DJ and DJ X are features available to Spotify Premium subscribers in select markets that use AI to create a personalized listening experience. It employs algorithms and machine learning to create a customized music stream based on your listening habits and preferences. It considers your past listening history and favorite artists/genres, curates playlists based on your listening history, offers a mix of familiar and new music, and even includes commentary and voice updates from an AI voice. This feature is trained on licensed music. They claim it does not directly impact artist compensation beyond the existing stream royalty system, but Spotify refuses to be transparent on the details of artist compensation.

While this presently occupies a legal gray zone, Moves are being made to completely legalize corporations to profit from the stripping of our creations, intellectual property, rights, and money from us.

Major AI firms like OpenAI (which Elon Musk co-founded) face ongoing lawsuits over alleged unauthorized use of copyrighted materials in their model training. To counteract this, the Trump Administration and Musk have used their political influence to prepare for the next phase of this. On Thursday, May 8, 2025, Trump fired Carla Hayden, the Librarian of Congress. Two days later, on Saturday, May 10, 2025, Trump purged Shira Perlmutter to become the Register of Copyrights and Director of the Copyright Office. Trump did this only days after  Perlmutter published a report about how the development of AI technology could conflict with fair use law and how some AI companies breach copyright laws. Musk’s xAI (Grok) and OpenAI (which he co-founded) rely on copyrighted data. Trump’s move toward AI deregulation could easily put music distribution, copyright protection, and US financial markets at risk.

Trump’s reason for doing this is apparent. He is attempting to remove any legal obstacle Musk and his cronies could face from profiting from our work without compensation. His removal of Perlmutter clears the path for unchecked AI theft. Rep. Joe Morelle (D-N.Y.) has denounced President Trump’s removal of Perlmutter as “an unprecedented power grab with no legal basis.” Trump’s purge of the Copyright Office wasn’t his usual bureaucratic unpredictability and chaos. It was a license for AI firms like Musk’s xAI to argue that stealing your music is ‘fair use easily.’

The question of fair use has many facets. In 2023, Universal Music Group (UMG), Concord Music Group, and ABKCO sued AI company Anthropic, alleging copyright infringement for using their copyrighted lyrics to train Anthropic’s AI chatbot, Claude. The publishers claimed Claude copied and disseminated their lyrics without permission. Anthropic argued that their use of the lyrics was “fair use” and that the publishers had not proven they suffered irreparable harm.

The US is not the only country attempting to tackle this problem. The European Union’s AI Act 2024 mandates providers of General Purpose AI (GPAI) models to create and make a “sufficiently detailed summary” of the content used for training their models publicly available. While it does not explicitly require the full disclosure of the training data used to develop AI models, it introduces transparency obligations for GPAI models that indirectly relate to training data.

It should be noted that not all music distributors are involved in the blatant data mining of our music. Below are some policies that are favorable to musicians.

UnitedMasters terms focus on traditional distribution and licensing, with no broad AI training rights granted. They emphasize artist ownership and direct deals (e.g., brand partnerships, sync licensing).

AWAL (by Sony Music). Their public terms do not include blanket AI training permissions. They emphasize negotiated deals for sync and licensing, though major-label affiliations could impact future policies.

ONErpm’s agreement lacks explicit AI training clauses for 2024, explicitly focuses on distribution and monetization without broad rights grabs for machine learning.

Record Union. This EU-based distributor has historically avoided AI-centric terms, focusing on transparent royalty splits and copyright control. Their terms do not grant AI exploitation rights.

Horus Music’s user agreement emphasizes artist rights and does not include provisions for AI training without consent. They’ve publicly positioned themselves as artist-friendly on ethical issues.

The Community Manager of MixCloud said, “We believe creators should be respected for their efforts in making music, DJ mixes, radio shows, and podcasts. Because of this strongly held belief, we do not use your music to train generative AI models.” Mixcloud’s Terms of Use and Licensing Agreement do not include clauses that permit using uploaded music for AI/ML training without consent. They focus on live streaming, DJ mixes, and radio-style shows, not redistributing music for tech development. Mixcloud has direct licensing deals with rights societies (ASCAP, BMI,  PRS, PPL, etc.) and pays royalties to artists/labels. This structure means they can’t unilaterally license music for AI. However, the option to renegotiate with rights holders does exist.

Bandcamp, while not a distributor, prohibits AI training without consent under its current policies. Bandcamp’s Terms of Service and Acceptable Use Policy explicitly banning the use of content from their platform for training AI models: “You also agree: … Not to train any machine learning or AI model using content on our site or otherwise ingest any data or content from Bandcamp’s platform into a machine learning or AI model”. Their Acceptable Use Policy also prohibits the creation, development, use, or offering of “Unauthorized AI Tools” which are defined as AI that uses copyrighted works without authorizationAdditionally, users are prohibited from “scraping” or “data mining” content from the site using automated tools, further preventing the collection of data for AI training purposes.

Like any platform, Bandcamp could be vulnerable to external AI Third-Party scraping (e.g., bots downloading tracks). However, it actively combats piracy, including unauthorized AI data harvesting. Terms could evolve if Epic, Bandcamp’s parent company, integrates Bandcamp into its metaverse/AI projects. For now, it’s one of the least risky major platforms.

On May 12, 2025, SoundCloud issued a statement attempting to clarify its stance on AI training after controversy over wording within its Terms & Conditions. The policy update (added in February 2024) states: “You explicitly agree that your Content may be used to inform, train, develop, or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services.” This was exposed by Ed Newton-Rex, founder of the nonprofit Fairly Trained. In an X post on May 9, 2025, Newton-Rex said, “SoundCloud seems to claim the right to train on people’s uploaded music on their terms. They have major questions to answer over this. My question to SoundCloud is: does this include generative AI models? If so, I’ll remove my music. I would encourage others to do the same.” SoundCloud has since issued statements to media outlets, asserting that “to date,” no user content has been used to train AI models, and that the rule change is meant to allow the use of user content for AI-driven platform features such as music recommendations and generating playlists. Marni Greenberg, SVP and Head of Communications at SoundCloud, said: “SoundCloud has never used artist content to train AI models, nor do we develop AI tools or allow third parties to scrape or use SoundCloud content from our platform for AI training purposes. We implemented technical safeguards, including a ‘no AI’ tag on our site to prohibit unauthorized use explicitly.”

Surf (Surf.audio) is an AI-powered music mastering and distribution service. They have a transparent and artist-friendly policy regarding AI training. Surf’s Terms of Service and Privacy Policy do not include blanket rights to use uploaded music for AI model training and do not force users to grant indefinite, royalty-free AI exploitation rights. Their core service uses AI only for audio mastering, not for training generative AI models. Users must opt in to submit tracks for mastering, and the AI processes files without retaining them for broader training. Their policy states that user uploads are deleted after processing unless stored voluntarily (e.g., for distribution), and they do not claim ownership or repurposing rights over your music. However, if you use Surf to distribute to platforms like Spotify or Apple Music, be aware that those stores’ terms will apply while  Surf itself doesn’t exploit your music for AI. Also, I’d like to point out that other AI mastering and distribution services like Udio or Suno use the artist’s content to train AI models.

Additionally, SURF has an AI-powered deepfake detection tool, the SURF Deepwater deepfake detector, integrated into their Enterprise Zero-Trust Browser. The tool aims to protect enterprises, media organisations, law enforcement agencies, and military entities worldwide from the growing threat posed by AI deepfakes. It is designed to detect whether a person is a real human or an AI imitation with up to 98% accuracy and swiftly issues alerts about possible deepfake threats.

While the situation looks grim, we are not powerless. Here are some things we can do.

Use Opt-Out Platforms. If a distributor offers an AI opt-out (a rarity, but not impossible), check the account settings for “Data Sharing” or “AI Training Permissions.” You must demand written confirmation that your music won’t be used for AI purposes. We could also use contract negotiation and request an addendum prohibiting AI use.

Please review the latest terms before uploading music, as policies can change. Search for “machine learning,” “AI,” or “license” in distributor agreements. I strongly recommend updating your awareness of any company’s policy on AI training – by the time you read this, some of the information may have already changed.

Here are a few Red Flag Phrases/AI Exploitation Loopholes to look out for:

  1. “Machine Learning” or “AI Training” Example: “We may use your content to train machine learning models.” This explicitly allows AI to mimic your voice, melodies, or style.

  2. “Data Analysis” or “Service Improvement” Example: “Your music may be used for data analysis to improve our platform.” It is a vague phrase that hides AI training behind “recommendation algorithms.”

  3. “Perpetual, Royalty-Free License” Example: “You grant us a worldwide, royalty-free license to use your content…” This allows them to sell/use your music forever, including for AI.

  4. “Sublicense to Third Parties” Example: “We may sublicense your content to partners for research purposes.” These “Partners” = AI companies like OpenAI or Udio.

  5. “Derivative Works” or “Transformative Use” Example: “We may create derivative works from your uploads.” AI clones/remixes could be defined as “derivatives.”

Technical resistance is a viable option. Tools for anti-AI Audio Watermarking are emerging. This involves embedding unique, inaudible identifiers into audio files to detect and identify them as AI-generated or to track their origin. This technique helps combat deepfakes, copyright infringement, and unauthorized AI-generated audio content. However, the technology isn’t foolproof. Alterations such as compressing a file can remove the watermark. Developers have tried to make watermarks more resilient by embedding them in every section of an audio track, so they remain detectable even if the file is cropped or edited. Despite these efforts, even the most advanced watermarking techniques cannot prevent skilled and motivated attackers from removing them. Nonetheless, watermarking our tracks, files, and stems by embedding hidden audio tags may help to prove theft if it is cloned. You may wish to look into https://audiotag.info/

We can also lobby for change. Some organizations are pushing for bans on unauthorized AI training. One of these is the nonprofit, artist-led education and advocacy organization Artist Rights Alliance (ARA). They have addressed artificial intelligence (AI) developers, technology companies, platforms, and digital music services in an open letter (supported by over 200 artists) urging them to stop using AI to “infringe upon and devalue the rights of human artists.” This action follows the global discussion on the responsible use of AI in music and the Ensuring Likeness Voice and Image Security (ELVIS) Act signed into law on March 211, 2024, in Tennessee. The ELVIS Act protects individual artists’ unique voice and likeness against unauthorized AI deepfakes and voice clones. Similar legislation is being discussed in the US Congress and several other states.

We should boycott companies like CDBaby, Distrokid, and TuneCore and urge musicians to stop giving them our money and business. Similarly, we should use every means to expose political bodies like the Trump Administration, which perpetuates corruption and ruins honest and ethical business practices.

Some of us already have music on services such as CD Baby and Distrokid. Perhaps they would be motivated to adjust their policies if their clients pressured them. One way could be in the form of letters. Here is a suggested template for a letter demanding to opt out:

Subject: Request for Immediate Opt-Out of AI Training on Your Terms

Dear [Distributor’s Name / Support Team],

I am writing as an artist and am paying to use your platform to distribute my original music. I recently reviewed your Terms of Service and noticed clauses that grant [Distributor Name] broad rights to use my music for AI training, machine learning, or “service improvement” (e.g., Section [X] of your agreement).

Take notice that I DO NOT consent to my music being used to train AI models in-house or by third parties. This includes but is not limited to:

  • Generative AI tools (e.g., vocal clones, melody generation, etc.)

  • “Derivative works” or “transformative use” by AI systems

  • Data licensing partnerships with AI companies

I demand written confirmation that:

  1. My music (past and future uploads) will never be used for AI training.

  2. My account will be exempt from any such licensing, even if your terms are updated.

If you cannot provide this assurance, I will have no choice but to remove my catalog from your platform and publicly disclose your refusal to protect artists’ rights. I will also pursue legal options, as unauthorized use of copyrighted work for AI training may violate copyright law.

Please respond within [7-10 business days]. Silence will be interpreted as refusal.

Sincerely,
[Your Full Name]
[Artist Name]
[Your Email]
[Phone Number]
[Catalog/Release Example: “Album XYZ” (2025)]

CC: [Consider cc’ing legal@ or copyright@ if no response]

Sharing the letter and the response on social media or other public forums could be effective.

Throughout the history of the music business, every technological advancement has been accompanied by a need to define the ethical use of that technology. There always have been, and always will be, those who would use unethical means to exploit the rightful owners and creators of music. The current factor of AI in all its aspects is no exception. In past articles I’ve written for DBDBD, I addressed the problems inherent in Artificial Intelligence. AI is (or at least can be) a handy tool for musicians in both our work’s creative and business/financial aspects. In time, it will be as integral to our work as recordings, radio broadcasts, electric amplification, and computers. We only need to understand it and be vigilant against those who use it against us.

If history taught us anything, arming ourselves with knowledge and solidarity, and holding our ground are our most viable courses of action. This isn’t just about money, it’s about our survival. If they steal our music today, we will be erased tomorrow.