An Editorial by Dawoud Kringle
When you thought it couldn’t get worse, the corporate overlords took things to the next level. Music distributors have been sneaking clauses into their user agreements allowing them to use our music to train AI models.
This is how it works. Many distributors (e.g., DistroKid, TuneCore, etc.) include broad licensing terms in their contracts, granting themselves the right to use uploaded music to train AI models. They use terms such as “machine learning,” “data analysis,” or “service improvement” to describe this. Artists often unknowingly consent by agreeing to the terms. The user agreements frequently have no option to refuse to allow the distribution service to use their music for AI training. These clauses usually allow the unrestricted, perpetual use of music for AI without paying royalties or compensation to artists or rights holders. Some distributors sell or provide AI companies like OpenAI or Google DeepMind access to their music libraries to train generative models, including vocal clones, melody generators, etc. The raw audio is still used even if the artist removes the metadata from their uploaded files. This can easily replicate an artist’s unique styles without attribution.