Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
AI Music

How the Music Industry is Building the Tech to Hunt Down AI-Generated Songs (theverge.com) 75

The goal isn't to stop generative music, but to make it traceable, reports the Verge — "to identify it early, tag it with metadata, and govern how it moves through the system...."

"Detection systems are being embedded across the entire music pipeline: in the tools used to train models, the platforms where songs are uploaded, the databases that license rights, and the algorithms that shape discovery." Platforms like YouTube and [French music streaming service] Deezer have developed internal systems to flag synthetic audio as it's uploaded and shape how it surfaces in search and recommendations. Other music companies — including Audible Magic, Pex, Rightsify, and SoundCloud — are expanding detection, moderation, and attribution features across everything from training datasets to distribution... Vermillio and Musical AI are developing systems to scan finished tracks for synthetic elements and automatically tag them in the metadata. Vermillio's TraceID framework goes deeper by breaking songs into stems — like vocal tone, melodic phrasing, and lyrical patterns — and flagging the specific AI-generated segments, allowing rights holders to detect mimicry at the stem level, even if a new track only borrows parts of an original. The company says its focus isn't takedowns, but proactive licensing and authenticated release... A rights holder or platform can run a finished track through [Vermillo's] TraceID to see if it contains protected elements — and if it does, have the system flag it for licensing before release.

Some companies are going even further upstream to the training data itself. By analyzing what goes into a model, their aim is to estimate how much a generated track borrows from specific artists or songs. That kind of attribution could enable more precise licensing, with royalties based on creative influence instead of post-release disputes...

Deezer has developed internal tools to flag fully AI-generated tracks at upload and reduce their visibility in both algorithmic and editorial recommendations, especially when the content appears spammy. Chief Innovation Officer Aurélien Hérault says that, as of April, those tools were detecting roughly 20 percent of new uploads each day as fully AI-generated — more than double what they saw in January. Tracks identified by the system remain accessible on the platform but are not promoted... Spawning AI's DNTP (Do Not Train Protocol) is pushing detection even earlier — at the dataset level. The opt-out protocol lets artists and rights holders label their work as off-limits for model training.

Thanks to long-time Slashdot reader SonicSpike for sharing the article.

How the Music Industry is Building the Tech to Hunt Down AI-Generated Songs

Comments Filter:
  • Witch Hunting (Score:5, Insightful)

    by allo ( 1728082 ) on Sunday June 22, 2025 @01:57PM (#65467917)

    It won't take long for the first false positive to occur, just look at the mistakes made by YouTube's Content ID system.

    This could also lead to a dangerous system where companies (let's be honest, it's not about the artists but about the music industry) trademark certain musical styles and sue anyone who uses them, regardless of whether they knew someone else had used that style before. That's bad news for independent artists, regardless of whether they use AI or not.

    • the problem is you cant trademark a style. this is just the music types doing what they always do. trying to abuse copyright because they cant keep up with the times. they will fail at it as they always do.
      • Re:Witch Hunting (Score:4, Insightful)

        by allo ( 1728082 ) on Sunday June 22, 2025 @03:34PM (#65468153)

        They may or may not succeed in making this a law, but the article suggests they certainly want to deplatform tracks that have stems similar to those in their portfolio. If YouTube and Spotify cooperate, independent artists will have to pay their tax for having music similar to anything in the vast portfolio of large recording companies. Music isn't that unique. If one has a large portfolio and looks at short segments, one always finds something similar and then can demand to be compensated for that similarity.

        • the problem is theirs only so many notes you can sting together to make sound. theirs a reason those cant be trademarked.
          • by allo ( 1728082 )

            It is also not reasonable. But suing people for music playing in the background of their home videos is not reasonable either, nor is wanting kindergartens to license the music they sing.

            • If someone pays full price for the latest Taylor Swift CD there should be a label on the product that it is 95% actual Taylor Swift with actual musicians/composers and not her singing a few partial song verses on top of a fully AI done sound sample, AI generated backing tracks, AI generated musical score, etc.

              What's at stake here is a century plus revenue stream of near guild based product being threatened by AI generated music.

              With AI generated music, anyone could start a streaming business and sell the 'b

          • That doesnt unfortunately stop trial lawyers, judges and juries from ignoring legal precedent, and the advice of musicologists who study this stuff in excruciating detail (music is *very* mathematical when you break it down, humans enjoy it and tend to write it intuitively, but under the hoods our brains are just giant pattern matching machines, and all the symmetries and patterns in music give it a nice big tickle and combined with the cultural stuff "I like this genre" etc, make music), and can tell you t

            • paired with clueless judges who dont know anything about copyright and we have modern copyright law.
      • by Zak3056 ( 69287 )

        the problem is you cant trademark a style.

        Trademark? No. But copyright infringement? Yes. See "Blurred Lines" for more detail.

        • Exactly the example I was thinking of.

        • This argument is dumb. What's the point a style? For others to copy the source. If you don't achieve mimicry, you're not stylish! Therefore you can't expect a monopoly on the style you claim to own.
          • by Zak3056 ( 69287 )

            This argument is dumb. What's the point a style? For others to copy the source. If you don't achieve mimicry, you're not stylish! Therefore you can't expect a monopoly on the style you claim to own.

            I don't disagree it's stupid, but it's the current state of the law. I don't know what to tell you, other than "I reject your reality and substitute my own" sounds great, but tends to not actually do anything.

    • That's bad news for independent artists

      That's exactly what an AI would say. *suspicious eyes*

      • by allo ( 1728082 )

        What an independent AI or a corporate AI would say?

        You can bet the music industry will employ their own "licensed AI." Any company with vast amounts of content will attempt to develop "ethical AI" and then argue that all other AIs are unethical and should be banned.

        That's likely why they prefer establishing a "pay for each second of similar music" system rather than banning AI music entirely. This allows them to sell AI-generated music, ensuring that every second of audio similar to content they already own

    • They will find a way to use this on humans. If you do anything fitting their pattern, then you are stealing their influence on you and must pay for the impression left on your brain by them. If you can do the same with a machine why can't you do it with the real thing?

    • Re:Witch Hunting (Score:4, Interesting)

      by AmiMoJo ( 196126 ) on Monday June 23, 2025 @06:01AM (#65469283) Homepage Journal

      Not just mistakes with YouTube, outright fraud too. A few days ago a video I uploaded had a copyright claim for a sine wave. A test tone. They allowed some asshat to claim ownership of that.

      • by allo ( 1728082 )

        If you just upload white noise, you're sure you will someday violate copyright. If they really match on sub-second stems, it gets way more likely than when you need to wait for a work of Shakespeare to appear in the output.

    • This could also lead to a dangerous system where companies (let's be honest, it's not about the artists but about the music industry) trademark certain musical styles and sue anyone who uses them, regardless of whether they knew someone else had used that style before.

      Good. You can see what has been happening. You just don't REALLY believe it can happen that way since it is CLEARLY unfair and rigged for the owners... and yet, it absolutely WILL happen. All pretense at Democracy is thrown out the window with the most recent "election". Now, let's see how far and how fast we fall... and we are falling. It is no longer theoretical.

  • by MpVpRb ( 1423381 ) on Sunday June 22, 2025 @02:26PM (#65467995)

    AI generated stuff may be fun to make and listen to, and it may be a fad for a short time, but it should be honestly labelled.
    The article mostly references AI generated frauds, designed to extract money from streaming services.
    For those with no musical skill, tech can allow them to make stuff that almost sounds like music if you aren't too critical. It can be fun to do, but don't expect to get famous or make a lot of money.
    I tried for years to use sequencers and virtual instruments to produce my compositions. At best, they sounded OK, but were missing the magic that happens when real musicians play live together.
    It's also interesting to observe that when the legendary musical genius Frank Zappa tried using sequencers and virtual instruments, his results also sounded robotic and unmusical. Even when they made small mistakes, his bands always sounded better

  • What if a competing AI managed to convince this AI to shut itself down? Or better yet, what if it convinced this AI that these aren't the songs you're looking for and it can go about its business. Move along. Move along.

    • by allo ( 1728082 )

      That doesn't work as entertaining as you think, but the concept is adversarial noise. You add a crafted noise to the song (if possible unhearable by humans) that detracts the classification from the song being AI. Of course that's a cat and mouse game and rarely as robust as you'd wish, if you look for example at glaze.

  • Why (Score:5, Insightful)

    by Valgrus Thunderaxe ( 8769977 ) on Sunday June 22, 2025 @02:54PM (#65468057)
    Does the "music industry" need to hunt down AI generated songs? It's not their songs and it's none of their business where they came from.
    • but we cant have ai generating new songs how will we take 95% of there money.
    • Re:Why (Score:4, Informative)

      by 0123456 ( 636235 ) on Sunday June 22, 2025 @03:32PM (#65468147)

      Because The Mushc Industry must be able to continue rent-seeking and collecting most of the revenue that actual musicians create.

      I know a guy who was a moderately successful musician in the 90s with a few songs near the top of the charts back then. He loves AI because he no longer needs a band and can create entire songs by himself.

      That terrifies The Music Industry. Because its income is based on gatekeeping and rent-seeking.

      • by allo ( 1728082 )

        They likely want to limit it early on. For two decades, people have been able to self-publish music on the internet and no longer need the music industry. For some reason, this never posed a significant threat to their income or control, as the ways consumers discover music remained largely the same. The challenge was more about how self-publishing musicians could find an audience.

        Now a new frontier is emerging, a whole new category of music. This could include not only AI-generated music created by humans

        • by 0123456 ( 636235 )

          I think it will be difficult for the music industry to push 'this song was created by an AI in which 0.1% of the training data was a song we own' because that would pretty much destroy the AI industry which is based on being able to use other people's content without paying for it and the Tech billionaires are richer than the music billionaires. You're right though that the tech companies like Youtube might agree to block AI music 'because of unresolved legal questions' or some such nonsense.

          • by allo ( 1728082 )

            I don't think they will block. They will agree to monetize it for the people holding the copyright for the stems that are allegedly similar. The problem is, that if you want to do music for more than yourself you probably want to have it on YouTube, because people are there.

        • Maybe I'm wrong to equate this, but what is the difference between me creating a song on my own versus me using ai to create a song? Neither one of those interactions actually requires RIAA permission. I can create and record my own music and give it away all I want. So why can't I also use AI to make give away music? See what I'm saying?

          I could also make and record my own music and charge people for it. That doesn't require me signing with the RIAA. Sure, discovery may be really hard, but then again, maybe

          • Actually, now that I think about it, I just described Myspace, right?

          • Creating it on your own requires thought and takes skill. Creating it with AI does not. What's the point of paying to listen to a song you could have made yourself?
            • by allo ( 1728082 )

              "Creating it on your own requires thought and takes skill. Creating it with AI does not."

              The question is, if you are hearing the song for the labor that was needed or for the song. I wouldn't say it doesn't need skill, because a good AI song still requires thought, it just doesn't require finding five people trained with different instruments cooperating on it, but provides you the tools to do it alone. The really good songs will also require you to know about music, audio, processing ... that's not what yo

              • You are missing the point. Let me put it another way... A website sets up and charges $10 a month to hear all the AI music you want. Since the barrier to entry is so low, another site will pop up at $9 a month, then one at $7 and so on. There will be no discernible quality difference between them, and they may even use the same AI. This is in direct contrast to a record label which have to actually put a great deal of effort into finding the next big thing.
                • by allo ( 1728082 )

                  I'm not sure if this is a direct reply to this post. I was answering the question "Why should I listen to something when I can create something own?"

                  The AI powered radio is definitely something that will happen, but you will be always have better results with at least cherry picking, but most people who publish their works will probably put a lot more effort in it.

                  You can see it from the other side, why should I publish something I am not proud of? Just clicking "Get music" and then clicking "Upload" is som

                  • But what is there to do? No matter how many prompts it's going to be less than learning to play an instrument and playing an actual song. Also I don't really see how you can describe a sound so that it emits to your exact specification. You may say something like 'play a crunchy sound', 'no crunchier', 'no lower' and then be done, as opposed to adjusting 50 knobs to make that sound exactly like you want.
                    • by allo ( 1728082 )

                      That's a really good question for future systems, as I also see that current ones lack good control over the results.

                      In the end, you have a similar problem like you had with images: An image can convey more meaning than a thousand words. What did people do? They developed Image-to-Image, Controlnets (for things like poses, normal maps, depth, segmentation, etc.), LoRA for styles and characters, and now you can control all the details that won't work with just a text prompt.

                      I'm not sure about the controls. I

    • Does the "music industry" need to hunt down AI generated songs? It's not their songs and it's none of their business where they came from.

      Because of music fraud. There's a very real problem right now of fake music being uploaded under real artists names for the purpose of click farming. It's the same as AI generated books that are nothing but incomprehensible trash that pretends to be from real authors to capitalise on their names.

      The problem is the bar for checking currently is "is this music yes / no", and beyond that uploads to streaming sites are generally allowed. It's then relies on a manual process to call it out and get it taken down

      • It would seem the issue there is "uploaded under real artists names", not whether it was AI generated.

        It would also seem relatively easy to prohibit uploading a track labelled "Taylor Swift" unless it's from the official account, matches an official audio signature, or is clearly tagged as a cover?

        • So protection only for the most famous people and force everyone else to jump through extra hoops? Tell us you don't realise you're contributing more to centralised control of an industry without telling us. Also remixes and covers are a thing that are allowed. Are you now going to get every artist to jump through hoops asking for permission from every single person on every single platform independently?

          Also also there's the topic of algorithm farming - the process of creating songs similar in style and su

  • by bradley13 ( 1118935 ) on Sunday June 22, 2025 @04:10PM (#65468233) Homepage
    Way too much music could easily be replaced by AI. The music industry is worried...
    • The future of human music may be musicians as live performers rather than as recording stars.
    • by mjwx ( 966435 )

      Way too much music could easily be replaced by AI. The music industry is worried...

      Whoa, are you honestly suggesting that 40 years of simplifying music, cookie cutter performers, formulaic writing and autotune have made a market where people can't tell the difference between human and computer produced music?

      I guess they dug their own graves here.

      I think the music industry's problem isn't that AI is producing music, getting rid of the human elemet is a wet dream of music execs. Computers don't get drug habits, they don't have delusions of grandeur and think they don't need a label,

  • The countdown to record companies using ai to collaborate artists with no compensation.
  • by FudRucker ( 866063 ) on Sunday June 22, 2025 @05:15PM (#65468373)
    That their cozy money making monopoly/racket is under threat by people that dont want to be a cog in their machine, I bet the buggy whip company felt the same way about the invention of the automobile, hey RIAA it is none of your business what other people do with their music and your business model is obsolete anyway I can compose my own music and do with it what I want and you have no say in the matter
  • by RossCWilliams ( 5513152 ) on Sunday June 22, 2025 @07:13PM (#65468599)

    with royalties based on creative influence

    If you are a musician you better be careful in interviews about who your influences were. The corporations who own their music are going to come after you.

  • ...from the music industry. They will never tell you anything that is not a bloody lie.

    --Lemmy

  • Music has value because it requires a skill to create it, so the people who don't have that skill pay for it.

    Why would anyone pay for AI music they could make themselves or could even be made by a child? There is nothing that went into it.
    • People do not pay for music because it requires skill to create that they don't have.

      People pay for music because they feel it's worth money to get to listen to it.

      People will pay for a service which AI-generates music for them, especially if they can give some kind of input (good old thumbs up and down would suffice) and the system learns what kind of music they would like to listen to over time and gives them more of that.

      Yes, I do appreciate music for skill that went into it, but I also appreciate music

    • Convenience and discovery are two reasons.

      Also, not everyone wants to know how the sausage is made; They just want to enjoy it in blissful ignorance.

      • And that type of music has always existed. We called it 'elevator music'. Now it will be called AI music and it will be everywhere. It's not going to be the kind of music that people will pay $20 for a disc of because there will be so much of it and there will be such a low bar to entry making it that it will be free.
  • Imagine taking tens of thousands of these old jazz records that incorporate direct references to well-known melodies and hooks during a solo as a 'tip of the hat' and quickly switch to something else, leading to them being retroactively sued for appropriation by other music publishers whose song they 'infringed' because the one of the players evoked the melody during their solo.

    This scheme of splitting up songs into their stems and attempting to claim copyright ownership of specific elements truly feels Orwellian, and could lead to so many frivolous lawsuits as well as false positives that one could question whether it'll really lead to better outcomes for musicians and songwriters.

    One thing that's for sure: For a while, if such a monstrosity comes to pass it may temporarily lead to better outcomes for the lawyers and corporate beancounters who control it, which probably is why these tech companies are trying to sell them the service.

    But in reality it feels like yet another arguably futile attempt by middlemen and gatekeepers to somehow figure out how to monetize music in ways that increasingly reek of desperation. Because at the rate of new music being produced (over 4 million new tracks per month) the amounts that can be collected are going to mathematically keep getting smaller and smaller, while the cost of administering all of this and dealing with false positives and litigation is only going to increase. Arguably to the point where it won't make economic sense.

    Besides which, real-time generative AI is making giant strides to be able to create music that's passable, soon to be available on-demand and on-device and will probably satisfy the type of casual listeners who play hour-long mixes of lo-fi chill music as background during their work, at the health club or during a commute.

    Regardless, at the very least the information in this article feels like it's yet more confirmation that copyright in its present form appears to have outlasted its useful life cycle and has fallen victim to another type of enshittification, something which certain pundits might argue was inevitable and a long time coming.
  • Add a human element to your AI-song to defeat the "AI-generated" label.

  • New artists don't need the big labels to get the reach that they could provide in generations past. The Internet and the ability to record/mix-down in small studios has revolutionized that. And booking shows, particularly the big festivals, has gone rogue too. The only hope of survival for these big labels is to squeeze every last bit of juice out of the older content that they own. They're doomed to fail given the advent of AI. Music can be generated faster than the labels can sue.

    I'm not even a littl

  • Any detection will likely look for certain "fingerprints" placed into the audio feed. It doesn't seem like it would be hard to develop a tool that takes the AI generated stuff and "dirties" it up enough to eradicate any of those fingerprints yet be indistinguishable to normal people. Basically like how a jpg gets a super close depiction of an original image, but it has lost some aspects of the original.

How much net work could a network work, if a network could net work?

Working...