• AI music is fine until it starts pretending to be real people

    From TechnologyDaily@1337:1/100 to All on Tuesday, August 26, 2025 04:15:08
    AI music is fine until it starts pretending to be real people

    Date:
    Tue, 26 Aug 2025 03:00:00 +0000

    Description:
    AI-generated music getting released under real artists names is a growing threat to creativity and the integrity of music streaming platforms.

    FULL STORY ======================================================================

    AI-generated music is becoming more widespread but not necessarily popular . And that's just the publicly acknowledged AI music. Now, artists are dealing with seeing their name and voice attached to music they never performed or approved of, even if they passed away decades ago.

    The most recent high-profile incident occurred when English folk singer Emily Portman heard from a fan who liked her new release, except the album , Orca, though released under her name, was entirely fake. The whole thing had been pushed live on Spotify, iTunes, YouTube, and other major platforms without
    her knowledge or consent.

    A post shared by Emily Portman (@emilyportman)

    A photo posted by on

    Portman took to social media to warn her fans about what was happening. The fact that the AI could mimic her artistic style well enough to trick some
    fans just added to the creep factor. It took weeks for Spotify to address the problem, and you can still see the album on Spotify even if the music is
    gone.

    Portman joins a litany of acts, from pop artist Josh Kaufman to country artists Blaze Foley, who passed away in 1989, and Guy Clark, who died in
    2016, in having her work mimicked by AI without her approval.

    It seems weve moved past the novelty of AI remixes and deepfake duets into digital identity theft with a beat. The thieves are often good at being quiet in their releases, able to score whatever royalties might trickle in.

    Further, even getting the music taken down might not be enough. A few days after the initial incident, Portman found another album had popped up on her streaming page. Except this time, it was just nonsense instrumentals, with no effort to sound like the musician. AI's future sounds

    Having scammers use AI to steal from actual artists is obviously a travesty. There are some blurry middle grounds, of course, like never-real musicians pretending to be humans. That's where AI-generated band Velvet Sundown
    stands.

    The creators later admitted the origin of the AI band, but only after
    millions of plays from a Spotify profile showing slightly uncanny images of bandmates that didnt exist. As the music was original and not directly ripped from other songs, it wasnt a technical violation of any copyright laws. The band didnt exist, but the royalties sure did.

    I think AI has a place in music. I really like how it can help the average person, regardless of technical or musical skills, produce a song. And AI tools are making it easier than ever to generate music in the style of
    someone else. But, with streaming platforms facing 99,000 uploads a day, most of which are pushed through third-party distributors that rely on user-submitted metadata, its not hard to slip something fake into a real artists profile. Unless someone notices and complains, it just sits there, posing as the real thing.

    Many fans are tricked, with some believing Orca was really Emily Portmans new album. Others streamed Velvet Sundown, thinking theyd stumbled onto the next Fleetwood Mac. And while there's nothing wrong with liking an AI song per se, there's everything wrong with not knowing it is an AI song. Consent and context are missing, and that fundamentally changes the listening experience.

    Now, some people argue this is just the new normal. And sure, AI can help struggling artists find new inspiration, fill in missing instrumentation, suggest chord progressions, and provide other aid. But thats not whats happening here. These are not tools being used by artists. These are thieves.

    Worse still, this undermines the entire concept of artistic ownership. If you can make a fake Emily Portman album, any artist is at risk. The only thing keeping these scammers from doing the same to the likes of Taylor Swift right now is the threat of getting caught by high-profile legal teams. So instead, they aim lower. Lesser-known artists dont have the same protections, which makes them easier targets. And more profitable, in the long run, because theres less scrutiny.

    And there's the issue of how we as music fans are complicit. If we start valuing convenience and novelty over authenticity, well get more AI sludge
    and fewer real albums. The danger isnt just that AI can mimic artists. We
    also have to worry that people will stop noticing, or caring, when it does. You might also like Spotify had to pull an AI-generated song that claimed to be from an artist who passed away 36 years ago AI can write a hit song, but
    it cant lift your soul or break your heart Is AI bad for music or is it just another step in the auto-tune timeline?



    ======================================================================
    Link to news story: https://www.techradar.com/ai-platforms-assistants/ai-music-is-fine-until-it-st arts-pretending-to-be-real-people


    --- Mystic BBS v1.12 A49 (Linux/64)
    * Origin: tqwNet Technology News (1337:1/100)