A new music group that had reached one million listeners on Spotify turned out to be a fabrication.
The band Velvet Sundown, which garnered over a million streams on Spotify, made headlines—only for it to be revealed that the group didn’t exist at all. All of the songs, promotional images, and even the band members’ backstories were entirely generated by artificial intelligence.
According to The Guardian, the project was initially introduced as “an artificial construct guided by human creativity” and released two albums, Floating On Echoes and Dust And Silence. The music resembled classic folk rock, similar to the works of Crosby, Stills, Nash & Young. However, things gradually became more complex.
One unofficial member of the project confirmed in an interview that the songs were created using the AI tool Suno, and the entire project was meant as an “artistic prank.” At first, Velvet Sundown’s official channels denied the allegations, claiming that the group’s identity had been hijacked by others. But hours later, they issued an official statement, describing the project as “not entirely human, not entirely machine,” but something in between.
The Velvet Sundown case has sparked serious discussions about transparency in the digital music industry. Critics argue that the lack of legal requirements for labeling AI-generated content on platforms like Spotify leaves audiences unaware of the true origins of what they’re hearing.
Roberto Neri, CEO of the Ivors Academy, warned that projects like Velvet Sundown—created without direct human authorship—raise critical questions about copyright, consent, and transparency. He emphasized that while AI can enhance musical creativity when used ethically, the industry is currently facing much deeper challenges.
Sophie Jones, Chief Strategy Officer at BPI (the British music industry trade body), called for clear labeling of AI-generated works: “AI should serve human creativity, not replace it.”
Meanwhile, Liz Pelly, author of Mood Machine, cautioned that independent artists are particularly vulnerable to exploitation, as AI models can be trained on their work without consent.
In the case of Velvet Sundown, it remains unclear what data the AI models were trained on—a point that critics say may signal a disregard for the rights of independent musicians.
For some, the emergence of such projects represents a natural step in the fusion of music and technology. But the absence of clear regulations increases the risk of misuse. Jones stated: “Tech companies have fed artistic works into AI models without permission or compensation, in order to compete with human artists.”