Welcome to the latest installment of “Technological Panic for People Who Still Use Paper Maps.” Today’s guest of honor is Murphy Campbell, a folk musician who—prepare your fainting couches—discovered that the internet is a lawless wasteland where people steal things. The Verge is framing this as a “brewing storm” around AI and a “broken copyright system,” but let’s be real: it’s mostly just a story about how Spotify is essentially three raccoons in a trench coat and how we’ve forgotten that “AI detectors” are about as scientifically accurate as a mood ring.
First, let’s talk about the “shock” that AI could possibly target a folk musician. The assumption here is that AI has some sort of aesthetic dignity—that it only wants to replicate Drake or The Weeknd because it has a taste for Top 40. Campbell seemed under the impression that folk music had a “barrier” to entry. Newsflash: if it’s on YouTube, it’s data. If it’s data, a scraper will find it. Believing your genre is too “authentic” for an algorithm to mimic is the ultimate hipster hubris. AI doesn’t care about your artisanal, hand-crafted vocal fry; it just sees a frequency map and thinks, “I can math that.”
Then we have the heavy lifting done by “AI detectors.” The article mentions that two different detectors “supported her suspicions.” This is groundbreaking journalism, provided you ignore the fact that AI detectors are notoriously unreliable. These tools are the digital equivalent of a “guilty” verdict from a Magic 8-Ball. They frequently flag the US Constitution and the Bible as AI-generated because, turns out, humans and machines both like patterns. Relying on an AI detector to prove a copyright claim is like using a divining rod to find a leak in your plumbing—you’re mostly just pointing at things and hoping for a narrative.
The article also takes a swing at the “broken copyright system.” While it’s fashionable to bash copyright law, the system isn’t “broken” because a fake song ended up on Spotify; it’s working exactly how the digital distribution model intended. Services like DistroKid and TuneCore have made it so easy to upload music that literally any teenager with a laptop and five dollars can pretend to be a folk legend. This isn’t a “copyright troll” problem; it’s a “low barrier to entry” problem. We spent twenty years demanding the democratization of music distribution, and now that the “demos” (the people) are using it to upload AI-generated garbage, we’re suddenly shocked that the gates have no guards.
Finally, the term “copyright troll” is used here with the grace of a sledgehammer. Traditionally, a copyright troll is someone who buys up patents or copyrights specifically to sue others. In this case, we just have a common digital impersonator. But “Impersonator Uses Basic Web Tools” doesn’t get the clicks that “AI Copyright Troll” does.
The reality? This isn’t a storm; it’s a drizzle. Musicians have been dealing with bootlegs, covers, and identity theft since the days of sheet music. The only difference is that now we can blame a “black box” algorithm instead of a guy in a basement with a dual-cassette deck. If you’re a musician in 2024 and you’re shocked that your public YouTube videos are being used to train models, I have a bridge in Brooklyn to sell you—and yes, I’ve already uploaded a generative AI photo of it to Instagram.
In the end, Murphy Campbell’s situation is annoying, sure. But framing it as a systemic failure of technology rather than a predictable byproduct of the “upload everything, vet nothing” era of streaming is just lazy storytelling. Welcome to the future: it’s exactly like the past, just with more processing power and fewer royalties. Regardless of whether a human or a bot stole your song, the payout from Spotify is still going to be $0.0003. Now that’s a fact you can take to the bank—if the bank hasn’t been replaced by an AI yet.

Leave a Reply