Like many other kids in the 1970s, I learned important lessons about the perils of bootlegging and the meaning of authenticity from a two-part episode of What’s Happening!!, when Rerun is caught taping a Doobie Brothers performance in a high school gymnasium: Don’t get mixed up with the criminal syndicates that force teenagers to steal intellectual property; you may not be so fortunate as to have a cane-wielding Jeff “Skunk” Baxter come to your rescue, despite your transgressions and your ingratitude.
Presumably these episodes were part of the entertainment industry’s response to the advent of portable tape recorders; they were a kind of progenitor of the 1980s “Home Taping Is Killing Music” campaign. Media companies seemed to readily assume that only intimidation tactics would prevent kids from using technology against them — that the value of their product depended on restricted access, secured through implied force and guilt. (Michael McDonald would be so disappointed in you ...) They didn’t press the idea that taped music entails a loss in fidelity, probably because it may have hampered the nascent cassette sales market. Instead they tried to half-heartedly impose a “correct” way of using media technology on a cohort that was just getting their hands on it, doing more to enhance that technology’s apparently subversive promise than to limit how it was used. Maybe that was the idea all along.
Something similar seems to be going on with “AI-generated” media and its impact on “legitimate” media, as if these were clear and distinct opposites. A recent 404 Media report by Jason Koebler about “leaked” Harry Styles songs being sold on Discord-driven black markets revolves around this distinction, devoting many paragraphs to forensic analysis of various sound waves produced by “real” and “fake” recordings. But it is not as though sophisticated “AI” technology has ever been necessary to falsely attribute tracks to artists or to make plausible fakes. You didn’t need AI to make people in 1976 think Klaatu were the Beatles, just a story-hungry press.
And you didn’t need any sort of digital technology to create a semi-clandestine, pseudo-exclusive market for bootlegs either. Sometimes there would be a case of cassettes with blurry mimeographed labels behind the counter at a used record store, or a rack tucked in the corner of a head shop. The bootleg market back then seemed to work on some of the same principles as the leaked song market Koebler describes: It took advantage of fans who wanted to purchase their way into some new level of devotion or obsession, the possibility of belonging to a semi-illicit society that had special access. By buying a Memorex tape of Led Zeppelin’s 1979 Knebworth performance from a flea-market stall out behind the Q-Mart, 11-year-old me could suddenly belong to an in-the-know elite, so much more advanced than anyone content to just listen to In Through the Out Door. The fact that everything about the tape sounded terrible didn’t compromise its occult allure for me at first, but eventually it made me feel duped and embarrassed and I ended up taping over it. (Unfortunately I didn’t also then get over the idea that I might make up for my nonexistent personality by amassing obscure rarities to brandish. I’m still working on it.)
In Rerun’s time, it wasn’t as though the very existence of tape recorders made it possible to sell any kind of cassette as a potential bootleg, under the premise that someone could have carried it into a Doobie Brothers concert in a Panasonic home recorder. Yet the leaked-song sellers are trying a variant on that as a sales pitch. As Koebler explains:
The people selling these songs are also posting both snippets of what may be legitimate leaks as well as snippets that could be AI-generated, interspersing songs that are possibly AI-made with tracks that could possibly be real, unreleased songs. To complicate matters even more, the same people who are selling leaks they insist are genuine are also posting tracks they disclose are AI-generated in an attempt to prove their leaks are real. Their argument is that the AI tracks sound bad in contrast to the “real” leaks, therefore proving they’re authentic.
As if all fandoms don’t exist precisely to facilitate a state of perpetual squabbling, Koebler claims that “the mere specter of AI has fully split” the One Direction fandom, “driving people nuts.” But AI seems more like a red herring than a specter in this case. If anything, “AI” works here as a kind of rhetorical magic meant to authorize the same kind of determined suspension of disbelief necessary for any con to succeed. The article is pitched toward the idea that better AI means more convincing fakes, which means more confusion in the marketplace for “genuine” entertainment. But once you leave the realm of official products — and there is no confusion about which material has been officially released — you are no longer in a marketplace that works on rational claims regarding quality or integrity but one that is largely about hopes and fantasies, and murky, half-acknowledged affects. You are at the Q-Mart looking through bootlegs, wondering who they might impress.
It seems to me that the lesson of the Styles “leaks” is not that AI might blur the line between real and fake so much as redraw it more emphatically. The leakers have used the general climate of AI hype — which usually aims to dupe the much bigger fish of venture capitalists, government agencies, and institutional investors — to lull fans who already want to believe they could have special access to secret materials into accepting the limits of generative models as defining the realm of the real beyond them. The idea seems to be that we all know how powerful and unstoppable AI has become, and of course anyone trying to fabricate anything would use AI technology (and not the timeless tricks of the grifter trade). So if a given generative model can’t seem to produce a particular sort of artifact on demand, any examples of that artifact have to be genuine (whatever that could mean in such a context).
Obviously, it would take some motivated reasoning to fall for this. (What? The AI-generated Mark Rothko above has not convinced you that I have real, never-before-seen Rothkos in my possession that I’d be willing to show you — for a fee?) But it seems indicative of how the continual evocation of AI’s magical capabilities could produce a narrowing of our imaginative horizons, a retrofitting of computational capacity and technique over what we believe artists can do.
The elaborate comparison of AI Harry Styles with the “real” Harry Styles doesn’t seem to establish the priority of either but their mutual interdependency. “Authenticity” depends on the quality and availability of “fakes,” just as “live” music depends on recordings. We will now only know what Styles “really” sounds like if we can compare their singing with a computer simulation, much as we know what the Grand Canyon or the Mona Lisa “really” looks like only by contrasting it in our minds with all the photographs we have already seen. The “realness” is not simply a straightforward matter of identity or difference — the real must resemble the fake enough for the subtle distinctions between the two to emerge or take on a stronger meaning.
Of course, trying to fix the “reality” of an artifact in the abstract, divorced from any specific context, seems somewhat pointless. The definition of what counts as real shifts depending on the situation and the stakes; the “aura” of any particular work is always historically contingent. What if Styles admitted the leaks were “real” clips but that they were only made to seem old and rare, and were fabricated as a publicity stunt? Would they seem more or less real to fans? Would anyone care about them in the same way if Styles issued official versions tomorrow? Isn’t interest in the leaks intensified by their indeterminate status?
In Bunk, Kevin Young offers this description of the audiences willingly scammed by P.T. Barnum: “Those who paid to see the humbug surely experienced a number of things, not least of which was a feeling of being fooled, but also a not unpleasant realization at how foolish they had been to be so eager. How could I have believed in mermaids?” Can you hear the mermaids singing, or won’t they sing to you?
I experienced this same thing months ago in a server for a frank ocean leak. The seller was booted for allegedly hawking AI songs for money - but then it was overturned, and it turned out the seller was selling real leaks. But the impression that the leak was AI actually affected how people perceived it - "this is mid because it's AI" - so when it was revealed it was actually real, and not AI, it felt even more disappointing. (It turns out a scrapped demo is often scrapped for a reason!) But it was truly disorienting to witness, and to acknowledge that I had no way by ear of telling if it was real or fake. I was convinced it was real, then convinced it was AI, and then convinced again that it was real. I could not trust my judgement, I only had to trust what was being told to me. Obviously, like you say, it's one thing in the realm of bootlegs, and another in official releases -- but it made me realize that the impression that something is falsified media can affect how you perceive it even after you learn it's real.