Spotify selves
I've written a few things before about algorithmic recommendation, most of which boil down to the idea that such systems reshape users to experience desire on schedule and in pre-formatted ways. This standardization makes it easier for platforms to sell their users on to advertisers: What those users are presumed to "want" has been turned into a kind of rationalized, structured data and A/B tested within algorithmic feeds.
Eric Drott's paper "Music as a Technology of Surveillance" lays out how aspects of this process work on Spotify. What you stream and when you stream it and how exactly you interact with playlists and other recommendation features is turn it data that supposedly reveals things like when you are your highest susceptibility to algorithms, what the pattern of your day looks like, what sort of music serves as what sort of emotional trigger for you — what makes you most vulnerable to persuasion and when. As Drott points out, the platforms have every reason to inflate these claims about how well they can predict and manipulate users, in ways that other platforms can't (in some senses they are competing for the same ad dollars), but it nonetheless gives a clear sense of what they are trying to accomplish, what sort of "problem" they are throwing all the VC money and data scientists at.
Spotify aims to turn the extent to which music has a special relationship to "who we really are" or what really moves us into vulnerability. In ad-funded algorithmically driven streaming platforms, "the very qualities of music that people put to work in shaping their everyday lives and regulating their emotional lives are increasingly turned against them," Drott writes. If your taste can be made predictable, you can be controlled. But what I think is most interesting about Drott's paper is his demonstration that this sort of control involves splitting the self into a series of predictable yet discrete moments. Every ad auction is a blank slate, and every time we are the captive audience for one, we are interpellated as having certain Pavlovian responses to whatever experience (musical or otherwise) the ad is being sold against.
Drott draws on Deleuze's "control society" essay to detail these "efforts undertaken by marketers, data brokers, and commercial media outlets to disaggregate users, transforming them into what Deleuzians would refer to as a collection of dividuals: the various sub- or pre-individual elements out of which individuals are assembled (affects, behaviors, drives, habits, physiological responses, and so on)." This allows streaming services to prescribe music that corresponds and accentuates particular elements according to trends in past data. Drott writes:
For instance, a common approach to the problem of how to automatically curate playlists sensitive to changes in listeners’ context is to subdivide a single user profile into a number of discrete profiles, differentiated according to situational factors such as time of day, ambient temperature, social setting, or geographic location. In the words of one Spotify employee: 'We believe that it’s important to recognize that a single listener is usually many listeners, and a person’s preference will vary by the type of music, by their current activity, by the time of day, and so on.' "
This approach both fragments and consolidates the self: It breaks apart one unity (the self that contains different moods and variegated tastes and experiences but sustains a continuity of identity through it all) but constitutes and solidifies another (the self that is wholly identical with what it consumes, or whatever that product is supposed to represent). That version of the self can know authenticity as the experience of the right song at the right time, but it comes at the expense of the self's richness and its temporal continuity. The individual "authentic" moments never add up to an authentic-feeling life.
Seeing people as discontinuous, as more or less totally determined by their particular circumstances, allows streaming services to claim that they can essentially program people with music. Ultimately Spotify wants to turn the environmental factors associated with one's particular music consumption into behavioral triggers: A certain time of day will require certain streamed content to feel "real" or feel right — to make you feel like "yourself."
That is to say, the incentives Spotify has to make listeners into attractive targets for advertisers also makes Spotify work hard to eradicate listeners' tastes as they might have manifested before (as a disposition, as curiosity, as a kind of flexibility that can adapt to what it hears, etc.) and replace that with an involuted desire to know oneself (which changes from song to song and eludes determination from within one's consciousness). In other words, the sort of listening Spotify wants to inculcate is one in which every song announced something specific and exploitable about the listener in a given moment — there is no "disinterested" listening, nothing "aesthetic" about music consumption (at least in Kant's sense) within the listening environment Spotify sets up.
There is no "desire for music" (or anything else) that pre-exists and waits for an opportunity to be fulfilled (no ontology of taste, only emergent affect); the desire has to be constructed around what becomes available to you. When we are always having media directed at us, the efforts to construct our desires become more strenuous and contradictory. Amid these competing claims, we may want music to reign supreme and be able to "express" (that is, dictate) our true being, but in the realm of streaming (in the realm of surveilled consumption), all it can do is suggest more correlations, more momentary selves; the effectiveness of music is then in how it makes us want to hear another song, another chance at building up our being into a specific somebody.
Much as TikTok's algorithm, through the content it selects, begins to posit a self for user to become and relate to or consume as a fantasy about who they seem to be, so would Spotify's recommendation system. In other words, recommendation systems in general don't "recommend" so much as try to reformat the self so that it can power the underlying algorithms and produce structured data for ad-targeting.
Sometimes I wonder whether recommendation systems can be used negatively — that you can enrich yourself by seeing what they recommend and rejecting it, forcing yourself to do something different. I tend to think that "taste" is worthless if you can fully articulate it to yourself, explain it, turn it into an algorithm in your own head. I want to think of taste as being an expression of what I don't know about myself but can only live out as experience, as affect, as feeling that hits me as being both spontaneous and from within (even though it is, of course, highly structured by how I am situated in the world, what sort of habitus I've had engrained).
I imagine that algorithmic recommendation systems are working hard to commodify that sensation. They don't care if you reject them; that just refines their data set, tells advertisers something important about the kind of person I am. Streaming platforms have no space for negativity (in the philosophical sense) —all the data captured about our behavior is taken into account, taken as meaningful fact. They will negate any attempt at negation, any effort to posit a "self" that is ineffable and exceeds its representation as manipulatable information.
The recommendation systems can't be used as a means of identity play or of structuring the self as exploratory or fluid or provisional, or experiencing "life as a journey" or anything open-ended. They reductively make every action purely functional — every listening experience is fed back into future experiences as being more of the same, so that one's listening experience is always already nostalgic — always just a repetition of an experience you already had that has already been permanently classified as making you more or less likely to listen to some ad or other.