The Old Identity God
It is easy to conclude from looking at Facebook or Twitter that these platforms are shaping our thoughts — that their interfaces' affordances, algorithms, and metrics are dictating the sorts of experiences we can have: All the millions of updates and tweets seem to prove that, while all the thoughts that take no particular form and go unrecorded prove nothing.
To put that another way, social media's vast archive encourages a behaviorist point of view. What is observable, what is captured in data, counts. Unrecordable, unobservable thoughts are basically nonexistent. Interiority is a myth; consciousness a kind of epiphenomenon, at best. It's mere metaphysical speculation to believe we can have thoughts and experiences that leave no material traces. We are equivalent to our data trail. It may seem to follow, then, that the more data we generate, the richer a human being we'll prove to be.
But people don't generally use social media to try to document everything indiscriminately. The appeal tends to be the opposite: the illusion that we can curate our presentation of self, have control over it. That control is illusory — we can't fully dictate who sees what when, or what they will make of it. Moreover, as the information we willingly provide begins to accumulate, our omissions can come to seem more glaring, more telling. Maybe we really live in those omissions. But social media companies typically want to fill in the gaps.
Social media let us think we can express how we want to be seen as something separate from who we "really are." Yet behind the scenes, social media companies work hard to augment our self-performances with information derived from our unintentional data traces, to produce a more "accurate" version of ourselves. Judith Duportail offers Tinder an example in this Guardian article, in which she procures the full dossier that Tinder keeps on her and details how the company uses data from other sources to augment her internal profile and dictate what she sees on the platform. Google and Facebook also "onboard" data from disparate sources (cell phone data, location data, purchase histories, behavior on other websites, where your mouse is on the page, where you are looking etc. etc.) and amalgamate it into their unitary profile of you. This data is a contemporary analogue for the unconscious; we can't access it yet it can dictate the way we experience our lives. It is as if those companies' servers house our interiority (if you believe in such things).
From the behaviorist point of view, what we think or how we want to be seen — what we volunteer — is among the least valuable data points these companies have. What we actually do (as opposed to what we think about) is what really matters. So the profile we deliberately curate on these platforms is in some ways a distraction from the real, unseen profile being used to direct algorithms and target ads —the version of us that is up for sale, that is social media's main product. What we share only makes clear what we're trying to hide, which is where the exploitable vulnerabilities are likely to be found.
In an essay at 3 Quarks Daily, Samir Chopra takes social media behaviorism to its logical conclusion, positing a scenario in which companies develop means to harvest our thoughts before we falsify them with our conscious intentions:
In meditation and mindfulness sessions, we become aware that an endless stream of thoughts parade through our mind. This state of affairs suggests a sci-fi variant of our current social media tools: an automated Facebook status and tweet harvester, perhaps part of the software on a chip embedded in our brains at birth, which in some not-so-distant future would post them—as they occur—on the social network of that time. This would automate what we strive to do now: think angry, witty, sad things; rush to enter them on our social media statuses ... Perhaps such an embedded social media chip could ‘live-stream’ in the literal sense: a multi-media, multi-modal streaming of our stream of consciousness.
To get the real truth, we need to disintermediate our own consciousness so that our authentic thoughts can be published and our authentic feelings revealed.
This reminded me a bit of some of the arduous 1970s literary projects Chris Kraus describes in After Kathy Acker, in which writers tried to methodically describe their experiences in order to find the limits of their subjectivity. The hyperdocumentation was meant to lead not to a coherent, indexable, and searchable self-representation (what social media can seem to promise) but to a total fracture: "Crack up the old identity god," as Acker puts it in a letter. (In that spirit, here is a mashup of Kathy Acker's texts and Cathy the cartoon.)
Kraus cites Bernadette Mayer's Memory (1972), twice quoting art historian Liz Kotz's account of the project as one in which "the very intensity of surface detail paradoxically atomizes personal experience into an endless flow of pictures and recited recollections; its authorship is distributed among various functions that don't necessarily cohere into a single self." That makes the project sound less like a grueling durational performance-art piece and more like Instagram. Which suggests that conversely, we could think about Instagram as a grueling durational performance-art piece.
These projects (and social media too?) call the status of the "stream of consciousness" into question. Is it something that can be tapped, as Chandra posits — a flow that we or some transistorized thought harvesters can simply tune into? Or is constructed by the desire to observe it? Kraus quotes a passage from an early diary of Acker's: "Magical connection (real magic) between putting-down-word & reality ... If I think too hard or too programmatically & fast I get away from what's happening." In that view, what is "happening" is precisely not the stream of consciousness, the language of thought. That language is treated instead as a set of screen memories. Thoughts in language are translations of experiences already past, already lost. Our direct experience, paradoxically enough, is inaccessible to ourselves. But the data is out there somewhere.
In a passage Kraus transcribes from the opening monologue of The Blue Tape, Acker describes "strange memory experiments" in which she was "trying to find out what the structures or structure of intentions were behind my remembering." Of course, social media have algorithms for that, surfacing "memories" and anticipating intentions based on the data it has collected about users. But do these ever ring entirely true? When I encounter these specters, they feel less like me than fictionalized representations of the past me or future me that the present me can consume as fantasies of sorts. "Inspired by your browsing history" — yes, I am inspired!
In her Guardian article, Duportail seems to be less than inspired by the experience of reading her Tinder dossier:
Reading through the 1,700 Tinder messages I’ve sent since 2013, I took a trip into my hopes, fears, sexual preferences and deepest secrets. Tinder knows me so well. It knows the real, inglorious version of me who copy-pasted the same joke to match 567, 568, and 569; who exchanged compulsively with 16 different people simultaneously one New Year’s Day, and then ghosted 16 of them.
I thought it was weird that she would regard that reading experience as capturing "the real, inglorious version" of herself. I guess she is reacting to the way her self-presentations come across to her differently after time has passed or when the contradictions in them are brought together. She sees herself as other than what she originally intended, which is another manifestation of the behaviorist bias: What is "real" about ourselves is only that which we didn't do on purpose. Our intentions screen reality. Only the interpretations from the outside, undistorted by our "hopes and fears," can grasp who we really are.
Reading the novel (of sorts) of her Tinder profile allows Duportail to have that outside view on herself. She can treat herself as a character, and characters in novels exist entirely to be known, to be decoded, to be interpreted. This makes them realistic (if not quite real).
Duportail's article plays up the angle that being processed and sorted by outside forces is a new and terrible fate:
As a typical millennial constantly glued to my phone, my virtual life has fully merged with my real life. There is no difference any more. Tinder is how I meet people, so this is my reality. It is a reality that is constantly being shaped by others – but good luck trying to find out how.
But our lives have always been constantly shaped by others. And this has always worked in ambiguous and unfair ways. It seems like the constant connectivity of phones give a different, more tangible sense of that — a feeling of measurable interconnectedness that people seem to generally crave. Apps tend to try to legitimize their data collection with this in mind, using algorithmic processing to try to make users feel known and recognized, to facilitate their relationships with people they care about, to show them that their behavior matters and has consequences, that it makes other people react.
Consumerism tends to lionize a radical individualism that runs counter to how we are "shaped by others." This dissonance, this ambivalence is reflected in how social media platforms isolate us as discrete profiles and encourage us to express ourselves unilaterally, but then also impose attention metrics that foreground other people's reactions.
I've always had a hard time negotiating this tension. I oscillate between self-promotion and self-deprecation, between posting "for myself" and posting as an imagined service for some unknown others ("I bet people would find this interesting..."). I feel compelled to post things in search of some kind of acknowledgement, but posting also feels destabilizing to my identity, casting it into greater uncertainty as I wait for any response from who knows who. Self-expression is also self-alienation: By posting things, I turn myself into a character, but also make the separation between "me" and that character. There is a relief in that. Who I "really am" is residual, what's left after I'm done expressing things. Yet that "me" is inherently unknowable, even to me.
Chopra, in his 3 Quarks Daily essay, is afraid that "we are learning to express ourselves in Facebook statuses and tweets" and that this training is making us into "different beings." He contrasts how we used to remember ourselves with the way social media directs us to remember.
We are used to looking at older photos and exclaiming in surprise and wonder at how much we have changed; those photographs have never captured the changes in our interiority. But a history of our social media interactions most certainly will; we might be surprised to see what we are becoming and have already become.
When I look at old photos of myself, or any photos of myself, I see that nothing has changed: I think, That person is still a stranger. I think about how I'm the only person who can't see me from the outside, and that means I have only the vaguest idea of how I am in the world. The terror and wonder at being an object in other people's perception.
When I look back over my social media posts, I have a similar feeling. I don't get any sense that "changes in my interiority" have been captured, just that there is objectified "me" that I can't see or control. But an omnipresent audience sees. This changes how I experience interiority in general, without changing anything about its moment-to-moment form or content.
An ever-present audience normalizes interiority's irrelevance, making it into something luxurious and selfish, a pretense of privileged autonomy. As if panoptic social surveillance can't touch you. Every time I look to social media to see how others or algorithms are responding to me, it reinforces the idea that those external interpretations are real, and what I choose to think about myself isn't. That can be a relief too.
Audiences were once hard to convene, hard to imagine, scarce and valuable: Kraus notes how Acker sent her early writings to a mailing list originally compiled for avant-garde "mail art" projects, characterizing this as a savvy leveraging of her social capital.
Social media obviously make it much easier to assemble our own mailing lists and realize some of our own social capital (though the algorithms that hide posts from people take some of that power away). These audiences, whether they are inferred from follower counts or subscribers or likes or retweets, help us see ourselves as characters, as knowable. But just as Duportail saw her "real self" in what she wanted to disavow in her Tinder profile, we become knowable in the same way — not through what we choose to post but in how it is misrecognized.
In Technologies of the Self Foucault writes about how religious confessional practices became means of articulating the self, of producing oneself as a "subject." But the point of these confessions were not their substance, but the process. Confession, in his interpretation, revealed that you were not attached to what you were and were ready to become what the institution demands of you. "In permanently verbalizing your thoughts and permanently obeying the master," Foucault writes, "you are renouncing your will and yourself."
Penitence of sin doesn't have as its target the establishing of an identity but serves instead to mark the refusal of the self, the breaking away from self: Ego non sum, ego. This formula is at the heart of publicatio sui. It represents a break with one's past identity. These ostentatious gestures have the function of showing the truth of the state of being of the sinner. Self-revelation is at the same time self-destruction.
Foucault suggests that this changes in the modern era, but I wonder if this old formula has re-emerged, with social media structuring an ideal where that constant expression of self is a form of submission to audiences and, ultimately, algorithms.
Christian confession (in Foucault's version of it in this particular lecture) was steeped in the idea that sin placed us in a state of self-illusion that hides the truth of ourselves from ourselves. Through verbalizing rituals, that self-illusion is worn down; the audience listening reveals the truth amid all those articulated thoughts. Built into this is the assumption that more evil a thought, the harder it is to verbalize. "The price of the permanent verbal was to make everything that couldn't be expressed into a sin," Foucault says.
Something of this sentiment lurks behind the constant exhortations to participate in social media. Even if we don't surrender to them, they still make clear that there is an inexpressible evil within us, driving our evasions. The data collectors are trying to make it all expressible for us, to us, to carry us into heaven. "What's on your mind?"