That Feed Is Your Poem
This morning I received The Goods newsletter from Vox, which today consisted of a roundup of trends on TikTok by Rebecca Jennings. She describes the joys of “aesthetic TikTok” — “the part of the app where people put together what are essentially slideshows of Pinterest boards devoted to a certain feeling or mood,” and highlights “cottagecore” and “Dark Academia” as emerging styles, along with their offshoots: “There’s goblincore or crowcore (collections of weird shiny trinkets), meadowcore (pretty pictures of meadows), fairycore (meadowcore but with mushrooms and magic), and Light Academia (Dark Academia, but girlier and in the summer).”
This kind of content is not for me, and I generally do my best to avoid it. On the rare occasions I open TikTok, I immediately feel as though I have walked through the wrong door by mistake and entered someone else’s home. For me this feels nightmarish, but Jennings likely speaks for most people when she writes that “these cute little slideshows aren’t just an escape from the Bad Internet, they’re a reminder that another kind of life is possible.” One might say that they serve not just an escape from the news but from the self. Perhaps this is also true for the people making these displays, turning a form of “self-expression” into a negation of the self conceived as a trap, as an unalterable destiny. The “-core” suffix is never less than ironic, in a good way.
Since the MySpace era, if not before, social media platforms have always accommodated the creation of “-cores,” ephemeral styles that are more assembled as multimedia projects than lived. They function as a kind of imminent critique of the idea of a “lifestyle,” exposing it as little more than a suite of scrims and superficial gestures, a color palette and a few tableaux vivants. Social media make the ersatz package of a “lifestyle” extremely legible; they offer tools to build them, means of circulation to organize them, and metrics to evaluate them. If all this makes the idea of a lifestyle seem “inauthentic,” then so much the better. If there is anything worse than an inauthentic lifestyle, it is an authentic one.
But still, there is a tension in how social media expedite this kind of expression, encourage our thinking of life as a style, positing the aestheticization of life as a means to its own end, an ouroboros of aspiration. Communication on social media platforms is so structured by their peculiar attention economics that posting can feel as though it reveals more about the nature of the medium than something about yourself. What does TikTok want? Could it really be just a clever form of Chinese espionage meant to capture the minds and data of American youth, as the U.S. Secretary of State seems to be suggesting?
A recent essay by Rebecca Lemov in the Hedgehog Review speculates on whether our exposure to predictive analytics is a form of “mild if pervasive form of brainwashing.” Not only does she detail the 20th century panic over supposed techniques of Chinese brainwashing (given memorable expression by the film The Manchurian Candidate); she also cites mid-20th century studies that found that listening to the radio had the effect of making some listeners forget that they had the option of turning it off: “Audiences found themselves emotionally chained to the radio, in effect, by a mix of obligation and suspense …. listeners could both know and not know they had been overtaken by a kind of compulsion.”
Obviously, that captures how many older people seem to feel about using social media platforms, that they have become an obligatory form of communication whose various forms of notifications can’t be made insistent enough, even though they are at the same time intolerable and destructive. Similar complaints were made about television, which has also been described as hypnotic, overriding the will of viewers and imposing a kind of thoughtless compliance. Often the problem with television was construed in terms of the passivity it induced; with social media, the problem appears more as a compulsive interaction with a personalized feed, under the auspices that this feed represents a synthesis of our friends, family and peers along with the sorts of celebrities and influencers we might wish were among them.
TikTok prides itself on having the most fully enveloping of algorithmic feeds: it’s “For You” page purports to adapt itself to you as you consume content on the app, learning your preferences and catering to you. It’s “one of the defining features of the TikTok platform,” according to this post from the company’s “newsroom,” which explains how it works with this description: “When you open TikTok and land in your For You feed, you're presented with a stream of videos curated to your interests, making it easy to find content and creators you love.” But of course, it doesn’t know much about you when you first use the app; it allows you to tell it what you want only by choosing among topics and categories it has preselected. Then it requires you to interact with videos in ways its interface can track and then infers what you are about from this narrowed-down range of behaviors.
Lemov details Marshall McLuhan’s account of “thought standardization” through advertising in similar terms. In his first book, The Mechanical Bride, McLuhan
recounted a conversation he once had with an advertising expert. From this expert he learned that the key tool of the ad trade was to “standard[ize] thought by supplying the spectator with a ready-made visual image before he has time to conjure up an interpretation of his own.” In that instant before the process of making sense was completed, a presupplied image and, subsequently, a thought (not quite your own) could take hold. Thought was being standardized. There should be no mistake here, McLuhan commented: This was mental tyranny disguised as market research.
TikTok’s For You page can also be construed as “mental tyranny disguised as market research” — that is, it is better understood as TikTok barraging users with a stream of “ready-made visual imagery” that pre-empts interpretation and reshapes their sense of what it is possible for them to desire. It’s training users to want what the algorithms can easily select for and provide. It is tyranny, because given the chance to choose for yourself, you might choose “no content right now, thanks.” But the point of apps like TikTok is to create the sense in users that they “know and not know they had been overtaken by a kind of compulsion” and that turning it off is not an option, lest the system forget who you are, what you want, why you matter. We can at once claim this self and not claim it; interacting with algorithms generates plausible denial, a comforting ambivalence about our complicity.
All algorithmic content feeds work this way by definition: They reshape users to match the content inventory, which in turn prompts content creators to make further content in the same mold, shaped by the sorts of identifying markers that algorithms are capable of picking out. These markers line up with some of the qualities that Jennings points to as “aesthetic” — repetitive elements that algorithms can identify and use to constitute audiences, groups of pseudo-collaborators.
James Bridle’s account of children’s YouTube is perhaps the canonical account of this process, but Lemov points to Caleb Cain’s account of being force-fed white-supremacist and other right-wing content:
Like nearly 70 percent of YouTube watchers, he employed the “UP NEXT” function on the site, which will “autoplay” as a default option. Unless the viewer stops the video feed, it will keep rolling. This creates an onrush of content configured to match the user’s ongoing behavior profile. Behavioral data—compiled from the individual’s choices and a vast collation of other viewers’ clicks, likes, and other habits—then informs which videos come next, with the prime value being to optimize ways to keep the watcher watching. Time on site overrides ideological content. Within a few months, having dropped out of college and with a lot of time on his hands, Cain “fell down the alt-right rabbit hole,” as he himself recalled, becoming, in his own word, “brainwashed.”
Or, as TikTok sees it, “the system is designed to continuously improve, correct, and learn from your own engagement with the platform to produce personalized recommendations that we hope inspire creativity and bring joy.”
Content produced for an algorithmically governed delivery system must primarily accommodate the algorithms; it can only express what the algorithms recognize. Every TikTok video, regardless of its surface-level content, ultimately must say the same thing: “an algorithm picked this for you.” You are left to interpret what that says about you; what the post says about itself doesn’t really matter. When you enjoy content on your TikTok feed, you are actually just enjoying your own taste as the platform has constituted it for you; the creativity or ingenuity of the content creators becomes a reflection of your own latent creativity — the algorithm basically implies that you called it forth. It exists so that you can know yourself; you create yourself by watching more. And you thereby learn how to make content for the platform that reflects that self-creation. You go cottagecore.
But then I’m often tempted by the line of criticism that condemns social media for making self-documentation too easy. It’s as though I want to imagine the self as this infinitely complex form that is already crystalized and that is in danger of being misrepresented in an overly simplistic way and thus betrayed. From that perspective, every formulaic post or participatory meme, every predictable or rote reaction or gesture, becomes a diminishment of the self, a falsification of its deep truth, and every moment of resisting the urge to post testifies to the richness of one’s own character — never mind that this testament is entirely solipsistic. Words and images just trivialize my Thought. Only silence can convey my profundity. No clichéd post could be more cliché than that.
Yet I wouldn’t want to surrender to a teleological view of social media, as though they are just the inevitable next step forward toward perfecting our means for communicating with each other. An Outside magazine article by Lisa Chase, headlined “What I Learned at the Most Instagrammed Outdoor Places,” concludes with the lesson that we have a “deeply human desire to share our most profound experiences with others,” and Instagram, for better or worse, is “just the way that looks today.”
As with TikTok, I tend to draw the opposite conclusion: Instagram and other platforms are reshaping our sense of what constitutes a profound experience, such that we depend on them to invest our lives with a sense of profundity that we can no longer generate internally or through a sense of communion with others. Instagram structures our sense of what counts as experience just as TikTok’s algorithm structures what we are expected to recognize as our own true tastes. “Taste” and “experience” don’t, can’t, preexist the media through which they are expressed and made coherent to ourselves. The same might be said of “connection” — that is, social media don’t actually “connect” us, but rather mediate the idea of connection, transforming it into a form of constant information circulation — the data that platforms are optimized to harvest.
Instagram would very much like us to believe that we have a “deeply human desire” to re-mediate experience and that its platform is just a convenient way of doing that. But it may also be regarded not simply as a means of communication but as an incitement to communicate more, to convey precisely those things it allows you to now convey. Platforms operate on the principle that more communication is better communication, more connection is better connection. Sheryl Sandberg’s ludicrous and disingenuous statement that “Facebook stands firmly against hate” rings so hollow because the company will always treat speech as data and treat more data as better data. The nature of their business demands this, but the principle finds ideological support in the misguided belief that more speech is free speech, that more speech will correct hate speech and tend toward “truth.” It has turned out that making connection easy has made it easier for strangers to band together for antisocial purposes, for conspiracies to spread and for groups to form around a shared sense of persecution, spite, or contempt. Providing the means for more speech at larger scale has meant both devaluing it and making it easier to weaponize, as though that were the only way left to give communication value.
In her Outside essay, Chase writes of visiting Canyon du Chelly, a tourist attraction on Navajo reservation land that one can visit only when accompanied by a tribal guide. Taking photos is allowed only at specific spots, which Chase admits makes her journey there discomfiting. She seems to both know and not know that she has been overtaken by a compulsion.
If I’m being honest, this place makes me feel like a trespasser. If I’m being honest, I feel lonely here. At a wash about a mile into the canyon, there’s a Kokopelli symbol painted on the rock, which Yazzie says we may photograph. I pull out my phone a little too gratefully. It’s my connection to the world I inhabit and understand.
This suggests that people feel at home anywhere they can use their phones — that home is not a place but a way of doing things, a matter of centering oneself. This is the sense of privilege that platforms grant us. The phone allows us to document experience, but it demands we understand it as property. We are encouraged to understand communication as simultaneously a form of appropriation, a way to take where you are and make it yours in the eyes of someone else, a conflation of possessive individualism with the possibility of the familial. The screen becomes a small imperial homeland “shared” between you and your “loved ones,” or at least whomever the platforms’ algorithms decide to match with you.