Last week, Emily Baker-White reported on TikTok’s internal “heating feature” — described as a “button that can make anyone go viral”:
In addition to letting the algorithm decide what goes viral, staff at TikTok and ByteDance also secretly hand-pick specific videos and supercharge their distribution, using a practice known internally as “heating.”
The selected videos are then placed onto users’ For You Pages until a certain number of views are reached. The point of this, according to Baker-White’s sources, is “to court influencers and brands, enticing them into partnerships by inflating their videos’ view count.” As the word “inflating” indicates, it’s assumed that posts have some organic or natural amount of reach that they should have, and that any variance from this constitutes some sort of implicit injustice. With “shadowbanning,” posters are supposed to be artificially shortchanged; with “heating” they are inappropriately rewarded. But there is no such thing as organic reach; there is no way to deduce from the content alone what amount of audience it should have. The audience is always structured by broader mechanisms of circulation, with various interested parties exercising whatever leverage they can establish within a given context. What a specific individual wants to see is a very small piece of the equation.
So it’s not surprising that social media companies manipulate the visibility of posts to serve their own ends: That is the whole point of pushing users into algorithmic feeds, if not running a social media company at all. Rather than offer a feed whose rationale the user can understand and appear to control, platforms impose “personalized” feeds that work to remold users into what is most profitable for the company. Ideally users will experience this remolding as a journey of self-discovery or an elaborate form of flattery — as Sophie Haighney described the potential appeal of interacting with AI: “The app tricks me into feeling seen, but really it is just me, trying once again to see myself.”
TikTok’s rise has often been attributed to its aggression on this front, designing its interface and affordances to foreground users’ relationship to “the Algorithm” (and thereby themselves) rather than to other users. People are encouraged to believe that the algorithm discovers something special about them and shapes their experience on the app around their proclivities. But as “heating” illustrates, it turns out that what is “For You” may have less to do with who you are and more to do with brand and creator partnerships. Not much has changed since 2020, when it was revealed that TikTok deheated videos from users deemed “poor, ugly, or disabled.” Exclusionary normalization and stereotype reinforcement are apparently “for you” and for all of us.
What is sort of surprising, if you take the rhetoric about the oracular propensities of the algorithm at face value, is that users don’t seem to notice or mind the imposition of “heated” content. Why doesn’t this material stand out as decidedly not “for you”? Why don’t we detect a bait and switch?
It may be because the algorithm is already so hit or miss that any inserted promotional content (“heated” posts are native advertising) doesn’t necessarily ring that false. All “personalization” is ultimately ad targeting: The idea is always to goad you into accepting something alien as being for you, about you, or expressive of you.
At the same time, for feeds to be compelling, they can’t be so idiosyncratic as to be isolating for users, cutting them off from what other people are seeing and talking about. Imposing virality from above militates against this, helping balance the opposing drives toward solipsism and conformity — or best of all, creating solipsistic conformists whose feelings of belonging are mediated through trends derived from their access to media feeds.
The For You page works as an alibi, allowing users to be trend followers at the very moment the interface is telling them how special they are. This helps maintain the tension in consumerism’s basic contradiction, in which individualism is recruited to motivate the ever-increasing consumption of mass-produced goods. Via mechanisms like algorithmic feeds, you become yourself by keeping up with everyone else, doing what they do and seeing what they see.
Cory Doctorow seized upon TikTok’s “heating” as an example of what he calls “enshittification,” the process by which a platform entraps and ultimately immiserates buyers and sellers (or audiences and “creators”). “First, they are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all the value for themselves.”
This sort of analysis is often applied to Amazon (as Doctorow does) and to venture-capital-funded initiatives like Uber, where the aim is to use consumer subsidies to drive competing suppliers of a service out of business so you can then use monopoly status to gouge customers who suddenly have no alternative. (The “end of cheap money” was supposed to shut down this strategy, according to an Atlantic thinkpiece from last June.)
To apply it to media platforms, Doctorow interprets algorithmic recommendation as the means by which companies like Facebook pivot from giving customers what they want (the stuff they followed intentionally) to forcing on them what advertisers want, which leads inevitably to a terminal crisis. In his account, people tolerate algorithmic feeds only because “a critical mass of people you cared about” are on particular platforms and other ways of socializing with them have been depreciated.
But that doesn’t make much sense for TikTok, which never built itself up from a “social graph.” TikTok never presented itself as a main conduit for friends-and-family content. To apply his enshitty model to TikTok, Doctorow argues that the bait-and-switch is played not on audiences but content creators, who are being tricked into thinking they can go massively viral at any moment.
These [heated] videos go into Tiktok users' ForYou feeds, which Tiktok misleadingly describes as being populated by videos "ranked by an algorithm that predicts your interests based on your behavior in the app." In reality, For You is only sometimes composed of videos that Tiktok thinks will add value to your experience – the rest of the time, it's full of videos that Tiktok has inserted in order to make creators think that Tiktok is a great place to reach an audience.
While TikTok’s description of what “For You” means is certainly misleading, it doesn’t seem to qualify as a bait and switch. “Every time Tiktok shows you a video you asked to see, it loses a chance to show you a video it wants you to see,” Doctorow writes, but in general, users aren’t asking to see anything. That is the whole point of TikTok, the only reason you’d be relying on the For You page. What is “for you” is not concretely articulated in the first place; that obscurity is its main appeal: It saves you the trouble of having to worry about what sorts of things should be “for you” and lets you engage with a reflection of your own passivity, dignified by the interface into a semblance of an idiosyncratic personality.
Oddly enough, this makes “heated” content an example of what Marshall McLuhan described in the 1960s as “hot” media. To be honest, I have always had a hard time grasping what he was trying to get at with the hot/cool distinction, and the explanation in his relentlessly preposterous book Understanding Media has only ever added to my confusion. The choice of metaphor seems counterintuitive, if not altogether arbitrary. But in this case it actually seems to fit. For McLuhan, a “hot” medium conveys a more intense and overwhelming amount of information, and a “cool” medium is lower-definition, requiring more work on the part of the information receiver to put together the message or fill in its gaps. So a “cool” medium is in theory more participatory (which has always seemed to me metaphorically backward), whereas a “hot” medium, in McLuhan’s view, shocks receivers into a kind of stupor or defensive crouch.
Were we to accept fully and directly every shock to our various structures of awareness, we would soon be nervous wrecks, doing double-takes and pressing panic buttons every minute. The "censor" protects our central system of values, as it does our physical nervous system by simply cooling off the onset of experience a great deal. For many people, this cooling system brings on a lifelong state of psychic rigor mortis, or of somnambulism, particularly observable in periods of new technology.
With this theory in mind, one could interpret TikTok’s algorithmic feed as optimizing not so much for virality but for McLuhan’s sort of heat, which would induce a ricochet psychic somnambulism. As people mentally sleepwalk through the app’s videos, they become habituated to numbness, this protective non-engagment in which the relinquishment of agency becomes equated with a feeling of protection. As TikTok turns up the heat, the user’s submission to whatever videos they are served plays out as relief. It becomes icy hot.
“The effect of electric technology had at first been anxiety,” McLuhan claims. “Now it appears to create boredom.” In their compressed expression on TikTok, these different effects become simultaneous.
McLuhan thought TV was an inherent coolant (because cathode-ray tubes were low-res by definition and viewers had to work their eyeballs harder to complete the flickering images) and believed it could be deployed programmatically for the purposes of “maintaining equilibrium in the commercial economies of the world,” chilling populations out when some other more intense form of media (film and radio are supposedly “hot”) got them worked up. He also claims that sunglasses are “cool” because they reduce the available information about your face. The more you read about what he says about hot and cool media, the less it seems worth taking too seriously.
But his freewheeling speculation at least doesn’t understand media as just another marketplace in which a sovereign consumer exercises their conscious preferences and service provides cater to them accordingly. For algorithmic media especially, it seems inadequate to characterize them as entirely manipulative or straightforwardly gratifying, as either meeting user needs or altering those needs so that they can be instrumentally serviced. Content has no inherent reach or automatic appeal. Engagement sometimes signals boredom. Anxiety may be inseparable from pleasure. There is no heat death of the social media universe.
Marshall McLuhan's cool and hot medium always confused me. Yes, Understanding Media is preposterous. Still a good theory though.
I also can relate to this numbness. But not with TikTok but with Instagram reels. I always have to recover my sharpness by reading a book, writing, and musing on something that stretches my mind.