Got mashed potatoes
On Twitter, Shoshana Wodinsky (a must-follow on the topic of ad tech), in considering the question of whether algorithmic feeds serve more "outrage" content than chronological feeds, made the point that "engagement" and "outrage" are not monolithic concepts. As she notes, "not every flavor of outrage leads to direct engagement." Engagement seems instead a way of measuring outrage, which is otherwise difficult to quantify. Outrage seems especially volatile as an affect, likely to concatenate with other kinds of feeling (shame, cruelty, servility, pleasure, etc.) and to be multivalent or even self-contradictory, in the way that many people would rather have things to complain about than to have the objects of their complaints rectified.
It's tempting to simply equate engagement with outrage (similar to how it is tempting to equate it with "virality") and evade facing how trying to define and measure these as if they were straightforward empirical things (and not social relations) ends up producing them. Wodinsky notes that "it’s possible that algorithms push outrage for the sake of engagement" but because "each platform’s algorithms are so unique, and 'outrage' means so many things to so many people, it’s … kind of tough to prove definitively." One might instead consider that each platform's algorithm is designed to produce a proprietary form of "outrage" specific to that platform, and a certain type of "creator" who is well adapted to provide it. But that is just another way of saying that "outrage" is a pejorative word that critics might use to describe what platforms prefer to call "engagement."
Responding to Wodinsky, Ali Alkhatib argued that there was something inherently "disorienting" about algorithmic timelines, "like if my fridge constantly had different stuff in it." That is, the content itself is not more or less outrageous; rather the outrage is a matter of how algorithms necessarily function. The suspicion that people are being "shadow banned" by algorithms is one expression of this sort of "outrage"; the paranoia that phones are eavesdropping on you and then showing you content accordingly is another.
Algorithms are designed to serve content that gets people to react in ways they otherwise wouldn't. They "work" only if they produce engagement that wouldn't have occurred under other conditions — users have to engage more than they would if left to their own choices. Thus we become primed for reactivity by being in the kind of environment that algorithmic feeds establish, where everything is at once too predictable (I already know all the takes without reading them...) and also full of vaguely known unknowns, in that we know things are being left out and reprioritized in a way that necessarily excludes our direct control. Algorithms must be manipulative by definition, choosing for you what should make you feel something and what will necessarily be beneath your notice. It's not surprising that the word "outrage" would be applied to describe such experience.