The beginning of the baseball season this year brought with it an “epidemic” of injuries to pitchers and a range of explanations to explain it. The Major League Baseball Player’s Association wanted to blame it on the recently introduced pitch clock, which could conceivably cause pitchers to rush into their motion in physiologically unsound ways. Presumably aware that the pitch clock makes the game much more palatable to spectators, the league itself loudly disagrees, pointing out that “velocity and spin increases” are “highly correlated with arm injuries.”
But it is not as though pitchers just recently decided to try and throw harder and impart “max effort,” as if that never occurred to them before. What has changed is that pitchers and coaches and the various biomechanical experts that teams hire to help refine players’ pitching motions have more measurement capabilities than ever before to track not only a pitch’s velocity but its spin rate, lateral movement, and depth of break. The ball and the players are all outfitted with trackers to convert their behavior instantly into a data stream, as if the point were the data and not the action on the field. If you go to a game at most stadiums, or watch Apple TV’s (abhorrent) baseball broadcasts, you get this information as a viewer after each pitch, as well as a battery of other “advanced” statistics that purport to capture a player’s individual talent and performance separate from luck, contingency, and the burden of their teammates. (There is no “I” in team, but nothing but “I” in xBA and xFIP.)
For spectators, the statistical trivia can be amusing (that foul ball had a launch angle of 135° with an exit velocity of 75.3 mph!) when it isn’t distracting. It seems part of the general desire to mathematize the game to make it more amenable to simulation, prediction, and gambling. Undoubtedly there are prop bets available on how many hard-hit balls there will be in an inning. Gamblers shouldn’t be limited to betting on things that are directly meaningful to who wins and loses the game but should be able to play a game of their own devising with the help of FanDuel. But it is also as though the data is there to make the game seem “more real” as it transpires, as though the numbers were necessary for fans to interpret what was really happening. Numbers, not narratives. (A wager is not a story.)
But for players, the data is disciplinary. All the performance surveillance and granular information collected makes it possible to isolate players from each other — to conceive of their performance as strictly individual and entirely reducible to what is measurable — and then home in on mechanical adjustments that can optimize for “spin increases” and the like with detailed feedback coming on every attempt. Knowing they will get paid on the basis of how well they can maximize those numbers, pitchers are deeply incentivized to develop pitching motions that accomplish that aim, and disincentivized from what old-timey broadcasters like to describe as “learning how to pitch” — outthinking the batter with pitch selection and location, making use of the defense behind them, coordinating with the catcher to steal strikes from the umpire (at least until they are replaced by cameras).
The vast increase in pitch-level data makes it compulsory to choose short-term gains over habits that might allow for more career longevity. The data makes the entire pool of potential pitchers more fungible; franchises can just discard the arms as they get used up because the data flow helps them learn how to extract more juice from the general arm supply. The ideal from that point of view is an endless parade of interchangeable “max effort” relievers who throw 100 m.p.h. for a season or two until their ligaments disintegrate, only to be replaced with another “live arm” who will have no particular name recognition value or contract-negotiation leverage. Imagine a league where every team is the Tampa Bay Rays.
The data collection is irresistible to pitchers, who might even find such self-tracking useful in achieving their personal goals. But the data mainly supports the establishment of larger structure that makes pitchers more injury-prone and disposable. This holds even more true for those of us who are not professional athletes but are being driven by metastasizing schemes of measurement and management to understand the world in terms of endless competition and ranking — what Marion Fourcade and Kieran Healy call “the ordinal society” in their new book of the same name: “a society oriented toward, justified by, and governed through measurement.” At Citizens Bank Park, when I see those advanced statistics on the scoreboard along side the relentless revolving advertisements, it feels like a microcosm of this; I am watching game that has become oriented toward, justified by and governed through measurements for their own sake. They may as well stop keeping score and just let people bet on launch angles.
“The logic of layered sorting through personalization is now built into economic infrastructures and social expectations,” Fourcade and Healy write. “The advent of fine-grained, quantified personal data has propelled a rationalization of the social stratification process and equipped it with a new kind of moral justification.” We are continually interpolated into competitions and tryouts we had no prior interest in, and the rank gamification involved works to extract our labor and compel our obedience to whatever precepts are encoded in the ranking formulas.
Fourcade and Healy describe the ordinal society as one in which people are “individually sovereign and cybernetically supervised” — one’s performance is isolated from that of others and thus appears to be under their total control, but that personal performance data is continually leveraged against them by larger institutions pursuing their own aims. People think they can be well-classified and appropriately rewarded, until they encounter the system’s overall indifference to their condition and their specific circumstances.
The ever more elaborate data collected about us, often against our will and without our knowledge, can be mediated back to us and push us toward what appears to be self-knowledge (the algorithm really knows me!). Social media platforms can proceed as though users want to see view counts and so on — as if that forced metricization is to help us know what we want to say rather than to help them entrap us into adding more value to the platforms. But the proliferation of data is also a kind of ideological conditioning that trains us to accept opaque, biased, and even arbitrary ranking schemes to be objective and just, and to see our individual behavior and capabilities as strictly isolable, as if we could be who we are and express our talents and formulate meaningful goals that make the most of them without belonging to a society. You can’t really change yourself, but you can optimize your performance along the lines demanded by various sorting protocols: Thanks to ubiquitous surveillance, life can be one endless scramble to game one system after another in a climate of universal suspicion and trustlessness. In turn, the systems continually ranking individuals will impose the consequences of their unauditable decisions on individuals as a kind of doom. If you try to evade the systems, your invisibility will be interpreted as poor performance, and you will be further punished for your recalcitrance.
Fourcade and Healy occasionally seem to suggest that people are intrinsically compelled to seek status and to prioritize convenience over other values, and that tech companies just developed technologies to exploit these flaws in our psychological makeup. They claim that “social theorists underestimate the power of delight,” which apparently should stand uninterrogated as a marker of what humans really want. They go to some lengths to try to preserve the idea that people can (or at least once could) make meaningful choices about their technology use. They argue that the tendency toward “replacement of individual liberty within a polity by pure sovereignty within the market” has been “driven by delight, convenience, and a growing demand for authenticity in identification. Individuation reduced friction in a way that people enjoyed—when it worked.”
Similarly, they seem to argue that tech companies mean well and hope only to delight us but are compelled by larger forces to track and surveil us, as in this perplexing passage:
A careless observer might attribute the practice of collecting personal data to the simple wish to make money, the unhealthy desire to pry into people’s lives, or the grand ambition to surveil and control a population. But in practice, data collection often has a ceremonial character ... Professional exhortations; conventional wisdom; falling computing and storage costs; and most recently the gigantic training demands of large language models for content production—all of these forces have pushed organizations to sweep up increasingly large quantities of bits about whatever crosses their path.
No doubt I am a careless observer, but this seems to be letting data collectors and data brokers off far too easily. Social media companies get off far too easy too when the status anxiety they provoke as a business model is attributed instead to human nature. There is no escape from invidious comparison because people intrinsically enjoy feeling superior to feeling solidarity.
Henry Farrell, in his summary of The Ordinal Society, argues that “technologies like AI are both pernicious engines of desire-shaping, and miracle-technologies that produce just what we want, at one and the same time. We are perpetually eager to discover how we compare to others, and how others compare to us ... humans love to turn classification systems into ranking systems.” There is ample enough evidence for that, I guess, but it remains a depressing verdict on the human species and makes one wonder why anyone would bother to try to make the world better.
But it could be possible that pernicious engine of desire-shaping yields the perpetual eagerness to rank people’s social fitness. Fourcade and Healy also sometimes seem to suggest that delight can be a delusion or a coping mechanism for being caught in a broken social system. The ordinal society, in their account, gathers ideological momentum that “blinds us to what we all share and chips away at solidaristic feelings … Public goods and collective goals are being dissolved in the acid bath of individualization and competition, leaving us increasingly alone in a hyperconnected world whose social ordering is precisely metered.” Individuals are compelled to participate in this system and begin to accept the consolation prizes built into it — the pleasures of “negative solidarity,” of being able to identify and punish people who are held to be lower status; the pleasures of achieving little victories in games without reward, of beating your high score in a productivity game whose fruits accrue to your bosses.
Rather than attribute this to human nature, it can be attributed to a persistent effort to normalize surveillance and highlight the supposed pleasures of competition with peers and being continually ranked. If indeed the “push for seamless and fully reliable verification and authentication isa steady,” it is because trustlessness is unendingly fomented by those who stand to gain by it. This careless observer finds historicizing passages like this one far more convincing:
Commentators often speak blandly of “digital natives” as if they had a natural technical facility with computing. The reality of the idea may lie more in the tendency to accept a social ecology where everything is indexed, tracked, and measured. One is not born, rather one becomes a digital native. Organizations share this attitude, too, as advances in ML have solidified “the unnerving belief that everything is data and is there for the taking.”
Realizing this belief demands a collective, sustained overhauling of the sociomaterial environment. It means adjusting the rules of human exchange to circumvent normal expectations about privacy, drawing on an infrastructure of logins and passwords, unique device identifiers, and biometrics; routinizing the use of trackers and sensors in virtual and physical spaces; socializing people to volunteer inputs and respond to machine feedback through addictive designs; nudging them into frequent check-ins and assessments. With its algorithmically produced feed, endless scroll, automated data collection and learning, its quantified metrics and modulated interventions, the social media app exemplifies this regime more than any other mode of computer interaction. Is it surprising that social media apps increasingly resemble shopping, transportation, streaming, payment, educational, and cooking apps, which in turn all resemble social media apps?
The “rules of human exchange” are not fixed but can be “overhauled” by cabals that accrue sufficient leverage. “Training a population to embrace its own ordinalization can sometimes take the form of a bold exercise of political will,” Fourcade and Healy point out. In other words, that embrace of status-seeking must be inculcated to some degree and institutionalized; it is not simply innate and waiting merely to be more rationally exploited. “Corporations … carefully design interfaces to cultivate a sense of ignorance and helplessness.” By the same token, they use their power over the circulation of discourse to make ignorance and helplessness appear as convenience and delight.
Despite its ambivalence about where the compulsive measurement and classification schemes come from, The Ordinal Society is essential reading for its clear overview of the consequences: how all goods are turned into “services” that one must repeatedly pay for, how discrimination (price discrimination, the foreclosure of opportunity, the administration and production of different forms of stigma) becomes more pervasive and granular, how bias becomes more entrenched, how the demands for perpetual validation enable endless forms of extortion, and on and on.
Fourcade and Healy note “that a particular personal failing, some specific vulnerability, may be the most useful thing about you as far as the market is concerned.” This is what tracking and measurement are now for: finding everyone’s vulnerable points and assuring that those weaknesses, and not their capabilities, are the only things they can use.
Great article, not-so-great reality. Seeing the truth written so plainly, bleak as it is, always makes me feel a little less crazy. Other people can see it looming too.