Advertising is essential
"Advertising is essential to keeping the web open for everyone but the web ecosystem is at risk if privacy practices do not keep up with changing expectations." This is the opening sentence from Google's recent update on its project to replace third-party browser cookies — which track us as individuals — with something it calls FLoCs: ad hoc interest groups to which individuals would be assigned based on their browsing history. As Shoshana Wodinsky explains at Gizmodo, "Any data generated on an individual basis would be kept in-browser, and the only thing advertisers could track and target would be a 'flock' containing an aggregated group of semi-anonymized people."
This is not much of an improvement over the existing situation. "Stereotyping people behind their backs is essential to keeping the web open for everyone." Not only are we still living in the imaginatively impoverished world where "advertising is essential" (and where all information is reduced to advertising), but being typecast as a member of a particular group may actually compound the harm of being pigeonholed for one's individual behavior. When a club automatically enlists you without your knowledge, it's because they believe you won't join voluntarily. Nothing about this proposal will make the adversarial assessments being made about us any more transparent or avoidable.
It is not as if you are no longer being tracked as an individual in opaque and involuntary ways. It's more that a massive single entity is doing the tracking, assigning you an arbitrary identity of some sort, and using its leverage to make that ascription more impactful, allowing it to further occlude your horizons. As Wodinsky notes, drawing on this 2019 EFF analysis, "being a part of a flock isn’t unlike being branded with a 'behavioral credit score': one that remembers your interests, your purchase history, and a lot of what makes you you, and puts it in the hands of one extremely powerful, largely unaccountable corporation."
Often the ad industry's machinations are covered in terms of privacy tradeoffs, as if personal privacy was unilaterally controllable by individuals who make deals for themselves rather than a matter of social norms and legal determinations. Privacy is seen as a currency that users can spend a little of in order to receive more "meaningful" or "personalized" ads, which they are alleged to want. ("We surveyed people about whether they wanted 'meaningful ads' and they said yes!") The framing that "users love personalized ads but don't want advertisers to go too far" is just the advertising industry's self-representation, its pitch for self-regulation. Google's latest efforts are in keeping with that.
Advertisers also like to suggest that any evidence that ads "work" is proof that consumers want them. Revealed preference! That assumes a world without coercion, which is precisely the opposite of the "advertising is essential" world. Advertising is about overriding people's preferences. Fantasies about making advertising more effective — making it more Pavlovian in its persuasiveness — are invariably tied to various forms of identity ascription. Individuals are not treated as free beings who choose their own nature and shape their identity as a form of perpetual becoming; instead they are seen as fixed entities whose behavior can be predicted and prescribed once they are properly represented and reified in data. In other words, surveillance is substituted for self-expression. Your browser history is treated as your willfully creating an identity for yourself.
To make these identities binding, people must be trapped in environments where their "scores" can tell the whole truth about them — that is, environments where the way people are defined from the outside can be used to control what they are able to experience and condition their possibilities for action. Algorithmically filtered feeds are of course one such environment. Some would say that "society" is another.
Companies will continue inventing new ways to reduce people to scores because it is one of the most efficient ways to implement discrimination and instill discipline. Emotion recognition systems (as detailed here) are one of the more extreme versions of state-of-the-art phrenology: Your internal emotional state is given a score based on externally measurable aspects that you may or may not have conscious control over. This MIT Technology Review piece reports on a few companies that are using facial analysis algorithms to make credit ratings for your face. The CEO of one of these firms argues that this provides a kind of knowledge that allows people "to determine their fate" by, say, getting plastic surgery to conform to the prevailing beauty standards that are revealed (i.e. reinforced) through algorithmic analysis. Nothing says personal freedom like conformity. I, for one, am excited for a private company to tell me "you're officially ugly" and "we're selling that information to anyone who wants to use it against you." What a feeling of power and control that will be.