False binary
A bill currently being considered in North Carolina, as part of its intention to prohibit people under 21 from transitioning gender, would require any "government agents" with "knowledge that a minor under its care or supervision has exhibited symptoms of gender dysphoria, gender nonconformity, or otherwise demonstrates a desire to be treated in a manner incongruent with the minor's sex" to "immediately notify, in writing, each of the minor's parents, guardians, or custodians."
Presumably this mandate — which compels (as Sarah Jones notes here) state officials to out trans and queer children to their parents whenever possible — is designed to reinforce parents' authority over their children and prevent them from acting on their own behalf with regard to their gender identity. It insists on the closet. It also inadvertently makes plain that gender is not "genetically encoded into a person at the moment of conception" as the bill would have it, but is politically constructed and enforced — that is, it is assigned at birth by a political authority and must be sustained by an ideological matrix underwritten in all its moments by state power.
A detestable law like this necessitates an operational definition of what "gender conformity" concretely consists of. Unfortunately, the field of data science and "AI" is perfectly suited to this task. It routinely takes social biases and prejudices and launders them into naturalized and neutralized facts that are supposedly "discovered" by algorithmic procedures. (What is "normal" gendered behavior? Well, what does Big Data say it is?) It extracts behavior patterns based on the vagaries and assumptions already encoded in how its training data has been previously (and often arbitrarily) processed, and then rationalizes the imposition of those patterns as norms by whatever authority is in control of the process.
This technique gives crude intolerance a scientific veneer. As Dan McQuillan argues, "The character of 'coming to know through AI' involves simplifications based on data innate to the analysis, and the reduction of social problems to matters of exclusion based on innate characteristics is precisely the politics of right-wing populism."
When North Carolina insists that there is a "congruent" way to act like a particular gender, AI could be deployed to substantiate whatever that behavior is alleged to be and to provide spurious evidence that one's gender is fixed and inalterable at the behavioral level — "at scale, men act one way and women another. Or it can be guided toward generating pseudo-objective criteria for "maleness" and "femaleness" based on how a set of algorithms have been aligned to implement specific a priori gender definitions. AI often serves to make political decisions and exclusions appear as somehow grounded in a dispassionate assessment of "the data" — Computer says no — as though the data were perfectly and unproblematically representative of reality rather than unavoidably conditioned by incompleteness and the ethical and conceptual problems that come with assembling any data set.
Data scientists have devised numerous machine-learning projects that attempt to ascribe gender to subjects based on pattern-matching analyses, dubiously labeled data, and specious assumptions about what constitutes gender identity. This may take the form of the infamous "gaydar" study conducted by Yilun Wang and Michael Kosinski that purported to link sexual preferences to a person's facial features, or it might look like data brokers assigning a probabilistic gender ("59.3% male") to web users based on their browser activities. It might look like using facial-recognition or speech-recognition technology to assign gender, a practice that, for example, Spotify has entertained and which the advocacy group Access Now explicitly condemns here.
In all these cases, AI is presented as a supposed discovery tool when it in fact tries to impose a reality on whoever is subjected to it. That imposition draws its force not from the identification procedure's accuracy — gender identity has no empirical basis beyond what a subject explicitly and deliberately claims it to be; it is always potentially fluid, dynamic, open-ended; it can't be detected, only proclaimed — but from the legitimacy categorically (and erroneously) assigned to AI.
As Os Keyes, Zoë Hitzig, and Mwenza Blell explain in this paper, "Cultural mythologies and imaginaries about ‘what data can do’ reshape work, practices and values even if their promises are not (and may never be) kept ... even if research processes using AI do not produce more ‘accurate’ results, these processes may nonetheless be interpreted as grounded in additional certainty — simply by deploying the rhetoric of AI." They cite Alexander Campolo and Kate Crawford's paper on "enchanted determinism": the tendency for AI to be accepted as a magical "deep learning" process that can "give unprecedented access to people’s identities, emotions and social character" and invest those claims with a phony objectivity.
In a petition Access Now has organized to call for gender-ascription technology to be banned in the EU, it notes how "these AI systems work by sorting people into two groups – male or female," and that the "algorithms are also programmed to reinforce outdated stereotypes about race and gender that are harmful to everyone." North Carolina's proposed law offers a stark illustration of that. It's not merely an "anti-trans" bill, as many news accounts describe it, but an attempt to legislate identity, withdrawing the possibility of self-determination from all. AI should be understood similarly.