Under the dome
As the Mueller investigation into U.S. election tampering begins to yield indictments and another school shooting has prompted another predictable round of conspiracy theories and denialism, concern over attempts to troll or otherwise hack the media and tamper with people's sense of what has "really happened" (i.e. has been given the imprimatur of media coverage) feels especially intense now. It seems more commonly understood that social media and algorithmic sorting play a role in this, allowing misinformation and disinformation to find their target audiences.
But it's not clear how to change this. In lieu of a solution, algorithmic power is being mystified and romanticized, represented as though it functions at a level of complexity that makes taming it or dismantling it impossible. The machine learning and artificial intelligence that goes into algorithmic systems are ascribed an amorphous agency so that tech companies can disavow their own, as though it were an accident or fate that their systems exist, and not the accreted result of decisions made to maximize profits. Fundamentally, algorithms are trade secrets: they exist to generate asymmetric knowledge and striate the marketplace. They are designed to keep people guessing.
So it is no surprise that Facebook's ongoing reprogramming of its sorting algorithms has fueled what Laura Miller in this Slate piece calls" algorumors," basically fake news about the fake-news machine. Ironically enough, a sound strategy for gaming Facebook's algorithms and gaining reach is to go meta and try to go viral with speculation on how virality happens. Given how opaque Facebook's aims are in adjusting its algorithms — they want to promote "authentic engagement" and "meaningful interactions," as if those terms aren't tautologies — they seem to invite occult theorizing. Trying to define "authentic" or "meaning" are essentially theological questions; Facebook speaks in meaningless riddles because it wants to be seen as a remote god.
Miller argues that these rumors spread because those who depend on Facebook for social participation also have little understanding of how it actually works. It is a classic "expert system" in sociologist Anthony Giddens's sense, something we have to try to trust without understanding it once it has worked to "disembed" us from local frames of reference. Facebook's asynchronous mode of communication and algorithmic sorting disorient users, undoing familiar frameworks of time and space and leaving them in the position of having to propitiate the platform like an angry god while trying to adapt to the kind of sociality it imposes: "Facebook already does choose what we see," Miller writes, "and all we have with which to placate the strange and unknowable algorithms that govern it is the ritual magic of our likes and shares."
The mystification of social media's workings makes platforms prime territory for scams, pranks, hoaxes, and other forms of abuse. They unfold a space where the more familiar methods of contextualizing and parsing social interaction are absent and the grounds for trust are muddled, obfuscated, warped, disguised — who is talking? why am I seeing this? what aren't I seeing? when did they say this? who were they talking to originally? etc. — all while that same space is intensively promoted as an ideal spot for sharing, openness, friendship, "meaningful interaction," "genuine conversations." We are supposed to accept this highly unstable space as both an ideal public sphere and a forum for deep connection simply because we can access it directly, through our phone, an interface designed and customized for one. The promise is that sociality will be so convenient that we'll accept that ease as a more important priority than trust. The protective bubble of the phone gives relief from the tentative and awkward negotiations that face-to-face communication can require. The awkwardness online takes different forms, as disinhibition or solipsism: personal boundaries are overlooked or transgressed, identity is spoofed, communication becomes ambiguously ironic. Conversational contexts become harder to identify or fix. What we want to talk about and why changes. People share articles in a way they wouldn't think to talk about them — they treat them like badges of honor or identity or loyalty without necessarily having read them.
But it is not as though we have left behind a world where trust was never abused, or where media was not manipulative. Much of today's concern with how easy it is to dupe people in digital media suggests a nostalgia for a time when the media was somehow immune to misinformation and free of its own distorting agendas. The anxiety tends toward a doomsday tone, as in this essay by Mark Pesce, "The Last Days of Reality," which comes across as more elaborate version of an algorumor. Fake news, machine learning, neuromarketing, and augmented reality, Pesce argues, are fusing "in a tangled nexus of technology and capital, each amplifying the others almost beyond imagining, warping the fabric of reality, offering attractions so alluring many will find it difficult to resist, framing an emerging world that can only be termed ‘post-real’."
Basically, the technology exists to invent and disseminate real-seeming news items to people prone to accepting them as true: hence "the real world is about to disappear" because we will all be duped. The efforts to discredit the Parkland students who have organized against the gun lobby is an illustration. Evidence is fabricated to cast doubt on the teens' motives (the idea, for instance, that they are "crisis actors"), and then algorithmic recommendation systems are exploited to make sure this "evidence" spreads to people who might be open to believing it, or who want their prejudices confirmed.
This is terrible, but not new. It is just more overt, more public, more mechanized and rationalized as a process. Reality has always been a local phenomenon, and people have always preferred to confirm for each other "truths" they prefer to believe in gossipy conversations rather than verify everything against the "facts on the ground." Pesce argues that Facebook is incentivized to serve fake news catered to each individual's bubble in order to keep them using Facebook, but it is not as though people were not already self-selecting dubious sources of information outside social media. What Facebook does is make that bubble a convenient, on-demand sort of escape — one that is also visible and trackable, and capable of being administered more deliberately.
Maybe because I already tend toward pessimism and have made these sorts of dire pronouncements myself, I don't find the doom-mongering all that useful. I think a more compelling commentary on the "post-real" can be drawn from the show Nathan for You, which makes media manipulation and our susceptibility to it into comedy, making us see our complicity in it through how we can be brought to laugh about it. After all, we all participate in various forms of media manipulation, presenting ourselves in a flattering light, selecting the perspectives we want to foreground over others, remaining quiet about other things. It does no good to pretend that people are only victims of it.
On the show, Nathan Fielder poses as a business consultant offering to help supposedly struggling small businesses, usually by concocting elaborate stunts to attract media attention. This makes it seem as the only possible problem a business can face is insufficient attention; in other words, all businesses are in the attention business. Everything is a media company. (This mirrors the experience of most social media platforms, where every user is invited to think of themselves as broadcasters trying to score high ratings.)
Much of the comedy lies in how Fielder's plans quickly become unwieldy, as the bait he comes up with to attract media attention ends up necessitating all sorts of extra layers of deception or contingency planning. Usually he hires actors from Craigslist who bring their own quests for media attention into play. Suddenly all business is theater. The goal of the schemes is usually to mislead the media into making a business into news, but Fielder typically plays up a nebulous personal agenda of his own — some awkward relationship he appears to be pursing with one of his clients or freelancers — which casts his own representation of events, the one viewers see, into question. So even as we are in on the larger joke, there is a sense that a joke is also being played on us — and after all, like all reality TV, the entire show could be contrived, pieced together through deliberate editing.
The final episode of the show's most recent season ends with a scene that sums up a lot of its stakes. Throughout the episode, Fielder is trying to help "Bill," one of the actors on an earlier prank (he appeared as a Bill Gates impersonator), find a lost high school love in Arkansas. At one point Fielder hires Maci, a female escort, to try to take Bill's mind off the woman he is obsessed with, but soon Fielder begins seeing Maci himself. The focus shifts away from Bill's fantasy of lost love in the past to Fielder's present day fantasy of "love" with Maci, who, like Bill ultimately, is a paid performer whose "real" feelings are obscure. The show ends with ostensibly intimate scene between Maci and Fielder at a playground, where he tries to get serious and real with her, but then the camera pulls back and the viewer is shown just how many people are on the scene filming it, and how misleading the scene had been (and by extension the entire series) up until then. Even when viewers might feel as though they are fully in on the setup, there is always another setup.
But this reveal doesn't mean the emotions onscreen haven't been real, or that we now know the real truth about what happened. Instead it's clear that there is no real truth, no final explanation, no perspective from which everything is clear and all motives are explicit. We are always making assumptions about intentions based on how things are framed, but those framings have become easier for everyone to manipulate, to impose through the various ways we can mediate things ourselves. To escape the layers of different perspectives and points of view in play, one would have to withdraw into oneself.
***
Over the weekend we went to the Met Breuer and happened to see Anselm Kiefer's 1970 painting Everyone Stands Under His Own Dome of Heaven, which is at the top of this post. The Met's description quotes Kiefer's explanation of the painting: "Each man has his own dome, his own perceptions, his own theories. There is no one god for all. Each man has his own, and sometimes [it] overlaps with or intersects another’s." That nicely summarizes my reading of Nathan for You, but it doesn't really describe what the painting depicts. There aren't overlapping domes, but one man under one dome, giving a Nazi salute. The figure is isolated and remote, possibly even a statue.
The painting seems open to several interpretations: is it heaven under the dome because one can be a lone dictator in there, controlling everything, or is it heaven outside the dome, because one's fascist impulses are automatically quarantined? Or is it heaven because the dome we are always under is finally revealed, and is itself the dictator we salute in gratitude and relief?