Keep out
In a recent article for Buzzfeed, Anne Helen Petersen notes the rise in social-media shaming of perceived social-distancing violations. She interprets people’s chastising posts as “misguided manifestations of fear and confusion in the face of a very real vacuum of authority.” To restore a sense of order, she suggests, we seize the available means of communication to put others in their place.
Because we have received all sorts of contradictory and confusing information from the internet, Petersen suspects that we also tend to lash out through the same platform: “It’s unsurprising that the places where we’re confronted with that information, again and again, would also become the places where we become our worst selves.” This modifies the familiar critique that the internet is a site for evil because of how it divorces connectivity from reciprocity, or the idea that people go trolling because it is often a consequence-free means of getting sadistic kicks. Instead, the online coronavirus shaming is represented as an expression of an authoritarian desire going unfulfilled. What people want, by this logic, is a perfect clear elucidation of the rules and a perfect authority enforcing them universally. When they see evidence of other people defying the rules and “getting away with it,” they shame them out of frustration with not knowing who they could turn them in to.
The same mentality of ambient fearfulness and lashing out seems to apply to “snitch” tech like the Ring camera and its associated social-media platform, Neighbors. These are designed to integrate users with local police departments, with whom Amazon partnered to promote its devices. Ring cameras are supposed to constitute the “new neighborhood watch,” but since the pandemic has begun, homebound people can watch with their own eyes, so the primary function of home surveillance may be drifting. Early on in the crisis, it seemed as though neighborhood-crime apps like Neighbors and NextDoor would be repurposed for the local provision of mutual aid and emotional support, but they are also highly susceptible to the sorts of use and abuse that Petersen details. You can post complaints about how other people are not following the rules, in an effort to generate contagious outrage.
“The most effective way to diffuse collective action — and the sweeping, systemic changes it can spark — has always been to turn those who are suffering against one another,” Petersen writes. That is also a concise way of explaining what Ring cameras accomplish: They reject the possibility of collective action and embed “every person for themselves” at the level of architecture and infrastructure. The cameras alienate homeowners from their own neighborhood, from their own front porch, so that it becomes that place where we get confusing and unsettling information, and thus a “place where we become our worst selves” — defensive, sneaky, suspicious.
Home surveillance is another instance of tech companies finding a way to scale up an understandable human proclivity into something monstrous and destabilizing for everyone, regardless of whether they have bought into it. If enough people are using Facebook, or Ring cameras, everyone’s behavior is reshaped by them. This kind of “network effect” tends to be uncritically celebrated; hopefully the pandemic, which proceeds by the same principles, will change the way we perceive the weaponization of our collectivity.
With social media, the reasonable desires for social recognition and tailored information metastasized into addictive use patterns and “coordinated inauthentic behavior,” to use Facebook’s coinage. With surveillance cameras, personal security concerns aggregate into generalized paranoia. Everything brought within range of a camera is suffused with a newfound potential for crime; transgression is ambient and always imminent, always about to be seen. One can imagine the contact-tracing apps that tech companies are promising will have similar implications: Everything brought within Bluetooth range can be reinterpreted as primarily an infectious agent, while the app can continually foreground a sense of the world is fundamentally dangerous.
Ring cameras and the like presuppose social distancing, physical isolation from one’s neighbors, paranoia, fearfulness in ways that now seem redundant. Surveillance cameras presume that people want separation, want mediated relations, want to have “Neighbors” only through the intercession of a phone app. Quarantine perhaps has suggested that personal interaction is not actually as inconvenient as so many distanciating apps presume. It may be a major problem for tech companies to convince everyone that the quasi-quarantine conditions people used to opt into with things like home delivery and online shopping are actually “convenient” and “secure” rather than tedious and soul-sucking reminders of this current dark time.
Neighborhood surveillance cameras, like neighborhood watch schemes in general, produce social bonds based on generalized suspicion. But as theorist Sara Ahmed noted in her 2000 book Strange Encounters, “the signifier ‘suspicious’ does an enormous amount of work in Neighborhood Watch discourse precisely insofar as it is empty. The good citizen is not given any information about how to tell what or who is suspicious in the first place.” Ahmed argues that this undefined “suspicion” implies a shared “common sense” about who belongs and who doesn’t. Contact tracing apps could do something similar with the virus, generating false positives and negatives that nonetheless generate a common sense about how to regard those who might potentially be sick as pariahs. The confused information about the pandemic and the absence of a coherent testing regimen works like an undefined “suspiciousness” that authorizes the kind of behavior Petersen describes — no benefit of the doubt with respect to the behavior of strangers, a determination to read it as ill-intended or destructive.
Surveillance cameras seize on the “emptiness” of suspicion and automate the recognition of suspicious behavior — detecting motion or, potentially, individuals whose faces don’t appear in a preapproved database or whose gait betokens possible malicious intent. Many can be programmed to compile highlight reels of all the moments they detected motion, and more sophisticated forms of video content analysis are being developed that use algorithmic image recognition to extract information from camera feeds and archived video footage. The automatic detection helps reinforce the “common sense” that the “suspicious people” are always out there, implacably threatening the community. Video analytics can produce suspicion on demand, generating a compilation of concerning clips whenever we felt like looking, much as social media feeds refresh for us whenever we feel like scrolling. In this way, the eagerness to protect a certain neighborhood character morphs into a desire to watch this process, foreshortened, as a form of entertainment. In the midst of the lockdown, decontextualized social media posts can appear like these surveillance feed “highlights” — inscrutable moments calling out the viewer’s outrage. Eventually it becomes a mode of interactive entertainment to seek out something to be angry about, to find someone to blame for the crisis and its anxieties.
Social media have always been partly rooted in surveillance, which has progressively become more extensive and immediate. Platforms now routinely use various forms of facial and object detection to identify who is doing what where, with what sorts of brands involved. They have expanded to incorporate real-time video feeds to capture ever more informational capital, time-space rendered as a kind of possessable property. Cameras in general don’t secure space so much as convert it into a data flow, in a sense securitizing it, making it a kind of investment in producing and capturing a particular future, where certain people can be targeted with ads and others can be targeted as criminals. The surveillance apps being proposed for contact tracing could easily morph into data collection tools in the same way, feeding the already existing system for tracking and targeting people, not merely as individuals but as nodes in a network of connections, or as part of a kaleidoscopic variety of demographics and potential emerging lifestyle clusters.
Camera technology has made it easier than ever for busybodies to monitor their neighborhood and harder for all of us to pretend that there are not cameras pointed us at every moment, no matter where we are. This ad hoc panopticon doesn’t simulate or facilitate solidarity; it imposes the opposite sentiment, an insistence on proper respect on each individual household’s private property above all other interaction. The surveillance camera places the house within a neighborhood but against it.
The contact-tracing apps would have similar effects — teaching users to envision their body as separate and sacrosanct, not coextensive with those of the people around them in a shared space of “public health.” It singles out other people as threats, and encourages us to imagine more comprehensive ways to keep them off our property.