Meaningful interaction
Last week, Facebook announced changes to how it planned to administer its content stream: The algorithm that decides what users see would now be weighted to "prioritize posts that spark conversations and meaningful interactions between people," according to this corporate announcement, rather than permitting what the company calls "engagement-bait."
In a post accompanying the announcement, Mark Zuckerberg claims that "passively reading articles or watching videos" — as opposed to sharing and commenting on them — leads to less "connection" and more unhappiness, so Facebook will instead force-feed users the content that makes them more responsive — for their own good, of course. "I'm changing the goal I give our product teams from focusing on helping you find relevant content to helping you have more meaningful social interactions," Zuckerberg writes.
This is part of the company's long history of trying to impose "meaning" or "authenticity" on users through its News Feed. But as for what Facebook means by "meaning," or "authenticity," a concept it declares a "key News Feed value," who knows. These aren't defined but asserted as self-evident tautologies. "Meaningful interactions" are the interactions that people find meaningful. But "meaningful" shouldn't be confused with "engaging," because "engagement-bait" is obviously non-meaningful content that people circulate but must be protected from. And it certainly should not be confused with "informative": though Facebook chooses what users see it disclaims any editorial or journalistic responsibility for those choices (one of the classic examples of using algorithms to evade accountability). It has no publishing vision, just metrics.
"We feel a responsibility to make sure our services aren’t just fun to use, but also good for people's well-being," Zuckerberg writes, implicitly acknowledging that the site has been engineered to be "fun" at the expense of its users' "well-being." But none of these terms are clarified either. What is "fun" here? Addictiveness? And what is "well-being," besides something "leading experts at universities" apparently know about? And why is "passively" reading things — sometimes known as "thinking" — held up for special suspicion?
It's odd for Zuckerberg to castigate "passivity" as the problem when the News Feed was initially designed to facilitate it. Rather than gather updates on what friends were doing on the site, the News Feed passively compile it all to save you the trouble. It took the RSS model of content delivery and adapted it to Facebook's closed platform, and then it used algorithmic sorting to try to put the most captivating content (as measured by behavior Facebook's interface could capture) on the top. This essentially turned Facebook into a broadcasting platform — "social networking" became "social media."
In the wake of the site's role in spreading false information, the site now wants to change its content mix, so it can claim it has done duly diligent self-regulation. But it is not changing its underlying approach: collect data on people to target them with the "right" information, whatever it feels like calling "meaningful" on any given day. "Bringing people closer together" seems to be the current slogan. The tautological nature of the slogans, and "meaning" and "authenticity" and the rest, all serve as alibis for this business model: capturing audiences, collecting data about them, and then selling ad space accordingly.
Facebook's idea of "bringing people closer together" apparently involves determining which of your "friends and family" get you to use Facebook most and then (a) showing you more of their content while (b) getting them to post more of it. "Meaning," if there is one to this construct, is fully internal to Facebook and to what it measures. If it can't be captured and quantified on the platform, it is not meaningful. Meaningful interaction, ultimately, is just interaction you have on Facebook, nothing more or less.
This sort of interaction is not more active or less passive; it is a kind of collective passivity (or "interpassivity") in which people let Facebook do the interacting for them. If someone's post shows up in my News Feed, I may as well have to talked to them; more so if I "like" it. If we all "interact" with each other this way, we can trap each other in Facebook's loops of programmatic connection — getting "closer" and "closer" still. It's going to mean so much, you're going to be so sick and tired of all the meaning.
And then "meaning" will change yet again. Even if you choose to believe Facebook is in good faith in its desire to use "authenticity" to "bring people closer together," even if you think authenticity has anything to do with being "closer" to people, the problem remains that what we experience as "authentic" is always a moving target. It tends to be understood in relation to what has become overfamiliar or calculated, as a form of spontaneity, which means that programing an algorithm to spot it is inherently self-defeating. So Facebook will continually change its algorithm and keep those changes as secret as possible to try to produce the illusion of serendipity.
So the parade of statements announcing Facebook's new dedication to a new kind of "meaningful interaction" will be endless. These periodic updates will always be meant to guarantee "authentic expression" by changing the algorithm that content producers (professional and amateur) are trying to game to get into other people's news feeds.
"Meaningful" for Facebook, then, is the fantasy of an ungameable algorithm that would allow its metrics to measure something pure. Every algorithm tweak aspires toward the ideal of being able to read measurable user behavior as a perfect expression of "who users really are" and "what users really want." But in effect, the constant algorithm changes retrain Facebook users, teaching them who they are supposed to be and what they are supposed to want to make the kind of attention Facebook affords more valuable.
An "authentic" user of Facebook, from the company's point of view, cooperates with this process and collaborates with the company's efforts to insert Facebook into our most important relationships (with "friends and family") and let it facilitate the process of "bringing them closer," as if we would be helpless to do so otherwise. But do we really want our relationships to be more dependent on Facebook? Does we want to be "closer" to the people we'd be more distant from if not for Facebook's interventions? What sort of friend talks to you through the News Feed?
***
Any interaction we have with Facebook is meaningful from Facebook's point of view. And the platform's logic is that the more data it collects on us, the better it functions in allowing us to have meaningful interactions. What, then, to make of the company's perplexing claim in its most recent statement that "space in News Feed is limited"? The News Feed is an endless scroll; its space not limited at all but infinite.
What is limited, perhaps, is the amount of time a user has to spend scrolling, or the number of posts they will look at before closing Facebook minus any of the "meaningful interaction" that gets them to stay on even after they are no longer waiting in a line or otherwise killing time. (Who says to themselves: I want to have a meaningful interaction: wait, I know — I think I will go on Facebook!) Facebook use is typically solitary, not social. It is a individualized consumption experience.
The "limited space" claim may be a ruse to create authenticity through artificial scarcity. (It seems to work for museums.) The implicit promise is that Facebook not only has access to all the necessary "content" of your social existence but can validate a subset of that and authenticate it by showing it to you. If Facebook shows you everything from your friends in reverse chronological order, you would still have agency in choosing what to pause on — you would be required to be active in your filtering. When the algorithm puts "relevant" content closer to the top, you can restrict your activity to responding as Facebook would like you to.
But the "limited space" remark is more directed to advertisers and publishers. Facebook is basically saying, We're working hard to show people less but with more impact, and that's why you should pay more to be included with that content. But if content is "meaningful" by being "authentic" and not calculated for commercial gain, ads always degrade the value of the surrounding content, which effectively becomes ads for the ads.
This makes Facebook the principle agent responsible for making social interaction on Facebook less "meaningful" and "authentic" — it puts people's social lives up for sale. So Facebook is constantly on the lookout for scapegoats. The culprit now is "public content," a new Fackbook-ism that Zuckerberg uses to describe content that is not from "friends and family" (presumably "private" content) and hence not "meaningful." This "public content" is bad because it is "crowding out the personal moments that lead us to connect more with each other."
This hinges on an incoherent distinction between public and private. Public here seems to mean material that is addressed to anyone and not just friends. It carries the sense of "being in the public interest." In other words, "public content" is conventional news, as opposed to the News Feed's original version of "news" about what our friends do on Facebook. Of course, "public" also means posts that everyone can see, which Facebook is in favor of: You should set your private content to "public" so that Facebook can "bring you closer together" with people you have friended. But don't broadcast information as though you are Associated Press; that is public content that should be private, and it will probably make you sad and unhealthy anyway.
What seemed to shock some journalist commentators, like Joshua Benton in this Neiman Lab post, was that sweeping condemnation of reading the conventional news as "passive" and against "well-being." "Optimizing for its own definition of engagement has, of course, been the core of Facebook’s model for years," Benton notes, "But it sure is something to see reading or watching news described as bad for our health."
Reading the news without context on Facebook does, in fact, seem bad for our understanding of the world, if not our health, and the platform is probably right to discourage its use as a kind of newspaper. It may be unfathomable to journalists, but Facebook's idea of "meaning" has nothing to do with the public sphere or informing people. It assumes people can live their whole lives without ever knowing anything about what happened beyond their circle of friends. It doesn't distinguish a certain kind of information to be in the public interest and worthy of special publicity. Its longstanding habit of making things public by default confirms that. Facebook assumes that every post on the platform is de facto chit-chat among friends and not an attempt to report the news, and it uses algorithms to route that content to the appropriate participants.
But Facebook saw that the same apparatus it had built could also generate virality, which publishers really wanted (and advertisers, and users themselves too, to some extent). It couldn't then resist optimizing for those big numbers. Now it is trying to go back to chit-chat rather than virality ("some measures of engagement will go down," Zuckerberg writes, "but I also expect the time you do spend on Facebook will be more valuable"), which runs counter to every journalistic instinct about how to communicate. Journalism's business model assumes information is only as valuable as the level of its circulation.
News publishers know Facebook can drive traffic to their sites, so they think that it should. (Never underestimate the conventional media's sense of entitlement to people's attention.) But that traffic generation is a by-product of what Facebook wants to do, which is profile and target users with whatever content will keep them on Facebook. Publishers, as they are wont to do, assume that only their professionally made content is capable of gathering attention and holding audiences. Facebook thinks that content is essentially meaningless relative to who shares it. Put another way, Facebook doesn't want to help build the brands of other publishers, it wants to build the brand of other Facebook users. Which is to say it wants to build the brand of Facebook itself, which allows those users to have that kind of "brand" in the first place. Facebook doesn't want to reinforce a taste for conventional news, let alone keep people informed with sound perspectives about current events; it wants to inculcate a taste for Facebooking with friends.
So when Benton complains that "the idea that the value of a piece of news is defined by likes and comments — that taking in information without getting into a back-and-forth with your uncle about it is somehow unworthy — is actually a profoundly ideological statement," he's pointing out just this: Facebook assumes you care more about your uncle's Facebook behavior than what a newspaper says. And of course that is "profoundly ideological" — everything in a press release is ideological. Everything Benton says is "ideological." But it is not some sort of secret code. Journalists want to sell more of their product by insisting it is good for us. Their understanding of "meaning" — no better articulated than Facebook's, for the most part — relies on a vague sense of the public sphere and our mandatory civic participation in it on their terms. They want to generate "meaningful interaction" too, but like Facebook, their intention to serve "meaning" is compromised by their imperative to sell ads and make money.
Facebook offers an alternative idea of the public sphere, which has nothing to do with deliberative democracy or good governance but on wide-scale depoliticization. In its "public sphere" people talk for the sake of talking, for the sake of belonging and not out of any hope of changing anyone's minds. Nothing matters beyond friends and family anyway. Between the lines of Facebook's pleas for us to "get closer" is the idea that all our interaction is "meaningful" as long as it is harmless.