Welcome to Galaxy Brain — a newsletter from Charlie Warzel about technology, media, and culture. You can read what this is all about here. If you like what you see, consider forwarding it to a friend or two. You can also click the button below to subscribe. And if you’ve been reading, consider going to the paid version.
Programming note: I have to travel across the country due to a pressing family issue, so I’m not entirely sure on the newsletter cadence this week — should have one subscriber-only newsletter and one general coming at some point. Thanks for bearing with me.
We’re all arguing about Facebook again.
The short backstory is that on Thursday, U.S. Surgeon General Vivek Murthy spoke publicly about vaccine misinformation as a leading driver of the country’s now-stalling vaccination effort. Murthy (rightly) singled out the architecture of social media platforms, noting that they “reward engagement rather than accuracy.” His statement sparked the latest round of a ‘what do we do about Facebook?’ conversation that would be mostly unremarkable if not for the president of the United States lobbing a rhetorical grenade into the discourse. Biden’s criticism of Facebook was, uh, unequivocal: “They’re killing people,” Joe Biden told a reporter.
The president’s words prompted a statement from Facebook. “We will not be distracted by accusations which are not supported by the facts,” it read, before detailing all the good that Facebook has done in the vaccine information realm (that 2 billion people ‘viewed’ authoritative vaccine info on the platform; or that 3.3 million Americans used Facebook’s vaccine finder). “The facts show that Facebook is helping save lives. Period.”
This back and forth between Facebook and the White House — conducted through the press — is a good example of an unproductive, false binary of a conversation on a complex topic that deserves far more nuance. Ironically, the argument is a great example of social media-influenced and flattened discourse that is poisoning us all.
Given all we know about Facebook’s destabilizing effect on politics (Myanmar, the Philippines, QAnon/the MAGA movement, to name a few) it is laughable to suggest that Facebook is only “helping save lives. Period.” As Kara Swisher argued, that period in Facebook’s ought to be a semi-colon. Of course, Facebook is and has been a hotbed for anti-vaxers and covid denialists — the platform is, arguably, one of the biggest vectors for Covid misinformation, just by virtue of its unprecedented size. But Facebook’s size cuts both ways. The fact that the platform is so massive also means that the company’s efforts — like its vaccine finder — will likely reach more people than many well-intentioned campaigns by public health officials and the U.S. government. This is because Facebook is absolutely huge (a problem, if you ask me!).
So, is Facebook killing people? Or saving them? Depending on how you want to assign blame to a social network (which is an information amplifier) it is either doing both or it is doing neither (a version of the ‘guns don’t kill people, people kill people’ argument). The killing/saving binary is an unhelpful frame for a conversation about Facebook because it turns an important conversation about liability into a choose-your-own-adventure argument for already interested parties. It’s a perfect little culture war scuffle that allows Joe Biden to (rightly!) pressure a company that has far too much power and far too little accountability, while allowing Facebook to play the misunderstood victim and doing little to fix its problems. Nothing changes and we move on to the next fight.
The misinformation problem is, unfortunately for us all, messy. Here’s one tiny example, from White House digital director, Rob Flaherty:
Dylan Byers @DylanByers#Break: FACEBOOK responds to Biden’s “They’re killing people” remark >> https://t.co/CIB8tPTwdr
Flaherty’s question is completely reasonable. After, all, if Facebook is willing to tell us how many people saw its helpful vaccine information — so it makes logical sense to share the other side, too. Of course, that’s not going to happen. Over at Tech Policy Press, Justin Hendrix had a great line about Facebook’s PR strategy, arguing that historically, Facebook will trot out statistics and “emphasize the denominator to obscure the numerator, and vice versa when it’s more advantageous.” Flaherty is essentially asking for the denominator.
Here’s where I think it gets a bit more complicated: What if the denominator doesn’t matter like we think it does? As Siva Vaidhyanathan, a University of Virginia professor who has written extensively about Facebook, argued Sunday morning, simply knowing who has “seen” a piece of content is also an imprecise metric. People see tons of terrible or potentially dangerous crap all the time and ignore it for any number of reasons — because it looks shady; because it is coming from a source they’ve never heard of and don’t trust; because they are busy; because their minds are made up; because vaccine discourse doesn’t interest them and they’re just trying to scroll down to find a post from their Facebook knitting group; because whatever.
What matters, Vaidhyanathan and others argue, is the “complex, social process” of narrative creation and the creation of trusted networks that are built to spread a specific narrative (this Twitter thread by the journalist Farai Chideya is a short, but helpful encapsulation of this idea).
Basically: the quantity of vaccine misinformation matters, but it likely matters less than the way that the platform allows for the creation and consolidation of bigger, political and cultural narratives, which are the extremely effective delivery mechanism for the misinformation. Facebook obviously plays a big role in this process, but narrative creation is not remotely limited to Facebook (I find it wild that YouTube is, once again, hardly a part of this bigger political conversation. In March, the company announced it took down 30,000 covid vaccine videos containing misinformation. That’s great! But it also speaks to the volume of garbage vaccine information floating around the platform).
If we’re going to scrutinize Facebook, it seems helpful to focus on the way it helps incubate these cultural and political movements. Renee DiResta, the technical research manager at Stanford’s Internet Observatory, had a fantastic thread on Twitter that focused precisely on this dynamic. For the last seven years, DiResta has studied and tracked the anti-vax movement across platforms and, as her thread notes, social media platforms became a home in the late 2000s for anti-vaccine activists who’d been largely “deplatformed” by traditional media, who stopped covering their antics.
The social media platforms allowed the groups a centralized home to develop narratives and to consolidate audiences. Having bigger, public spaces to gather online helped create a durable culture around the movement. It birthed a new set of grifters/marketers/influencers. At the same time, the platforms provided handy distribution mechanisms for these influencers and their narratives. The distribution is crucial for many reasons, one of which is that it allowed the movement to become (or at least appear) big enough to pick up the media coverage it previously couldn’t get.
As DiResta notes in her thread, Facebook and other platforms did, eventually, take meaningful steps to try and curtail these groups and their primary offenders. But taking down a Facebook group only does so much to stamp out the cultural and political movement — the one that was nurtured by these platforms.
“I feel like the bigger problem, rather than vaccines or the Big Lie, is the broader environment that facilitates our perpetual, societal dissensus,” DiResta told me on Sunday afternoon.
“It’s constant assumptions of bad faith on the other group’s part,” she continued. “Every public official’s comment is a battleground. Forming consensus to begin with is this seemingly intractable problem.” She suggested that whether the subject is election results or vaccines or even a natural disaster unfolding in front of our eyes, the structural ways in which social media pushes us into online factions, makes dialogue between groups with opposing ideas basically impossible.
“It’s just so challenging to get to point where we could even have a conversation that might begin to alleviate vaccine hesitancy,” she said.
DiResta also argued that the Biden vs Facebook conversation and its focus on the false binary of ‘is Facebook killing or saving people?’ is a good example of a broken information ecosystem. Surgeon General Murthy’s advisory on vaccine disinformation, for example, was not exclusively about the social media platforms, or Facebook — it included ways in which traditional media, the government, and individual influencers all play a role in the way information travels and is received and distorted. It was a nuanced look at a nuanced problem that was pretty swiftly decontextualized by all kinds of interested parties (Tech platforms playing the victim, free speech maximalists falsely decrying censorship, the press and the White House seizing especially on Facebook’s role).
Even the statements issued by Biden (perhaps off the cuff?) and Facebook (definitely crafted) were bits of rhetoric designed to travel across this broken ecosystem. If you think of Facebook as a scourge (as I do), Biden’s comments offered a kind of populist catharsis and a confirmation of the company as a Villain Of Democracy. If you’re a Big Tech defender of sorts, Biden’s comment is an example of political scapegoating from a septuagenarian who doesn’t begin to understand the problem.
In other words, the Biden v. Facebook discourse is a flattened, unproductive argument that has been shaped by the very platforms they’re trying to critique/defend. We are all stuck arguing this false binary because this is the way we argue now. And a big reason why we argue this way is because of the influence of big platforms like Facebook, which have outsize influence on how our media covers/amplifies stories, how our corporations try to evade responsibility, how our politicians frame issues, and how the rest of us process information and fall into camps. “These fights we’re having all have important nuances worth talking about,” DiResta told me, “and we’re having them in a an information system devoid of nuance.”
Does Facebook bear some real responsibility for our vaccine hesitancy mess? Absolutely! Is it because of specific pieces of Covid-19 anti-vax material? Probably? But the real blood on Facebook’s hands is due to its years of inaction, which allowed broader political and cultural movements to incubate inside Facebook with the help of its connection and distribution tools. Did these movements exist before/without Facebook? Yes. Did Facebook supercharge them and give their biggest grifters access to large pools of money and attention? Yes. Did Facebook allow all these disparate movements (QAnon, anti-vax, etc) to find each other and consolidate into a more cohesive political ideology? Yep. And I’d argue that this movement-building is and was far more dangerous and potent than any individual acts of content moderation.
That’s because these movements now exist powerfully outside of a few social platforms. As DiResta said in our call, the misinformation has gone mainstream. Anti-vaccine narratives are now amplified by influencers who are not primarily concerned about vaccines — people like Tucker Carlson or Rep. Marjorie Taylor Greene. “These influencers have way bigger platforms than anti-vaccine influencers like RFK Jr. did before he was de-platformed,” DiResta said. “Now you have the canonical tropes of Anti-Vaccine misinformation rapidly mainstreamed by those who are not primarily defined by anti-vax beliefs. That has a phenomenal impact.”
What should we do? A lot of things, probably. If we want to have a real discussion about Facebook’s actual damage, there’s a great case for compelling Facebook to open itself up to outside auditors (social scientists, economists, epidemiologists, etc) and granting them access to data that would allow them to determine harm and liability (Hendrix’s piece is good on this). Of course, it seems very unlikely that Facebook will willingly open itself up to any kind of meaningful transparency (see Kevin Roose’s recent piece).
We should attempt to hold Facebook to account, in other words, but we should also recognize that the problem is much bigger than Facebook. This not a reason to let Facebook off the hook, it is merely a fact. As DiResta notes in her thread, “we need government, media, social media companies, and the public — including citizen journalists and influencers — to all be working together to create an environment in which people are able to make decisions with the most accurate information possible.”
That is a huge ask — one that DiResta told me, “sounds pollyannish,” given how all consequential information is immediately politicized and weaponized. What we need is something akin to new conversational and political norms. But I’m not sure what that looks like. There is no easy way out of this situation, which is why I don’t think a vague ‘Facebook is bad’ conversation does us much good anymore. I do think that one very small step forward is not to flatten our important conversations — like this one around Facebook and vaccines — into false binaries. No one needs another reason to hate Facebook, what we all need is a way forward: a way to both live in and begin to repair the broken world Facebook has helped create.
Ok! That’s it for today. I’ll be back Monday or Tuesday with a bloggy, subscriber-only post.
If you are a contingent worker or un- or under-employed, just email and I’ll give you a free subscription, no questions asked. If you’d like to underwrite one of those subscriptions, you can donate one here.
If you’re reading this in your inbox, you can find a shareable version online here. You can follow me on Twitter here, and Instagram here. Feel free to comment below — and you can always reach me at firstname.lastname@example.org.