Welcome to Galaxy Brain -- a newsletter from Charlie Warzel about technology and culture. You can read what this is all about here. If you like what you see, consider forwarding it to a friend or two. You can also click the button below to subscribe. And if you’ve been reading, consider going to the paid version.
Reminder: Galaxy Brain has audio versions of the posts for you to enjoy. I’ve partnered with the audio company, Curio, to have all newsletters available in audio form — and they are free for paid subscribers. Here’s the most recent audio post.
There’s been a tremendous amount of good reporting and analysis on the Facebook Whistleblower saga so I thought I’d just jot down some thoughts about the last few days and why they stand out in what has been about four years of steady bad news about the platform.
First, there are multiple types of Facebook scandals. There’s the data privacy scandal, which usually leads to familiar critiques that the company is: 1) Too big. 2) Reckless with users’ information because it prioritizes profits over people. There’s the content moderation scandal, which usually leads to the critiques that the company is: 1) Too big. 2) Reckless with optimizing for engagement over user safety. 3) Destabilizing social society across the globe. There are obviously other types of Facebook scandals, but these are two of the ones that are most newsworthy as they tend to intersect with politics.
During previous Facebook scandals (especially those that focus on privacy and Facebook’s myriad content moderation issues) even the most productive conversations tend to get flattened. Reporters write more nuanced stories but the media ecosystem plays a game of telephone with the original story, stripping bits of context out until people are left with a simple narrative: ‘Facebook is dangerous and responsible for X’ [the slow erosion of democracy, political polarization, electing demagogues, helping Russia infect Americans with propaganda, Brexit, very personal invasions of privacy, including listening in on your conversations to feed you ads]. Some of these things, like the listening in on you for ads part, aren’t true. Others are basically true. Others are directionally true but the particulars are exaggerated.
A good example of this whole game of telephone at work is with the Cambridge Analytica scandal. Cambridge Analytica was a data privacy scandal (and it was originally reported that way) but I’ve spent years talking to people — in the government, in the media, in everyday life — who believe that the lesson from Cambridge Analytica is that psychographic profiling is essentially akin to mind control. Others believe that Cambridge Analytica played an outsize role in getting Donald Trump elected. I think the evidence on that is pretty slim. In my opinion, the important part about the Cambridge Analytica scandal is that Facebook was caught being reckless with its data and that a clearly sketchy company with ties to defense contractors were trying to partner directly with political campaigns across the globe to elect the highest bidder.
The reason for this digression is that, during these scandals, the popular narrative tends to ignore the nuances of what Facebook is really doing to all of us. During Cambridge, aggregated headlines/stories and political grandstanding (during hearings and in endless lawmaker posts and comments) repeated the marketing company’s most salacious claims about manipulating voters. There was plenty of good reporting that psychographic profiling is far from a perfect science as well as work from researchers like Brendan Nyhan, who argued that, “it’s very hard to change people’s minds, especially when so many are already committed partisans.” But that nuance didn’t often trickle down to those who don’t pay constant attention to Facebook’s Democracy Dilemma.
Broadly, I think it’s helpful when Congress forces the Big Tech executives to come testify. I don’t think all these discussions are very productive but the hearings are a powerful bit of optics that frame whatever is going on with the platforms as a matter of national interest. That’s good. But still, I find myself pulling my hair out over the lawmaker grandstanding and the extreme partisan divides on what the platforms are doing wrong. I worry that there’s nothing unifying for lawmakers to latch onto to create policy besides disliking vaguely disliking Facebook. If Republicans are furious about their rather spurious claims of censorship and Democrats are anxious about finding hastily considered ways to clean up the platforms, it’s pretty clear nothing big will get done. ‘Facebook is bad’ is not the best policy frame when you’re dealing with a complicated platform of billions of people in every region of the world.
Which brings me to Tuesday and Frances Haugen. Haugen is a former Facebook product manager and a long time tech employee. She understands the systems she’s talking about and, helpfully, has seen how they operate at multiple companies. She’s the primary source of the Wall Street Journal’s excellent investigative series, which I wrote about earlier this month. She revealed herself on 60 Minutes on Sunday and testified before Congress Tuesday and her interview and testimony may have fundamentally altered the Facebook narrative in a positive way. Specifically, Haugen managed to elevate the conversation about Facebook by focusing it on the platform’s design and algorithms instead of portraying the company as a politically motivated, censorious juggernaut or an evil empire set on global destruction.
Her focus on Facebook’s algorithms was important because it was both credible (it’s her field) and because it was specific. In her interviews and testimony, Haugen did not cite the algorithm as some magical force, but as a tool to make decisions at scale. Here’s how she explains it on 60 Minutes:
Frances Haugen: So, you know, you have your phone. You might see only 100 pieces of content if you sit and scroll on for, you know, five minutes. But Facebook has thousands of options it could show you. The algorithm picks from those options based on the kind of content you've engaged with the most in the past.
And one of the consequences of how Facebook is picking out that content today is it is -- optimizing for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarizing, it's easier to inspire people to anger than it is to other emotions.
If you pay close attention to this stuff, what she’s talking about is Platforms 101. But most people don’t pay close attention to this stuff. And what Haugen is doing here is articulating a very powerful point that many Facebook users still take for granted: What you see on Facebook is not organic presentation of information. It is the result of decisions made for you by the company’s software, which follows its leaders’ directives.
This is a powerful sentiment because it gives every Facebook user a tangible example of how the platform deprives them of a certain kind of agency. In 2018, when the Cambridge Analytica scandal was in its second week, I wrote that it would have staying power because it reminded regular users how platforms have “stripped us of the agency to dictate what happens with our most personal information.” I think Haugen’s testimony (and the documents that help back it up) will do something similar for people who may have not realized that Facebook is not a pure reflection of what’s happening in the lives of their friends and families — it is a highly curated one. Talking about Facebook from the perspective of user agency has the potential to be effective. The company isn’t all powerful and platforms aren’t mind controllers, but they do exert influence on how information is amplified. And that’s a responsibility to be held accountable for.
The algorithmic focus also turns the conversation away from the ‘WHY ARE YOU CENSORING US? THIS IS BIAS!/WHY ARE YOU LEAVING THIS CONTENT UP?’ binary — a favorite especially in congressional hearings. I thought this was most apparent during Sen. Ted Cruz’s questioning portion. For him, Cruz was rather subdued during his portion of time but what was striking to me was the way that Haugen’s focus on algorithmic amplification cut off his attempt to bring the Facebook conversation back to liberal bias territory:
Haugen: You mentioned earlier concerns around free speech. A lot of things I advocate for are around changing the mechanisms of amplification, not around picking winners or losers in the marketplace of ideas.
Cruz: Explain what that means.
Haugen: It’s like how on Twitter, you have to click through on a link before you re-share it, small actions like that don’t require picking good ideas and bad ideas, it just makes the platform less twitchy, less reactive. Facebook’s internal research says each one of those small actions dramatically reduces misinformation, hate speech, and violence-inciting content on the platform.
Cruz was out of time but it’s also clear that Haugen’s focus is helpful in deflecting the rather disingenuous ‘politically biased censorship’ claims that lawmakers like Cruz constantly toss out against the platforms. The conversation shifts to focus on reach, not speech. This is still delicate territory — amplification decisions can ultimately be just as political as de-platforming decisions. But we are far closer to the heart of Facebook’s information problems, which focus on its role in what the company chooses to boost, not just what it hosts.
One doesn’t have to agree with everything Haugen said to appreciate the injection of nuance into the conversation. One particular response from Haugen that struck me was her talking about standards for deciding how far a piece of content has to spread across a platform before it's not considered private. Again, this is not an easy thing to decide on! But it's more along the lines of how we could talk about social media — it’s a conversation grounded less in moral grandstanding and easy villains and rooted in the platform’s architecture. The solutions may still be controversial, but they are more productive.
Lastly, I think Haugen’s testimony is helpful in that it addresses Facebook in a more empirical, technical fashion instead of simply a moral or political one. Haugen said repeatedly in her testimony that she didn’t think Facebook set out to build a destabilizing company that contributes to teen self harm. I think that’s true. Instead, she shows (with ample evidence) the ways that the company has privileged profits over people and growth over safety again and again. And while I get that people are mad and would like to imagine Facebook leadership as psychopaths or secret Republicans trying to get Trump elected or pure chaos agents who want to overthrow the world order, I think that’s too convenient. Personally, I think the truth is more damning. Facebook’s leadership might have a negative effect on the world that amounts to evil, but Facebook is evil then in the way that all unregulated hyper-capitalist businesses are evil. At the end of the day, it doesn’t matter what’s in their hearts, it matters what their actions are doing to people. And it matters what we do in response. That response starts with a better, more focused conversation. I think that we’re a step closer today.
Ok! That’s it for today. If you read this newsletter and value it, consider going to the paid version, and come hang out with us on Sidechannel, the Discord you’ll get access to if you switch over to paid.
If you are a contingent worker or un- or under-employed, just email and I’ll give you a free subscription, no questions asked. If you’d like to underwrite one of those subscriptions, you can donate one here.
If you’re reading this in your inbox, you can find a shareable version online here. You can follow me on Twitter here, and Instagram here. Feel free to comment below — and you can always reach me at charliewarzel@gmail.com.
"What you see on Facebook is not organic presentation of information. It is the result of decisions made for you by the company’s software, which follows its leaders’ directives." What if a quote like this were required to be prominently displayed atop the FB feed? On Twitter? A version of it on Newspapers? Just like with cigarettes we have a prominently displayed warning from the Surgeon General, in the digital/information age we should have similar "Media Literacy" warnings placed on information sources. And the Federal Government should have some appointed official like the Surgeon General to oversee (logically should be in the FCC but that agency is so beholden to the industries they regulate it's problematic).
Yes, this. Thank you