Welcome to Galaxy Brain -- a newsletter from Charlie Warzel about technology and culture. You can read what this is all about here. If you like what you see, consider forwarding it to a friend or two. You can also click the button below to subscribe. And if you’ve been reading, consider going to the paid version.
Reminder: Galaxy Brain has audio versions of the posts for you to enjoy. I’ve partnered with the audio company, Curio, to have all newsletters available in audio form — and they are free for paid subscribers. Here’s the most recent audio post.
This week, the Wall Street Journal (and especially Jeff Horwitz) has been cranking out revealing pieces for a series called “The Facebook Files.” The series draws on leaked internal research and shows that the company (obviously) knows quite a bit about the negative effects its platform has on users and the world. In its first four stories, the WSJ revealed that:
Facebook found Instagram use had strong negative effects on the body image and mental health of teen girls on the platform.
Facebook had content moderation systems in place that created special exemptions for celebrities and politicians to keep them on the platform if they skirted the rules.
The company knew that a 2018 News Feed algorithm change change caused an increase in the promotion of divisive, angry, and politically destabilizing content.
Facebook knew about and did not adequately respond to demonstrated instances of drug cartel behavior, human trafficking, and other platform violations in developing regions of the world.
There’s a lot here. I recommend taking a look at this Will Oremus piece in the Washington Post, which elucidates just how many of these problems are due to the company’s culture and organizational structure. Oremus’ framing allows us to understand that there are, indeed, good people at Facebook doing important, thoughtful work trying to understand their platform, hold it accountable, and make it less societally toxic — and that these efforts are often stymied by leadership and profit/engagement imperatives.
For this newsletter I wanted to point out a smaller detail that really stuck with me reading Wednesday’s “Facebook File,” which explained why Facebook tweaked its News Feed algorithms to promote “Meaningful Social Interactions”:
Facebook’s chief executive, Mark Zuckerberg, said the aim of the algorithm change was to strengthen bonds between users and to improve their well-being. Facebook would encourage people to interact more with friends and family and spend less time passively consuming professionally produced content, which research suggested was harmful to their mental health. (emphasis mine)
The research the bolded line refers to is a December 2017 report published on Facebook’s blog. Facebook’s data scientists cited external research, mostly, but argued that, “in general, when people spend a lot of time passively consuming information — reading but not interacting with people — they report feeling worse afterward.”
It’s important to remember that this blog post came during a post-Trump election moment where the social network was under intense scrutiny. Former execs like Sean Parker and Chamath Palihapitiya were publicly throwing the platform under the bus for the way it was manipulating users’ emotions and destabilizing society. These are pretty standard claims today, but 2017, there was a freshness to the whole ‘Facebook is Bad’ discourse. Facebook admitting that its role as the World Leader In Time Wasting Browsing That Makes You Feel Like Shit felt like a meaningful step toward…something.
But never fear, Facebook said. The company had a convenient answer to this budding societal problem!
A study we conducted with Robert Kraut at Carnegie Mellon University found that people who sent or received more messages, comments and Timeline posts reported improvements in social support, depression and loneliness. The positive effects were even stronger when people talked with their close friends online.
All that connection getting you down? The answer is more connection, baby! Just a different kind this time. The rest is the kind of history that’s memorialized in the WSJ piece:
Some political parties in Europe told Facebook the algorithm had made them shift their policy positions so they resonated more on the platform, according to the documents.
“Many parties, including those that have shifted to the negative, worry about the long term effects on democracy,” read one internal Facebook report, which didn’t name specific parties.
I think it’s important to note that the Facebook decision to incentivize ‘Meaningful Social Interactions’ wasn’t necessarily a horrible, nefarious idea. But increasing these interactions without also amplifying divisive, incendiary content (thus making people feel awful by consuming it) is big ask. As former Facebook Civic Engagement team leader Samidh Chakrabarti noted on Twitter, this work essentially requires a set of philosophical and ethical values about what ‘good’ and ‘bad’ content is and what the network should do to promote the former.
In other words: Facebook had an ambitious goal increasing ‘Meaningful Social Interactions’ and, in large part, failed to implement its changes in a way that made Facebook less toxic.
Going through this history, I’m struck by how Facebook creates shitty outcomes for its users no matter how you tweak the algorithms. Pre-2018, the platform’s architecture incentivized the creation of some a lot of medium and low quality news and entertainment content, which performed like crazy on the platform. It sucked users in and kept them engaged in a passive way that made them feel worse after a Facebook session. Then, the company tried to incentivize an opposite platform experience in order to get people to engage with each other. It turned out that this was potentially worse and definitely more (politically) destabilizing. At the end of the day, Facebook found that two people screaming at each other and accusing the other of being part of a pedophilic cult or stealing an election is, well, a Meaningful Social Interaction.
Now, I’m being a bit glib and oversimplifying a complex set of policy choices by a company that serves billions of people globally. I don’t believe that Facebook has such power that a few small algorithm tweaks will change human behavior like waving a wand. Facebook employees responded to the WSJ piece arguing that political divisions have been building for years/decades/forever independently of Facebook. I agree with that to an extent, but it’s also a huge, disingenuous dodge. I do believe that the company’s policy decisions have a meaningful impact on the way we communicate. These algorithmic changes are, essentially, amplification incentivizers. They don’t always change the ideology but they do change the way an ideology is communicated. That’s important. Especially because, over time, the repetition in the way we communicate can change the ideology. Divisive, dark, and angry rhetoric can dehumanize one’s opponent, as we’ve seen across political movements on Facebook.
This is a digression, but what I’m trying to say here is that any policy or technical decision Facebook makes will be complex in implementation and bad things and good things will happen. And a fair amount of criticism of particular Facebook decisions (criticisms in which I’ve taken part!) implicitly assume there is a good kind of tweak/prioritization/moderation that will drastically reduce the bad stuff and drastically increase the good stuff. But I’m not sure I fully buy that. To me, the history I laid out above shows a platform that is broken in ways no marginal changes can fix. Facebook, the passive consumption machine is not a great place, nor is Facebook, the fight in the comments machine. What do both of these have in common? Facebook.
In this instance, I’m not trying to be glib. I’ve explicitly laid out the complexities of the platform because I believe that the company has essentially built something so big and influential that running it at the current scale all but guarantees that a lot of really awful and destabilizing stuff happens. Will some good things happen? You bet. Will they outweigh the bad things? Probably not. But also Facebook would like to think the good outweighs the bad and, well, it’s hard to know definitively and so who is to say?!! This is the hill Facebook will die on (more accurately, it is the hill the rest of us will all die on because Facebook is probably just going to wait it out and it certainly has the resources and comms apparatus to do that).
I’ve come to believe that arguments weighing Facebook’s good and bad outcomes are probably a dead end. What seems rather indisputable is that as currently designed (to optimize scale, engagement, profit) there is no way to tweak the platform in a way that doesn’t ultimately make people miserable or that destabilizes big areas of culture and society. The platform is simply too big. Leave it alone and it turns into a dangerous cesspool; play around with the knobs and risk inadvertently censoring or heaping world historic amounts of attention onto people or movements you never anticipated, creating yet more unanticipated outcomes. If there’s any shred of sympathy I have for the company, it’s that there don’t seem to be any great options.
I think there are plenty of overwrought claims about Facebook that are really not about Facebook and mostly about scoring political points. It can feel performative when people say things like “Facebook is not compatible with democracy.” But I do believe that Facebook, at its current scale and in its current design, is not really compatible with humanity.
I don’t mean to suggest we need gatekeepers everywhere or reality czars or that democratizing speech is a de facto bad thing. It’s not! But I think the way that Facebook operates — the way it incentivizes particular ways of talking to each other, the way it can aggregate attention, the sheer amount of people it allows users to reach — seems incompatible with our human ability to handle attention and information. The platform offers constant sensory, informational, and emotional overload. And for this whole historic connectivity experiment to work at scale, the platform asks us for an essentially inhuman amount of restraint — not to pile on, not to argue, to assume good faith, to always appeal to the better angels of our nature. It’s a huge, unrealistic ask of two billion-plus people.
Every day, I’m a little more confident that we’re not supposed to be connected at this kind of scale and with this intensity. And yet, here we are. And I don’t really know what to do about that. The fixes (a more ethical, less scalable business model…that basically doesn’t look like what we’ve come to know as Facebook) feel out of reach. Not because we can’t imagine a better future, but because powerful people are so heavily invested in preserving the exploitative one they’ve built.
Ok! That’s it for today. If you read this newsletter and value it, consider going to the paid version, and come hang out with us on Sidechannel, the Discord you’ll get access to if you switch over to paid.
If you are a contingent worker or un- or under-employed, just email and I’ll give you a free subscription, no questions asked. If you’d like to underwrite one of those subscriptions, you can donate one here.
If you’re reading this in your inbox, you can find a shareable version online here. You can follow me on Twitter here, and Instagram here. Feel free to comment below — and you can always reach me at charliewarzel@gmail.com.
When I was little and would have stupid fights with my best friend, my mom would attribute it to "too much togetherness" and say we just needed some time apart.
I think social media has created a "too much togetherness" problem for all of us, and it would be better if we stepped back from this so-called connection. And that's exactly what I've noticed my more well-adjusted friends doing -- they've either left Facebook altogether or they're barely on anymore.
I'm not quite in that camp yet, but I'm getting there.
Great post, Charlie. I can’t help but wonder if the only thing that would have made a difference is *not* allowing Facebook to buy Instagram and WhatsApp, because those two acquisitions bolted on so many more users.