"What you see on Facebook is not organic presentation of information. It is the result of decisions made for you by the company’s software, which follows its leaders’ directives." What if a quote like this were required to be prominently displayed atop the FB feed? On Twitter? A version of it on Newspapers? Just like with cigarettes we have a prominently displayed warning from the Surgeon General, in the digital/information age we should have similar "Media Literacy" warnings placed on information sources. And the Federal Government should have some appointed official like the Surgeon General to oversee (logically should be in the FCC but that agency is so beholden to the industries they regulate it's problematic).

Expand full comment

Yes, this. Thank you

Expand full comment

One question that occurs to me as I try (naively) to look at things from a systemic standpoint is whether Zuckerberg himself has the ability to change anything on Facebook, or whether even he is constrained by the organizational system.

It's hard to imagine what forms these constraints might take. But, for example, there's the Board of Directors and scary stock-market swings, along the apparent belief that the stock market alone can annihilate a company. There's the thing about companies always having to grow (whatever that means) or else that's it for them.

Then, looking downward on the org-chart, I wonder if a well-intentioned, ethically based mass change in operating policies and revenue stream would require a company-wide project of such a massive and risky scope that it would be prohibitive. There's just all that code! (As much as the youngsters hate the word "legacy" I suspect there are many layers of legacy software at Facebook.) Overhauling a huge system all at once is incredibly risky. Doing it in pieces and phases takes even longer, delays the public's recognition of changes and could result in unintended consequences that make things worse.

Haugen is giving us some high-level insights on another set of internal systems: the algorithmic ones. Charlie's comment on this: "it addresses Facebook in a more empirical, technical fashion instead of simply a moral or political one" is a major reason I think it's important to look at the world as a bunch of systems. It's essential to recognize the moral consequences of our systems, but expecting individuals to single-handedly change the course of even their own out-of-control offspring is unrealistic and ineffective.

(Actually, FB's best course might be to start from scratch with just the original connect-with-friends functionality, a new user-paid revenue base scaled to the range of national incomes, and strict internal data privacy rules. Then we just all migrate over.)

Expand full comment

Without wanting to be facetious, my comment is in large part simply an excuse to use the phrase "Facebook's Pivot to Children" in a sentence.


I am very pleased that this all seems like it will greatly reduce the torque on Facebook's Pivot to Children.

Yep, sounds like unregulated hyper-capitalism. Perhaps too a lot that's regulated but regulated poorly or for the wrong metrics.

Expand full comment

"Rather disingenuous ‘politically biased censorship’ claims??

Perhaps you missed when the tech monopolies banded together to actively censor a news story that could have been harmful to their preferred candidate days before the election?

Expand full comment