Fuckbook
A review of "Broken Code: Inside Facebook and the Fight to Expose its Harmful Secrets," by Jeff Horwitz
This is the best book on Facebook yet written, which contains the full account of Frances Haugen, the whistleblower who ended up testifying before Congress. It’s not perfect, but it’s full of strong reporting. It’s much better than the Ben Mezrich one, on which Aaron Sorkin’s gloss in The Social Network is based, which I suspect is mostly fake.
Nevertheless, there are shortcomings that arise from treating reporting on Facebook as a standard exercise in business journalism, when there is little reason to consider the company through such a traditional lens. My favorite anecdote is the revelation that President Obama’s choice of Moby Dick as one of his top books triggered an obscenity filter, which one imagines was designed to catch the no doubt quite tasteful nudes of a certain vegan electronic producer.
All the maladies of the Silicon Valley perspective are in evidence here, the desire for disaggregated “connection” with God knows who, the pursuit of growth above all else—“the ideology of a cancer cell,” the saying goes—and an unwillingness by people entrusted with enormous resources, unseen in world history, to take upon themselves value judgments of any kind. All these naive and idealistic nerds, living by the maxim to “assume good faith,” are frequently surprised that their platform is abused, and are thwarted from taking action due to growth imperatives (a metric not at all coterminous with actual profits) and their belief in the fiction of platform neutrality.
Eventually, they realize too late that they are in a political business and the liberal tribe most employees consider themselves a part of makes it clear that they expect a thumb on the scale on their behalf, which upsets the conservatives. Finally, with everyone fed up with their holier-than-thou bullshit, the big guns get pulled out—lots of drugs are sold on their platform, and lots of children are being abused—which provides a pretext to wield the big stick against them.
All of this is fairly par for the course in Silicon Valley, what makes Facebook unique is its role in gathering data on individuals and their networks, and its size allowing it to acquire other technologies and therefore shape the market beneath them in key ways, which is the real story, and the substantive reason they need to be reined in.
My waggish thought halfway through reading this book is that Facebook badly needed a discrimination czar, a professional discriminator. The word has a poor odor, but everyone involved, all these data science geniuses, could have benefitted from having someone in the room with the common sense to say things like “this thing is better than that thing.” This is something researchers at the company realized themselves when looking into QAnon recommendations.
There are a few what you might call “structural features” about the company and its global interventions that are worth scrutinizing. Horwitz has the details but he doesn’t seem to think it through. Of the big platforms, Facebook is the most right-wing, and also the most Israeli. Sheryl Sandberg is from a Likud family, Adam Mosseri is Israeli, Guy Rosen, the chief security officer after Onavo was acquired in 2013, is Israeli, and Jordana Cutler seems to have enjoyed a revolving door between Facebook and Netanyahu’s circle. On the GOP side, there’s Joel Kaplan, a George W. Bush aide, and Katie Harbath, who got her start with the Giuliani campaign. In India there’s the saga of their one-time policy head for the country Ankhi Das, who appears to have been too close to the BJP.
There’s nothing junior about any of these people, they’re all key decision-makers, and it’s a team that bespeaks not just a right-leaning tilt but a bias specifically toward the Israel First style of Republican politics that’s been dominant in recent decades. Moreover, a few of the early acquisitions by Facebook for really key technologies, like Onavo and Face.com (the former is covered here because of Guy Rosen’s long career at the company afterward and the fact that it is fairly straightforward spyware, the latter is facial recognition) are Israeli. Kaplan’s role here is pretty telling, their global public policy lead, who was intimately involved in technical and moderation decisions, even at the individual level, cares about, according to a former aide here, “the U.S., Europe, India and Israel.”
The question is whether this this matters for how Facebook is run. It seems to. All the events cited here as major moderation failures when it comes to extremism—the Rohingya ethnic cleansing, the BJP’s vigilantism, Jan 6—tend in a direction that suits the foreign policy of the Likud Party, while they have Israeli executives making key decisions. Does it seem like that would be a coincidence?
It’s possible to continue building that case from what this book shows. In general there are two ways to know how these algorithms work: you either get your hands on them, or you test them from the outside. If you’re doing the latter, you need some people to say offensive things to see what it catches and who those statements cause to come out of the woodwork. In general a company will always have a preference for blaming the algorithm rather than human discretion, even though most human problems inevitably will require human solutions.
Facebook’s big problem is enough has been publicly disclosed for us to know their role in shaping the influencer economy in a discretionary way has been quite significant, which raises the question of whether there is still more going on beneath the surface we don’t know about. There’s nothing automated about some of the most prominent programs at Facebook: the Partnerships team, which worked directly with influencers and publishers, the Fact-checking program, which actually paid them, Xcheck, which absolved certain influencers from fact-checks (including the Israeli-backed Tommy Robinson and Charlie Kirk, who Joel Kaplan intervened to shield), and the Oversight Board, which ruled on moderation decisions. The Cambridge Analytica scandal itself is a story about the role of third-party data brokers being used for ad targeting, so one gets the impression of a sprawling infrastructure of deferred responsibility.
The really interesting acquisition is CrowdTangle, which has implications for the journalism business—one detail I’m fascinated by is both Pfizer and NPR requested pitches on the same day. Facebook acquired it in 2016, and it had become an industry standard in many newsrooms prior to that time. The way it tended to be used was a writer with a good nose for what would go viral would pick a story or moment beginning to trend, write up a post, and be reasonably sure it would do well. The phrases “winning the Internet” or “making Internet,” are all based on this kind of thing. There is of course an operant conditioning problem here, creating expectations about what people like to see, but after the acquisition, you can see a way in which the recursiveness of this piece of technology, combined with algorithm tweaks, could be used to more or less control a newsroom. Another thing not discussed here is how Facebook’s algorithm changes around the “pivot to video” destroyed the balance sheets of lots of media companies. Maybe they did it on purpose.
Even downranking, which began in 2017 after the platform began to punish clickbait, seems to exist to limit the spread of things rather than making judgment calls about what can fly and what can’t. “The company would punish hoax publishers for everything,” Horowitz shrewdly observes, “except the fact that their news stories weren’t true.”
The reality is you can fact check almost anything, and you can find a reason to downrank or ban almost anyone. So the way to think of this architecture is that it’s a way to select and back certain influencers and publishers. Influencer culture in general is mostly Israeli, and with Elon Musk insisting they must give their ID to Unit 8200 in order to get paid, the same is true for Twitter. The purpose of this is to control information on behalf of the Israeli state and cartelize the influencer economy. This is the sort of thing Taylor Lorenz ought to be writing about.
This is especially true on the right-wing side: the figures at the center of the alt-right controversies of the Trump years are almost all Israeli, an irony unappreciated by the extremism-watching industry—Lauren Southern’s book was ghostwritten by David Frum protege Mytheos Holt, Tommy Robinson got money from Daniel Pipes’ group, and Breitbart, by their own admission, was founded in Jerusalem. According to Horwitz, American right-wing influencers were shielded from fact-checks more than almost any other. What this implies is, if you want to be a conservative influencer, you have to get the Israeli blessing from Big Tech.
The above is a reframing of some of the upper-level stuff at the company, Horwitz’s account is better when it comes to the increasingly dystopian experience of the average user, much of which they failed to fix, which has a lot to do with the network’s declining popularity. What minimal fixes get tried seem to go a very small distance to fixing the problem. They respond to the unusually promiscuous friending behavior of a small number of accounts by capping it, which probably just makes these accounts grow more slowly.
One thing that surprised me in Horwitz’s account of the 2016 election is very little was made of the alleged Russian election manipulation effort. It turned out to be pretty small—ad buys of less than a quarter million dollars—but it occupied the public conversation, certainly in DC, out of all proportion to its likely impact. The head of cybersecurity policy at the company, an NSA alum hired in 2018, is quoted as saying “The idea of applying influence operations to the internet and social media of 2016, people weren’t focused on that.” It seems more accurate to say they didn’t want to look: yet again the DNC hack is described by Horwitz as a Russian hack-and-leak thing, when the best evidence today suggests it was Israeli.
Horwitz quotes a report showing 64 percent of all extremist group joins come from Facebook. I would be the first to dispute the definition of an extremist group, but that’s still quite astonishing. Their recommendation algorithms are degrading the quality of the news people consumed. With all due caveats that the sort of person probably making these decisions considers anything conservative low-quality, that’s still a problem.
The gripping final third or so of the book is the story of Frances Haugen, who struggled a bit before getting back into the tech business to work on integrity and civic matters for the company. It’s an interesting story. I do think Facebook’s failure to catch her in the months-long process of gathering leakable information, given what she worked on, would tend to vindicate rumors that she was a kind of limited hangout. That doesn’t mean she was wrong, and it’s easy to see Ben Shapiro’s self-interest—he was given special treatment on Facebook thanks to Jared Kushner—in trying to shut down Haugen’s revelations.
You could do worse than conceptualizing Facebook as an alien invasion of American media. The pod people land in the South Bay, set about trying to imitate the humans by noticing how they interact, vaguely understanding that things like “Meaningful Social Interaction” are important, so they integrate an MSI metric into their hive-mind, which encourages the humans to turn on each other, boosting acrimonious and “scummy” content. Mark Zuckerberg, the alien starlord, in imitation of the myths that seem to animate the locals, calls himself a “wartime CEO,” like he’s Alexander the Great. And these humans, you see, they like to read the news, and they have all these opinions about it, so they devise a “Broad Trust” metric to determine what publishers are generally seen as reliable.
If you were to write this story there would have to be a character, maybe a homeless person squatting in Menlo Park, who notices the dead look in their eyes, and the fact that they don’t quite talk like human beings. There’s something off about them. The difference between this one and an ordinary alien invasion story is they arrived with no invidious intent, but in the course of studying the humans, and maybe even sincerely trying to help them, they drove all of us rats in their sick Skinner box completely mad.
What I would like to know, which Horwitz doesn’t venture to guess, is what proportion of content on Facebook would be considered, by a reasonable person, to be fake. It sounds like it’s a lot. “Six out of the top ten Black-themed pages… were troll farms,” along with the top 14 English-language “Christian and Muslim- themed pages” being faked, and an overwhelming majority of evangelical content. They “hadn’t just created bad precedents—it had created bad ecosystems.” The whole platform was literally overloaded with shit.
If I may presume to speak for the humans, they have overstayed their welcome, and we would like them to leave now. They’re a menace.