That “slightly better” does a lot of work in explaining where we are today.
Gabriel was concerned about the delay and complexity that comes from being right from the start, as opposed to being on the seemingly right path. “The worse-is-better software first will gain acceptance,” Gabriel predicts, “second will condition its users to expect less, and third will be improved to a point that is almost the right thing.” In other words, for much of the time you live with a flawed product, though Gabriel optimistically described this stage as “half of the right thing.” If all goes according to plan, you end up with very nearly the right thing, and a lot faster than trying to accomplish this in one fell swoop.
Evidently, we are still at the maddening stage in the social networks’ evolution in which they are half of the right thing: They provide an amazing ability to connect people across the globe and share information, ideas, and deep emotions, while also allowing rampant harassment, disinformation, and conspiracy theories. It’s a hefty price, and our acceptance of it begs the question: Is worse better?
I emailed Gabriel, nearly 30 years after his essay first spread among programmers via early email, to ask if he saw worse-is-better thinking in the way Facebook approaches problems like the 2020 election. He responded with a link to a keynote speech from 2009 by Facebook’s head of engineering at the time, Robert Johnson, with the title, “Moving Fast at Scale—Lessons Learned at Facebook.”
These were Facebook’s glory days—it had grown to 300 million users, all the while being generally beloved by its audience. As told by Johnson, Facebook’s insistence on moving fast and breaking things came from a place of humility and a drive to make something better than its leaders could even imagine. The key point is not to let anything—certainly not a few broken features in a huge platform—stop you from testing a new idea.
“This notion we should slow down and get it right doesn’t actually make much sense,” Johnson said back in 2009, “because, unless you have an extremely good idea of what right is, slowing down to get something right just means you are going to take longer before you figure out that you are wrong. And, more importantly, why you are wrong, and move on to the next thing.”
The fatal flaw at Facebook and the prominent companies that followed this move-fast-and-correct-quickly philosophy isn’t simply the rapidity or all the broken stuff, but that by involving itself so deeply in divisive politics, it is applying a programming philosophy to a system that is not nearly as resilient as computer code. A few crashes of the Facebook site, or other user-experience blemishes, may well be the price you pay to discover an important innovation in building the architecture of a global platform. But when it comes to democracy and tolerance, the crashes from bad designs can be genuinely catastrophic, acquiring a momentum of their own. In the latter case, fixes sent out from headquarters are likely to be inadequate.
“I didn’t design ‘worse is better’ to be a moral approach to designing and making things,” Gabriel wrote in an email, but rather as a way for a community “to help design and build a thing that works for them.” However, he wrote, “when the ‘test’ includes either directly or indirectly ad revenue, paid content, political considerations, patronage, and corruption, the evolutionary arc of the platform can go haywire.” He said that, had Facebook limited this fast-moving technique to matters of back-end programming, we wouldn’t be in our troubling situation involving free and fair elections and the social networks. Worse Is Better is good for coding code, but not for coding society.