While trying to make an analogy for a smartphone review, the technology reviewer and journalist Marques Brownlee once made the following observation about the Porsche 911:
Have you ever listened to a car reviewer describe the latest generation Porsche 911? This is a car that’s looked more or less the same for the past fifty years, with slight evolutions each new generation. And literally every time you watch or read a review, they always say, every single time, “Oh, it’s so refined! This is an engineering masterpiece that has been perfected over generations! It’s a formula that’s been developed in the same direction for years!”
This, in a nutshell, captures what a certain breed of aspirational social engineers aim to do. A key advocate of this approach to social engineering was Karl Popper. In his book The Poverty of Historicism, Popper advocated for what he called “piecemeal social engineering.” In opposition to utopian social engineering, which aimed at redesigning societies according to grand blueprints and five year plans, piecemeal social engineering was focused on making small, tinkering adjustments, learning from the result, and using that information to make new adjustments. As this process iterated, it would lead to an accumulation of small improvements and refinements to social institutions, bettering the situation of a given society. As Popper described it,
The characteristic approach of the piecemeal engineer is this. Even though he may perhaps cherish some ideals which concern society “as a whole” – its general welfare, perhaps – he does not believe in the method of re-designing it as a whole. Whatever his ends, he tries to achieve them by small adjustments and re-adjustments which can continually be improved upon…The piecemeal engineer knows, like Socrates, how little he knows. He knows that we can learn only from our mistakes. Accordingly, he will make his way, step by step, carefully comparing the results expected with the results achieved, and always on the look-out for the unavoidable unwanted consequences of any reform; and he will avoid undertaking reforms of a complexity and scope which makes it impossible for him to disentangle causes and effects, and to know what he is really doing.
But how optimistic should we be about the prospects of this piecemeal engineering? It’s widely agreed upon that the American system of health care has serious flaws. But this came about as the result of the kind of piecemeal engineering Popper describes. In their book We’ve Got You Covered: Rebooting American Healthcare, Amy Finkelstein and Liran Einav describe how the existing system came about precisely because of this kind of piecemeal engineering. Some problem was perceived, a policy was put in place to address it, and that policy had its own problems, leading to new reforms, creating new problems addressed with new policies with their own reforms, over and over again. And the end result of this process isn’t a Porsche-style “engineering masterpiece that has been perfected over generations.” The outcome resembles something more like when a person with no understanding of home repair attempts a DIY project, and keeps trying to readjust and rebuild on top of his own fumbling attempts, creating monstrous, lumbering result that is simultaneously overly complex and excessively fragile. (The previous description may be based on my own attempts at home DIY projects – I will neither confirm nor deny such speculation.)
Finkelstein and Einav argue that because of this, further piecemeal engineering isn’t the way forward – the whole system needs to be rebooted. While their proposals are ultimately unconvincing, they are correct to describe how the current system came about as a result of the kind of piecemeal engineering Popper advocated.
But clearly, small refinements and piecemeal engineering can work in some circumstances, such as with the Porsche 911 – or the Apollo space program. So what makes the difference? Here’s a few points that leap to mind.
First, there’s the question of whether the social engineer can have knowledge of social problems relevantly similar to the way automotive engineers understand auto design. Popper’s view depends on the idea that social engineers can design their reforms in a way that avoids “a complexity and scope which makes it impossible for him to disentangle causes and effects, and to know what he is really doing.” That social engineers are capable of this a pretty heroic assumption in its own right, and one that I believe Jeffery Friedman reduced to powder in his book Power Without Knowledge.
The second issue is the type of learning environment. In a discussion with Russ Roberts on EconTalk, David Epstein talked about the difference between “kind” and “wicked” learning environments. In a kind learning environment, there are clear and reliable methods of feedback that provide useful information, and the way things worked in the past will continue to be how they work in the future. In a wicked learning environment, feedback may be absent, or may point in the wrong direction, and lessons and outcomes don’t repeat themselves the same way over time. As Epstein described it recently, “You can think of kind learning environments as situations that are governed by stable rules and repetitive patterns; feedback is quick and accurate, and work next year will look like work last year…In wicked learning environments, rules may change, if there are rules at all; patterns don’t just repeat; feedback could be absent, delayed, or inaccurate; all sorts of complicated human dynamics might be involved, and work next year may not look like work last year.”
Crucially, a “kind” learning environment doesn’t necessarily imply a given task is simple or easy. Automotive engineering can be exceedingly complex, but it still takes place in a kind learning environment. A manned mission to Mars, likewise, would be an exceptionally difficult feat, but it would still take place within a kind learning environment. Learning about the human body and treating diseases, while complex, are still relatively kind. But social engineering of an entire healthcare system across a civilization, whether wholesale or piecemeal, would take place in an extremely wicked learning environment.
Lastly, even in kind environments, accurate feedback by itself doesn’t do anything in the recipient of that feedback doesn’t have an incentive to respond to it in a productive way. In markets, price signals provide feedback and provide incentives. Even if you have no idea why market prices are sending you a given signal, that’s okay – you don’t need to understand why, as long as you just respond.
So it seems to me that piecemeal engineering can work in contained, knowable situations, within kind learning environments, in situations where the engineer has both accurate feedback and an incentive to respond to that feedback in a socially beneficial way. But for engineering social policy, that confluence of factors seems to be very far from the norm.