The call to empower experts, and to keep politics to a minimum, failed to trigger a clear shift in how Washington did business. But it did crystallise the assumptions of the late 1990s and early 2000s – a time when sharp criticisms of gridlock and lobbying were broadly accepted, and technocratic work-arounds to political paralysis were frequently proposed, even if seldom adopted. President Barack Obama’s (unsuccessful) attempt to remove the task of tackling long-term budget challenges from Congress by handing them off to the bipartisan Simpson-Bowles commission was emblematic of this same mood. Equally, elected leaders at least paid lip service to the authority of experts in the government’s various regulatory agencies – the Food and Drug Administration, the Securities and Exchange Commission, and so on. If they nonetheless overruled them for political reasons, it was in the dead of night and with a guilty conscience.
And so, by the turn of the 21st century, a new elite consensus had emerged: democracy had to be managed. The will of the people had its place, but that place had to be defined, and not in an expansive fashion. After all, Bill Clinton and Tony Blair, the two most successful political leaders of the time, had proclaimed their allegiance to a “third way”, which proposed that the grand ideological disputes of the cold war had come to an end. If the clashes of abstractions – communism, socialism, capitalism and so on –were finished, all that remained were practical questions, which were less subjects of political choice and more objects of expert analysis. Indeed, at some tacit, unarticulated level, a dark question lurked in educated minds. If all the isms were wasms, if history was over, what good were politicians?