The turnaround in US military fortunes in Iraq in 2006 was attributed to the replacement of bad leaders (Donald Rumsfeld and General Casey) with good ones (Robert Gates and General Petraeus). But Rumsfeld had left an important legacy with his (inadvertent?) popularisation of the language of complexity. “There are” he explained at an early press briefing “known knowns; things we know we know. There are known unknowns; things we know we don’t know. But there are also unknown unknowns; things we don’t know we don’t know.’ Unfortunately, one of Rumsfeld's (many) mistakes was believing top down leadership was the right answer in all these domains.
'Known knowns’ describe obvious domains that are clear to everyone (if you go to war people will die). ‘Known unknowns’ describe complicated domains that only few, usually experts, know (how many might die). While ‘unknown unknowns’ describe what actually occurred - chaos - a place of total randomness. Yet, the man who finally brought effective leadership to a dire situation - General Petraeus - did so by harnessing complexity - the ‘unknown knowns’. In complex domains we may not know the right answer (for it's constantly changing) or know anyone who does (as answers are widely distributed, with many different people holding small pieces of the bigger puzzle) but we know answers exist, so we must adopt a different mindset to harness and exploit them.
The General, heralded Stateside as the hero of the counter-insurgency, sought answers “further down the ranks, and outside the armed forces entirely, searching for people who had already solved parts of the problem that the US forces were facing.” He understood that faced with a seemingly intractable problem (as the insurgency seemed in 2006) very different perspectives must be sought. Petraeus moved away from expert opinion, which anticipates what might happen, to favour instead listening to the experiences of those closest to the action. This increased awareness of what was happening (replacing uncertainty with certainty) which helped reveal why it was happening and how might an effective response look in context.
Experts can rapidly provide a degree of confidence for leaders faced with difficult challenges. By collecting masses of data and applying big brains to analyse and synthesise neatly the expert can reduce risk. But as data is always past data their analysis will be fraught with assumptions extrapolated to determine an unknowable future that will willingly or otherwise confirm their hypothesis - on which their expert status depends. Experts impose what they know and, while their advice can be useful for ‘complicated’ questions, in ‘complex’ domains their advice can mislead - especially if its authority renders other voices mute. Therefore, as Professor Armstrong of the Wharton School pointed out, when attempting to forecast the future ‘expertise beyond a minimal level is of little value’ so never ‘hire the best expert you can — or even close to the best. Hire the cheapest.’
Title quote attributed to (then) Colonel H.R. McMaster, whose story and some of the other ideas for this blog entry are taken from the excellent 'Adapt: Why success always starts with failure' by Tim Harford