Probability estimates don’t always help decision making.
In Spring 2011 Barack Obama had to decide whether to approve a US Navy SEAL raid on the compound in Abbottabad that was believed to be the location of Osama Bin Laden.
Here were the main reasons for suspicion:
- The compound was located near the Pakistan Military Academy
- It was larger than other homes in the area
- There was no telephone or internet service
- Residents burned their trash
- The perimeter had barbed wire and two security gates
- There was a seven foot privacy wall on the third story terrace
This constituted a rare opportunity to capture an important target and disrupt Al-Qaeda, but there was no certainty that “The Master” was in fact Bin Laden. According to John Kay and Mervyn King, a CIA team leader thought that there was a 95% chance that Bin Laden was in the compound. Others thought there was a 40% or 30% chance. Obama, to his credit, recognised that this was a situation of uncertainty. He had to make a decision without knowing whether Bin Laden was there. As he said,
“In this situation, what you started getting was probabilities that disguised uncertainty as opposed to actually providing you with more useful information”
A scenario approach dispenses with the false precision about probability estimates, and makes a judgment. Scenario planning would also focus on considering alternative outcomes, and planning for each of those, rather than falling behind a single view of the situation.
What I find particularly interesting is that Obama’s decision came in the context of the US invasion of Iraq. As we know now, the US-led invasion of Iraq in 2003 was an error based on faulty intelligence and faulty judgement. But the focus on Saddam Hussein in the context of 9–11 was somewhat reasonable (I recommend Season 5 of the podcast Slowburn for more). It seems to me that if we praise Obama’s use of judgment over Bin Laden, we should also cut Bush some slack over Iraq.
Indeed the relevance of this example is that it was because of the failure of intelligence surrounding the Iraq war that Obama received quantified information. According to John Kay and Mervyn King:
“US government agencies were required to implement a more structured process for providing advice to the President. Analysts were expected to quantify their confidence levels and express them as probabilities.”
Kay and King, 2020, p.8
Perhaps US Defense Secretary Donald Rumsfeld had a point all along:
“Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns — the ones we don’t know we don’t know.”
Donald Rumsfeld, February 12, 2002
All human action takes place under conditions of uncertainty. Grouping cases in quantifiable risk is a triumph of human capability, but cannot account for all of the issues that we care about. We therefore need epistemic humility and a suite of tools.
Further reading:
Kay, J., and King, M., 2020, Radical Uncertainty, The Bridge Street Press
For my review see here: https://link.springer.com/article/10.1007/s11138-021-00562-9