It's no secret that intelligence services the world over don't always get things right, but what leads their highly trained analysts to come to the wrong conclusion about a situation? Are intelligence failures a consequence of poor data, inaccurate information, a bad source; or are they caused by analytical preconceptions, misinterpretations and flawed human cognition?
Writing in the journal Intelligence and National Security, James J. Wirtz of California's Naval Postgraduate School discusses the art of the 'intelligence autopsy'; post-mortem investigations which examine the performance of the intelligence gathering following intelligence failures and successes.
Wirtz focuses in particular on the contribution of the scholar Robert Jervis, who is best known for his scholarship on international relations; especially the way human cognition shapes foreign and defence policies. Jervis conducted his first 'intelligence autopsy' in the late 1970s, examining the failure of the Central Intelligence Agency's National Foreign Assessment Center to warn of the Shah of Iran's inability or unwillingness to respond forcefully to the Islamic revolution led by the Ayatollah Khomeini. His work was nothing short of ground-breaking, introducing the idea that the cognitive and methodological challenges faced by analysts can lead to intelligence failure.
In other words, Jervis discovered that what mattered wasn't just the evidence piled on the tables in front of the analysts; the views, biases and organisational flaws of the analysts could often have a huge bearing on their interpretations.
Examining Jervis' 'Iran' autopsy and also a second intelligence post-mortem on the 2002 National Intelligence Estimate on Iraq's Weapons of Mass Destruction, Wirtz highlights the value of Jervis' logic and methodology.
"By focusing on analytical flaws and the very process of intelligence analysis itself, they stand in contrast to the emphasis on organizational shortcomings and reform that has emerged in the aftermath of the September 11, 2001 terror attacks."
Wirtz explains how 'prevailing beliefs' can act as a 'filter' for both supporting and contradicting information; how the principle of 'availability' leads even well-trained analysts to interpret information in light of what is on their minds at the time, and how analysts can be guilty of 'layering': uncritically piling new evidence on top of old. These types of 'cognitive biases', in addition to problems posed by sources and the intense pressure analysts are under, can all contribute to intelligence failures. The way to reduce failures Jervis believed, based on his research into the effects of cognition on world politics, was to improve agents' analytical skills rather than endlessly reorganising the bureaucracy.
This article is essential reading for those seeking to understand the success, failures and limitations of their intelligence services, as well as those seeking to improve them. It also provides fascinating insights into two significant intelligence failures that still have consequences for analysts, and the world, to this day.
Cite This Page: