Are Intelligence Failures Inevitable?

William Thomas

Originally published in 1998/1999, Issue 1

“The possible failure of intelligence to assess a situation correctly is a danger coeval with intelligence itself.” (Abram Shuisky, Silent Warfare).

The “Great Game”, as Rudyard Kipling famously referred to the art of Intelligence, is a “game” that comes with a large slice one of chance, and just like any other game there is as much chance of losing as there is of winning.  As Shulsky illuminates above there is a cast iron guarantee that with the pursuit of intelligence there comes intelligence failure.  This work will examine why this is so, drawing on the examples of failure as regards to the Japanese attack on Pearl Harbor.  This case shows a range of failures at different levels of the intelligence process as will be seen below.

Intelligence is a manifestly human endeavour, and consequently all intelligence activity lends itself vulnerable to the vagaries of “the human factor”. This human factor shows itself at every level of the intelligence process, from the gathering of raw data, how this gathering is arranged, through analysis and interpretation and onwards to the final use of intelligence, to the “consumer”.  Michael Handel usefully identifies the key factors of failure, being “human psychology and politics; wishful thinking; ethnocentric biases; perception and misperception of reality; conflicting interests; political competition over scarce resources; organizational biases.” However it is possible to identify additional areas where failure can be unavoidable, the most notable being the “unknowable”, that is in one respect facts and circumstances that cannot possibly be available to the intelligence community of one power simply because the opposing power wishes for this information to remain absolutely secret.  In attempting to maintain this secrecy the “defending” power will take significant steps to ensure secrecy, as will been seen later with regard to Pearl Harbor.  This would normally involve good “security, but in the case of Pearl Harbor Japanese secrecy goes further in limiting the risk of security failures by severely limiting the dissemination of information, and the form of dissemination.  Deception and betrayal and the deliberate cause of failure on the part of another power can also be considered, as in some cases (in the case of Pearl Harbor an unproved accusation levelled at Winston Churchill) such betrayal can make intelligence failure inevitable.

Another instance where the unknowable can cause intelligence failures is that where circumstances either change with extreme rapidity or where events take on a chaotic nature, for example with the unexpected and previously unlikely introduction of new actors.  Intelligence can, as Handel and numerous others identify, can fail at “three distinct levels: acquisition (the collection of information); analysis (its evaluation); and acceptance (the readiness of politicians to make use of intelligence in the formulation of their policies) . Handel goes on (in his essay entitled “The Problem of Strategic Surprise”) to examine how instances of failure at the strategic level, especially with respect to military strategic surprise are largely the result of errors and inadequacies at the levels of analysis and acceptance of intelligence, and not the collection of intelligence.  This assertion is contested by the present writer as being limited in perspective, and that all three levels play significant roles in all types of intelligence failure including some of the examples cited in Handel’s essay, the most notable being that of Pearl Harbor.

Of course what is key at these three levels of failure is the underlying choices and perceptions of individual actors and groups of actors, these often being based on faulted views and preconceptions of the world.  The Japanese surprise attack on Pearl Harbor on the 7th December 1941 demonstrates many aspects of why intelligence failures are inevitable.  In the decades since the attack a great deal of research and investigation has attempted to illuminate the causes of failure, and indeed some of the real causes may never be known in full.  However enough has been learnt to provide vital lessons for the future.  Some commentators suggest that most intelligence failures are a result of the latter two levels of the intelligence process, that is analysis and acceptance by the consumer.  This is however not the case as can be identified in the case of Pearl Harbor a distinct failure in the collection of intelligence as well as the interpretation and use of data.  The failure of American intelligence at Pearl Harbor was inevitable for the following reasons, and is indicative of all intelligence failures.

The reasoning behind Japan’s attack on the United States fleet at Pearl Harbor was to strike a significant and powerful blow to the morale (and the will to fight) of the government and people of the United States.  Such a strike required complete strategic surprise, and this was achieved for a number of reasons.  Most importantly all knowledge of plans for the attack were kept on a strict need to know basis.  As David Kahn notes, “Knowledge of it was limited in Tokyo to as tight a circle as possible.  Plans for it were distributed by hand to the ships of the task force.  No reference to a raid on Pearl Harbor ever went on the air, even coded.” Such tight communications security made any cryptanalysis’ useless in foretelling the attack of December 7th, and demonstrates that secrets can be maintained and their integrity put to maximum effect in surprise attack.  This “unknowable” factor is most likely to result in surprise and the associated intelligence failure on the part of the surprised.

On the part of the United States it is possible to identify a significant lack of intelligence gathering.  Kahn states that ‘There was, in Wohlstetter’s terms. no signal to be detected’.  This is as discussed above with reference to signals-intelligence (SIGINT), but SIGINT is not the whole story.  When Wohlsetter talks of “signals” in this sense she means “indicators”, and is making reference to the lack of intelligence in general, human and electronic.  Had the U.S. had any significant intelligence assets focused on Japan and her naval activities the likelihood of strategic surprise and intelligence failure would be greatly diminished.  At best the failure here was to not organise thorough intelligence collection in the Pacific Ocean and islands, and this likely reflects a human decision as regards allocation of resources.  At worst it indicates a lack of attention to the threat that the Japanese navy posed, a topic which will be more fully examined later.

Moving on it is possible to identify how the failure at Pearl Harbor can be attributed partly to an error of analysis; that is through a cognitive misinterpretation of ‘traffic analysis’.  Kahn notes that during the war scares of February and July 1941 traffic analysis identified a pattern, which involved elements of the Japanese navy pursuing objectives outside of home waters, while Japanese aircraft carriers remained in home waters to protect the main islands.

This pattern involved the usual communications activity between the active members of the fleet and zero communications between the carriers.  The error in these observations was the establishment of a pattern in the minds of the U.S. analysts and the application of this pattern in subsequent analyses, specifically that of the beginning of December 1941. There was no substantive evidence for the establishment of a pattern in this case, as two instances of an event does not con       stitute a pattern as such, and consequently this cognitive failure added to the overall intelligence failure at Pearl Harbor.  However good the gathering and analysis of intelligence may or may not be, the use of that intelligence by the consumer will determine ultimately the level of risk of intelligence failure.  It is accepted that it is with the end user of intelligence that the majority of failure takes place.  This is because it is here where there are least restrictions and checks on an individuals perceptions and preconceptions, whether they be based on ethnocentrism, racial prejudice or failures of cognition. Within intelligence hierarchies there are periodic checks and balances on the inferences analysts draw form raw data.  These may not necessarily be institutionalised and formal but they exist nevertheless At the higher level these are weaker, and outside the hierarchy, within policy making circles they are non-existent.  The conditions that can lead to failure are therefore more likely to cause failure.  In terms of intelligence, ethnocentrism is expressed in terms of “mirror-imaging”, that is projecting your cultural values, whether they be political, military or cognitive, onto the enemy.  This can lead to strategic and systemic failures of policy formulation and military planning, and so carry more significance as regards failure.  In the case of Pearl Harbor this happened in a number of circumstances and at different levels of the intelligence apparatus and government.  It was this factor above all that had the most significant effect on the failure that resulted in the strategic attack on Pearl Harbor.  In this case Kahn notes that a Western sense of rationalism was imposed on perceptions of Japanese intentions.  This instance of mirror-imaging resulted in a strategic analysis that Japan would consider it irrational to attack such a powerful adversary as the United States.

In addition Kahn identifies a sense of racial superiority over the Japanese which influenced a more objective analysis of their capabilities.  What can be seen therefore from this brief analysis is that intelligence failures are inevitable.  The vital ingredient in intelligence gathering and analysis, the human intellect, is also the vital ingredient in intelligence failure, and accordingly you can not have intelligence success without running the inherent risks of failure.

Advertisements