Intelligence Failure
The date was November 27, 1941. A special assessment committee from the United States state department came out with its predictions about Japan’s involvement in world war two. “Were it a matter of placing bets, the undersigned would give odds of five to one that the United States and Japan will not be at war on or before December 15; would wager three to one that the United States and Japan will not be at war on or before the 15th of January; would wager even money that the United States and Japan will not be at “war” on or before March 1.” (Wohlstetter, 1962). Ten days latter the Japanese attacked Pearl Harbor sinking four battleships and killing over two thousand people according to (History, 2009). This singular polarizing attack by the Japanese at Pearl Harbor has driven the need for low probability event planning. Low probability event planning is the science of planning for high impact low chance events. Low probability by definition is anything possible but statistically unlikely. High impact low probability has two main ideas. The first idea is thinking outside the box to come up with unlikely scenarios. The second idea is to start at the end and work backwards to the beginning of the scenario. These two fundamental aspects of analytical thinking could have been key to stopping Pearl Harbor. Pearl harbor is the most referenced intelligence failure of modern times and here is why it should have never happened.
The term think outside of the box is though to come from a nineteen seventies critical
thinking game. The game consisted of nine dots and requires the player to draw outside the box
to complete it as shown by (Martin, 2017). The term think outside the box is now engrained in the training of annalists of all intelligence agencies. This mindset is imperative for an annalist to be effective at their job. It is the job of the annalist looking over material to connect the dots they may or may not understand. Some say that there was warning Pearl Harbor was going to happen. Either noone put the dots together or the dots where ignored. That very argument of ignorance versus dismissal it debated to this day by experts. On the morning of December seventh, local human intelligence in the harbor spotted a Japanes submarine. Those soldiers did their job by reporting the submarines to their chain of command. The Japanese mini submarine was then attacked and sunk by a destroyer. “The ship’s captain, W.P. Burford, ordered all engines ahead at flank speed and headed straight for the submarine to ram it, and at 8:43, the destroyer rammed and passed over the submarine while dropping two depth charges. A minute later, both charges exploded aft of Monaghan, and the submarine disappeared. Another of the kō-hyōteki had been destroyed.” (NOAA, 2010) At the same time Unisted states intelligence Japanese fleet. The Japanese fleet was heading towards Pearl Harbor. The report of the enemy activities was not delivered to General Marshal in time. The general was out on a long horse back ride. No one raised the general alarm and no orders where given. These glaring errors in communication and deduction let the Japanese walk right into our pacific fleet. The counter point to this is that no matter how much evidence their was someone could still have dismissed it as circumstantial. Going forward in the future a computer network operator could use the think outside the box mind set to break into the systems of enemy ships and send them off course.
Starting at the end and working backwards is a style of thinking that is as old as literature itself. When faced with a major threat thinking backwards can lead you to outcomes never conceived before. This approach is vital to the high impact low probability model. Working backwards not only allows to see things from the perspective of the enemy but it often brings up unforeseen vulnerabilities. The United States believed that Japan would not attack us so early. Had an intelligence analyst run the scenario from Japan winning the war backwards they might have seen our weakness in Hawaii. The counter point to this is it would have taken a lot of work and effort to make a convincing argument to the people in charge to do anything about it.
Going forward in the future a computer network operator could use this methodology to gain access to an end system and work backwards to find the source of the enemy’s command.
The question going forward is did the intelligence community learn anything from Pearl Harbor? Many intelligence agencies require new recruits to read or learn about Pearl Harbor and other intelligence failures. It would appear, on the surface, that the knowledge learned from Pearl Harbor has not stopped our enemies from getting lucky. Human error will always be a factor and a problem. Case in point September eleventh two thousand and one. According to released documented reports we had eyes and ears on the people responsible for the September eleventh attacks but where unable to stop them until it was to late (Kean, 2004). Many American lives where, once again, lost in a surprise attack that plunges the United States of America into a war. September eleventh taught us that applying conventional tactics to unconventional enemies does not work. Once again the tactics of thinking outside the box and working backwards became relevant again. The same principles apply but now have to apply to everyone, everywhere.
In conclusion Pearl Harbor was in fact a preventable intelligence failure. The counter argument is what if Pearl Harbor had never happend? Would the nation have not learned its lesson and allowed our enemies to strike a bigger blow? Japan could have used a thought out attack that could have ended the United States involvement in the war before it began. These questions make relevant points but cannot change history. The United States takes the lessons learned from its mistakes and applies those lessons going forward into the future. Pearl Harbor will hold a sad place in history and continue to teach valuable lessons to the future intelligence analysts.