Skip to Content

DECISION MAKING: Key Decision Making Tragedies and What We Can Learn From Them

As we’re watching the horror of the oil explosion unfold in the Gulf of Mexico and its untold effect on wildlife in the wetlands and the Louisiana coastline, it seems like we’ve been here before.  Listening to one of the oil rig operators recently on 60 Minutes describe the errors in decision making, reminded me of  similar mistakes made in the Columbia and Challenger accidents and the 1996 Mount Everest climb tragedy.  The BP operator reported that when one of the men carried a handful of rubber material to a superior concerned that the rubber seal down in the well had been damaged, he was told, “No, that can’t be.  We always get that kind of material coming up.”  A similar story emerged with the Challenger when the contracting agency expressed concern that the O-ring wouldn’t seal correctly at 28 degrees.  And with the Columbia, engineers dismissed concerns raised about the size of the tiles that had fallen off during rocket launch, suggesting that tiles had fallen off time and time before with no problems resulting.  In the Mount Everest accident, the leader dismissed his own earlier advice about the last acceptable time to ascend to the summit.

What is it about our decision making that becomes so horribly skewed in the wrong direction?  Why do good leaders make bad decisions?  And why do we keep repeating ourselves with such catastrophic results?  Interestingly, experts have identified specific characteristics that we possess that cause our minds to be biased in certain directions.  These “cognitive biases” prevent us from seeing the situations as they need to be seen; our minds make errors in the way we process information.  All the risk management strategies in the world won’t prevent problems if the information going in is being distorted.  The old saying, “garbage in; garbage out” is more than apt here.  Let’s take a look at a few of the dangers in our thinking.

First, and foremost, is the effect of what is called sunk cost.  The more money we’ve spent on something, the more difficult it is for us to change direction.  BP for example had already sunk millions of dollars into the oil rig’s production.  Changing direction at that point – in other words stopping production because the rubber seal might be broken- was out of the question.  The company was already upset with being behind schedule, so concerns had to be minimized in order not to cause more delays.   For the Challenger space ship, NASA had already been criticized for delays in the project and felt significant media pressure to get the launch off on schedule.

Another bias that distorts our thinking is an overconfidence bias.  When we’ve had success in the past, it’s difficult to believe that things could turn out differently.  The weather for the previous Everest climbs had been excellent for some time, so the climbers were not anticipating poor weather conditions when they reached the summit.  The Columbia space shuttle had lost tiles numerous times before without any negative consequences, so engineers never even explored the possibility of a hole in the capsule.  The BP  Deepwater Horizon oil rig had been the most productive in the fleet with a stellar safety record up until April 20th.

Interestingly, another problem in decision making comes from what is known as complex interactions with tight coupling.  Catastrophes more often than not stem from a domino chain of bad decisions rather than one wrong choice.  The Everest tragedy (5 climbers including the leader, Rob Hall, died trying to descend from the summit) had been plagued by delays in a complex system of getting climbers, supplies, and guides all to the correct location at the correct time.  When delays occurred, there was no flexibility in the schedule to make up for being rushed and stressed.  The same was true for the Challenger, where launch had been delayed so many times that it became an embarrassment for  NASA.  The clock had started on the launch and there was no time to truly explore the new information coming in.  When project timelines are so tight that no flexibility exists for a change in direction, we need to recognize the impact of the scheduling on decision making. 

Interestingly, President John Kennedy was one of the first to realize the detrimental effects of poor decision making after the Bay of Pigs failure.  When he and his advisors reflected on their mistakes, they created a new process for group discussion and decision making to prevent future errors and to promote diverse perspectives.  One of the elements that Kennedy, using the work of Irving Janus on Groupthink, came to recognize was the pressure for group conformity that resulted in self-censorship (devaluing of one’s own ideas), false consensus or an illusion of unanimity, peer pressure, mind guards (protecting the group from exposure to ideas that disagree with the group’s assumptions) and the importance of keeping up the atmosphere of agreement at all cost.  Kennedy implemented several methods to stimulate more open debate:  mental simulation techniques, a point-counterpoint dynamic and periodic removal of himself from the room to encourage more open discussion.

More recently, organizations are recognizing the need to focus on the possibility of failure, to proactively look for problems rather than ignore red flags.  In addition there is a trend among leaders to move from making decisions themselves to focusing on how decisions are made by everyone in their organizations.  Smart leaders know how to ask the right questions, especially when the threat is ambiguous and people are likely to minimize the possible risks, such as in the Columbia space shuttle accident.

The real tragedy in the BP situation – as in the Challenger, Columbia, Everest, Daimler Chrysler, Three Mile Island, the Buffalo plane crash, decision making at General Motors, and even the catastrophe of September 11 – is that we are not learning fast enough from our previous mistakes.  I feel so strongly about this that I plan to write a larger, more detailed article on this topic of flaws in decision making in the near future to increase our awareness of the games our minds can play and the critical strategies we must be using to prevent another tragedy.

View a video on the cost of poor quality:

 

 

  • Great Blog. A must read.

  • This commentary by Deborah hit home. I haven’t thought about Groupthink in a long time, and I witness this happening every day. People with brilliant ideas take back seats so as not to upset the apple cart. Frequently, this behavior stems from the perception that an undesirable outcome will result if they speak up, i.e., someone gets angry or people get impatient if they have to take time for discussion rather than moving through an agenda on time. I notice that when I stop, think and have the courage to speak, I feel complete in that I’ve done my job in speaking my truth. If not acted upon, at least my thoughts can be heard. We never know how or what we’ll affect.

Who We Are

An innovative training and employee development firm located in southern Vermont since 1984, we specialize in helping organizations get the most out of their people by raising the bar, inspiring potential and partnering with organizations to build a people-centered, high-engagement culture.

Our Twitter Feed