The hours ticked down to launch time, and Roger Boisjoly had the weight of the world on his shoulders.
It was January of 1986, and the highly respected Morton Thiokol rocket engineer and thermodynamicist was pleading with his supervisors and NASA to postpone the launch of the Space Shuttle Challenger on that unusually cold Florida day. Icicles extended for hours from the shuttle and the launch platform, and NASA had never before launched in such cold weather. In fact, it was 15 degrees colder on this day than on any prior shuttle launch.
Boisjoly presented hard data and evidence to support his point — that the O-rings on the shuttle’s solid rocket boosters would fail, leading to an explosion, the loss of the Challenger and the astronauts on board. Boisjoly had written a memo less than 1 year earlier about the clear evidence of the O-Ring failure following a prior launch, and he was concerned that the extreme cold temperatures on the morning of January 28, 1986 would lead to a disaster. Morton Thiokol engineers had also expressed concerns about the O-Rings in a memo written seven years before the Challenger launch.
How did NASA incorporate normalization of deviance into their decision-making processes? NASA and Morton Thiokol convened a teleconference in the hours leading up to the launch, and NASA pressed Morton Thiokol for data to prove Boisjoly’s concern. This was an odd shift for NASA personnel, who historically required their engineers and contractors to have evidence that launching was safe; but now, they reversed field and asked Morton Thiokol to prove why the launch wasn’t safe.
On the teleconference, Boisjoly described the data from recent previous launches that showed O-ring failure, and at one point, a pause in the discussion gave Roger hope that his concerns were being heard. Suddenly, however, Thiokol senior management asked the engineers to leave the room, and Boisjoly’s heart sank. He knew this meant that senior management wanted to go over the engineers’ heads and recommend launching to NASA.
Not only was Boisjoly dismissed from the room, but his warnings were dismissed, as well. Roger sat in his office and waited while the countdown commenced. He believed that the O-ring failure would cause immediate explosion after the rocket engines ignited on the launch pad. He was temporarily relieved however when the Challenger lifted off without incident. But 73 seconds later as Challenger went “throttle up”; he was inconsolable when the shuttle exploded. Employees came to talk to Roger, but he found himself so stunned that he was unable to speak. The months and years that followed led to depression and inability to work.
There were so many questions he wanted to answer about the explosion of the shuttle and the loss of the seven astronauts on board — most importantly, “Why didn’t they listen?”
What Roger didn’t realize at the time was how much he was up against. He had more than just his Morton Thiokol superiors and NASA supervisors to convince; he was also fighting a battle against human nature. Roger had all the right data, all the correct technical explanations — heck, he had identified the very problem that would cause the shuttle to explode. But on that cold January day in 1986, it was understanding social psychology that could have been his best weapon.
Let’s start with the social psychology phenomenon known as the “normalization of deviance.” In laypersons’ terms, it describes a situation in which an unacceptable practice has gone on for so long without a serious problem or disaster that this deviant practice actually becomes the accepted way of doing things. As far back as 1979 (two years before the first shuttle launch and seven years before the Challenger exploded), engineers warned of concerns with the O-rings.
The Rogers Commission that investigated the Challenger explosion highlighted the history of concerns with the O-rings that went back to 1979, and included a copy of a Morton Thiokol memo that indicated that the design would be best used for unmanned space travel. In a 1979 Morton Thiokol memo, an engineer wrote that he believed the O-ring rocket design should be used with unmanned rockets, as he was concerned about their failure. Burn-through and the resulting erosion of the O-ring had been documented on several past flights. But in the absence of an explosion prior to the Challenger launch, NASA actually came to accept the failure of the O-rings because no disaster has occurred.
The same social psychology phenomenon would rear its ugly head 17 years later at NASA. When a large piece of insulation struck the shuttle Columbia orbiter just after a 2003 launch, several NASA engineers expressed concern that a hole could have been opened in the shuttle wing. NASA management dismissed the concern by saying that insulation had fallen off on multiple prior launches without harm to the shuttle occurring. A NASA engineer pleaded with his superiors to take a picture of the orbiting shuttle, as he was concerned that the foam insulation that had hit the shuttle upon takeoff had caused serious damage to the wing. His warnings were ignored, no picture or thermal imaging was performed on the Columbia orbiter during flight, and the ship disintegrated upon re-entry.
Impact to your company
Business leaders should take notice of the lessons learned from the two shuttle disasters. The normalization of deviance is one of the most dangerous aspects of human nature in preventing disasters.
If an unexpected and undesirable event is taking place in your organization, investigate and understand it thoroughly.
The absence of a disaster doesn’t mean that one won’t occur. Perhaps you’ve merely “beaten the odds” up till now, but statistics will catch up with you eventually, and the result could be tragic. If you find yourself or an employee explaining away known risks by saying, “We’ve done it this way before without problems,” the organization may be succumbing to the normalization of deviance.
Dave Yarin is a compliance and risk management consultant to senior management and directors of large and mid-size companies, and author of the soon to be published book “Fair Warning — The Information Within.” Yarin follows and researches news stories regarding ignored warnings that lead to bad business outcomes, along with the social psychology theories that explain why these warnings were ignored. He lives near Boston with his fiancée and two children. For more information, visit his website, follow him on Twitter, or subscribe to his FlipBoard magazine, Fair Warning.