A warm late December evening in Miami awaited the 163 passengers who traveled on Eastern Air Lines Flight 401 from New York to Florida. During the otherwise uneventful trip, marriage proposals were made, a young couple looked forward to introducing their newborn baby to awaiting grandparents, and college students headed back to school after the holidays. But in a few hours, over 100 of the passengers and crewmembers would be dead. The tragically simple cause of their death will not only shock you, but as the leader of your organization, will also provide a fair warning lesson in the problem of over-focusing on one issue or problem at the expense of other risks.
On Dec. 29, 1972, Eastern Air Lines Flight 401 left New York’s JFK Airport and was approaching Miami International Airport at approximately midnight, when the nose landing gear indicator did not illuminate. The pilots had to identify whether the landing gear had indeed failed to extend, or more likely, if the indicator bulb in the cockpit had simply burned out. That bulb was a one-inch square light on the lower right side of the center instrument panel (replacement value of $12) that indicates that the nose gear is down and locked in position for landing. The light should have been on at this point but remained unlit. As a result, the pilots aborted the landing, and the first officer set the autopilot to keep the aircraft at 2,000 feet to allow the crew to sort out the problem.
Shortly afterwards, during a discussion regarding the landing gear, the flight data recorder detected slight pressure on the captain’s control column, coinciding with the captain asking the second officer to check the gear through a viewing bay window. It is likely that the captain unknowingly bumped the control column while turning to speak with the second officer, enough for the autopilot to disengage and change from altitude hold to manual wheel steering, which initiated a gradual descent of the airplane.
Difficulty changing a bulb
Meanwhile, the captain and first officer tried to replace the bulb and confirm that the original had indeed burned out. Cockpit voice recordings revealed that the crew was frustrated with the problem of changing the bulb, as the cover in the cockpit panel had jammed. As the aircraft descended 250 feet below the designated altitude of 2,000 feet, an audio warning from the second officer’s speaker was detected on the cockpit voice recorder, but the crew seemed to be unaware of it, and by that time the second officer was already in the viewing bay. There were at least four indications that the L-1011 was slowly descending towards the Everglades, but the altimeter, vertical speed indicator, captain’s autopilot display, and audio warning all went unnoticed by the crew.
Just a few minutes later, while the pilots were still working on the light bulb problem, the first officer noticed that the altimeter was indicating a dangerously low altitude, and the radar altimeter sounded an altitude warning. However, by the time the pilots realized their situation it was too late; Flight 401 struck the Everglades in a left turn and began to disintegrate, the wreckage being strewn over an area of 50 square kilometers. It was the first accident involving a wide-bodied airliner, and the most deadly crash in the United States up to that point. While approximately 30 passengers and crew miraculously survived, more than 100 passengers and crewmembers died.
Cognitive tunneling and crew resource management
Investigators were puzzled to subsequently discover that, apart from one burned-out bulb, there was nothing wrong with the L-1011. Tragically, the landing gear was found to be in the down and locked position. The primary cause of this accident was not the aircraft, but the crew — the human factor. Errors committed by the crew were the main cause of the crash of Eastern Air Lines Flight 401; it was a landmark accident in more ways than one. Although the crew members were dealing with the landing gear indicator, they still could have noticed their surroundings and the aircraft’s altitude.
As long as stress levels are not too high, the average human has enough additional information-processing capacity to notice things unrelated to the current task, such as the audio-altitude warning and instruments indicating a descent (Robson, 2008). When stress levels increase, however, it is possible for cognitive tunneling to develop (Chou, Madhavan, & Funk, 1996); this is where one particular task is given a high priority at the expense of other tasks. It can be especially dangerous when the task being focused on is actually less important than those tasks being neglected.
Initially, it may seem that the crew was presented with the simple task of changing a light bulb. However, as the cover had jammed, both the captain and first officer likely experienced cognitive tunneling as they tried to establish a way of replacing the bulb without breaking the cover. In this case, all of their attention was given to this one small problem, at the expense of flying the aircraft.
The crash led to the evolution of Crew Resource Management (CRM) in the airlines industry, in which the captain is responsible for ensuring that monitoring of all indicators and warnings systems are delegated appropriately among the crew. CRM, largely a non-technical breakthrough but rather a human nature innovation, has led to major improvements in airline safety. The first crash of a wide-body airliner provided a strong catalyst for the development of CRM systems, and has served as a strong example for the benefits of CRM training. At the time of flight 401, CRM was not a developed system, and so the crew did not have the opportunity of developing the same effective teamwork skills as modern pilots.
A story in which CRM was successfully deployed occurred in US Airways Flight 1549; Capt. Chesley “Sully” Sullenberger’s successful landing of his disabled airplane in the Hudson River. While Sullenberger took immediate control of flying the airplane, First Officer Jeff Skiles focused his attention on the reference handbook that included instructions for emergency situations.
The implications from this story for leaders of companies are clear — cognitive tunneling can cause organizations and their employees to over focus on one task or goal (e.g. revenue growth, acquisitions) while ignoring the warning signs from other parts of the organization (e.g. safety concerns, regulatory issues), particularly when stress levels are high. Can your company over-focus on one corporate goal or objective (e.g. sales, growth) and fall victim to cognitive tunneling?
Perhaps CEOs and senior management need to ensure that their companies’ warnings systems are being monitored appropriately and deploy a “personal crew” that can help in monitoring these systems — the company’s version of crew resource management to ensure that someone is “always flying the plane.” But how do we implement or maintain an “organizational CRM”? Who are the company’s crew in this regard, and how are they best deployed to watch the various systems in your company that need attention?
Here are seven critical elements of an effective CRM program:
- High-level oversight — does your company have an organizational structure led by senior management that oversees issues such as compliance, quality and safety while being able to respond quickly and appropriately to credible warnings?
- Training and education — How many of your training programs deal with issues such as safety and quality? How many deal with sales and growth? Confirm if there is a balance in your company between the messages you provide employees about “grow, sell, grow” as compared to the number of programs devoted to safety, quality and compliance?
- Auditing/monitoring and risk assessment — risk assessment is the “radar screen” of threats to your company. How are risk assessment, and,more specifically, enterprise risk management (“ERM”) implemented in your organization?
- Open lines of communication — Does your company have a hotline that operates 24/7/365 and is staffed by competent individuals who know how to properly receive and triage emergency reports or concerns? Yes, in these many reports will be false alarms and non-emergencies; one CFO told me that someone ordered a pizza on his company’s hotline. But somewhere within these hotline reports will be a pearl of information that just may save the company from disaster.
- Written policies and procedures — Does your company have written guidelines that make reporting concerns a job requirement, as well as non-retaliation for doing so?
- Investigating reports of concerns — Does your company thoroughly escalate and investigate reported concerns?
- Enforcing standards — Is your company prepared, culturally and otherwise, to both set and enforce standards related to issues such as safety, quality and compliance? I’ve often used a phrase with clients that I refer to as a “stop the presses” issue. Is your company prepared to “stop the presses” if a credible warning comes in?
Dave Yarin is a compliance and risk management consultant to senior management and directors of large and mid-size companies, and author of the soon to be published book “Fair Warning — The Information Within.” Yarin follows and researches news stories regarding ignored warnings that lead to bad business outcomes, along with the social psychology theories that explain why these warnings were ignored. He lives near Boston with his fiancée and two children. For more information, visit his website, follow him on Twitter, or subscribe to his FlipBoard magazine, Fair Warning.
If you enjoyed this article, join SmartBrief’s e-mail list for our daily newsletter on being a better, smarter leader.
1. NATIONAL TRANSPORTATION SAFETY BOARD. (1973). Aircraft accident report: Eastern Air Lines L-1011, N310EA, Miami, Florida, December 29, 1972. Washington D.C: Author.
2. ROBSON, D. (2008). Human being pilot. Cheltenham, Australia: Aviation theory limited.
3. CHOU, C., MADHAVAN, D., & FUNK, K. (1996). Studies of cockpit task management errors. International Journal of Aviation Psychology, 6(4), 307-320.
4. KANKI, B. G., & PALMER, M. T. (1993). Communication and crew resource management. In E. L. Wiener, B. G. Kanki, & R. L. Helmreich (Eds.), Cockpit resource management (pp. 99-136). San Diego, CA: Academic Press.