Slow-Onset Hypoxia: An insidious
killer
Captain Shem Malmquist and Dr.
Paul Buza, Florida Institute of Technology
Have you
ever ran a checklist, and then looked up a few minutes later and realized that
something that you thought you had done was not
done? Is there a professional pilot
with any experience who has not had
this experience? The point is that as
pilots we often read reports and dismiss them with the thought that “it won’t
happen to me” If you could honestly say
that you have never missed a checklist item, then you can skip this article as
you are likely not a pilot!
Hypoxia. That’s one of those things that you have to
worry about flying little airplanes. In
our modern transport aircraft, that’s not a real risk, is it?
We all learned
about hypoxia early in our flying careers.
Part of that discussion likely involved a chart that showed the time of
useful consciousness at various altitudes in the event you lost
pressurization. Like most pilots, I ran
through the scenarios in my head. I had
climbed mountains over 14,000 feet, so it did not seem that bad, so how bad
could it be if we lost the cabin?
Time of Useful Consciousness Chart from Pilot's Handbook of
Aeronautical Knowledge
The truth is
that a sudden, rapid loss of pressurization is not likely to lead to a problem
of hypoxia. The reason being that the
crew is going to have no question as to what is happening and the need to don
oxygen masks. While this is the type of
event that seems like the “worst case” scenario, like many risks, the one with
the most salience is not the one that is the most likely to kill you.
On August
14, 2005, the crew of a Helios airlines B-737 were preparing for departure[i]. Things moved quickly during the preflight and
departure stage, as we have all experienced.
Flows were ran, checklists completed, and the flight departed.
A few
minutes after takeoff, the Captain called back to their dispatcher to discuss a
problem that appeared to be the air-ground shift. The crew had an intermittent warning horn, a
master caution and an equipment cooling alert.
The Captain inquired as to where the circuit breakers were for the
equipment cooling, not surprising as the aircraft maintenance history showed
that it had a history of equipment cooling problems.
Helios
B-737
Expectation bias is the tendency to
believe or see results that agree with those details that agree with our
expectation, and to disbelieve, discard or downgrade the corresponding weightings for the data
that appear in conflict with those expectations[ii].
The crew continued to trouble-shoot the issue while they
climbed. 13 minutes after takeoff there
were no further communications from the flight crew. The actual problem turned out to be that the
pressurization was not properly set; it
was part of the checklist, but it was not accomplished. The Captain was known for being meticulous on procedures and
quiet. According to the report, the
first officer had a history of rushing through checklists. This is not uncommon, although what is even
more common (and likely something every professional pilot has done at one time
or another), is “looking” at an item without really “seeing” it.
Unlike a rapid decompression,
the cues for “slow onset” hypoxia are not as clear. Think of it as the “boiled frog” effect. While the cabin slowly ascends there are the
usual cues. Ears need to be cleared, but
for anyone that flies often, that is pretty much an unconscious act. Other cues are more subtle, such as gradual
decrease in the ability to discern color and a slow degradation in the ability
to think clearly. The crew of the Helios
B-737 never sorted out what was happening.
The aircraft continued to fly on autopilot, finally running out of fuel
and crashing about three hours later, killing all 121 aboard. The flight crew never regained
consciousness. It appears that a flight
attendant either maintained or regained consciousness and attempted to regain
control of the aircraft just prior to the aircraft running out of fuel.
All humans are subject to
cognitive biases of various sorts. These
biases are heuristic “short cuts” that we use to make sense of the world around
us. They are the basis of “intuition[iii]”,
which allow us to make more rapid decisions in familiar situations. The more careful thinking required to
logically analyze a situation or concept requires considerably more
energy. Humans do not like to do this
under the best of circumstances (preferring the much faster and easier
intuitive method). If our mental ability
is compromised in any way, such as being fatigued, ill, impaired by medicines
or alcohol, or even with low blood sugar, we will be even more inclined to
think the “easy” way. Hypoxia will have
similar effects.
The effects of hypoxia are
subtle, while the effects of biases are very strong. Pilots have often continued flying without
seeing or hearing warnings; accident,
ASRS and ASAP databases are full of these types of events. Given a combination of circumstances, it is
not hard to see how an event such as this happened to the Helios crew could
happen to any of us. Missing a
checklist item, a warning or alert is common.
This has been a factor in so many different types of accidents, that it
cannot be overemphasized. Regarding this issue, the Helios accident report, in
discussing missed procedures or checklist items, states:
The Board was also sensitive to the fact that automatic
execution of actions was very much affected by assumptions – in the case of
performing a large number of verification steps, the assumption that all
switches and indications were in the usual, normal for this phase of flight
position. A superfluous green indication on the pressurization panel could be
easily (inadvertently) overlooked when perception was biased by the expectation
that it should not be present.
Exacerbating this tendency (expectation bias) is the rarity
with which switches (especially, and directly relevant to this case, the
pressurization mode selector) are in other-than-their-normal position. A pilot
automatically performing lengthy verification steps, such as those during
preflight, is vulnerable to inadvertently falsely verifying the position of a
switch to its expected, usual position (i.e. the pressurization mode selector
to the expected AUTO position) – especially when the mode selector is rarely
positioned to settings other than AUTO…
…because they are performed repeatedly on the line, they are
also performed by memory, typically in time-pressured circumstances (i.e.
indirect pressure to maintain on-time
departures). For these two reasons, checklists are often performed in a
hurried, automatic
fashion. From a human factors standpoint, rushing is known
to lead to the inadequate allocation of attention to the task at hand – and
thus to errors. Furthermore, like procedures, checklists are also vulnerable to
“looking without seeing” because they are biased by the assumption that since
each item verified an action performed only moments
ago, then it must be
already in the desired position/set (P.118-120)
These
lessons alone can save your life. The
only way to break this tendency is to make a conscious, direct effort to focus
on each item and ensure it is accomplished.
The failure to pressurize while on ascent, while the crew is attending
to other issues, represents one of the most dangerous hypoxia situations.
Having now
established how easily a crew might find themselves in a scenario where the
aircraft is not properly pressurized, let us turn our attention back to the
effects of hypoxia. Unlike the rapid
decompression situation, the “slow boil” hypoxia is not obvious, and there is
nothing that “jumps out at you;” this is the situation that killed Paine
Stewart. Any sort of depressurization is
rare, and one that is very subtle is even more dangerous. The following chart outlines the effects:
Sequence
of Events - Slow Onset Hypoxia
Notice how
subtle the effects are. The items up
through 18,000 feet could be easily ignored, and the loss of cognitive
functions from 18,000 through 24,000 feet are also things that we might not
notice or might attribute to other things.
Tunnel vision, like loss of color vision, is much less obvious while a
person is experiencing it, particularly when focused on trouble-shooting a
task. Once past that phase, things
happen rapidly. Unfortunately, the
symptoms are impairing our ability to see the symptoms and recognize them!
Hypoxia can
be a factor in other scenarios. Cargo
aircraft that have been depressurized as part of a smoke procedure could put
crews at risk if they elect to go back and investigate without an activated
walk-around oxygen mask. The canular-type
supplemental oxygen used in general aviation can put a person at risk at higher
altitudes as well due to the insidiousness of the slow degradation. A similar effect could occur due to a leaking
mask or a diluted oxygen flow.
Hypoxia is
one of those threats which most professional pilots seldom consider. With modern aircraft, the chances of a rapid
decompression are very low. Many pilots
do not worry about it and approach the entire issue casually. Slow-onset hypoxia has received little
attention over the years. The thought of
the pressurization not working on climb out is not one that strikes fear in our
hearts or grabs our imagination like the thought of a door blowing off an
aircraft leading to explosive decompression.
Despite that, it is the slow one which is more likely to kill you.
The
degradation of our cognitive functioning which occurs with hypoxia can lead to
a fixation that is so strong that the last thing you will ever know is “what a
great pilot you are”.
Mitigations
Awareness is
the key. Each persons’ hypoxia
experience will be unique to them. Crew communication (i.e., good CRM) becomes
a vital component in identifying the problem prior to a loss of situational
awareness and subsequent incapacitation.
Furthermore, flight attendants should be trained to have a fundamental
understanding of the sequence of events once the “rubber jungle” drops and
should proactively participate in the CRM with the flight crew.
[i]
HELLENIC REPUBLIC MINISTRY OF TRANSPORT &
COMMUNICATIONS. (2006). AIRCRAFT ACCIDENT
REPORT HELIOS AIRWAYS FLIGHT HCY522.
[ii]
Dismukes, R., Berman, B, and Loukopoulos, L. (2007). The Limits of Expertise. Ashagate: Farnham, England.
[iii]
Kahneman, D. (2012). Thinking Fast and
Slow. Ferrar, Strauss and Giroux: New York.