21 December 2011

How Bad Decisions Happen

Popular Mechanics has a chilling report on the cause of the crash of Air France 447, which was the result of a sequence of bad decisions by the crew. The fight recorders recovered from the bottom of the ocean reveal confusion and poor judgment among the pilots.  Despite being warned by the planes automatic systems more than 75 times that the plane was in a stall, the pilots never acknowledged the warning.

The article introduces a distinction between "normal law" which are the restrictions placed upon the plane by a computer to disallow any decision that takes it out of its flight envelope. However, Flight 447 was operating not in "normal law" but in "alternate law" -- PM explains:
Again, the stall alarm begins to sound.

Still, the pilots continue to ignore it, and the reason may be that they believe it is impossible for them to stall the airplane. It's not an entirely unreasonable idea: The Airbus is a fly-by-wire plane; the control inputs are not fed directly to the control surfaces, but to a computer, which then in turn commands actuators that move the ailerons, rudder, elevator, and flaps. The vast majority of the time, the computer operates within what's known as normal law, which means that the computer will not enact any control movements that would cause the plane to leave its flight envelope. "You can't stall the airplane in normal law," says Godfrey Camilleri, a flight instructor who teaches Airbus 330 systems to US Airways pilots.

But once the computer lost its airspeed data, it disconnected the autopilot and switched from normal law to "alternate law," a regime with far fewer restrictions on what a pilot can do. "Once you're in alternate law, you can stall the airplane," Camilleri says.

It's quite possible that Bonin had never flown an airplane in alternate law, or understood its lack of restrictions. According to Camilleri, not one of US Airway's 17 Airbus 330s has ever been in alternate law. Therefore, Bonin may have assumed that the stall warning was spurious because he didn't realize that the plane could remove its own restrictions against stalling and, indeed, had done so.
The tragic sequence of decisions on board Air France 447 will no doubt help to make flying even safer than it is today, and should also serve as a case study in decision making more generally. Many poor decisions can be attributed to a confusion between an expectation that one is operating under some conditions akin to "normal law" when the actual decision context is better characterized in terms of "alternate law."

14 comments:

  1. All this to me seems to be a way to take attention from the fact that one of the Co-Pilots was holding the stick back the entire time--seems a lot more basic than "never having flown in alternate law". Yes there was some period of incomprehension as a result of the pitot and computer problem however the lead pilot understood the "fix" in sufficient time to save the plane. He didn't because of the incomprehensible actions of the co-pilot.
    There are likely more lessons to learn from here. Perhaps there were too many pilots. (And the lead pilot was out of the cabin taking a nap how long after takeoff and while transitioning through the most potentially dangerous part of the flight?)

    ReplyDelete
  2. This is the second instance of this happening that I have read about. They had an early model airbus do a flyby of an old airfield for an airshow literally not pull up and plow into some trees. That time it was due to how the fly by wire software works and it wouldn't let the pilots pull the nose up and climb out.

    ReplyDelete
  3. Pat Moffitt said... 1

    seems a lot more basic than "never having flown in alternate law".

    But that's the problem with flying on auto-pilot 99.99% of the time. The 'automatic' reactions end up lacking finesse. Every takeoff the pilot just pulls back as far as he can on the still because he 'knows' the computer will compensate.

    Old timers who flew before 'fly by wire' pull back on the stick far enough to maintain a maximum climb and know if the pull too hard that will do a 'maximum decent'.

    ReplyDelete
  4. The situation on Flight 447 was also complicated by the actions of stall warning. When the pilot pulled back and the aircraft airspeed went below a designated threshold, stall warning stopped. When the pilot pushed the yoke forward (correct procedure to recover from stall), the stall warning started again. It stopped when the pilot pulled back again.

    It was also dark, night time. Had the pilots been visually able to certify the attitude of the aircraft, they would have lowered the nose.

    The irony remains that had the pilots done the "JC maneuver" .... take your hands off the controls and put it into the hands of a supernatural power as test pilot Joe Walker put it..... the aircraft would have levelled by itself.

    ReplyDelete
  5. "Every takeoff the pilot just pulls back as far as he can on the still because he 'knows' the computer will compensate."

    Are you saying this is a potential outcome of fly by wire, or that it actually happens?

    ReplyDelete
  6. Assuming the pilot was competent to fly in both modes, this would suggest that the problem stemmed from an incompletely or insufficiently characterized environment.

    First, the computer relinquished control as it recognized its own marginal awareness. Second, the pilot did not assume control due to improper or, possibly, contradictory information, which lead to the confused state of the operator.

    If that was the scenario, then the issue to consider is risk management. What process and steps should the operator have followed in order to mitigate perceived and, in this case, real risk when lacking full situational awareness?

    ReplyDelete
  7. More discussions here:
    http://www.pprune.org/rumours-news/466259-af447-final-crew-conversation.html

    ReplyDelete
  8. I'm not a commercial pilot - maybe there is one lurking who can correct me if I am wrong - but it is my understanding that this issue is largely an Airbus issue. There is a fundamental difference between Boeing and Airbus aircraft in terms of how much control of the aircraft the flight control computer is given - or shall I say how much control over the aircraft the PILOT is given.

    It is my understanding that in Boeing aircraft the flight control computer [I'm not talking about the auto-pilot here], gives warnings and advice [eg. "Pull Up! Pull Up!"] to the pilots, but does not exercise control authority over the pilot's actions. [there may be the digital equivalent of the old 'stick pusher' but I'm not sure]

    Airbus on the other hand has [apparently] a completely different strategy which is to have the flight control computer [again as distinct from the autopilot] exercises control authority over the pilot at all times, unless a fault occurs with the system that the system is able to recognize, then [apparently] it cedes authority back to the pilot - which the pilot then needs to recognize and act upon.

    The Airbus system can produce good results in an emergency as it did in the case of US Airways flight 1549 when it had to ditch in the Hudson River in 2009 after losing an engine on takeoff due to a bird strike. Here the full authority system of the A320 helped pilot Capt. Chesley Sullenberger by reducing his workload steering the plane as he and his copilot worked emergency procedures; however, the critical factor in that flights successful ditch with no loss of life rested in the hands of the pilot: talented, experienced, and with expertise in cockpit management in emergency situations.

    I don't know about you, but I would rather have a pilot like Sully Sullenberger in full control of the aircraft and fly an airline that can boast, "Best pilots in the business," rather than, "Best software in the world."

    ReplyDelete
  9. -4- Pirate,

    "The situation on Flight 447 was also complicated by the actions of stall warning. When the pilot pulled back and the aircraft airspeed went below a designated threshold, stall warning stopped. When the pilot pushed the yoke forward (correct procedure to recover from stall), the stall warning started again. It stopped when the pilot pulled back again."

    Holy smokes. That's like the accident at Three Mile Island. The pressurizer was constantly filling up. But that didn't mean that there was too much reactor coolant (as one would logically think)...it meant there was too little reactor coolant.

    Meanwhile, I'm sure in both cases lights were flashing and warning buzzers were sounding. It's not amazing accidents like Air France 447 and Three Mile Island happen, it's amazing they don't happen more often.

    ReplyDelete
  10. Hi,

    One wonderful thing about GPS is the nice GPS lady never gets flustered.

    I had one instance where she led me into what I knew was a dead-end road. I had plenty of time that night, so I just followed her directions to see what she was getting at. It turned out that there was another dead-end road about 100 feet beyond the end of the road I was on. I guess her software said those two streets were actually joined together. All I needed to do was drive through the side yard between two houses, and her advice would have been good. ;-)

    Anyway, the point is that the GPS doesn't tell you "You're on the wrong road You're on the wrong road. You're on the wrong road."

    The GPS instead helpfully and patiently tells you how to get to the *right* road (although sometimes the GPS doesn't know what the right road is). It seems to me that the Airbus software could be modified from "Stall! (Cricket.)" "Stall! (Cricket.)" "Stall!...."

    ...to..."Stall."..."I notice you have the stick pulled up. This is highly irregular, Dave. Nine out of ten pilots and their dentists would push the stick *down* to recover from a stall. Have you thought about doing that?"

    Mark

    P.S. My heart goes out to the poor people who suffered from and are suffering from this tragic accident.

    ReplyDelete
  11. Mark,

    Nine out of ten SURVIVING pilots push forward on the stick to recover from a stall, pilots who pull back on the stick don't often come back! ;)

    I also have to respectfully disagree with you about the GPS lady never getting flustered.
    This last Thanksgiving eve I was driving with my dad to visit some relatives in Damiscotta for dinner, full dark, a bit foggy, a foot of snow on the ground. We were nearing our destination and looking for our final turn [not 'turn to final'] when the GPS lady insisted we, "Turn right now!" We turned right onto a one lane road, proceeded less than 100 yards when the GPS lady insisted, equally urgently, "Turn around now!" We managed to turn dad's A8 around without getting stuck in the ditch on either side of the road, went back out to the main road, where she insisted we,"Turn right now!" took us a quarter mile down the road, before indicating our correct final right turn. We never did get to admit to her mistake, how very like a pilot in that regard.

    Optimum culturally correct Holiday to all, and to all a good night!

    W^3

    ReplyDelete
  12. There is a joke that says, "In these new computer-controlled aircraft, the optimal flight crew is the captain and a dog. The job of the captain is to feed the dog and the job of the dog is the bite the hand of the captain if he tries to touch the controls."

    From a human factors POV, it is very difficult to craft a situation where the human does very little then is suddenly expected to reason the correct decision when an anomalous emergency occurs.

    Instinct is honed, in part, by experience. And, experience is the one thing that cannot be taught. So, I fear, we are setting ourselves up for future failures of this nature.

    ReplyDelete
  13. Apart from technical assessments there is the problem of communication, social norms and social status and how they interact. Sometimes with disastrous consequences as Air Florida flight 90 famously exemplifies. A Boeing 737 crashed in January 1982 after attempting take-off with iced wings. The first officer was not able to get the message across to the captain.

    Jean-François Bonnefon, Aidan Feeney and Wim De Neys 2011. The Risk of Polite Misunderstandings
    Current Directions in Psychological Science 2011 20: 321

    The problem is that politeness is difficult to interpret. In risky situations this can be fatal:
    "Convergent experimental findings thus suggest that whenever
    a situation crosses the politeness threshold (i.e., whenever
    there is a substantial risk that people might get upset), confusion
    arises about the meaning of statements that would otherwise
    be clear. This confusion is likely to create problems and
    misunderstandings in high-stakes situations, which are precisely
    those most likely to cross the politeness threshold.
    Worse still, it appears that processing politeness taxes cognitive
    resources."

    ReplyDelete
  14. "We were nearing our destination and looking for our final turn [not 'turn to final'] when the GPS lady insisted we, "Turn right now!" We turned right onto a one lane road, proceeded less than 100 yards when the GPS lady insisted, equally urgently, 'Turn around now!'"

    My GPS lady never speaks with exclamation points. Even when I deliberately don't follow any of her directions. She just patiently says, "Recalculating..."

    ReplyDelete