“The problem with pilots today is that they’re not doing enough hand flying!” How often has this been said in recent years? It is a refrain that seems to be the current “hypothesis du jour” by many in the aviation industry in an effort to explain many of the incidents and accidents that have occurred. It appears that the entire industry, regulators and the various alphabet organizations have jumped onto this bandwagon. ”Advocate more hand flying” and safety will be improved. Closely following this, and a bit curiously, is the concept that all pilots need to do is “pay attention to detail” more, that if the accident or incident is not a result of “stick and rudder” skills, it is a consequence of just not paying attention. Has anyone stopped to think about these things?
Stick and Rudder
I have a bias. As can be seen in the photo I took over Hong Kong (above) I love hand flying. I have a love/hate relationship with autopilots and autothrottles, in that they can be great but it is not nearly as fun for me. If conditions allow for it, I will hand-fly the airplane up to FL250 (25,000 feet) and I also turn off the automation fairly early during the arrival. I do it because flying the airplane is fun! There is no question that hand flying is good, and the more you do the better you will get at it. However, how does this relate to the recent air carrier accidents?
A review of all of the accidents involving jet transports in the last five years fails to find any that would clearly have been averted if the pilots had done more hand-flying. If the hand-flying aspect was present, it was not in isolation. There are several where the aspect of hand-flying has been raised, but even those are missing the point. Most recently in the news highlighting the call for “more hand-flying” have been Air France (AF) 447 and Asiana 214 in San Francisco. However, a review of AF 447 and the preliminary data on Asiana, with the removal of hindsight bias, makes the “easy fix” of more hand-flying much less clear.
Contrary to popular opinion, and based on the data that has been provided, the accidents themselves do not appear to be a lack of hand-flying skills per se. In other words, the accident could have happened to someone who handles the aircraft exceptionally well. What I am seeing in these accidents is not a lack of flying skills, but rather a lack of understanding what the automation is providing for them. In other words, their expectations of the automation are different than what the automation will actually do. This aspect is not trained as well. While the pilots are trained in the basics of the automation, there are enough gaps in the training that the “What’s it doing now” comment (typically stated in a humorous way, but meant seriously), is very common.
If a pilot does not know why the automation is doing something there is (of course) the possibility that the computer is doing something it should not be doing. In reality that is a very low probability — the rate of actual problems with the hardware and/or software is very, very low. It is much more probable that the pilot does not understand the automation. Therefore, a statement (or even a thought) along the lines of “What’s it doing now?” should serve as a red flag that the pilot does not fully understand the automated system. Perhaps they are operating at too high a level of automation for the situation. Regardless, if the disconnect continues for more than a moment or two, they should seriously consider moving to a less automated mode until things are sorted out.
At all times, the pilot should be certain what to expect next. Unfortunately, absent significant system knowledge, most of the current systems installed in aircraft do not do a very good job of telegraphing to the flight crew what it is planning on doing after the current action. Sometimes it is not a large factor, such as the way the aircraft adjusts to a different-than-expected wind model when it transitions from one phase of flight to another, but other times it can be a major factor that can lead to, at the very least, a violation of a regulation or at worst, an accident when a crew expected the aircraft to capture a certain vertical path or airspeed.
Putting the automation aside, let’s take a look at the hand-flying aspects. In all the advocating of hand-flying there has not been much of an effort to ensure that all pilots have reviewed or received recurrent training on the fundamental aerodynamics. You may find that surprising. You might ask “Don’t pilots get trained for that early on?” Well, the answer to that is a definite “maybe”. While most pilots might get it at some point early in their careers, there are some that have only learned enough to pass the written test, with their formal training being generally fairly superficial. Sure, they know how the aircraft responds to their inputs and a rudimentary concept of “why”, but that leaves out the concepts of what would happen in the more rare “corner-points” that are often the scenario where accidents occur. Regardless of their background, a comprehensive review of aerodynamics should be included in the training programs. It is not expected that pilot skills will stay sharp in hand-flying or the flight procedures absent regular training, so why is a review of aerodynamics left out of the equation?
While it is only rarely that a comprehensive baseline understanding of aerodynamics is important, when it is important, it can be very important! In addition, if we are to expect pilots to use their “hand-flying” skills to get out of sticky situations, perhaps they should be allowed to practice those skills in situations where they might need them. AF 447’s crew found themselves hand-flying the aircraft at an altitude regime where the full time use of the autopilot is required, in a degrade flight control mode rarely practiced or demonstrated at any altitude.
The combination of the altitude and flight control regimes was something that they would never have seen in any training, even assuming the simulator could adequately replicate the conditions (a topic unto itself!). Is it reasonable to assume that they would be able to have the skills to hand-fly the airplane with those issues, while flying at night through the upper levels of turbulent thunderstorms? It is easy to judge them in hindsight, but the scenario was far different than the comfort of judging their actions after the fact from an office chair.
Attention to Detail
Thinking more on the combination of advocating hand-flying and “attention to detail”, one has to ask, “How can you pay attention to detail if you are hand-flying?” The industry originally moved towards the idea of more automation to allow for better decision making with fewer pilots in the cockpit. The concept was that we could operate very large aircraft (which used to require crews of three or more pilots) with just two pilots, as the automation would free up the pilots from all that hand-flying, so they could concentrate on decision making. In that process, there is no question that the pendulum went pretty far to the point of mandating high levels of automation use, and that needed to be pulled back, but it now appears that we are trying to go the opposite way, while admonishing “ok, hand-fly, but you’re still just as responsible to monitor!”
More to the point, though, we simply cannot just command somebody to pay attention more. In the Just Culture algorithm, we divide actions that led to an adverse outcome into error, at-risk and reckless. We define an error as something that is an unintentional act, and, by definition, if that leads to a bad outcome, it is a “system problem.” By “system”, we mean that we need to redesign the policies, procedures, equipment, displays, etc. to fix it. At-risk is an action that is a result of a person intentionally not following a policy or procedure, but doing it to work-around something in order to facilitate getting the job done. This is a combination of an individual and system issue, but also still largely a system problem, as the system was not designed correctly if someone is having to “bend the rules” to get the job done. Reckless is just a disregard of the rules for no justifiable reason.
Coming back to this issue of “attention to detail”, where would this fall in the Just Culture matrix? It would be hard to argue that any lapse due to lack of attention to detail would constitute an intentional act to just disregard procedures. Perhaps in some cases (but not many) it could fall into the at-risk area, but most of the time, these would clearly be an error. The fix for errors is not to just tell someone “don’t do that!” I am reminded of the Bob Newhart video (linked here) where he plays the role of a psychologist who tells his patient to “just stop it” when she is expressing problems. If we are to correct these issues it will be through system and procedural design, not just telling someone to “stop it”.
The “system” should be designed with the knowledge of human error and cognitive limitations by people who have an expertise in the field. Too many times we see people put in positions and given titles with no real expertise. A person who has a combination of formal academic training in human factors, coupled with experience in the field, would be ideal, as that person could integrate the academic knowledge with the “real world” to truly create systems that could capture error. As an alternative, two people, one with academic knowledge, and one with extensive operational experience, could be coupled together to attack these types of problems.
Relying on people who do not possess the knowledge and experience is how we created these flawed systems in the first place. We, as an industry, can do better.