Twin Proficiency: Pilot Belief System

Twin Proficiency: Pilot Belief System

Confirmation bias can lead pilots to act on their belief of what is happening, despite evidence that the opposite is occurring

Say you are solo in a very familiar aircraft, approaching a busy but familiar airport. The tower controller clears you to land on Runway 20L at Orange County/John Wayne Airport (KSNA), the shorter and narrower of the two parallel runways, and you correctly read back the landing clearance. As you near the airport, however, for whatever reason you align yourself with the equal length but even narrower Taxiway C, even though a Boeing 737 with six crew and 110 passengers aboard is stopped on Taxiway L, directly blocking what you think is your assigned runway, the flight was holding short of 20L for your landing. You see the 737 right in front of you; you even ask the tower if it’s supposed to be there. Still, focused on your landing, you overfly the jetliner, reportedly 125 feet above Taxiway L according to radar data. Despite the anomalies, you continue and make what you think is a normal landing, on Taxiway C.

Or perhaps you are loading a flight plan into your FMS and type the wrong identifier. Taking off, you don’t detect your mistake until you have flown far enough you’re committed to land in the wrong country, on the wrong continent.

Used to departing from a familiar airport westbound, unusual weather conditions require you to take off to the east in IMC. Cleared for a SID, after departure you turn 90 degrees to the left, which is correct the way you usually depart, but this time aims you directly at a cloud-obscured mountain ridge.

Or this scenario: Cleared for the visual in night VMC, you see the runway ahead and slightly to the left. Although you have the approach programmed into your navigation system and it shows you slightly left of course, you continue to correct further to the left to line up with the runway. Touching down, you see the runway end lights racing toward you and apply maximum braking, coming to a stop with almost no runway remaining. Only then do you realize you’ve landed at the wrong airport, on a runway barely long enough for a short-field landing in your airplane.

In each of these real-world events, no one was hurt and there is no damage. No “accident” occurred. But clearly the pilots were not fully in command of each flight. How can this happen? How can we avoid similar situations?

Confirmation Bias

Figure 1: Landing on Runway 18, I have the HSI course needle pointing on runway heading. My downwind course will be in the opposite direction, 360. My current heading of about 045 degrees is correct for a 45-degree entry to the downwind.

Confirmation bias is defined as “the tendency to interpret new evidence as confirmation of one’s existing beliefs or theories.” Our brains naturally tend to discard data that does not fit our preconceived notions, and grasp at little things that may support what we think we are seeing. Writing in Psychology Today, Professor Shahram Heshmat, Ph.D., notes:

“Confirmation bias occurs from the direct influence of desire on beliefs. When people would like a certain idea/concept to be true, they end up believing it to be true. They are motivated by wishful thinking. This error leads the individual to stop gathering information when the evidence gathered so far confirms the views (prejudices) one would like to be true. Once we have formed a view, we embrace information that confirms that view while ignoring, or rejecting, information that casts doubt on it. Confirmation bias suggests that we don’t perceive circumstances objectively. We pick out those bits of data that make us feel good because they confirm our prejudices. Thus, we may become prisoners of our assumptions… In sum, people are prone to believe what they want to believe. Seeking to confirm our beliefs comes naturally, while it feels strong and counterintuitive to look for evidence that contradicts our beliefs. This explains why opinions survive and spread. Disconfirming instances are far more powerful in establishing truth. Disconfirmation would require look for evidence to disprove it… Look for instances to prove that you are wrong.”

(Excerpted from “What Is Confirmation Bias?” Psychology Today April 23, 2015)

Confirmation bias may be a significant issue in aircraft accident causation, and a major threat to the successful outcome of a flight. Seeing what we think we’ll see or hearing what we think we’ll hear, then doing what we think we should do, are common hazards in aviation. One of the advantages of a two-pilot crew, with one pilot flying (“PF”) and the other pilot monitoring (“PM”) is a double-check of actions, indications, and operations to overcome the effects of confirmation bias. Most of us don’t have this additional layer of vigilance watching what we do (or do not do) in the cockpit. We need to be our own quality control, the PM watching what we do as PF.

Getting It Right

As difficult as that may sound, there are some simple yet effective techniques we can employ to combat confirmation bias:

Develop and use standard operating procedures. SOPs include items like consistent power settings, flap settings, airspeeds and (as appropriate) landing gear position; they include how you set up and use navigation and communications radios, GPSs, autopilots and other avionics; and how and when you use checklists. SOPs may at first seem to actually contribute to the likelihood of confirmation bias, because by their nature they provide expectations. The confirmation bias-bashing benefit of SOPs, however, is that by using SOPs many decisions are already made for you; all you should do is confirm they are having the desired effect. Without having to “make it up as you go,” you are free to focus your surplus mental bandwidth on detecting and overcoming anomalies that, if you were under greater workload, you might not notice.

Use checklists and cockpit flows. Although these are a category of SOPs, it’s worth extra emphasis to use printed checklists and cockpit flow checks to ensure you have not missed anything. Ironically, the busier you are the more important it is you make time for a printed checklist. When you’re busy you’ll be more likely to forget something, and at the same time more susceptible to confirmation bias.

Talk aloud. I talk to myself when I fly. Maybe it’s a byproduct of flight instructing for so long, but I am constantly voicing things like “one-thousand-seven-hundred for three-thousand, one-thousand-three-hundred to go;” “gear down, three green, no red [in transit light];” I’ve found that by speaking aloud I increase my ability to “pilot monitor” my pilot-flying actions.

Confirm navigation. I usually keep my GPS moving map display on either the 35 or 50 miles scale to have a level of detail for en route flying. I use closer scales (usually 20 miles) close to airports. But when I load my flight plan, I’ll zoom out to whatever it takes to see the entire programmed flight. Does it make sense? Is it taking me where I plan to go? Or did I program in an anomaly, either a typographical error on the GPS entry system or a planning error that made it into my filed route (that I’ll need to fix before takeoff)? Scaling out to check, and then zooming in to the departure or en route view, would have prevented the AirAsia 223 “landing on the wrong continent” event mentioned above.

Crosscheck altitudes. Look at the arrival airport information and state aloud your planned traffic pattern or approach intercept altitude. Make sure it makes sense and you’ve not fallen into a confirmation bias trap. For example, I instructed the low-time pilot of a Florida-based A36 Bonanza from my home field in Wichita recently. Air work complete, descending into KICT while coupled to the autopilot, the pilot spun 1,500 feet into the altitude preselect — the airplane would descend to that height and then level off without additional pilot input. As we continued descending through about 3,000 feet I pointed at the numbers on the preselect panel and asked, “What is the significance of 1,500 feet?” The pilot responded, “I always descend to 1,500 feet before final descent into the pattern,” I answered, “That might work in Florida, but the field elevation at Wichita is 1,320 feet.”

The pilot was certain the altitude preselect was set correctly based on experience, but did not actively work to ensure that setting was appropriate to the current environment.

Monitor groundspeed and fuel burn. As we’ve discussed many times, fuel exhaustion and starvation events are way too common. Often a pilot plans enough fuel for the flight and thinks that it will be sufficient—but then doesn’t lean the mixture quite as expected, or flies at a different altitude than planned (higher means more high-power climb time; lower in turboprops and jets means vastly increase fuel flow).

Or the tailwind is less than expected/headwind more than expected, and the flight takes longer than anticipated. I listened to the pilot of a Beechcraft once who was detailing his pending lawsuit against the FAA based on getting an incorrect winds aloft forecast (by telephone). His claim was that his ground speed was too low flying down Florida’s east coast, and the fuel he planned for the trip was insufficient to keep him from ditching in the Atlantic, when of course he should have simply landed for more fuel along the way. He expected he had the fuel to make it to destination, when monitoring the ground speed and its impact on fuel burn would have told him otherwise.

Confirmation bias may have been a contributing factor that resulted in one pilot landing on a taxiway instead of the intended runway at a busy airport in Santa Ana, California. This tendency may be a significant issue in aircraft accident causation, and a major threat to the successful outcome of a flight.

Orient yourself to runways. Most of the airplanes I fly have a Horizontal Situation Indicator (HSI), which is a combination heading indicator and GPS/VOR/ILS navigation display. These come in extremely handy for IFR work, but they also help me orient myself to runways, whether taxiing or coming in to land. If I’m not actively using it for anything else, I’ll dial the HSI course needle so that the arrow points to the runway heading. This makes it very easy to determine if I’m on downwind (the tail of the needle is up, on my heading), if I’m on a 45-degree entry to the downwind (see Figure 1), or if my heading is perpendicular to the yellow course needle, meaning I’m in a base or crosswind leg. If my direction of flight does not make sense compared to the course needle when I have the runway in sight, maybe I’m lining up on the wrong runway.

Similarly, I align the course needle with the planned runway before I taxi for departure. I can tell at a glance if the runway I see I’m taxiing parallel to is the one I think it is.

Of course, there are several tricks for using geo-referenced charts and taxi diagrams that are even more precise. But just because you have something on a moving map doesn’t mean you can’t back it up here in your primary scan also.

Brief the arrival. Even if you’re on a VFR arrival, look at the airport diagram and orient yourself to the runway you’ll use. If you don’t use IFR charts, or there’s not a diagram available for your departure or destination airport, check it out on Google Earth or similar resources. Set up for the instrument approach even when you’re making a visual approach so you’ll have a navigational crosscheck. Assume nothing; resolve any discrepancies before you get too close to the ground, and don’t rely solely on memory.

Monitor the Runway. On final approach, or lining up for takeoff, make a special effort to confirm the runway in front of you is the one you think it is. Look at the runway numbers. Confirm they match your clearance or, at nontowered airports, your intentions. Crosscheck your heading as you align with the runway. This would have saved the 49 who died, and the First Officer who survived with severe injuries, when Comair Flight 5191 attempted takeoff from the wrong runway at Lexington, Kentucky. Look for obstacles in the air around the runway and on the surface itself, including airplanes or vehicles on taxiways near runway entry points.

If anything doesn’t check out, if anything just doesn’t make sense or for any reason you find yourself thinking, “I have a bad feeling about this,” get out of the airport environment (abort takeoff or go around as appropriate), then figure out what’s really going on.

Question Everything

In flight by reference to instruments we teach the process of scan, interpret and aircraft control. This process is just as descriptive of flying in visual conditions, when you consider that eyes and mind inside the airplane or out, we are constantly taking in (mostly) visual information, deciding what if anything to do with what we see, and controlling the airplane based on those decisions. Anything that interferes with the process has bearing on the results. Confirmation bias can cripple the first two stages of this three-stage exercise.

How do you mitigate the risks of confirmation bias?

Gather information continually.

Support or refute your expectations and decisions with real data from skeptical observations.

Actively monitor the airplane, the environment and yourself, using an active scan, interpret, aircraft control technique.

If anything doesn’t look right or doesn’t make sense, get away from the threat (example: go around) and reset.

Question everything.

Employ healthy skepticism, knowing that even the best of us are susceptible to confirmation bias. As Professor Heshmat writes, “look for instances to prove that you are wrong.” 

About the Author

Leave a Reply