From the NTSB:
While on a night instrument approach to a non-towered airport, the aircraft collided with the airport’s perimeter fence and terrain. The fence and perimeter road were parallel to, and about 750 feet east of, the runway. The last radar plot was at an altitude of 1,200 feet MSL, slightly east of the runway, and approximately 435 feet from the accident site. Sheriff deputies reported that the weather was “foggy”. The pilot was flying an RNAV (GPS) approach. The published minimums for the approach are a 400-foot ceiling and one mile visibility. The automated weather station about 33 miles northwest of the accident site reported calm winds, temperature 41°F, dew point 41°F, visibility less than 1/4 mile and a Runway Visual Range ( RVR) of 600 feet variable to 1,200 feet in fog and an indefinite ceiling. An examination of the airplane failed to reveal any anomalies with the airframe, structure, or systems. Under the conditions at the time, the pilot appeared to have mistaken the east perimeter road for the runway landing point. The NTSB’s probable cause:The pilot’s decision to continue the approach below minimums without visual references, and subsequent collision with the perimeter fence/terrain.
We tend to see what we want to see. And we tend to believe what we want to believe, especially under stress (like on a night approach in low IMC when we’re on a schedule to get to our destination). Research has shown this to be a common trait across all cultures and societies. We suffer from a psychological predisposition called confirmation bias.
Science Daily calls confirmation bias “a phenomenon wherein decision-makers have been shown to actively seek out and assign more weight to evidence that confirms their hypothesis, and ignore or under weigh evidence that could disconfirm their hypothesis. As such, it can be thought of as a form of selection bias in collecting evidence.” If information or an observation does not match our mindset, we tend to discount, or even completely ignore, the contrary evidence.
Now, the pilot on this fateful night was not a newcomer. A 28,000-hour ATP/CFII, he was highly experienced in the type of airplane being flown and had logged over 150 hours in the previous six months. During the flight, the pilot received the weather conditions at his intended destination and several tower-controlled alternates from Air Traffic Control. He stated that he’d do the RNAV (GPS) runway 31 approach at his filed destination, and if he couldn’t make it in, then he’d go to an alternate about 50 miles away.
Nearing the airport, the pilot contacted Center and said: “I believe I’ve got (the airport) in sight right now, but I’m gonna go ahead and do the approach to make sure.” The controller replied, “All right, that’s good, yeah, (the alternate is) showing clear, let me look and see what that weather was at (the closest reporting station). Yeah, it’s almost an hour old, but it was showing just a hundred there with quarter-mile visibility and fog.” The pilot responded, “All right, I’ve got the prison (near the airport) in sight, I know that, and it’s right there by the airport, I can see the lights at the prison.” There were no further transmissions from the accident aircraft. As the NTSB report states, the radar track and the wreckage makes it appear the pilot mistook a road for the runway.
The airport in this event was equipped with MIRL (Medium Intensity Runway Lighting) and no touchdown lights. AirNav.com (a commercial airport guide), listed the runway markings as “in poor condition,” according to the NTSB.
Another NTSB report:
During a localizer back course approach, the airplane collided with four electronic transmission cables located 75 AGL and approximately 7,000 feet short of the runway. The crew executed a missed approach and landed uneventfully at an alternate airport. The NTSB found that the crew did not adequately review the approach chart. The First Officer (pilot not flying) misidentified lights on the ground, which influenced the captain’s subsequent misidentification (of highway lights for the runway environment).
This was another case of confirmation bias, in this event the captain’s willingness to believe that lights not aligned with the runway were the runway lights he was looking for. The point is that even high time professional airline crews can fall victim to confirmation bias. That, in turn, “confirms” that single-pilot operators need to actively work to avoid it as well.
Beating confirmation bias
You can beat confirmation bias by employing a little healthy skepticism when briefing yourself for an approach (or a visual night landing). Check what pattern of approach and runway lights you’ll see when you first have visual contact, from the Airport/Facility Directory, airnav.com or similar sources, and on the airport view of instrument approach charts (figure 1). Compare the pattern to published examples (figure 2), and finally the runway stripes you expect to see (figure 3).
On final approach, watch for the specific patterns and make these callouts to yourself as you progress down final. Say them aloud; there is strong evidence that speaking aloud supports good operating technique:
“I have the approach lights in sight.”
“I have the runway edge lights in sight.”
“I have the runway markings in sight” (this may not occur until your landing light illuminates the pavement, or not at all if the runway is covered in snow).
“Runway in sight, landing.”
It’s not enough to see lights at the Missed Approach Point; you need to see the correct lights. If you don’t see the expected sequence of lights and markings on short final, miss the approach and climb out safely to try again or go to an alternate. If the runway lights or markings are minimal, don’t combine low-light and poor visibility conditions—wait for the sun to come up or the weather to improve, or go land somewhere else. Don’t just think you see the runway. You’ve got to know you see it.