Eddie sez:

images

Photo: XB-70 in flight, USAF photo
Click photo for a larger image

I feel fortunate to have started my aviation career in the United States Air Force back in the days where pilots were expendable and aircraft misbehavior was more or less tolerated. That may seem to be an odd statement, and it is. But having survived it I think it gives me some perspective on how to spot organizations that have a similar mindset about pilot expendability and aircraft problems.

Rest assured the Air Force no longer has this mindset and these days even a single crash is cause to ground fleets and take a close look at everything from the bottom up. So let's see if we can take a look at the leadership of a flight organization to spot these attitudes before the crash.

I call these "smoke signals" because of a common warning heard during my year of pilot training. You don't want to become a . . .

Last revision:

2020-03-10

"Smoking hole in the desert"

images

Photo: Crash of XB-70
Click photo for a larger image

In 1979 it wasn't uncommon for the Air Force to lose an airplane or two every month. I was just starting that year and we had averaged five airplanes a year in our primary jet trainer, the Cessna T-37, and seven a year in the advanced trainer, the Northrop T-38. Even the operational Air Force was accustomed to these kinds of losses. The primary fighter of the year was the F-4 Phantom II and it averaged two a year. Even the heavy aircraft world was not immune, the KC-135A tanker and C-141 cargo transport tended to lose airplanes every two years. As we used to say back then, "you have to expect a few losses in a big operation."

The most common refrain in class for us was, "You have to know this cold, or you will become a smoking hole in the desert." We were at the former Williams Air Force Base, near Phoenix, Arizona. Most of our flying was over the desert. To get an idea of the mindset of the Air Force back then, a good case study would be the crash of the North American XB-70 Valkyrie on June 8, 1966.

The airplane was designed to meet the requirements of a 1955 proposal for a bomber that would be fielded in 1963. The Soviets had just become nuclear capable and we wanted a long range bomber to hit them in their territory before they had a chance to hit us. The biggest threat to our bombers at the time were their fighters, so this airplane was supposed to fly very high, very fast. Nuclear bombs at the time were very large, so the airplane had to have a large payload. The XB-70 won the contract, with a promised top speed of Mach 3+ and altitude of 70,000 feet.

The airplane was both ahead and behind the times. It took six engines using hybrid fuels to give it the required speed. While the engines burned twice as much gas as a conventional bomber, it flew at four times the speed. Its fuselage was designed to funnel the supersonic shock wave under the wings to provide compression lift, further improving its speed and fuel numbers. But by the time it started test flights, the mission had already changed. In 1960, the Soviets shot down a U-2 spy plane at around 70,000 feet, demonstrating the ability to use missiles to down aircraft at very high altitudes. The Air Force changed tactics to fly very low, beneath radar coverage, to penetrate enemy airspace. But once a weapon system procurement is begun, the Department of Defense is rarely willing to cancel.

But Secretary of Defense Robert McNamara, over the objections of the Air Force, was able to do just that, killing the program in 1962. Two XB-70s had been built and were relegated to conducting advanced studies of aerodynamics and propulsion. On June 8, 1966, someone at General Electric thought it would be a great idea to have photos of the XB-70, an F-4 Phantom, an F-5, a T-38, and an F-104 Starfighter. All five were powered by GE engines, after all. After the photoshoot, the F-104 drifted too close to the XB-70 and was pulled in and over, severing the XB-70's tail. The F-104 and XB-70 crashed, killing the F-104 pilot and the XB-70 copilot. The XB-70 pilot was able to eject.

The photo of the XB-70's "smoking hole in the desert" haunts many of us from that era of the Air Force. How is it we can lose sight of the mission and later of our safety procedures? It seems the original mission morphs, we forget the real mission, and then the new mission blinds us of our safety procedures. It is a problem that confronts every flying organization.

Adapting the history into lessons we can use.

I think we can take the lessons from the smoking hole in the desert and expand the ideas to look for signals of an organization in danger of permitting similar crashes. Retired astronaut Jim Wetherbee wrote an excellent book on the subject of risk management and says, "every potential accident gives signals before it becomes an accident." He has a list of five common conditions that existed in various organizations before they experienced major disasters or minor accidents:

[Wetherbee, pp. 12-17]

  1. Emphasized organizational results rather than the quality of individual activities.
  2. Stopped searching for vulnerabilities — Didn't think a disaster would occur.
  3. Didn't create or use an effective assurance process.
  4. Allowed violations of rules, policies, and procedures.
  5. Some leaders and operators were not sufficiently competent.

These are the conditions Wetherbee calls "Technical / Systems / Managerial." He has another five he says are on the "Social / Human / Leadership" side. But that is a study for another day.

Emphasized organizational results rather than the quality of individual activities

[Wetherbee, p. 12] Individual people in an organization don't create results; they conduct activities. . . . Results are important, but the quality of activities creates the quality of results.

A personal story

On September 4, 1980, I flew from Honolulu, Hawaii to Anderson Air Base, Guam. It was a flight of 3,294 nautical miles and it took us 8.4 hours to do it. I was looking forward to meeting a friend of mine stationed at Grand Forks Air Force Base who was due to fly in the same day I was. Keith was flying a B-52 and I thought there might be a chance my airplane would refuel his. But history had another scenario in mind.

In our tanker, day one of the trip was to be from Loring Air Force Base, Maine to March Air Force Base, California. After a night's rest, we flew on to Honolulu, Hawaii and got another night off. The third day of the trip was to Guam. I knew Keith was scheduled to arrive in Guam about the same time. I didn't know he was doing the trip nonstop. It was 5,852 nautical miles and would take around 20 hours with three air refuelings and a practice low level bomb run along the way. I also didn't know that the day before he left, President Carter made a statement to the press that the B-52 could reach any target in the world in 24 hours. The Air Force decided they would prove that with Keith's trip to Guam.

The B-52 had a crew of six back then: two pilots, two navigators, an electronics warfare officer, and a gunner. Keith's base decided they should have an extra pilot and navigator for the 24 hour mission. The morning of the mission one of the pilots called in sick. The base decided they could go with just two pilots. Of course they did.

My crew showed up at Anderson Air Base on schedule. It was a beautiful but humid day. The rest of my crew promptly went to bed while I checked out the command post to find out about Keith's bomber. I was told they were a little late but en route. They would be landing the next morning. The next day I heard they had landed but the crew was restricted to quarters, pending an investigation and possible punitive actions. For the next week there was no news at all. My crew was sent to Diego Garcia and I forgot about it for a while.

When I came back to Guam I was surprised to see a note on my door from Keith, inviting me to dinner at the Officer's Club. That night he let me know what had happened.

"We checked in with command post about two hours out and gave them our ETA, which was to be right at 21 hours. They asked if we had enough gas to fly three more hours and we made the mistake of telling them yes. They told us to find a holding pattern and that under no circumstances were we to land with less than 24 hours of flight time."

Keith further explained that they decided to pull the throttles back, fly their maximum loitering speed, and let the autopilot handle the flying chores. "I guess we all fell asleep," he said. "All seven of us." The basic crew and the extra navigator, all asleep. "The base scrambled two fighters and they found us south of Guam a couple of hundred miles headed for the South Pole. None of us heard them on the radio. The command post told the fighters to get in front of us so their jet exhaust would shake our airplane, and it did. That's what woke us up."

By the time they got back on the ground they had their 24 hour sortie and the base was contemplating throwing the book at the crew for falling asleep while flying, which was not allowed. In the end saner heads prevailed and the Strategic Air Command decided to look the other way. Keith went on for a full career in the Air Force and is now a college professor at an Ivy League school. The Strategic Air Command, which started out as an incompetent part of the U.S. Army Air Forces in 1946 was finally disbanded in 1992. If you think I am being unkind about SAC, please refer to any of the books written by Marshal L. Michel about Operation Linebacker II.

A case study

From the original G159 through the wildly successful G550, Gulfstream had redefined business travel on the high end. The Gulfstream G650 was something that pushed the envelope further with a wider cabin, longer range, and higher speeds. But the company promised takeoff and landing field lengths more akin to their smaller aircraft. They tasked their test pilots with validating those numbers, not determining what the numbers would be.

Of course they had computer models to go on and were confident the promised results could be achieved. This was a mistake. A crew of four gave their lives trying to achieve the promised results on April 2, 2011. Gulfstream raised their numbers and the airplane has gone on to be their most successful type ever. I believe the process has been fixed, at least I hope it has. They now emphasize the process (activity) of testing their aircraft, and not of achieving marketing goals (results).

images

Photo: G650 N652GD, 3 Apr 2011, from NTSB.

More about this: Case Study Gulfstream G650 N652GD.

A lesson

Having goals is great and a good way to measure successfulness is with results. Leaders who emphasize the results of the activities needed to achieve the results risk encouraging their people to lose sight of what is important. That is especially true when the cost of the desired results are too high.

Stopped searching for vulnerabilities — Didn't think a disaster would occur

[Wetherbee, p. 12] Managers usually thought their teams were performing well before the disaster occurred.

A personal story

In 2002 I fell for one of the oldest tricks in the book used against flight examiners. It goes like this: please pass this PIC upgrade candidate, we know he or she has weaknesses but we will only pair him or her with the strongest copilots. He or she only needs a little seasoning. I did this once in the Air Force and regretted it. I did this again while flying the Challenger 604 and it didn't work out any better.

The Compaq Computer flight department was collapsing on itself. We had racked up several years of high tempo operations flying all over the world without so much as a scratch on any of our airplanes or people. That fact was made all the more admirable considering the long distances we flew that airplane which was poorly suited to the task. But once it became known the flight department would be disbanded, we started losing experienced pilots and started hiring anyone with a pulse. One such pulsing pilot I'll call Peter.

Peter was a good guy and a fair stick and rudder pilot, but he was a lousy decision maker. I gave him his pass to qualify as a SIC while every single other pilot was of the caliber to help with Peter's seasoning. A year later, after losing four experienced pilots and hiring four new pilots, there was a push to make Peter a PIC. I resisted but when you run out of bodies, what are you going to do? He would be a domestic only PIC, so what could go wrong?

In July of that year Peter and a contract pilot flew from Houston, Texas to Bedford, Massachusetts, landing around 9 pm. My crew took the airplane over for the rest of the trip to Athens, Greece. The ramp was exceptionally dark and the only thing unusual about the crew swap was that their flight attendant, let's call her Patricia, had to be helped off the airplane. I asked Peter what happened to her and he said nothing, she was doing her usual drama queen routine. When we landed in Ireland for fuel it was still dark. After the passengers woke from their in flight naps they asked how Patricia was. They told us that it was quite turbulent descending into Boston and that Patricia had been thrown about the cabin like a ping pong ball. Once we landed in Athens we learned the rest of the story, the nose of the aircraft showed evidence of hail damage.

Of course Peter denied flying through a thunderstorm. I caught up with the contract pilot who admitted that they had flown through a thunderstorm, but he was just a contract pilot and "what was I supposed to do?" Nobody looks good in any of this, myself included. I should have shown more character and refused to upgrade Peter when I did.

A case study

On February 1, 2003, the Space Shuttle Columbia broke apart during atmospheric reentry, killing all seven crewmembers. A piece of foam insulation broke off from one of the two external fuel tanks during launch and struck the left wing. The damage was enough to breach the integrity of the heat tiles and hot atmospheric gases entered the wing during reentry. The damage destroyed the internal wing structure, causing the spacecraft to become unstable and break apart.

images

Photo: Breakup of the Space Shuttle Columbia, NASA photo
Click photo for a larger image

The accident was more tragic than just a retelling of the sequence of events because this kind of damage had been noticed several times causing anywhere from minor to near catastrophic results. The accident investigation focused on the foam and the organizational culture at NASA that caused them to ignore the warning signs. But the culture at NASA goes deeper still. I think if you look at the three major accidents in NASA's history, you will see repetition of this culture.

The time leading up to the January 27, 1967 Apollo 1 test was one of urgency to meet President Kennedy's deadline to place a man on the moon before the decade was out. The Mercury and Gemini programs had gone very well and they were ahead of the time line. NASA believed shortcuts in the capsule's cabin environment (100% oxygen) and materials (they did not require non-flammability) were justified in that the mission was very important and that they had taken adequate precautions. Nothing could go wrong.

The time leading up to the January 28, 1986 launch explosion of Space Shuttle Challenger was the 25th planned launch and orbital flight. NASA's stated objective for the mission was to make shuttle flights operational and routine. They had gradually lowered the lowest acceptable temperature for launch, overriding objections of engineers responsible for O-rings used to join segments of the solid rocket boosters. On this particular launch, the O-rings became brittle and failed. The shuttle exploded 73 seconds after launch. Nothing could go wrong.

In all three accidents there were engineers and managers who knew something was wrong, but there were higher level managers who refused to believe it.

More about this: The Normalization of Deviance.

A lesson

In both examples, mine with Compaq and NASA's with the space shuttle, organizations get comfortable and stop thinking about what can go wrong. If you spot an organization with this level of complacency, watch out.

Didn't create or use an effective assurance process

[Wetherbee, p. 13] Prior to accidents, managers in organizations did not understand how to create an effective process of assurance, nor did they understand the value of assurance. . . . In an operational organization, providing assurance means a person is giving confidence about future performance to another person, or group, based on observations or assessments of past and current performance.

A personal story

Our EC-135J (Boeing 707) squadron in Hawaii was charged with supporting the U.S. Navy and its submarine fleet in the Pacific. In 1984 I was sent to safety school for three months while the Navy brought their submarines back from their former "westpac" orbits off the coasts of Korea and the USSR to "eastpac" right off the coast of California. Our mission changed from Korea, Japan, and the Philippines, to California. The squadron set up a staging operation at March Air Force Base, Riverside, California. Pretty straight forward.

It would have been straight forward except the squadron had a change in leadership about a year prior and the new squadron commander set about replacing every squadron leader who wasn't spring-loaded to the "yes sir" position. Non-sycophants were shown the door, which might have explained how I ended up out of the squadron and assigned to the wing as the base's chief of flight safety but still flying on an attached basis with the squadron. The new commander kept the new mission to himself and a chosen few. It was "on a need to know" basis so line pilots were denied a look at it until they actually flew it. Two months after returning from safety school and four months after the change in mission, I found myself at March Air Force Base in an airplane too heavy to safely take off with an engine failed at V1.

"What do you mean you can't go," the commander asked over the phone. "My staff has gone through this backwards and forwards. You either fly it or you can consider your flying days with my squadron over."

As it turns out, the new staff didn't have a lot of experience considering obstacle performance with an engine failed and did not factor the mountains just north of the airport with higher temperatures. The previous aircraft commanders made sure they had the performance for their particular departure days and didn't mention the plan was flawed, since the emperor didn't like bad news. I had the first departure on a hot day since the plan had changed. But the squadron commander signed off on the plan for year-round operations. Now he had to go back to the Navy and say he couldn't do it. He was obviously furious. But he was lucky all he had to contend with was a little embarrassment and not notifying the next of kin that one of his crews had splattered themselves in the California mountains.

A case study

During the night of May 31, 2014, the pilots of Gulfstream IV N121JM failed to rotate and ended up in a fireball at the end of Runway 11 at Hanscom Field, Bedford, Massachusetts (KBED). Tower reported that the nose failed to lift off and the braking didn't start until very late in the takeoff roll. Just based on that information, I knew what had happened but not why. The what was disturbing enough. The why made me as angry as I have ever been in a very long time.

images

Photo: N121JM Wreckage, aerial photograph, from NTSB Accident Docket, figure 6.

The NTSB called the actions of the two pilots "habitual intentional noncompliance." Many of us speculated that they trained with "Brand X" but that wasn't true, they trained with the same simulator company that we use. We also hypothesized that these pilots never heard of a Safety Management System (SMS). Again, not true. They had been awarded their Stage II SMS rating. If only they had a Flight Operations Quality Assurance (FOQA) system. But they did.

As the details of the crash finally came out, it came to light that these pilots put on an act when training, just to pass the checkride. They flew by their own rules, not using checklists, callouts, or common sense. They had SMS certification, but it was a "pay your fee get your certificate" operation. As for FOQA, it appears it was installed but not used. These were two pilots who were comfortable operating their very expensive jet as you would a beat up pick up truck. And now they are dead. It is a tragedy that they took innocent lives with them.

More about this: Case Study Gulfstream GIV N121JM.

A lesson

Many small organizations, like my Hawaii squadron, are parts of larger organisms that set up and even require robust safety assurance systems. Others, like N121JM's flight department, are enrolled in assurance systems that are purchased. While most SMS auditors do a good job and try to get it right, some are out there to sell you the certificates and are unwilling to criticize the people signing their paychecks. Any organization willing to pencil whip these assurance programs is courting the next accident.

Allowed violations of rules, policies, and procedures

[Wetherbee, p. 14] After accidents, investigators usually determined that some organizational rules, policies, and procedures were violated before the accident. Often the workforce reported unofficially that some managers were cognizant of these violations in operations before the accidents.

A personal story

When I showed up as a copilot in our Hawaii Boeing 707 squadron, the biggest challenge was going to be learning how to air refuel as a receiver. Unlike the tanker which normally flew a stable platform with the autopilot, the receiver had to do this by flying formation using old fashioned stick and rudder. Just as I was getting the hang of it, one of the pilots talked the tanker into allowing him to fly fingertip formation, something reserved for smaller aircraft.

Air refueling formation is what is called "trail formation," in that one airplane flies behind the other, albeit close enough to make physical contact. It requires a high level of training (and skill) but offers the advantage of an easily effected abort: the receiver pulls power the tanker adds. There is more to it than that, but you get the idea. Fingertip formation introduces a lot of variables from the high and low pressure zones of overlapping wings. There have been more than a few midair collisions with one airplane quite literally sucked into another. That was what I was thinking about when I was a passenger in the copilot's seat watching the guy in the left seat fly fingertip formation with a tanker. One 200,000 lb. (plus) aircraft flying so close to another weighing almost as much, so closely that our left wing was underneath and just behind their right wing. I asked our squadron commander about this and was told it was perfectly safe and we did it to keep our flying skills sharp. A few months later there was a midair between a tanker and an AWACS airplane and the Air Force made it clear in no uncertain terms that anyone caught flying unauthorized formations in any aircraft would be getting a one-way ticket to Leavenworth, the military's most infamous prison. All of a sudden, the fingertip formation program in our squadron went away.

A few years later I was hired in the Air Force's only Boeing 747 squadron (at the time) and shortly after I arrived I was medically grounded with cancer. I spent two months in a hospital and shortly after I returned the squadron commander was fired. There were video tapes circulating showing him flying fingertip formation with another of our 747s. In this case, it was two 600,000 lbs (plus) airplanes doing what I had seen in the smaller 707. I overheard him talking about it, acknowledging that he was fired and forced to retire. He said, "it was worth it."

A case study

The most unkind, and valid, insult ever given to an airline came from Robert Gandt in his excellent book Skygods: The Fall of Pan Am when he said, "Pan Am was littering the islands of the Pacific with the hulks of Boeing jetliners." By the close of 1973, Pan American World Airways had lost ten Boeing 707s, not including one lost in a hijacking. At least seven of the ten crashes were due to pilot error. Pan Am initiated a study to find out what was wrong. As the study was being conducted they crashed two more.

images

Photo: The first three Pan Am Boeing 707s (N709PA, N710PA, N711PA), Seattle, Washington, 1958, (Public Domain)
Click photo for a larger image

To get a feel for their culture at the time, Gandt tells the story of a captain flying a visual approach into Honolulu International Airport in the days they did not use checklists or have call outs. The captain simply flew the airplane as he thought best while the first officer did his best to not offend the “skygod.” Passing 600 feet the first officer asked the captain if we was ready for the landing gear. The captain exploded with rage, saying “I’ll tell you when I want the landing gear.” Two and a half seconds later, with a great deal of authority, he said, “Gear Down!” The story doesn’t end the way you would suspect. The captain reported the first officer’s temerity to the chief pilot and the chief pilot told the first officer that if he ever challenged another captain’s authority he would be fired.

It is true this was a different time, but the culture at Pan Am when they transitioned from flying boats was that of the imperial captain. That was made even worse when they pioneered the jet age of airliners. It got so bad, the FAA threatened to ground the airline. The company got rid of all those "Sky Gods" and turned into one of the safest airlines in the world. But until then, they provided case study after case study on how not to run a crew.

For more about this see A CRM History Lesson: Pan American World Airways and the Boeing 707.

A lesson

In both my Boeing 707 and 747 squadrons we were considered what the Air Force called "special duty" assignments back then. We were outside the normal assignment process and getting hired required an interview with the squadron's command staff. In both cases we ended up with leadership that felt it was above the normal rules of the Air Force. In both squadrons various Air Force rules were formally waived and in both cases the squadrons tended to bend those bent rules further. But in both cases, we were spared midair collisions because external forces managed to rein us back in. Pan American World Airways was not so lucky, but they managed to return to the fold after finally learning the lesson.

Some leaders and operators were not sufficiently competent

[Wetherbee, p. 14] Deficiencies in knowledge, skills, or attitudes at any level in the organization can result in a failure to prevent accidents. Qualified assessors should have been assigned to test knowledge, assess skills, and evaluate attitudes of all people who were contributing to hazardous missions.

A personal story and a case study

The RC-135S was a Boeing 707 variant assigned a spying mission, the “R” stands for reconnaissance. In the late seventies and early eighties, an RC-135S could usually be found sitting in Shemya Air Force Base, on Shemya Island, Alaska. The weather on this Aleutian Island was usually poor, but the location was necessary to monitor a Soviet Union ballistic missile test area.

On March 15, 1981, a copilot ducked under a Precision Approach Radar (PAR) glide path and ended up landing short of the runway at Shemya, destroying the airplane and killing 6 of the 24 crew on board.

images

Photo: 61-2664 at Shemya, Paul Jeanes
Click photo for a larger image

As with most aircraft accidents, there are many related causes but the striking fact is that this squadron appeared to have very good pilots who flew into this hazardous airport routinely with great success. This particular copilot, however, had a history of flying below glide path and his behavior appeared to be the norm. Compounding the problem was that the squadron's other pilots may have also tended to fly below glide path but were able to get away with it due to a higher experience level. The copilot survived and was asked about the prohibition in the Air Force instrument flying manual, called Air Force Manual 51-37. He said he thought that manual only applied to the T-37, the airplane he flew in pilot training.

For more about this: Case Study RC-135S 61-2664.

A lesson

I went to Air Force pilot training about the same time as this copilot and also flew the T-37. I knew full well that AFM 51-37 applied to all Air Force airplanes. It seems to me someone along the way should have realized that this pilot, and possibly others in the squadron, did not have the competence to correctly fly a PAR approach. I was flying an EC-135J when the accident report was released. The EC- and RC- are both derivatives of the C-135, a Boeing 707. Unlike the KC-135A, these airplanes were much heavier and had higher approach speeds. Flying a PAR was a challenge. Many in our squadron knew some of those who had died in the RC-135 and our nonpilots wondered if our pilots were competent enough to have prevented the crash. We pilots, however, had no doubts.


Reading the signals, preventing the next accident

Of course there are countless textbooks, web posts, magazine articles, and seminars out there that tell you what not to do so you can avoid the next aircraft accident. The problem is that most operators in organizations that will have that next aircraft accident are blind to that. To them, as to most of us, they are doing everything just right and the next accident will happen to the "other guy." What worries me, and should worry you, is that other guy could be me or you. Captain Wetherbee's list of warning signals gives us something to look for:

  1. Does your organization place more importance on the desired results of your mission (getting from Point A to Point B) than on the activity required to do that (flying safely within all known procedures and regulations)?
  2. Does your organization spend time looking for weaknesses and other ways it may be vulnerable to missing something important?
  3. Does your organization earnestly and honestly use assurance programs, such as SMS and FOQA?
  4. Does your organization look the other way at violations of any rules, policies or procedures?
  5. Are your operators competent at what they do?

A case study in progress

About a year ago, as I write this, an Atlas Air Boeing 767 plunged into a muddy swamp near Houston-George Bush Intercontinental Airport, TX (KIAH), doing over 400 knots with the autothrust engaged. While the NTSB has not finished their investigation, they have released their airplane performance study. It appears that during their descent the takeoff / go-around function of the autopilot was activated confusing the first officer, who was the pilot flying. He made a comment about airspeed and about the airplane stalling, though all indications were otherwise. Looking at a plot of the airplane's altitude versus airspeed and elevator position, it appears the first officer pushed the nose down aggressively while the captain pulled back. Once they popped out of the weather the first officer joined in pulling back the elevator but it was too late.

images

Photo: Elevator split, Airplane Performance Study, figure 7
Click photo for a larger image

The first officer had a history of failed checkrides at Atlas and previous employers. The captain's record was only slightly better. According to the Director of Human Resources at Atlas Air, they had seen a "tough pilot market." Looking at these two pilots and their training at Atlas, in my opinion, their hiring standards were low and they trained as best they could to fill cockpit seats. I think their culture emphasized filling cockpit seats over producing safe pilots, failed to look for weaknesses in their hiring and training processes, failed to implement or use an effective pilot evaluation system, and failed to insure the competency of their pilots. In other words, they exhibited four of the five warning signals.

For more about this: Case Study Atlas Air 3591.

A personal story in progress

Years ago while flying for TAG Aviation we had a pilot retire who gave us a well intentioned compliment during his exit interview that hit my flight department two different ways. TAG Aviation had well over 200 pilots at the time and double that number of personnel. The retiring pilot had been with TAG almost from the beginning and he was a member of several flight departments. He said at his exit interview that we were the best flight department he had ever been a part of in terms of adhering to Standard Operating Procedures. He said that we were "as close to being by-the-book" as he had ever seen.

Half our pilots were pleased with the compliment. The other half of us wanted to know, "what do you mean close?" He was referring to our disregard for 14 CFR 91.211, the fact we didn't use oxygen above flight level 350 when one pilot left the cockpit. (This was flying a Challenger 604 that could not go above flight level 410.) Our chief pilot would not budge on the subject. He was fired about a year later and we immediately started flying by the book, even when it came to 14 CFR 91.211.

That was eighteen years ago. I am now starting my twelfth year leading my current flight department and am confident we have no questions on items 2, 3, 4, and 5. I worry about item 1 quite a bit. That is the nature of our business and if it doesn't worry you, it should.


Gandt, Robert, Skygods: The Fall of Pan Am, 2012, Wm. Morrow Company, Inc., New York

Michel, Marshal L. III, The 11 Days of Christmas: America's Last Vietnam Battle, Encounter Books, New York, 2002.

Michel, Marshal L. III, Operation Linebacker II 1972: The B-52s are sent to Hanoi, Osprey Publishing, New York, 2018.

Wetherbee, Captain Jim., Controlling Risk in a Dangerous World,, Morgan James Publishing, New York, 2019