Photo: Costa Concordia, 13 June 2012, from Paolodefalco75 (Creative Commons)

Eddie Sez:

It is a sad commentary on our profession that we have had to adopt the phrase "Normalization of Deviance" as our own. It refers to when pilots deviate from Standard Operating Procedures so often that their SOP-less practice becomes the new norm. It seems that with each couple of passing years we have a new poster child in the case history books and right now that prime example belongs to Gulfstream IV N121JM, which crashed attempting to take off from Hanscom Field, Bedford, Massachusetts with its gust lock engaged.

There are two routes to normalizing deviance with pilots.

  • First, you can start your career free of standardization and miss the all-important introduction to SOPs. General aviation pilots who skip the military or major airlines are particularly at risk. If you've never flown for a large organization where failing to "march in step" will jeopardize your pay check, you will have to work extra hard to avoid this trap.

  • Second, you could make it to a high level of expertise but then succumb to the complacency fed by your successful experiences, you could be beaten by the ever-present need to accomplish objectives (save time, save money, achieve schedules), or you might rebel against unrealistic SOPs and burdensome regulations.

Either way, you are living on borrowed time. Fortunately, you can change things.

  • The Academic Definition of the normalization of deviance comes from a book written by Professor Diane Vaughn following the 1986 explosion of Space Shuttle Challenger. The book is tedious reading but makes a very good case that even an organization as highly esteemed as the National Aeronautics and Space Administration (NASA) can be corrupted by the normalization of deviance.

  • The Aviator's Definition attempts to turn sociologist-speak into plain English and help us to recognize the normalization of deviance when we see it.

  • Objective-Based Normalization of Deviance — The primary way a highly professional pilot can be trapped into the normalization of deviance is to be ensnared by the pressure of having to achieve hard-to-achieve objectives, such as making a schedule work despite unexpected weather. These objectives are usually placed on highly experienced experts, who can talk themselves into believing any normalization of deviance is justified. The Space Shuttle Challenger tragedy makes for a good case study.

  • Ego-Driven Normalization of Deviance — Experts who have achieved lofty career objectives but do not have sufficient oversight can normalize deviance because they can, and not get caught. The self-inflicted running aground of the Cruise Ship Costa Concordia is a case in point.

  • Experience-Based Normalization of Deviance — Experts who have accumulated an enviable amount of experience and a strong sense of confidence are at risk of normalizing deviance unless they have sufficient oversight and a strong peer group. The crash of Gulfstream IV N121JM makes this case.

  • Avoiding and Curing the Normalization of Deviance — Fortunately you can avoid the normalization of deviance and even cure it after it has injected itself into your routine. But as with many things in life, avoiding it in the first place is much easier than curing it.

What follows comes from the references shown below. Where I think it helpful, I've added my own comments in blue.

I spoke to the aviation school at Bridgewater State University about his very topic in September of 2016. BWSU has the leading aviation program in the Northeast and there were about 80 pilots working toward airline careers. You can view this presentation here:


Academic Definition

Photo: Challenger explosion, 28 January 1986, from NASA.

The term "normalization of deviance" was coined by sociology professor Diane Vaughn in her 1996 book, The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA, where she examines the tragedy of the 1986 launch of Space Shuttle Challenger. It is a useful term for us in the business of flying airplanes, but her definition (which follows) can be a bit ponderous. I've included what I could extract from the book because it is worth considering:

[Vaughn, pg. 58.]

  • Deviance refers to behavior that violates the norms of some group. No behavior is inherently deviant; rather it becomes so in relation to particular norms.

  • Deviance is socially defined: to a large extent, it depends on some questionable activity or quality being noticed by others, who react to it by publicly labeling it as deviant.

  • [In the case of the Space Shuttle Challenger incident], actions that outsiders defined as deviant after the tragedy were not defined as deviant by insiders at the time the actions took place.

[Vaughn, pg. 58.]

  • Individuals assess risk as they assess everything else—through the filtering lens of individual world view.

  • The diverse experiences, assumptions, and expectations associated with different positions can manifest themselves in distinctive world views that often result in quite disparate assessments of the same thing. When risk is no longer an immediately knowable attribute of the object and the possible harm associated with it depends on other, less knowable factors, we move into the realm of uncertainty and probabilities.

  • As experience accumulates, the formal assessment of risk will vary with changes in the technology, changes in knowledge about its operation, or comparison with other similar technologies.

  • Implied in the term "expert" is some technical skill, gained either by experience, by professional training, or by both, that differentiates the professional from the lay assessment of risk. Also implied is that professionalism will somehow result in a more "objective" assessment than that of the amateur.

  • But professional training is not a control against the imposition of particularistic world views on the interpretation of information. To the contrary, the consequence of professional training and experience is itself a particularistic world view, comprising certain assumptions, expectations, and experiences that become integrated with the person's sense of the world. The result is that highly trained individuals, their scientific and bureaucratic procedures giving them false confidence in their own objectivity, can have their interpretation of information framed in subtle, powerful, and often unacknowledged ways.

[Vaughn, pg. 65.]

  • [In the case of the space shuttle], the work group normalized the deviant performance of the SRB [Solid Rocket Booster] joint[s between sections]. By "normalized," I mean the behavior the work group first identified as technical deviation was subsequently reinterpreted as within the norm for acceptable joint performance, then finally officially labeled an acceptable risk. They redefined evidence that deviated from an acceptable standard so that it became the standard.

  • The culture of production included norms and beliefs originating in the aerospace industry, the engineering profession, and the NASA organization, then uniquely expressed in the culture of Marshall Space Flight Center. It legitimated work group decision making, which was acceptable and non-deviant in that context.

  • The second factor reinforcing work group decision making was structural, and the structure of regulatory relations perpetuated the normalization of deviance.

Aviator's Definition

You won't find this anywhere, I just made it up . . .

The normalization of deviance is the incremental change to standards we once thought inviolate, turning actions once thought to be unacceptable into the new norm. The path to normalizing deviance can be paved by the lack of proper training, an experience-based ego, or expertise-based over-confidence. The ease at which one slides into the normalization of deviance can be facilitated by "group think," the lack of oversight, and a poor peer group.


Objective-Based Normalization of Deviance

The primary way a highly professional pilot can be trapped into the normalization of deviance is to be ensnared by the pressure of having to achieve hard-to-achieve objectives. These objectives are usually placed on highly experienced experts, who can talk themselves into believing any normalization of deviance is justified.

While the 1986 explosion after takeoff of the Space Shuttle Challenger was the defining moment for the concept of the normalization of deviance, there were doubtless many examples before and after. Incredibly, Challenger was not the only space shuttle to fall victim to the normalization of deviance.

Space Shuttle Challenger

Figure: Solid motor cross section, from Space Shuttle Challenger Accident Report, Volume I, figure 14.

The Solid Rocket Boosters (SRBs) on the space shuttle were built by Morton Thiokol, who was quite literally the cheapest bidder. Each booster was 149' long and 12' in diameter, was manufactured in six sections, and delivered to NASA in sets of two which were joined at the factory. The three combined sections were joined in the field. Each joint kept hot propellant gases on the inside with the help of two rubberlike O-rings and an asbestos-filled putty. The 1/4" diameter O-rings surrounded the rocket's entire 12' diameter. The secondary O-ring was meant to be redundant, a safety measure. But early on in the program there was evidence of some "blow-by" beyond the primary O-ring. Engineers determined an "acceptable" amount of erosion in the O-ring and for a while these norms held up. Starting in 1984 the amount of damage to the primary O-ring was increasing. Engineers were alarmed but were later convinced the damage was slight enough and the time of exposure was short enough that the risk was acceptable. In 1985 some of the SRBs returned with unprecedented damage, the majority came back with damage, and in one case the secondary O-ring was also damaged. For one launch, there was complete burn through of a primary O-ring. In each case, the decision was to increase the amount damage deemed acceptable and press on. When it was no longer possible to say the two O-rings were redundant, NASA decided to waive the requirement.

What also happened in 1985 was a series of launch decisions in colder and colder temperatures. While the overall shuttle program was designed with a temperature range of 31°F to 99°F as a launch criteria, the SRBs were never tested at lower temperatures. In fact, Thiokol stated that O-ring temperatures must be at least 53°F at launch, or they would become brittle and would allow "extreme blow-by" of the gases. There was also evidence that the O-rings could become cold soaked and their temperatures would take time to recover from prolonged cold. But top level NASA managers were unaware of the SRB design limitations and the 53°F threshold didn't hold firm. For one launch the engineers said "Condition is not desirable but is acceptable."

Illustration: Pressurized joint-rotation effect (exaggerated), from NASA Teleconference, figure 11.5.

The temperature issue was made more complicated because there had been one launch were there was O-ring blow-by at 75°F, "indicating that temperature was not a discriminator." Of course this is ridiculous, temperature can be one of several discriminators. After the fact investigators realized that "rotation" of the SRB could decrease O-ring contact with the joint. Rotation not in the sense of one section of the SRB rotating on its circumference against the next section, but of one section rotating longitudinally. For lack of a better term, it wobbles from side to side and creates a gap, as shown above in figure 11.5. What would cause such a wobble? Perhaps it was shifting winds along the craft's upward path. There was indeed such a windshear the morning of Challenger's launch. Nonetheless, this gave the proponents of launching the avenue to weaken the argument against those arguing for delay.

Temperatures on the morning of the launch were well below 53°F. Thiokol engineers recommended delaying the launch but NASA managers applied pressure on Thiokol management, who were unable to convince their engineers to budge. So they elected to make it a "management decision" without the engineers and agreed to the launch. It was 36°F at the moment of launch. The O-rings on one of the field joints failed almost immediately but the leak was plugged by charred matter from the propellant. About a minute after launch a continuous, well-defined plume from the joint cut into the struts holding the SRB to the main tank and the SRB swivelled free. The flame breached the main tank five seconds later which erupted into a ball of flame seconds later. The shuttle cabin remained intact until impact with the ocean, killing all on board.

Much of the reporting after the event focused on the O-rings. After the accident report was published, the focus turned to a NASA managers breaking rules under the pressure of an overly aggressive launch schedule. But, as Professor Vaughn points out, they weren't breaking any rules at all. In fact, they were following the rules that allowed launch criteria and other rules to be waived. The amount of acceptable primary O-ring damage went incrementally from none, to a little, to complete burn through. Over the years the practice of reducing safety measures with waivers had become normalized.

More about this: Report to the Presidential Commission on the Space Shuttle Challenger Accident.

Space Shuttle Columbia

Seventeen years later, on February 1, 2003, the Space Shuttle Columbia was lost as it reentered the earth's atmosphere. A piece of insulating foam from the external tank broke away and struck the left wing, damaging a heat tile. When Columbia re-entered the earth's atmosphere, the damage allowed hot atmospheric gases to penetrate and destroy the internal wing structure which eventually caused the shuttle to become unstable and break apart, killing all on board. Like the earlier incident, NASA failed to pay heed to engineering concerns about previous incidents of tile damage from separating foam.


Ego-Driven Normalization of Deviance

Experts who have achieved lofty career objectives but do not have sufficient oversight can normalize deviance because they can, and not get caught. The self-inflicted running aground of the Cruise Ship Costa Concordia is a case in point.

Photo: Costa Concordia, 13 June 2012, from Paolodefalco75 (Creative Commons)

On January 13, 2012, the cruise ship Costa Concordia departed Porto Civitavecchia, near Rome, Italy, to the north. Its route took it through the eight-mile stretch between Giglio Island and the Italian mainland. Captain Francesco Schettino ordered the ship to sail closer to Giglio as a "salute" to the islanders. This was apparently the normal practice for many cruise ship captains when transiting this straight. He came too close and the left side of the ship hit the rocks, gashing the hull and sending water flooding into the ship's engine room. The ship lost power as a result and started listing to the left side, where the 174-foot gash was. Winds pushed the ship back to the island and the water in the hull caused the ship to reverse its list to the right. Meanwhile, it took an hour to begin the evacuation. Of the 3,206 passengers and 1,023 crew, all but 32 were rescued. Thirty bodies were found, two remain missing, presumed dead.

Captain Schettino was sentenced to prison for 16 years but Chief Prosecutor Beniamino Deidda wanted to elevate the blame: "For the moment, attention is generally concentrated on the responsibility of the captain, who showed himself to be tragically inadequate. But who chooses the captain?" He also noted that the crew didn't know what to do and were ill prepared for crisis management.

What goes unremarked in most of the reporting of this incident is that Captain Schettino was making this illegal and reckless "salute" in front of thousands of witnesses. If his behavior was so egregious, why had no one reported it before? It seems the entire company had succumbed to this form of deviance. The deviance had become the new norm.

More about this: "Costa Concordia: How the Disaster Unfolded," sky News, Monday 04 July 2016, http://news.sky.com/story/costa-concordia-how-the-disaster-unfolded-10371938.


Experience-Based Normalization of Deviance

Experts who have accumulated an enviable amount of experience and a strong sense of confidence are at risk of normalizing deviance unless they have sufficient oversight and a strong peer group. The crash of Gulfstream IV N121JM makes this case.

Photo: Gulfstream IV N121JM Wreckage, from NTSB Accident Docket, ERA14MA271, figure 10.

On May 31, 2014, the crew of Gulfstream IV N121JM started their engines without running the engine start checklist and neglected one of the steps which would have them disengage the flight control gust lock. They then skipped the after starting engines checklist which would have required the flight controls to be checked; had they done this, they would have realized the flight controls were locked. They also skipped the taxi and line up checklists, as well as the requirement to check the elevator's freedom of movement at 60 knots. They were unable to set takeoff thrust, realized this, and continued the takeoff. The rest, unfortunately, is history.

As the news gradually leaked out from the NTSB accident investigation, we in the aviation world were stunned. How could two pilots had been so inept? But their airplane was outfitted with a Quality Assurance Recorder and we learned that this type of behavior was the norm for them. For example, the recorder revealed that they had skipped the flight control check on 98% of their previous 175 takeoffs.

These two pilots did not fly in a vacuum. They occasionally flew with contract pilots who witnessed their habitual procedural non-compliance. By tolerating their deviance, these contract crews played the roles of enablers. That served to reinforce the behavior as normal.

More about this: Mishap / Gulfstream IV N121JM


Avoiding and Curing the Normalization of Deviance

At each stage of a new pilot's growth comes a time where he or she is tempted to think, "At last I know what I need to know." Some pilots may even arrive at the "At last I know everything there is to know" stage. With each new level of license and training the concept of You Don't Know What You Don't Know should become reinforced. The new pilot, if not careful, may end up in the "deviant pilot" class without having ever accomplished any level of expertise.

A new pilot who does not continue his or her education is at risk of stagnating into comfortable routines that do not take advantage of lessons learned by others. Poor training can be even worse than no training if you are being taught the wrong things. In either case, the pilot becomes too embarrassed to expose him or herself to further training. This is an easy trap to fall into when flying as an amateur because there is little or not oversight, outside of a biennial review. When exposed to SOPs, these pilots are likely to react negatively. "I've been doing fine without them!"

An expert's path to deviance is different from a new pilot; the expert should know better! The expert becomes so after significant professional training, significant and meaningful experience, and often from peer recognition. These factors all solidify in the expert's mind a technical basis for deviance; they have considered all the factors and have made an expert's determination that deviance is not only acceptable, but is necessary for the good of the mission. In fact, the expert can rationalize that the deviance is actually safer.

With each pilot there are likely to be a combination of issues leading toward the normalization of deviance. These issues form a disease which impact the pilot's professional health; fortunately, for each issue there are known cures.

Issue: Stagnation

Flying can be an expensive hobby and at the point the money runs out amateurs find themselves with no future dreams to chase. For professionals there may also come a time when there are no further ratings to achieve and the prospects of a new career chapter seems unlikely. In either case, pilots risk a mental stagnation that weakens any motivation to keep in the books or keep up with the latest and greatest techniques and procedures.

Issue: Poor Training

Your training is only as good as the instructor and if you are taught to cut corners and to ignore all that has been learned over the years, you can be trained to deviate. This most often occurs when someone you respect or someone in a position of authority assumes the role of instructor but has already given into deviant behavior. It can also occur when a professional training vendor has misguided ideas of what should and should not be taught, or does not exercise proper oversight of its instructors. Most professional pilots will agree that recurrent training is an absolute must when dealing with technologically complex systems in a dynamic environment; but how often?

Issue: Poor SOPs

We often find ourselves having to adjust, reorder, or even skip some standard operating procedures because they don't exactly fit the situation at hand, they would take more time than a widely accepted short cut, or we think we have a better method. There are several problems, of course. Operating ad hoc, in the heat of the moment, we risk not carefully considering all possible factors. If we skip or reorder steps, we risk forgetting something important or failing to consider any sequential priorities. If we adjust an SOP on our own, crew resource management becomes more difficult as other have to guess about your procedures and techniques. Once we've violated the first SOP, it becomes easier to violate the second, and the third. Before too long the culture of having SOPs will erode and when that happens, all SOPs become optional. In a small flight department, there is a low likelihood of "being caught" or challenged.

Issue: Compartmented Information

Pilots tend to compartmentalize information for a variety of reasons. On a personal level, keeping a particular airplane's procedures separate from others, for example, can help keep cockpit procedures straight. We might want to insulate upper management from the nuts and bolts of what we do, reasoning that it is either too complicated to explain or uninteresting to non-practitioners. Some larger flight departments segregate fixed wing from rotary wing, Part 91 from Part 135, domestic versus international, or any number of other artificial distinctions. Some pilots may even wish to keep things quiet for fear that their ignorance on some obscure topic could be revealed. Regardless of motivation, these efforts at compartmentalization give pilots a de facto secrecy. Pilots are much like engineers in this respect. The Space Shuttle Challenger incident provides a good case study where this kind of "structural secrecy" can lead to a normalization of deviance.

[Vaughn, pg. 238] By structural secrecy, I refer to the way that patterns of information, organizational structure, processes, and transactions, and the structure of regulatory relations systematically undermine the attempt to know and interpret situations in all organizations.

[Vaughn, pg. 391] The 31°F-99°F ambient temperature requirement for launching the entire shuttle system was created early in the program by other engineers. In a palpable demonstration of structural secrecy, many teleconference participants had jobs that did not require them to know Cape launch requirements and so were unaware that this temperature specification existed. [ . . . ] [Launch decision makers] did not know that Thiokol had not tested the boosters to the 31°F lower limit.

In the case of Space Shuttle Challenger, the entire organization was steeped in the idea that "anomalies" were dealt with and only the worst problems were channeled to top decision makers. The SRB anomalies were for engineers to deal with and upper level decision makers were never informed. Lower management levels had accepted an increasing level of deviance that upper levels may not have allowed.

A flight department can suffer the same ill effects of this de facto secrecy. Pilots could make decisions based on incomplete information or a misunderstanding of organizational goals.

Issue: Poor Peer Group

Of course no professional pilot sets out to bend the rules on the margins or flagrantly disobey all SOPs; but many end up doing just that. Good pilots can be corrupted by poor peer groups. If everyone else has already normalized deviant behavior, it will seem an impossible task to hold true to SOPs without upsetting the status quo.

Issue: Poor Priorities (Mission Objectives Over Safety)

One of the profound lessons of the Space Shuttle Challenger tragedy is that decision makers believed they were making the right, reasoned decision each step of the way, though in hindsight would have to agree many of these decisions were foolish. They didn't set out to doom the shuttle to its fate, but that's exactly what they did. NASA managers were under a great deal of pressure to increase the launch rate in an effort to prove the shuttle had progressed from "test" to "operational" phases. These managers were experts in their fields and were confident each design waiver was the right decision. As their experience with waiving criteria closer and closer to the edge of safety accumulated, their confidence increased. Professor Vaughn calls this the "Culture of Production," as opposed to the engineering culture we all assumed ruled the roost at NASA.

Pilots in the "expert" class are in remarkably similar circumstances. They are quite often under extreme pressure to minimize costs while expanding mission capabilities. While spending less and less on maintenance, training, and operating costs they were expected to fly longer distances and longer duty days. Skipping maintenance checks, training events, and checklist steps were at first approached carefully with considerable thought and consideration. Formal waivers may have been instituted in an effort to do it "just right." Before too long the envelope of what was considered a deviation and what was just "normal operating practices" started to merge. These decisions are rarely black and white and plainly labeled as "we are about to deviate from a procedure we once considered sacred." Rather, they are cleverly disguised. In fact, we can tell ourselves the deviation is for the good of the mission; we can tell ourselves we are actually being safer.

I was once in a flight department that had a strict 5,000' minimum runway length and for good reason. The aircraft's normal landing distance was well within this if properly flown and if the runway way dry. When our new CEO made it clear he wanted to start flying into Hilton Head Island Airport, SC (KHXD) with a runway only 4,300' in length, the rank and file pilot answer was to suggest using Savannah International Airport, GA (KSAV), only 26 nautical miles away. Our chief pilot decided we would allow the shorter runway only if flown by a highly experienced pilot with more than 1,000 hours in type, on a VMC day, provided the runway was dry. As we started running out of such qualified pilots the experience requirement was dropped. Before too long any pilot could be assigned the trip and the dry runway restriction was removed. After my first trip there I made it clear I would never go back and soon every pilot in the flight department except the chief pilot voted likewise. The chief pilot was fired (for other reasons) and his replacement took Hilton Head off our list of acceptable airports. The CEO said he was okay with this, but within a year sold the company. More about this: Pilot Psychology / The Art of Saying the Word "No".

A common problem in all types of professional aviation can be called "target fixation" or "losing the forest for the trees." We tend to become so focused on getting the job done we can lose sight of the need to do so safely.

Issue: Over Confidence

A highly trained, highly experienced, and highly praised pilot is more likely to deviate from SOPs than a novice because of a high level of confidence. Here too there are parallels with the space program.

[Vaughn, pg. 58.] Implied in the term "expert" is some technical skill, gained either by experience, by professional training, or by both, that differentiates the professional from the lay assessment of risk. Also implied is that professionalism will somehow result in a more "objective" assessment than that of the amateur. But professional training is not a control against the imposition of particularistic world views on the interpretation of information. To the contrary, the consequence of professional training and experience is itself a particularistic world view comprising certain assumptions, expectations, and experiences that become integrated with the person's sense of the world. The result is that highly trained individuals, their scientific and bureaucratic procedures giving them false confidence in their own objectivity, can have their interpretation of information framed in subtle, powerful, and often unacknowledged ways.

Issue: Complacency

Pilots who become comfortable with their day-to-day cockpit existence there is a temptation to stop reading checklists and rely on memory, to stop taking precautions that never seemed necessary before, to start taking the "easy way" over the "right way." Highly experienced pilots and pilots in managerial positions are at greater risk because they are less likely to receive needed oversight. More about this: Pilot Psychology / Complacency.


Concluding Thoughts on the Normalization of Deviance

The normalization of deviance is like a cancer, both on the individual basis and for the group as a whole. The individual is unlikely to realize he or she has been inflicted until it is too late. At that point the individual believes the deviant behavior to be normal. The cancer metastasizes quickly and unless the group has a strong leader and a strong culture grounded in standard operating procedures, the entire group is at risk. I've seen this first hand in the military and as a civilian. In both cases the experience level was very high and such behavior was unexpected. But in both cases one person can make a difference. More about my military experience with the normalization of deviance here: Flight Lessons 3: Experience.


See Also:


Book Notes

Portions of this page can be found in the book Flight Lessons 3: Experience, Chapter 7.


References

"Costa Concordia: How the Disaster Unfolded," sky News, Monday 04 July 2016, http://news.sky.com/story/costa-concordia-how-the-disaster-unfolded-10371938

NASA Teleconference, January 27, 1986, from Vaughn, pg. 295.

NTSB Accident Docket, ERA14MA271

NTSB Aircraft Accident Report, AAR-15/03, Runway Overrun During Rejected Takeoff Gulfstream Aerospace Corporation GIV, N121JM, Bedford, Massachusetts, May 31, 2014

Report to the Presidential Commission on the Space Shuttle Challenger Accident, June 6th, 1986, Washington, D.C.

Vaughn, Dianne, The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA, The University of Chicago Press, Chicago and London, 1996.