One could argue that the ability to make timely and correct decisions is an important part of just about any job. But few jobs would require quick and accurate decision making as a matter of life and death.
— James Albright
We, of course, have just such a job. Many believe good decision makers are born, not made. There is some truth to that; the ability to make good decisions is founded in one's biology. But even with those so gifted, decision making is an art that requires practice. Poor decision makers can get better. Great decision makers can see their skills atrophy if not practiced. So how do we do improve our decision making skills?
"Alternatives" by Daniel Oines, from Creative Commons.
Pilot decision making
I grew up in Aiea, Hawaii, a small plantation town overlooking the southern shores of the Oahu. The sugar refinery is long gone, but we still have a great view of Diamond Head, Waikiki, the Honolulu International Airport, and Pearl Harbor. As a teenager growing up there, the best part of that view was the airport. I got to see the airplanes taking off and landing every day, and could harbor my secret dream of one day becoming a pilot.
And not just any pilot. But a pilot of a large airplane landing at Honolulu International.
In 1981 I got my wish. I was a second lieutenant in the US Air Force flying as a copilot in a a KC-135 tanker. On the day of my first landing back home, I dialed up the ATIS and was surprised to hear the airport was IFR. The visibility was down to one mile We called up the forecaster who confirmed that it had been raining all day, and would continue to rain all night. But no problem, airplanes were getting in off the ILS to Runway Eight Left.
And sure enough, that’s what happened. We broke out at 300 feet and I landed the airplane. I was beaming with pride and couldn’t wait to drive up the western ridge of Aiea to tell my parents what I had done. But first I had to drive up the eastern ridge to visit the in-laws.
They also had a nice view of the airport, but not tonight. It was still raining. I walked up the driveway and could see the glow of their TV through the screen door. I knocked on the front door and was greeted by a sea of in-law faces. Brother in-law number one looked up and said, “You made it.” It was more of a question than a statement. I said, “Of course I made it.”
Brother in-law number two pointed down the hill to where the airport should have been, but was not. “It’s been raining all day, how could you find the runway?”
ILS, from FAA Handbook 8083-15B, figure 9-33.
I explained how two radio beams were broadcast left and right of centerline and how a radio receiver in the cockpit sampled both beams and positioned a needle on an instrument that graphically showed me the extended centerline.
And I went on to explain another set of beams were sent up on a three-degree angle at 90 degrees to the first set to give glide path information.
“Yeah,” brother in law number three said, “but how did you see the runway?”
I was tempted to give him the standard two-word answer for this situation: it’s magic.
But I couldn’t do that; the brother in laws would think I was making fun of them.
So I gave them the only other answer I could, to satisfy the question. I said, “I have really, really good eyes.”
“Ah,” they said, “we knew it!”
I was thinking about that story two weeks ago in Paris when I took this photo.
The airplane in the foreground is my Gulfstream G450, inside of which is the most advanced cockpit in the world. Or so I thought. The grayish-brown airplane in the distance is a Pilatus NG. I think the NG means next generation.
I walked over to the Pilatus to say hi and was treated to a tour, including the cockpit.
They have the Honeywell Apex system, which is equivalent to what I have in my Gulfstream, something we call PlaneView.
So for this crew, like me, the magic trick has gotten easier.
To the non-pilot world it really is magic, being able to see through the night or the clouds to find a runway.
To the novice pilot, the magic is being able to keep the needles centered when looking outside for the runway lights.
But for us, we in the professional pilot class, the real magic happens in the last two seconds.
Here is the view from the cockpit taken at 500’ above the runway, about 35 seconds from decision altitude. The weather is above minimums, but just barely.
The airplane will announce 500, 400, “approaching minimums,” and 200, at which time we will be at minimums.
For you pilots, what you are looking for is a set of MALSRs, medium intensity approach lights with a 1,000 foot roll bar. Now let’s see if we can spot the lights in time to make a decision to continue . . .
I took this video about two years ago and when I spotted the runway I landed. It seemed rather routine. I timed the video and discovered that the first approach light appeared at 220 feet, 1.7 seconds before the decision had to be made. Did you make your decision in 1.7 seconds?
Now, between that point and the end of the video, how many of you were thinking about Title 14 of the Code of Federal Regulations, Part 91, Section 91.175? That’s the one that says: “no pilot may operate an aircraft below the authorized MDA or continue an approach below the authorized DA/DH unless” and then gives a list of three conditions. Were you thinking about that?
It is simply amazing the amount of stuff we are responsible for knowing.
Not only do we have to know all this stuff, we have to be able to retrieve that knowledge at a moment’s notice, make a decision, and act.
In 1.7 seconds!
It is obvious we can’t know everything, but we have to know a lot. And we have to remember the important stuff and be able to use it reliably. How do we do that?
So that becomes step one in our road to becoming better decision makers . . .
Step 1: acknowledge ignorance
We have to acknowledge our ignorance.
That sounds pretty harsh, but let’s look at that word before we pass judgment . . .
The nature of aviation is that there will always be something you don’t know.
There is just so much to know, even after you learn it, you could forget it.
The condition of being uneducated – we have schools to fix that.
The condition of being uninformed – you can inform yourself with books, movies, magazines, and the web. In fact, I’ll have three suggestions for you about this in a bit.
Now what about unaware?
The problem with not knowing something, is that you don’t know that you don’t know it.
I run into pilots all the time who don’t seem to grasp this idea, and others who learn it early on.
I learned this secret . . . you don’t know what you don’t know . . . while flying the T-38 during Air Force pilot training.
I soloed early, was well past the stage where landings should have been an issue when I almost washed out for ignorance . . .
That’s not what the grade sheet would have said, but it might as well have.
I had flown five or ten solo sorties, passed my aerobatics and pattern check ride, and was getting comfortable in the airplane. By comfortable, I mean I started to fly the airplane my way, not the Air Force way.
Then I started flying with my instructor again, as we graduated to the navigation and formation phase . . .
Video: T-38 Pattern, John Cabigas.
One day I was coming back from a navigation sortie when my instructor yelled at me for landing hot.
He continued to yell as I pulled up for another pattern and realized I had landed a little bit long. So on the next landing, I made sure I touched down very near to the approach end of the runway. That only made things worse and he failed me on the ride.
That was my first failure in pilot training. I had to go up again with a different instructor who kept his mouth shut and failed me again with three words on my grade sheet: landings too hot.
So I had one more try or my flying career was over.
I flew a basic aerobatic sortie and the evaluator kept saying the same thing over and over again: “Nice.” In the pattern entry: “Nice.” Final turn: “Nice.” And then I put the airplane down on the very first micron of that runway. If it were a game of tennis, the line judge would have been tempted to say “out,” but would have to acknowledge that I had nailed it. The instructor said, “why did you do that? You fly such a nice airplane until touchdown.”
I said, “because my instructor says I’m too hot on landing.” He said, “you don’t know what that means, do you?” And he was right, I didn’t. He explained that landing hot meant you were too fast. [T-38 pattern rationale] When I tried to put the airplane down earlier I was only making matters worse. He gave me another chance and I landed the airplane in the touchdown zone, on speed. I passed the ride and I got my wings later that year. It always makes me cringe, reliving that story. Thirty-six years ago my flying career could have ended before it started because of ignorance.
But you have to be on guard against ignorance, even after flying for years and years . . .
For example, the case of N128CM is an infamous one in the PC-12 community and for good reason. All on board were killed at the end of a string of very bad decisions.
It would be tempting to chalk this up to a pilot exceeding fuel imbalance limitations. But it was more than that.
A critique would go something like this:
The pilot didn’t use fuel ice inhibitor on previous flights that day
The pilot didn’t sump fuel that would have let him know there was ice in the left tank prior to the accident flight.
The pilot didn’t immediately point the airplane to a runway once imbalance limitations were exceeded
Each of these “didn’ts” violated a flight manual requirement.
NTSB Accident Report, from NTSB AAR-11/05.
If all you did, after reading this NTSB report was conclude, “I would never have done that,” you are missing a lesson that might come in handy one day.
The key is to put yourself in the pilot’s shoes and ask the question, “Why?”
Why didn’t he sump the fuel?
Perhaps one day he forgot and nothing bad happened. Each “forget” incident reinforced in his mind that this mandatory procedure wasn’t so mandatory after all.
It was certainly a case of complacency, but it was also a case of ignorance because there was information out there that would have convinced him of the need to sump his fuel. . .
More about this: PC-12 N128CM.
Pilatus Pilot's Operating Manual, from PC-12 Series.
He apparently didn’t know about the case of N666M, another Pilatus from just a year prior.
That PC-12 had the same issue with different results; that pilot landed the airplane before the balance problem got out of hand. The case of N666M . . . no accident report because there was no accident. But there could have been.
For a non-Pilatus pilot, certainly, and perhaps for this pilot, there is a mystery as to why ice in one tank can not only accelerate the depletion of fuel from the opposite tank but can add to the fuel in the blocked tank. Even a single pump in this airplane sends more fuel than the engine needs and unused fuel gets sent to both tanks. The situation will get out of control even faster than one might predict.
No matter how much time you have total or in type, you need to keep plugged in, you need to keep learning.
So step 1 on your path to better decision-making is to acknowledge ignorance.
The learning never stops.
But notice that even with the appropriate knowledge, we don’t always have the time for what most would consider prudent decision-making.
Here is what a business school graduate would call a normal decision-making process. You come up with some options, the more the better but three would be a minimum.
You develop criteria to evaluate these options.
You rank order the options.
You take the best choice.
Who can argue with that?
Step 2: understand time-critical decision-making
In aviation we try to use the normal decision-making flow when we can.
But sometimes we can’t.
Sometimes we just don’t have the time.
We pilots aren’t alone in this quandary. Consider the firefighter. . .
Research Psychologist Gary Klein studied firefighters in support of the long-held notion that they didn’t have time to weigh many options when in the heat of battle so they only considered two options. But what he found was something completely unexpected.
One of the firefighters he interviewed claimed his Extra-Sensory Perception (ESP) saved his life. Here is his story:
It was a straightforward house fire in a one-story house in a residential neighborhood. The fire was in the back, in the kitchen area. The lieutenant sent his hose crew into the house, to the back, in order to spray water on the fire, but the fire just roared back at once.
“Odd”, he thought. The water should have more of an impact. They tried spraying it again, with the same result. They retreated a little to regroup.
Then the lieutenant started to feel that something was wrong. He didn’t have any clues; he just didn’t like the idea of staying inside the house and ordered his men out of the building – a very average house, with nothing out of the ordinary.
As soon as the men left the building, the floor they had been standing on collapsed. Had they stayed inside, they would have been plunged into the flames below.
The firefighter’s experience pattern wasn’t agreeing with the situation. The event wasn’t unfolding as his experiences would have dictated.
He didn’t know there was a basement, he didn’t know the fire was actually under the living room where he and his men were standing before they left.
The living room was hotter that they would have suspected for a kitchen fire.
Fires are usually noisy and for a fire to be this hot, he would have expected more noise.
In hindsight, the events were obvious.
In the heat of the moment, it wasn’t making sense.
His reaction was to withdraw.
He didn’t ponder, he didn’t consider his options, he just made the decision.
It was his accelerate decision-making — and not ESP — that saved the day.
Source: Klein, Chapter 4.
Psychologists have come to call this process “Recognition-Primed Decision Making.”
When armed with enough experience, our brains will match the events with an internal database to select an option to react as quickly as possible.
If the decision-maker has enough experience, he or she will then simply consider if the option is feasible, and if it is, act.
Note that the decision may not be the best choice, but if time is taken to come up with the optimal decision, it could be too late.
There isn’t enough time to make it perfect.
Some would call this intuition.
The definition on Intuition, from Merriam-Webster.
This is the classic dictionary definition but medical science has come some distance since Webster thought this one up.
There is physiology to suggest we can delete the words “without any proof or evidence”
In our profession, as with fire fighters, intuition is a skill developed from experience.
There is a time for rational decision-making but there are also times when you just don’t have the time.
But remember that time-critical decision-making works best when you have good intuition.
Step 3: improve intuition
There is no doubt that Intuition works, but you need the right kind of experience to make it effective.
But experience is expensive (in terms of flying time, dollars, and lives)
Reading an accident report can be helpful . . . but you have to know how to do that to make it count . . .
Let’s try one out for size . . .
In the year 2000, Southwest Airlines Flight 1455 failed to stop on the runway when flying to Burbank, California.
It is easy to second-guess the pilots because they made so many mistakes along the way. They ended up crossing the threshold too high, too fast, and with too high a sink rate.
The NTSB said they were left with only one option: to go around. But that isn’t the decision these pilots made. You would have never done that, right?
If that is all you do when you study this mishap, you aren’t going to get anything from it.
But if you break it down into the individual decisions along the way, you might have a different opinion. Let’s do that now.
They were one of many Southwest flights landing at Burbank that day and as many of you know, the sky is crowded and ATC likes to pack them in tight all over the LA basin.
They were landing on a 6,032’ runway and the weather was good.
As they got closer and were level at 7,000 feet, they were told to “maintain 230 or greater until advised.”
Is that a big deal? We get that all the time, right?
At 10 nm they were at 6,000 feet and were offered a visual approach. Is that a big deal?
On the one hand, most aircraft descend optimally on a 3 degree glide path which equates to 318 feet per nautical mile. At 10 nm, they would have been better off between 3,000 and 4,000 feet. So they were too high.
On the other hand, that phrase, “follow company,” means the previous Southwest 737 was doing a visual. Why can’t they?
We’ve all salvaged approaches like this before, but at 230 knots?
The pilots on Flight 1455 accepted the visual.
They were cleared for the visual approach, which technically removes the 230-knot speed restriction. The pilots missed this fact and waited another minute before starting to slow down.
Now they needed a 6° glide path and they needed to slow down to approach speed all with only 8 nm to go. Can a 737 do that?
Now let’s say you are the first officer and the captain elects to continue. What would you do?
In this case, neither pilot considered going around.
Now they have a 9-degree glide path in front of them while still flying 60 knots faster than their target approach speed.
Oh yes, they also have a 5-knot tailwind.
An accident investigator would call this “pilot continuation bias”
As pilots, we are pre-wired to getting things done, we are pre-wired to land.
Passing 1,800 feet they had a VVI of nearly 3,000 fpm, they were still above 200 knots, the GPWS had progressed from “sink rate, sink rate” to “whoop, whoop, pull up!”√Southwest stable approach criteria are evaluated at 500 feet. They were still above 500 feet when they crossed the runway threshold.
It took them half the runway to touchdown, their average speed in the flare was 195 knots. They were doing 32 knots when they exited the runway. Nobody was hurt but the aircraft was destroyed.
The accident report cited the controller for positioning the airplane “too fast, too high, and too close to the runway to leave any safe options other than a go-around maneuver.”
Did you experience that case study and end with the thought, “I would never have done that.”
That might be true.
But I would like you to at least consider, “that could have been me.”
I have been set up before and had to, in the end, tell approach control I couldn’t help them out any more. Sometimes they back down, sometimes I ended up delayed.
But in each case my intuition told me there is something else going on here . . . remember my “too hot” landing in the T-38.
I wondered why speed was so critical when there are so many other variables, but if you throw out the things you have no control of, you end up with this formula here. This is the energy you airplane has crossing the threshold and also the energy it has to get rid of to come to a safe stop.
It says the Kinetic Energy of an airplane is equal to 1/2 its mass times the velocity squared. In other words, velocity is more important.
Let’s say the Weight up 10% — KE up 10%, Speed up 10% — KE up 21%
From SW 1455’s target approach speed - 138 knots. They actually touched down at 182 knots halfway down the runway. That extra 32% of speed meant an extra 74% of kinetic energy for the brakes to kill.
Doesn’t knowing this formula give you a better understanding of why approach speed is so important? Perhaps your subconscious will spring into action next time approach sets you up too high, too close, and too fast; and it will cause your conscious to say “go around.”
We know we can broaden our experience with a good case study. But there is one more trick to make this even better. . .
You need to consider these case studies with a little emotion.
I can remember exactly where I was on this day. I know which room of the house I was in, who I was with. And I remember all of this vividly even though it took place nearly fourteen years ago.
And yet, I routinely forget where I put my car keys.
Why is this?
Neurobiologist have come to call this the modulation of memory storage. Emotional events are often remembered with greater accuracy than events that lack an emotional component.
The secret to remembering something important to you is to learn it emotionally. Here is one more case study that ties ignorance, time-critical decision-making, and intuition all together.
Recall that in 1980 I was a KC-135A copilot.
One day the next year I was sitting in a hotel room in Montgomery, Alabama with nothing to do until the next day when I was to fly our airplane home to Loring Air Force Base, Maine. They had just installed an avionics upgrade.
That night I was watching TV and caught the tail end of an interview of a very distinguished looking airline pilot with a model of a cargo DC-8 on the table behind him.
Aerial view of KJFK, Joe Mabel, from Creative Commons.
He was talking about landing his airplane at JFK airport. They had just replaced the navigators with an inertial navigation system and he was getting comfortable using it for all phases of flight.
He was especially interested in the ground speed of his airplane during final approach.
He noticed that his ground speed was not making any sense.
He had an approach speed of 145 knots, the winds were down the runway at 15, but he his ground speed was only 100 knots.
He was missing 30 knots. When this happened to him, in 1975, we knew what windshear was, but didn’t really understand it. At 300 feet his 145 knots disappeared and he had to firewall all four engines just to make the runway.
After the fact, he thought that maybe he should have added that missing 30 knots to his approach speed.
I thought that was fascinating and thought about it through the night.
The next day they released our airplane and I was pleased to see they had installed inertial navigation systems and sitting there, parked next to my knee, was an INS I could call my very own. For the first time in my flying career, I would have access to an instantaneous readout of groundspeed.
We flew north to our home base and I called in to get the weather. I was shocked by what I heard. The ceiling wasn’t too bad, around 500’, but the winds were howling at 50 knots. But at least they were straight down the runway. The pilot decided it would be a good night for copilot training, so he gave me the landing.
KC-135 ILS, from 1KC-135(A)-1.
In the tanker world we were just starting to adopt the idea of an en route descent to a low altitude instrument procedure. I normally flew the last vector to final at 200 knots, waited for the glide slope to start moving and then started adding flaps and the landing gear.
I was at 200 knots that night, on the localizer beam, but the glide slope needle wasn’t moving. It was taking forever and my normal timing cues were not working. I looked down at my fancy new INS and saw that we were only doing 60 knots over the ground.
Should we have abandoned the approach? Absolutely. Did we? No.
Keep in mind we had a cold war mind-set, our job was to send bombers over the north pole and that night our job was to get our airplane home.
As crew members, we somehow accepted the risks. Between 1958 and 1981, the year this happened, the KC-135 had suffered 58 losses, that’s 2 and a half airplanes every year.
So I was missing 90 knots. What would that DC-8 captain do?
I couldn’t put it all together so I decided to fly at the limiting speed for my final notch of flaps, which I think was 180 knots. That was an extra 45 knots of speed. The airplane sounds different at that speed and everyone in the cockpit was silent. They knew something wasn’t right.
We broke out at 500 feet and there was a collective sigh of relief by the other three. I was still worried about the ground speed, I stayed on the gauges.
I was right on glide path at 300 feet when the bottom fell out. It was as if we were in an elevator and the cables snapped. I hauled back on the yoke. The airspeed was dropping so fast I thought the needle was broken, I fire-walled all four engines. It took the engines time to spool up so the speed continued to unwind until about 100 feet when they kicked in and at 50 feet they jerked the nose upward.
That was a good thing but the next thing I felt was the wheels touch ever so gently, so I pulled all four engines to idle.
We were pretty quiet in the cockpit as we taxied in. The first person to say anything was the navigator. He keyed the interphone and said, “copilot, nav.” I said, “go ahead nav.” He said, “when my son is born, I’m naming him after you.”
The next day the navigator and boom operator both admitted they thought we were dead when that initial plummet occurred. The pilot said he was wondering why I was flying so fast but then thought it was a good thing I did, without that extra 45 knots we would have been dead.
Flying Tiger Line DC-8, Aero Icarus, from Creative Commons.
Me? I was thinking about that DC-8 captain. I think about that DC-8 captain a lot. That night I didn’t catch his name or even his airline. I have made efforts, but I never knew who he was.
That was in 1981. Last month Business & Commercial Aviation Magazine asked me to write an article about windshear for the August issue. I was studying the NTSB report about Eastern Airlines Flight 66, which crashed at JFK in 1975. It made reference to a Flying Tiger Line DC-8 that landed just prior and the captain who obsessed over ground speed.
I called the Flying Tiger Line Alumni Association and they verified it was him, Captain Jack Bliss. He devoted the rest of his flying career to studying windshear and advocating some of the changes we have seen over the years. He passed away is 2010 and I dearly regret that I was never able to meet him and say thank you.
His story, on TV that night in Alabama, made an emotional connection and I think my obsession with ground speed comes directly from him.
I’ve never flown a DC-8 and I’ve only flown into JFK a handful of times. But I have that experience logged away in my subconscious.
So there you have it, three steps to learning how to make better decisions and the secret behind each.
If you want to become a better decision maker,
you need to keep learning, stimulate your subconscious with new knowledge to keep the brain excited
you need to know when deliberation is appropriate and when time for time-critical decision-making is better
you need to broaden your experience base so you can better rely on your pilot’s intuition
I hope I have fed your subconscious today and that someday, when you need to make an accurate and timely decision without a committee, your time-critical decision making will save the day.
I get asked now and then what my aviation library looks like. Here is the bookshelf. There are also hundreds of digital materials here: References. Where I am allowed to provide the digital copy I have. Otherwise I've done my best to describe them.
Aeronautical Information Manual (AIM), U.S. Department of Transportation
Air Force Manual (AFM) 51-9, Aircraft Performance, 7 September 1990
Air Training Command Manual 51-3, Aerodynamics for Pilots, 15 November 1963
Connolly, Thomas F., Dommasch, Daniel 0., and Sheryby, Sydney S., Airplane Aerodynamics, Pitman Publishing Corporation, New York, NY, 1951.
Davies, D. P., Handling the Big Jets, Civil Aviation Authority, Kingsway, London, 1985.
Dekker, Sidney, The Field Guide to Understanding Human Error, Ashgate Publishing Limited, Hampshire, England, 2006.
Dole, Charles E., Flight Theory and Aerodynamics, 1981, John Wiley & Sons, Inc, New York, NY, 1981.
Fallucco, Sal J., Aircraft Command Techniques, 2002, Ashgate, Farnham, England
Gann, Ernest K., Fate is the Hunter: A Pilot's Memoir, 1961, Simon & Schuster, New York
Hage, Robert E. and Perkins, Courtland D., Airplane Performance Stability and Control, John Wiley & Sons, Inc., 1949.
Hogben, Lancelot, Mathematics for the Million, Penguin Books, Ontario, Canada, 1983.
Hurt, H. H., Jr., Aerodynamics for Naval Aviators, Skyhorse Publishing, Inc., New York NY, 2012.
Kanki, Barbara; Helmreich, Robert; and Anca, José, Crew Resource Management, Academic Press, Amsterdam, 2010.
Lehrer, Jonah. How We Decide, Mariner Books, New York, New York, 2009.
Morrison, James W., Engineering Fundamentals, Arco Publishing Company, Inc., New York, NY, 1978.
Pagen, Dennis, Performance Flying, Sport Aviation Publications, 1993.
Petroski, Henry, To Engineer is Human, Vintage Books, New York, NY, 1992.
Reason, James, Human Error, Cambridge University Press, New York, NY, 2009.
Aeromedical Training for Flight Personnel, Department of the Army Field Manual 3-04.301, 29 September 2000
FAA-H-8083-15B, Instrument Flying Handbook, U.S. Department of Transportation, Flight Standards Service, 2012
Flight Safety Foundation, Aviation Safety World, "Pressing the Approach," December 2006
Gulfstream GIV Operating Manual, Revision 9, October 11, 2002
Klein, Sources of Power: How People Make Decisions, The MIT Press, Cambridge, Massachusetts, 1999.
May Day: Who's in Control?, Cineflix, Episode 72, Season 10, 28 February 2011 (Turkish Airlines 1951)
NTSB Aircraft Accident Report, AAR-11/05, Loss of Control While Maneuvering, Pilatus PC-12/45, N128CM, Butte, Montana, March 22, 2009
Pilatus Pilot's Operating Handbook and FOCA Approved Airplane Flight Manual (also FAA approved for U.S. registered aircraft in accordance with FAR 21.29), PC-12 Series, revised 1 September 1984.
Swiss Confederation Final Report No. 1793 by the Aircraft Accident Investigation Bureau, Concerning the accident to the aircraft AVRO 146-RJ100, HB-IXM, Operated by Crossair under flight number CRX 3597, on 24 November 2001, near Bassersdorf/ZH.
Technical Order 1C-135(K)A-1, KC-135A Flight Manual, USAF Series, 25 April 1957
Please note: Gulfstream Aerospace Corporation has no affiliation or connection whatsoever with this website, and Gulfstream does not review, endorse, or approve any of the content included on the site. As a result, Gulfstream is not responsible or liable for your use of any materials or information obtained from this site.