“This particular aircraft doesn’t have seatbelts, but we only have it for this one boogie--and we’ve never had a forced landing, anyway.”
“There’s no AAD in this rig, but I’m only going to jump it this once while my regular rig is being repacked. It’s just so I don’t miss the record attempt. I’ll be back on my regular rig on the next load.”
“We always jump in cloud here. Otherwise we’d never get to jump! The pilot has GPS, anyway, obviously, and he’s never been wrong.”
The final sentence--which always follows, right?--is the kicker:
“I’m sure it will be fine.”
Are you? Really?
USPA Director of Safety and Training Jim Crouch introduced a really important concept in April’s Parachutist (‘Safety Check’; April 2017). In it, he brings up The Challenger Launch Decision, written by sociologist Diane Vaughan. Vaughn very usefully summarized the kernel of this human tendency. She even coined a term for it: the “normalization of deviance.” Normalization of deviance comes up pretty much everywhere in life (foregoing your helmet just to bike down to the neighborhood park; speeding; not bothering with the condom). High-variable, high-pressure, high-safety-requisite circumstances breed the normalization of deviance like bunnies at a bunny swinger’s convention.
For some insight into how the normalization of deviance affects you in your airsports career, let NASA Astronaut Mike Mullane bend your ear. Mullane was a fighter pilot in 1978, when he was selected as a Mission Specialist in the first group of Space Shuttle Astronauts. He chalked up three space missions (aboard the Shuttles Discovery and Atlantis), spending more than 350 hours in the void. And, solely in the years after he celebrated his 60th birthday, Mullane summited Mt. Kilimanjaro, Mt. Rainier and 35 of Colorado’s 14,000+ers. You can safely assume that Col. Ret. Mullane is an expert in managing his own risk envelope and that of those around him--and, yet, even he is still influenced by the normalization of deviance. How ‘bout that.
Why is it so tough to fight immunity to unacceptable risk? Cause damn, it’s hard. It’s cultural; it’s about preserving a certain quality of relationship. It’s personal; it’s about preserving a certain self-image. Finally, it’s transactive; it’s about trading off a potentially good experience now for the chance to have more good experiences later, in the absence of much data at ****all.
“The natural human tendency,” Mullane notes, “Particularly in pressured circumstances, to want to take a safety shortcut. [You say,] ‘I’ve done a [jump] like this a thousand times in the past, and nothing bad has ever happened. I can certainly do it this one time [...] and nothing bad is going to happen. [...] The absence of something bad happening when I took this safety shortcut means that it’s safe to do so again.’”
There will always be a next time. And you’re going to be mightily tempted to do it again. When you do it--whatever ‘it is--enough times, the shortcut becomes the norm. The loop is reinforced. In Mullane’s words, “The deviance is now invisible to you.”
And when invisible deviance leaves a very visible mess? Well, Diane Vaughn coined another term in her book for that eventuality: a “predictable surprise.” Those involved in the Challenger debacle readily admit that the explosion (and the resulting deaths) constituted a predictable surprise. So does a catastrophic wingsuit collision in the absence of one jumper’s AAD. So does a plane full of broken jumpers after a forced seatbeltless landing (of which--make no mistake--there are very many). So does a double tandem fatality at a dropzone with an it’ll-be-fine attitude towards instructor training.
The itchy issue we face as airsports athletes is that we’re not under pressure from the government, as Mullane and NASA were. We’re not under pressure from the market. The pressure you’re under on the dropzone is your own. If you think it’s a good idea to scratch, you can damn well go ahead and scratch. You can roll your eyes at anyone who gets after you for it--the manifest; your buddy; your team at the Nationals. Most of the time, though, you don’t. You stay on the load, and--probably significantly more than nine times out of ten--you build another nanolayer on your normalization-of-deviance callus.
The old triusm that familiarity breeds complacency makes a little more sense, no? That newbies are generally more risk-averse than intermediate-to-mid-career jumpers (a trend which tends to reverse as the jumper amasses significant empirical data)? That you’re more willing to do--well--gloriously stupid shit at a dropzone you know really well, as opposed to one you’re just visiting?
Take it from Richard Feynman, compared the practice of predictive reasoning to Russian Roulette: “The fact that the first shot got off safely is little comfort for the next. [...] Nature cannot be fooled.”
In real life, of course, it’s more uncertain than that. He was talking about binary predictive reasoning (with an either-A-or-B result). We’re not playing a binary game when we’re jumping and flying; we’re not playing Russian roulette. Honestly, we don’t even know how many bullets are in that gun. But we’d better remember that it is a gun, and it is loaded, somewhere in there--and the safety culture we’ve inherited is a desperate attempt to introduce proven failsafes in the face of our old nemesis, randomness.
Walking out to the pointy end is fun. Randomness is fun. Deviance is fun. That’s a big part of why we do this, right? That said: understanding why we make the decisions we make--and, perhaps, even learning to make better ones--can do much to extend a career.
For more, do yourself a solid and check out Vaughn’s The Challenger Launch Decision, which originally coined the phrase. It’s a riveting read--and I bet you’ll readily recognize the culture which worked to create the conditions for the tragedy.