Jump to content

How Cognitive Bias Is Messing With Your Skydiving

By nettenetteon - Read 11018 times

Three Cognitive Tools to Help Keep You In One Piece


Image by Kenneth Grymen

Uncertainty is a foundational element of skydiving, and managing that uncertainty is one of our most important responsibilities in the sport. Right? Right. Unfortunately: what we know from laboratory experiments is that when humans come up against probabilities—all of which are, technically, conditional probabilities—our minds seize up when they try to make an inference from that data.

We all hold carefully constructed illusions that comfortably surround the ideas of certainty, responsibility and safety. Learning about the structures we build out of those illusions forces us to open up to the idea that we don’t know jack. And man, we don’t like to do that.

As a culture, we’re also tripped up by the fact that statistical thinking isn’t a cornerstone of our educational system. We’re friggin’ terrible at juggling statistics, in fact. (Esteemed mathematician John Alan Paulos calls this phenomenon “innumeracy”: literally, an illiteracy of numbers.)

With that in mind, co-nerd along with me on this path to understanding how we stumble around in our own heads. Clearly, a library of cognitive phenomena relate to our beloved sport; I’ve picked my favorite three biggies to start the conversation.

#1: The Dunning-Kruger Effect

If you walk away with only one new tool in your cognitive toolkit, let that be the Dunning-Kruger Effect. The Dunning-Kruger effect is a relatively new discovery, as these things go. It was set out in experiments run by David Dunning and Justin Kruger at Cornell in 1999.

The Dunning-Kruger Effect is a cognitive bias (defined as a systematic deviation from rational thinking) that prevents people from being able to know with any accuracy how skilled and/or informed they are. Basically, the research found that incompetent people overestimated (sometimes, hilariously vastly) their own abilities. Skilled people, however, tended to downplay their competency.

What repercussions does this have for you as a skydiver? Holy dachshund puppy in a hot dog bun. SO MANY. It only takes a little extra listening to recognize it in the people around you--and yourself. Our sport is--just like the rest of the world--chock-full of the “confident incompetent”: those who lack the metacognitive ability to recognize that they (at least for now) suck.

What’s a smart skydiver to do? Firstly and most importantly: Underestimate your abilities and the abilities of the jumpers around you. Get professional coaching to uplevel; that’ll come with a bonus of an outside perspective on your actual skills. Then make it a project to find truly competent fun jumpers to enrich your educational environment. Remember: People who know nothing are far more likely to make themselves be heard than people who know a lot. The fun jumpers who actually know a lot will likely be much quieter, so it’ll take more work to find them.

#2: The Stress-Influence Tendency

Every human makes decisions under varying intensities of stress; far fewer of that number regularly, intentionally make life-and-death decisions under stress. Of that smaller population segment, a pretty large fraction does it for a living. For fun? Yeah. We’re a weird crowd. We play stress games. We need to play them as consciously as possible.

The Stress-Influence Tendency pops up when the stakes are high and not enough information (or cognitive resources) are present to reasonably guarantee a good choice. Here, it’s the pressure that matters. High-pressure environments dramatically change human decision-making strategies. You might think that your decision-making under pressure is solid, but you’d be wrong: Studies that compare outcomes often show vast differences in decision-making quality between high-pressure and low-pressure environments.

Here’s why.

According to psychologist (and Nobel Prize laureate besides) Daniel Kahneman*, we humanbeans have two routes to the endgame of a decision: the fast route, labeled System 1**, and the slow route, labeled System 2***. System 1 is snappy and pretty much automatic, kicking in to respond to an external stimulus. System 1 can be the result of genetic hard-wiring (Eek! A rat! Climb up the bookcase!) or long-term, hard-practiced skillbuilding (Eek! A rogue toddler in the LZ! Braked turn!). These responses have a tendency to feel involuntary.

System 2, on the other hand, has to do a bunch of library research and take up the whole damn operating system to do its work. System 2 puts together a spreadsheet and a PowerPoint presentation of the pros and cons associated with each option. It’s the farthest thing from involuntary, but it can flexibly check, modify, and override the decisions from System 1, if given the chance. Ideally, System 1 sits down with System 2 and offers a solution, and System 2 either vetoes that judgement call or gives it the blessing of reasoning.

An overdose of stress, however, pulls that chance right out from under System 2. It diverts all the cognitive resources that System 2 needs for its ponderous function, subbing out instinct for conscious reasoning. In lots of cases, that works out just fine. The problem is that System 1 is simple. It’s a habit memory system. It’s rigid; it only has a hammer, so everything looks like a nail. System 2 can bring the rest of the toolkit to bear on the problem, but only if it has the chance to get there.

The only way to reliably get System 2 into the room is to reduce the amount of stress you’re under. Strive to limit your variables. Example: Doing your FWJC? Awesome. Doing your FWJC at a new dropzone in suboptimal conditions? You just locked yourself in with System 1 and chucked the key out the window. If you need a more reasoned solution for a problem than the one System 1 throws out first, you’ll be out of luck.

Another thing: When you’re learning new skills that overlap with skills you’ve trained deeply, be mindful that your System 1 responses are going to overwhelmingly favor what you’ve trained. (This is why swoopers have a tendency to over-toggle paragliders, and why multi-thousand-jump skydivers sometimes panic-pull off of big-wall BASE exits in the opening phase of their low-speed belly careers.)

#3: The Availability Heuristic (a.k.a. The Availability-Misweighing Tendency)

 


Because our decision-making process has its roots in the systems that sent us scrambling for food and running from better-physically-adapted beasts, those systems are built for immediacy. They’re designed to make quick assumptions and finalize a decision using that scant criteria while consuming as few resources as possible in the process. The system works like a search engine, and we’re only ever really interested in the top three results. In precisely the same way as a search-engine ranks its resultant listings, the system ranks the stuff that pops up according to the number of times it has been accessed. “Availability” is analogous to top-ranking search position. Top ranking alone creates the illusion of truth and reliability. It’s easy to forget that simple repetition got it there.

Beliefs, in kind, propagate by repetition, and our sport is no exception. Examples abound. A nail-biting number of skydivers (and aircraft operators, besides) remain super-casual about seatbelts in the jump plane because it’s pretty rare that a forced landing makes the news, making them cognitively unavailable for decisions. Make no mistake: There are plenty of forced landings goin’ on. In another example, the prevalence of a certain brand of gear on your dropzone (or in the advertising you consume) will make it significantly more “available,” driving your purchase decision more than you realize. And another: A regional community’s oft-repeated mantras (Ex.: “If we didn’t jump in clouds, we’d never jump!”) are most certainly available due to repetition and not the unaltered truth.

Thinking outside of availability takes work. It takes curiosity. Often, it takes the willingness to say or do something out of lockstep. How often do you visit the second page of the search results? Maybe it’s time to start.

As I said before: There are loads of these cognitive heuristics that, in one way or another, bring their kerfuffle to bear on your skydiving (and, of course, your life at large). Most of these biases complicate the problem by tending to overlap and interweave, creating a series of false bottoms and fake doors in your thinking. Learning to recognize them is a good first step; the rest of the demolition work is up to you.

 

* Daniel Kahneman, Thinking, Fast and Slow

** “Intuition” in the “stress induced deliberation-to-intuition” (SIDI) model

*** “Deliberation” in the SIDI model

1 1
1 1

About The Author

Annette O'Neil is a copywriter, travel journalist and commercial producer who sometimes pretends to live in Salt Lake City. When she's not messing around with her prodigious nylon collection, she's hurtling through the canyons on her Ninja, flopping around on a yoga mat or baking vegan cupcakes.

SIGN UP OR LOGIN

Create a free account or login to comment on this article.

Sign Up Login

User Feedback


charliemike
Thanks Annette, perfect supplement to your article on Normalization of Deviance. Really appreciate these articles as they not only apply to this sport but many other areas involving critical decision making.

Share this comment


Link to comment
Share on other sites
heavision
More like this please. Fucking kick ass.

Share this comment


Link to comment
Share on other sites
heydon1960
admitting what I don't know, hoping it isn't so simple I should be able to reason it through, what is FWJC?

Share this comment


Link to comment
Share on other sites
JohnMitchell
Dunning Kreuger Effect is huge in our sport, and in life in general. Learn to recognize when you don't even know what you don't know. You'll live much longer.

Share this comment


Link to comment
Share on other sites
argann
Well done. As mentioned before DKE is a huge oversight many people have and it is even more prevalent in our sport. Also, Kahneman's work is spot on as well. Even, in more general ways you could bring in some of the work cited by Gladwell (like the 10,000 rule). If you want to read some work by Brymer it might be a good read for you. He studies a lot of risk seekers but his work is more qualitative. I, myself am doing some research on personality of risk seekers.
Great stuff, lots of value here for our community.

Share this comment


Link to comment
Share on other sites
DCjumper
This is an excellent article and fair treatment of Cog Bias... which differs from motivated bias... in which you have a fixed opinion of things (you’re a safe, solid swooped, for example) and you tell yourself and others in order to preserve the vision you have of yourself (ego, imposter syndrome, fear). Cog bias is unmotivated, natural, and sinister in this respect because it’s operating ‘under’ the system. Very difficult to change... it begins by being self-aware. Admitting possibility, collectingbup data, being open to suggestion (all Kahneman’ system2). Again, excellent topic

Share this comment


Link to comment
Share on other sites


Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Add a comment...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×