Foundational Concepts Humility

At the heart of everything we are going to be discussing on this site is the idea of adaptability. Every one of us, as human beings, comes into the world with limits. None of us arrives having all knowledge or all skill in any domain.

One of the things I look for when I study the diverse topics that catch my interest is common themes or principles. When people from varying backgrounds are all saying the same thing (sometimes using different terminology, but speaking of the same concepts), that can be a good indication that the principle is important.

Something that has come up repeatedly in almost every subject I have studied in any depth is the importance of what I am calling here, humility. In survival psychology1 , in critical incident / active shooter training2 , in the study of learning, expertise and its development, expert decision making, “flow state” and performance3 , the theme is clear. In order to learn, to thrive, to survive, it helps to be humble.

These examples are relevant to personal protection and general learning or skill building, but this idea really applies to anything you  want to get good at.

In the accounts of those who study survival in extreme and unexpected situations like natural disasters or active shooter situations, we see that humility is one of the very important elements possessed by those who live through these events. Humility is the ability to rapidly revise your expectations when the information your senses are telling you conflicts with your current mental
models “Taking the Red Pill” or preconceptions. We all have cognitive biases.

Cognitive biases can be generally described as systematic, universally occurring, tendencies, inclinations, or dispositions in human decision making that may make it vulnerable for inaccurate, suboptimal, or wrong outcomes (4)

All of us have a temptation to think that what we believe in any given moment is an accurate reflection of objective reality. Our decisions are based on our experiences, and while it is fair to say our assumptions have some degree of accuracy, it is uncomfortable to realize that even our most accurate models are flawed to some degree. Some realize this. When information our senses tell us is inconsistent with our view of our situation, we have a choice. We can rationalize things and try to make the information fit our existing models, or we can revise our models based on the new information. The capacity to admit that your assumptions are inaccurate and to rapidly revise your view is vital in surviving one of these incidents when the stakes are high and the rapidly evolving situation makes the margin for survival thin.

Humility is also the key to become knowledgeable or skilled in any endeavor. Right now, there is a destructive cultural trend that leads to a fear of “failure”. This fear of making mistakes leads to rigid mental models and an inability to learn or grow because the person is avoiding the very kinds of experiences they need in order to grow. We saw this in the post on Mindset.

I came up with a “formula” that helps describe this concept.

Skill / Wisdom = Information + Experience

For our purposes here, we can call skill and wisdom opposite sides of the same coin. Skill will represent the physical application and wisdom will represent mental application of principles in any area of interest or expertise.

The development of skill or wisdom comes from a combination of information (specific accurate knowledge of a subject) and experience (systematic ongoing testing of that knowledge against reality so that it is no longer merely theoretical and the understanding is deeper).

If you see two ditches, there is probably a road in the middle. – Dr. Craig Johnson

Ditch #1 – False confidence arising from information / knowledge without any actual experience.

If someone has only information (read it or watched it on YouTube etc.), the danger of misinterpretation comes from a lack of understanding through experience. All kinds of theories can sound great until they are actually tested. Someone can sound smart and knowledgeable because they possess great theoretical information, but without experience the expertise is just an illusion that will likely collapse at the first encounter with objective reality. It is also important to understand that expertise in one domain does not necessarily carry over to another. Just because someone has a PhD in astrophysics, it does not mean he knows the first thing about climate, economics, personal protection, music…

Ditch #2  Misinterpretation of experience due to lack of accurate information.

For someone who has an experience in an area that has no specific accurate knowledge to help them interpret their experience, it is very easy to draw inaccurate conclusions. Several variations of causal fallacies* are examples of this. We sometimes experience something and assume we understand why things happen the way they did. In reality, our interpretation has missed the true cause completely. The danger of this is that we do not really understand how to avoid a similar result in the future, though we think we do. This creates a false sense of confidence that can be disastrous.

*The causal fallacy is the logical fallacy of incorrectly making a conclusion about an event’s cause. The causal fallacy is actually a category of fallacies rather than one specific line of faulty reasoning. All of the fallacies that fit into this category are characterized by one thing: the illogical assumption that a specific factor caused a specific effect.5

Someone who follows less than a car length from the bumper of the vehicle in front of them at freeway speeds may feel that they are a totally safe and skilled driver because they have not been in an accident. They believe the evidence supports this through their experience, when the reality is that they will not have time to perceive a problem and react to it at those speeds and following distances. This person believes that the lack of negative consequence points to skill and acceptable driving habits. In reality, all it truly indicates is how fortunate the person has been to not have encountered a hazardous situation. This person is totally unaware of the fact their safety on the road is utterly dependent upon the well-designed highway system and the skill and awareness of other drivers around them as well as the blind chance that they never encounter an unexpected hazard.

Even in the event of an accident in the example we just gave, someone may not be open to the lesson. We are tragically good at making excuses and rationalizations to protect our ego rather than learning from hard lessons sometimes.

The theory of ‘confirmation bias’ (Klayman and Ha 1989) suggests that people seek information that fits their current understanding of the world. Incoming information may reinforce existing mental models or may be rejected outright.6

Humility is the key to learning, growing, and adaptability. Those who seek excellence have made a habit of seeking out the current limiting factors to their performance and addressing them. As they push themselves in practice in the areas they have identified as deficient (learning from their mistakes and gaining experience interpreted through the lens of accurate relevant information), they change their “system”. Once the issue has been improved, the student evaluates the new performance threshold and determines what the new limiting factor or factors are. These are the focus for the next training cycle. It is the opposite of avoiding mistakes or “failure”. These experiences are specifically sought out in an effort to continue improving.

We will be talking about all of this in much more detail in this blog and on this site as we go along.

[1]Some Sources: Deep Survival- Laurence Gonzales, The Survivor Personality- Al Siebert

[2] Taught at ITTS by Scott Reitz: https://internationaltactical.com/

[3] Some Sources: The Cambridge Handbook of Expertise and Expert Performance – Edited by: K. Anders Ericsson, Neil Charness, Paul J Feltovich, Robert R Hoffman, Peak: Secrets from the New Science of Expertise – Anders Ericsson, Robert Pool, Sources of Power, How People Make Decisions – Gary Klein, Flow: The Psychology
of Optimal Experience
– Mihaly Csikszentmihalyi,
The Gift of Fear– Gavin DeBecker, Mindset – Carol S. Dweck, Building the Elite: The Complete Guide to Building Resilient Special Operators – Jonathan Pope, Craig Weller

[4] (e.g., Tversky and Kahneman, 1974Kahneman, 2011Korteling and Toet, 2022)Quoted from Cognitive bias and how to improve sustainable decision making Johan. E. (Hans) Korteling, Geerte L. Paradies, and Josephine P. Sassen-van Meer (section 1.1)

[5] from Grammerly blog, Lindsay Kramer (https://www.grammarly.com/blog/causal-fallacy/#:~:text=The%20causal%20fallacy%20is%20the%20logical%20fallacy%20of%20incorrectly%20concluding,event%20and%20its%20supposed%20cause. )

[6] Mental Models: An Interdisciplinary Synthesis of Theory and Methods Natalie A. JonesHelen RossTimothy LynamPascal PerezAnne Leitch Ecology and Society, Vol. 16, No. 1 (Mar 2011) (13 pages)

 

 https://www.jstor.org/stable/26268859

May 2, 2024
top
Skip to toolbar