5 Cognitive Biases Impeding Your Thinking

Infinite Grey
5 min readDec 4, 2019

--

Logical fallacies and cognitive biases are embedded into every facet of our daily lives, yet most of us are unaware of their pervasiveness or consequential effects. Equipping yourself with this knowledge can be very empowering and a source of a ‘competitive edge’, both personally and professionally.

This is an original, introductory article I wrote on the topic in my spare time — hopefully the first of many. Feel free to share this with any of your friends or work colleagues you think may be succumbing to these human blind-spots!

Introduction

Following on from my article on logical fallacies, I will now consider 5 cognitive biases that I consider to be both interesting and common to the human psyche.

Human beings act far less rationally than we would like to think. The Enlightenment may have been an intellectual revolution but it could never alter the basic biological and psychological biases that have plagued us for millennia.

Cognitive Biases

1. Confirmation Bias

This cognitive bias is probably most familiar to readers, and happens to be one of the most challenging biases to negate and overcome.

Confirmation bias is defined as the tendency to search for, interpret, favour, and recall information in a way that affirms one’s prior beliefs or hypotheses. It is those Google search suggestions as you type that prompt you to find the one article that appears to confirm your worst fears, or your most wishful of thinking. Rather than base our view of a single piece of evidence, we ought to consider the meta-conclusions across time and source.

Topics that invoke emotive responses — namely politics, food and religion — tend to be most afflicted by people’s confirmatory biases. The vegan thinks that they are just as correct as the keto proponent; the atheist as the theologian; the Republican as the Democrat. Because each have ‘evidence’ to back up their point of view.

It is unrealistic to assume that we can approach a topic with a ‘blank slate’, a mind completely free from prior hypotheses, but the key is to be open-minded and willing enough to alter one’s point of view when compelling evidence is presented which goes against a prior belief. Perhaps even seek out contradictory information to formulate a more nuanced and rounded view. Break out of those ‘echo chambers’.

2. Framing bias

If you were told that there was a 98% chance that a routine medical operation would be a success, would you consent to this procedure?

If you were told that there was a 2% chance that a routine medical operation would lead to death or serious injury, would you consent to this procedure?

In this hypothetical scenario, the procedure itself has not changed. The framing of the risks and rewards, however, have changed.

The majority of people would likely consent to the procedure when it is framed in terms of the positive upsides, and refuse the procedure when the framing incorporates the potential negative downsides.

The way in which we frame an argument can have a significant effect on the success of it being accepted, particularly when leveraged in interpersonal negotiations and interactions.

This idea is closely aligned with the anchoring bias, whereby people use the initial piece of information they receive as an ‘anchor’ through which they view any subsequent information received.

3. Dunning-Kruger effect

“Wisdom is knowing that I know nothing” — Socrates

This is a cognitive bias in which people of low competence generally over-estimate their cognitive ability or knowledge — or at least have an inability to identify their own shortcomings — while people of high competence generally under-estimate their cognitive ability or knowledge.

Anyone who has done a deep dive of learning on a new topic comes away with a greater appreciation for just how little they actually know about that area. It’s a strange, inverse phenomenon. Learning and knowledge acquisition tends to induce greater humility.

This is partially explained by the ‘four stages of competence learning model’, where one moves from unconscious incompetence > conscious incompetence > conscious competence > unconscious competence. Those suffering from this cognitive bias tend to be stuck in the first phase, whereas those with greater self-awareness tend to occupy one of the three subsequent stages of the model.

That is why they say that ignorance breeds confidence.

4. Availability Bias

This is a cognitive bias whereby we over-rely on information that immediately or readily comes to mind, which clouds our judgement and our ability to accurately evaluate the risk or likelihood of a situation.

The availability of information that most easily comes to our mind is going to be determined by the sources of information that we habitually expose ourselves to. The business model of the news media, being revenue from advertising, is dictated by the quantity of clicks and attention garnered. This incentivises media to focus on the most shocking, 0.1% of information (known as salience bias) and to prioritise negative information over positive information (known as negativity bias). The result of this is that we have a distorted view of how dangerous and doomed our world is.

This is merely an illustrative example within one domain of our lives. It evidences how the availability of information is not reflective of its frequency or relevance. It showcases how important it is to be extremely selective about the kinds of information that we expose ourselves to.

5. Bias blind spot

This cognitive bias has been included slightly tongue-in-cheek. Bias blind spot is where we recognise the impact of biases on the judgment of others, while failing to see the impact of biases on our own judgment.

The first step to identifying one’s own cognitive biases is conscious awareness (see #3). That is the intended purpose behind this article. It may be easier to spot cognitive biases in other people, at least initially, but it can be good fun to begin to turn the gaze inwards and notice where you too may be falling back upon them.

I feel that this concept is closely aligned with Soren Kierkegaard’s observation that “most people are subjective towards themselves and objective towards all others, frightfully objective sometimes — but the task is precisely to be objective toward oneself and subjective toward all others”. We have natural blind-spots that make it difficult to fully admit to our own flaws and imperfections because doing so can be psychologically uncomfortable. It is easier to critique other people or the world around us than it is to apply the same objective, stringent framework to our own behaviour.

Applying empathy and non-judgement towards other people, and to oneself, is a winning strategy.

Summary

There are actual, real-world consequences when cognitive biases are applied and go unchecked. This can happen at both the micro and the macro level.

Identification of these biases is the first step (moving from unconscious unawareness to conscious unawareness). Disciplining oneself to avoid engaging in these biases is the second step. Challenging biases that go unchecked is the third step, especially when they have tangible (and potentially negative) consequences.

--

--

Infinite Grey
Infinite Grey

Written by Infinite Grey

Exploring nuanced crevices of truth in a world of complexity. Aspire to provide readers with better epistemic frameworks for intellectual and moral progression.

No responses yet