Nimblework

The Curious Case of Cognitive Biases – Human Condition to $c#*w up?

In his classic talk “The Psychology of Human Misjudgment”,  given to a Harvard University audience in 1995, Charlie Munger, the iconic and legendary business partner of Warren Buffet at Berkshire Hathaway, talks about the terrible ignorance and patterned extreme irrationality which he observed in himself and others soon after he graduated from Harvard Law School. He further adds that he made careful notes mostly by observations and some by reading and used these as principles and tools for his life and work. Broadly what Charlie Munger calls as Human Misjudgments can be now categorized as Cognitive biases in Behavioral economics and psychology.

Cognitive Biases

A cognitive bias is a systematic pattern of deviation and limitation in objective thinking or rationality. Individuals perceive and filter the input based on their own “subjective reality” and not the objective input. This has a profound effect on our thinking, understanding, action, and behavior. Thus, cognitive biases may lead to inaccurate judgment, illogical interpretation, and bad and poor decisions. In this blog, I would like to touch upon 4 common biases that affect our everyday work and personal life.

Sunk cost fallacy – The sunk cost fallacy is the general tendency for people to continue to pursue something once we have invested time, effort, resources, money even when we realize (midway) that we are in a hole and things aren’t likely to get better. This leads us to make investments not based on future value but on what we have already invested. The thought process is “we can’t quit now; we are too invested in it.”

This kind of bias is pervasive in all walks of life – personal things like romantic relationships, marriages, careers, educational courses are as prone to it as professional areas like projects and businesses. The longer we have been invested in it, the harder it is to break off leading to disastrous consequences.

We see this bias quite a lot with choices and investments in product features and functionality. Think of a product or project where we have already invested significantly, but are not getting enough returns, or the project is sinking. In most cases we tend to continue to put more effort and resources to revive it even though we don’t see any clear and objective signs of return. This often happens at the cost of investments in other newer areas.

This is also quite common with technology choices and investment. Once we pick a technology stack, we tend to stick with it even when we gain some new knowledge about its limitations and other better available options, simply because of the effort and cost incurred so far on training and developing till date.

Substitution Bias – In this bias, we substitute a complex question with a simpler question. According to Daniel Kahneman – “When faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.” We tend to need nice and easy explanations and we can’t deal with complex often unexplained situations. Our minds look for black/ white answers and find it very difficult to deal with the grey.

An example of a hard question may be – “How happy are we with our life these days?”. Our minds may substitute this complex question with a much easier question – “How is my mood right now?”

This is a fairly common problem at work. We see this often when projects fail. We try and assign reasons that may be easier to point out rather than really dig deeper. We look at the usual suspects like lack of resources, skills, requirement understanding, etc. and perhaps don’t dig deep enough to understand the actual reasons. Projects are complex and finding the reasons for failures in projects is also quite complex. The same is true with release of products, new features in the product and so on and so forth.

Confirmation Bias – This is another common and big bias in our lives. This is the tendency to be selective about looking for information that confirms our position, belief, hypothesis or assumption. It is a systematic error of inductive reasoning. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. It makes us blind to evidence that contradicts those beliefs or hypotheses. This is perhaps why context is so important in everything we think and do.

This bias has many forms like statistical sample bias, and statistical results bias in experiments. In teams and projects, it can manifest itself in multiple ways – for example, if a stakeholder hears and believes unwarranted negative things about a project and has already developed his own preconceptions before even hearing a single project briefing, by the time the project is formally presented to him, he will most likely only see what he already believes, and not give it a fair evaluation. All the while not even aware of the fact that he is already biased to not being satisfied with the progress of the project.

Image courtesy: JamesClear.com

 

A large part of project management is also people management. This bias also manifests in managing teams and emotions – if a person has low self-esteem and is highly sensitive to being ignored by other team members or by the manager, they constantly look for signs and behaviors that point to this.

Another variant of this bias is called the “law of the instrument” which says that one method, tool or technique can’t be appropriate all problems. It is commonly referred to as “if our tool of choice is a hammer then every problem may look like a nail”.

Planning Bias – This is a common bias for us managers. It is a phenomenon in which estimates about how much time will be needed to complete a future task display an optimism bias and underestimate the time needed. This phenomenon sometimes occurs regardless of the individual’s knowledge that past tasks of a similar nature have taken longer to complete than generally planned. We routinely underestimate the effort and time that it takes to complete tasks and projects. A close cousin of this is the optimism bias. It is the tendency for people to be overly optimistic about planned outcomes rather than believing that something can go wrong or that risks are part of every project.

Why do these biases happen? Cognitive biases occur as an adaptation to optimize our limited mental energy. Deep thinking and reasoning are exhausting. As we know, the brain consumes almost 25% of our energy needs. Our default thinking therefore is intuitive, sloppy, quick and energy-wise inexpensive. This allow us to make faster decisions and help us absorb large amounts of information. This is a very good adaptive mechanism to conserve our limited energies. However, in the complex world that we navigate today, it does often cause expensive errors. Our default laziness mode makes it difficult to put the effort and time that is required for our reasoning system to monitor and correct the intuitive system and to carry out a proper “unbiased” reasoning process.

Based on this, there are a few things I can think of in order to mitigate cognitive biases:

Be still and quiet: Having a condition where we are very alert yet still and quiet allows us to create the optimal conditions to pay attention and focus and catch the bias. Many of these biases are second nature to us and make it difficult to even identify and usually slip away unnoticed.

 Lower the ego: A lot of these biases are hardwired and part of who we are. Acknowledging these biases will require that we lower our ego and be vulnerable and human. This will allow us to observe ourselves as well as take feedback from others without being defensive and with an open and a curious mindset. It may mean just having the humility to invite the possibility of being wrong. 

Be aware: Just being aware of how our cognitive processes work and the kind of pitfalls that we are prone to the bias allows us to consciously watch out for these traps. Metacognition, which refers to consciously thinking about thought process, has been advocated as an antidote to cognitive bias. Journaling and writing down thoughts may force us to think, declutter and give ourselves a second chance.

Forcing ourselves or forming a habit of asking “could it be something else” – This is a way of double-checking our initial intuitive feel and trying to get our reasoning system to get involved in the cognitive process. This routinely does come into play in medical diagnosis where it is common to get a second opinion.

Formal teaching and role play of these skills as a part of ongoing executive education: For many years of our life we fly blind on these biases. It may make sense to formally categorize and teach this in the company’s ongoing executive education programs. The leadership and training departments of companies can incorporate these into their curriculum.

Behavioral frameworks become part of regular employee evaluation and appraisal:  A formal framework to evaluate these both by our supervisors as well as by our peers can be helpful. As with many other things these biases are easier to identify in others than in ourselves.

Avoid fatigue/ increase energy and robustness: Fatigue drives us to wrong decisions and behavior in most cases and cognitive bias is also vulnerable to low energy levels. It may be best to be aware of this and consciously wait till our energy levels are optimal before we confront the situation where we are likely to be biased.

In conclusion, it would be fair to say that human behavior is prone to cognitive biases. Cognitive biases make us wrongly believe that we know things that we don’t know and lead to poor decisions. Awareness of these biases and some of the above approaches to mitigate them may prove helpful to navigate the complex world of projects and business.

I would love to hear your views on this.

Exit mobile version