This is the last part of the three-part article on Risk, Decision Making and Critical Thinking. In the first part I highlighted that in order to make better decisions in all aspects of our life, and especially in times of change and adversity, we need to understand three elements:
- The nature of risk
- The processes involved in decision making
- The part that unconscious psychological biases play in influencing our choices.
The first challenge we face is to understand the nature of risk which has two essential components:
- The judgment of the likelihood or ‘probability’ of a given outcome
- The value or, as it’s known in psychology, the ‘utility’ placed on that outcome
In essence there are three hurdles to overcome in analysing and managing risk:
A) The probability of success/failure
B) The cost of failure (utility) if the event turns out badly
C) The benefits of success (utility) if the event turns out well
Part 1 concluded that there are two ways we can make decisions:
– Using rational, logical analysis
– Using intuition and smart rules of thumb (known in psychology as heuristics)
In Part 2 we looked more deeply at when these two sets of tools can be used when making decisions:
- Known Risk – where risks are known, good decisions require logical and statistical thinking.
- Unknown Risk (Uncertainty) – where some risks are unknown, good decisions also require intuition and smart rules of thumb
Most of the time a combination of both is needed.
Quick decision making and using simple models are appropriate:
- In highly complex situations
- Where there are lots of ‘unknowns’
- Where there is a need to reduce complexity by focussing on key known factors
- Where ‘experts’ are making experience based decisions
- When using intuition (emotional intelligence) and heuristics (rules of thumb)
- When knowing what data to ignore
- When trying to analyse all data will lead to bias variance (compounding the variance of many factors)
Thinking slowly and using complex models are appropriate:
- Where there are fewer factors i.e. relatively simple situations
- Where we can apply analytical, rational and statistical thinking
- Where there is a lot of data available
- When we can model the decision making process
- Where we can test for cognitive biases (thinking errors)
In Pat 3 of this article I will be looking at Critical Thinking which, is all about being aware of the unconscious psychological biases which sometimes lead us to make poor decisions.
Critical Thinking
Critical thinking is about being conscious of our own thought processes. If we are able to constantly challenge our thought processes we are much less likely to fall into the trap of making cognitive biases.
Cognitive biases colour our emotions and can influence our thinking and cause us to make poor decisions. There are a myriad of different cognitive biases but here are a few of the more common ones.
Over optimism – Almost all of us believe we’re in the top 20% of the population for driving, pleasing a partner, or managing a business. In the corporate world there are no shortage of organisations who are willing to spend colossal amounts of money acquiring another company, despite the fact that a study in Harvard Business Review estimated between 70% – 90% of acquisitions failed to meet their stated objectives.
Principal-agent misalignment –the principal is the company and the agent is the individual; how often have we come across a sales person who wants to pursue a risky deal because he’ll hit his number and get paid his bonus; or a finance director who avoids taking a reasonable risk because she’s fearful if it goes wrong she’ll get fired. But both individuals will try to convince themselves they are making the decision in the best interests of the company; they’re not, but the decision is in their best interest.
Group think – (sometimes referred to as sunflower management) – a charismatic leader demonstrates high levels of enthusiasm for a particular decision and more junior (but sometimes more qualified) individuals allow themselves to be swayed.
Loss aversion, or to put it more simply where we don’t want to feel that we’ve ‘lost-out’. There are two types of loss aversion:
- losing value (utility) most people go with the all-inclusive deal on the basis that they don’t want to feel they might miss out even if it’s actually more expensive. This applies to paying for more data than you’ll ever use on your phone or paying over the odds for an all-inclusive holiday. Or it could be losing the potential utility – being greedy just because everyone else is being greedy.
- protecting the utility already invested so that we find it very difficult to walk away from a ‘sunk cost’ investment. The stock exchange traders’ maxim is ‘your first loss is your best loss’. But many investors continue to hold onto poor investments in the hope that the investment will eventually regain its lost utility.
Recognition bias. This is where inferring a recognised option is better than an unrecognised option the old advertising slogan captured this perfectly ‘No-one ever got fired for choosing IBM’. This can also be linked to another bias…
Choosing the second best option, which is a form of defensive decision making. For example the well-known accountancy firm KPMG are going to charge my company £100,000 for an audit whereas Smith and Jones, who seem like a perfectly reputable local firm of accountants, are only going to charge £50,000. But if it goes wrong and I’ve chosen Smith and Jones I’ll get the blame; but if I choose KPMG and it all goes wrong KPMG will get the blame.
Summary
In times of adversity and volatility we often see both individuals and corporations start to become more risk averse. However, if we really understand risk we need not fall into the trap of being either overly optimistic or overly pessimistic. Probability and expected utility are the keys to understanding risk; by using some simple models to map the probability of success/failure against the cost of failure (negative utility) and the benefits of success (positive utility) we can make more sense of the risk.
The decision making processes have to take into account which elements require slow thinking (Simple situations that have fewer factors, more knowns and plenty of data to analyse where we can build complex models); and which require fast thinking (Complex situations with multiple factors, plenty of unknowns and insufficient data where using expert intuition to know which factors to ignore is important and simple models fare best).
Critical thinking does what it says on the tin – be critical of your own thinking processes in order to surface (and therefore avoid) any of the multiple unconscious biases we are all potentially subject to.
If we can master and apply the knowledge of risk, decision making and critical thinking, especially in times of uncertainty, not only can we avoid making costly errors, we can actually tap into windows of opportunity.
Summary Points
- Critical thinking is necessary to reduce poor decisions
- Critical thinking skills can be learned
- Balanced thinking is the key