Risk, Decision Making and Critical Thinking In Times of Change and Adversity Part 2

Introduction

In Rick’s last article which was the first of three parts, he highlighted that in order to make better decisions in all aspects of our life, and especially in times of change and adversity, we need to understand three elements:

  1. The nature of risk
  2. The processes involved in decision making
  3. The part that unconscious psychological biases play in influencing our choices.

Rick suggested that the first challenge we face is to understand the nature of risk which has two essential components:

  1. The judgment of the likelihood or ‘probability’ of a given outcome
  2. The value or, as it’s known in psychology, the ‘utility’ placed on that outcome

To put it more simply there are three hurdles to overcome in analysing and managing risk:

A) The probability of success/failure

B) The impact of failure (utility) if the event turns out badly

C) The level of reward (utility) if the event turns out well

In this second part he identifies the key points in the decision making process.

Decision Making

It helps if we can analyse each of these separately and then try to find ways to eliminate or manage the risk by decreasing the probability of failure; reducing the impact of failure and increasing the level of reward.

We concluded at the end of Part 1 that there are two ways we can make decisions:

– Using rational, logical analysis

– Using intuition and smart rules of thumb (known in psychology as heuristics)

Often corporations use very complex processes to build risk models to make predictions. The level of complexity of the model increases the confidence those corporations have in their risk models. However, every single bank failed to predict the scale of the 2007/08 Financial Crisis, despite having some very complex risk models.

The reason for this failure is that humans appear to have a need for certainty, but certainty is an illusion. While there are things we can know, we must be able to recognise when we cannot know something and the first step to living with uncertainty is to understand the distinction between known risks and unknown risks.

Two sets of tools are required when making decisions:

  1. Known Risk – where risks are known, good decisions require logical and statistical thinking.
  2. Unknown Risk (Uncertainty) – where some risks are unknown, good decisions also require intuition and smart rules of thumb

Most of the time a combination of both is needed.

 

Goldman Sachs’ CFO David Viniar reported that in 2007 when the financial crash first hit Lehman Brothers, Goldman Sachs’ risk models were taken totally by surprise by a series of unexpected ’25 sigma events’. According to their ‘value at risk models’ a three sigma event has the probability of occurring on one day every two years; a 5 sigma event on one day since the last ice-age; a 7-8 sigma event on just one day since the big bang. But the unthinkable of a 25 sigma event occurred not on just one day but on several days during the financial crash of 2007!

The problem was that the bank’s model was using an improper risk measurement: their models wrongly assumed known risks in a world of uncertainty; but the precise numbers that those models generated produced an illusion of certainty.

The banks were accused of operating like casinos; but as Mervyn King the former governor of the Bank of England noted, if they had been operating like casinos it would at least have been possible to calculate the risk!

In 1988 the first international regulation was created to define the amount of cash a bank needs to ensure that it is unlikely to go bust; known as Basel I it was 30 pages long and the calculations could be done by paper and pen.

In 2004 Basel I was accused of being too simplistic and was revised into Basel II a version incorporating new complex risk models which ran to 347 pages.

Following the inability of Basel II to predict the catastrophe of 2007/08, Basel III was subsequently created which came to 616 pages in length.

Yet, in 2013 in an interview with Professor Gerd Gigerenzer a psychologist and risk expert, Mervyn King was asked which simple rules would reduce the danger of another crisis, King answered almost immediately with one rule:

Don’t use leverage ratios above 10:1.

For example, insist on no less than a 10% deposit on a mortgage. Subsequent studies showed that by using leverage ratios it was possible to predict which large banks failed, but the complicated risk-based models of Basel II could not do this. Canadian banks survived the credit crunch reasonably well because they were restrained by leverage ratios and had tougher lending requirements.

So, when are simple decision-making models more effective than complex models and vice-versa?

Thinking, Fast and Slow was the title of the best-selling book by Psychologist Daniel Kahneman. The message in the book was very clear: almost every time we try to come up with a quick answer, (because we are inherently lazy) we get the answer wrong.

Kahneman provides lots of evidence for why we should slow down our thinking processes, take longer to analyse problems and not rush to what appears to be the obvious answer. When it comes to thinking slowly Kahneman is absolutely correct; after all he’s the only psychologist to have won a Nobel Prize.

So slow, analytical thinking based on complex models are best.

Risk Savvy is an equally excellent book by Professor Gerd Gigerenzer. One of the key themes in his book is the value of making fast decisions using your expert intuition which he argues is often quicker and more accurate.

When it comes to thinking fast Gigerenzer is absolutely correct; after all he is currently director of the Center for Adaptive Behavior and Cognition (ABC) at the world famous Max Planck Institute for Human Development.

So fast, intuitive thinking based on simple models are best.

But how can they both be right? Well it all comes down to the type of situation as to which form of decision making is appropriate:

Thinking fast and simple models are appropriate:

  • In highly complex situations
  • Where there are lots of ‘unknowns’
  • Where there is a need to reduce complexity by focussing on key known factors
  • Where ‘experts’ are making experience based decisions
  • When using intuition (emotional intelligence) and heuristics (rules of thumb)
  • When knowing what data to ignore
  • When trying to analyse all data will lead to bias variance (compounding the variance of many factors)

A key element of being able to think fast is that it does require you to be an expert on the subject. So what does being an ‘expert’ mean? The generally agreed definition is that to become an expert requires 10,000 hours of incremental practice; in other words 1,000 hours of the same experience repeated 10 times does not make someone an expert, we need the practice to become increasingly difficult. Practicing the same few simple songs on the piano over and over again is not going to lead to someone becoming an expert pianist, even if it’s done for 10,000 hours. But if the practice is variable and increasingly difficult then eventually someone who practices for 10,000 hours would become an expert.

10,000 hours is the equivalent of 5 years of practicing at 40 hours per week – in other words it’ a full time job, but an increasingly varied and difficult job.

A great example of people using their expert intuition is the TV show Dragons Den. In the show five self-made business people are presented with a series of entrepreneurs who are each asking the ‘dragons’ to invest many thousands of pounds in their fledgling businesses. On average the dragons have about 10 minutes to listen to the sales pitch by the entrepreneur and ask questions. They don’t have access to any business plans, profit and loss accounts or sales data which they can pore over for hours in order to decide whether to invest their cash in a business – they have to use their intuition and smart rules of thumb; the heuristics built up over thousands of hours of building their own successful businesses.

One particular episode demonstrated how each of the dragons made their quick decisions. Two entrepreneurs came to the den with a business names @The Snaffling Pig’. It was the brand name for a range of interesting and innovative variations of the humble pork scratching snack food.

After sampling the products, listening to the pitch and asking a few questions each of the dragons made a pretty quick decision on whether to invest.

Deborah Meaden liked the entrepreneurs and thought the snacks were quite tasty, but she declined simply because the snacks had quite a high fat content and she has a rule of not being involved with a company supplying unhealthy food.

Touker Suleyman liked both the entrepreneurs and the idea but as a Muslim he could not be involved with a company supplying pork products.

Peter Jones liked the entrepreneurs and the product, but not the idea; he had previously had a similar experience of investing in a company that was rebranding a traditional food and felt that it would not scale because the big players would start to come up with their own versions.

Sarah Willingham liked the entrepreneurs, but she didn’t like the taste of the product. Her simple rule was that if she didn’t personally like something she wouldn’t invest.

Nick Jenkins on the other hand liked the entrepreneurs, loved the product and thought it was a great business idea; and with no more information at hand, and despite the other dragons objections he invested.

Each of the dragons was using their own heuristics to make a quick decision with very limited information and a large number of unknowns; but because they were all experts they were comfortable with the decisions they had made.

Thinking slowly and complex models are appropriate:

  • Where there are fewer factors i.e. relatively simple situations
  • Where we can apply analytical, rational and statistical thinking
  • Where there is a lot of data available
  • When we can model the decision making process
  • Where we can test for cognitive biases (thinking errors)

A great example of this is the modelling that goes into casinos, and in particular slot machines, or to give them their technical name fixed odds betting terminals. Careful analysis of patterns of behaviour by gamblers coupled with the fact that it is relatively easy to work out winning combinations leads to the development of machines that are designed to keep punters playing the machines for as long as possible and therefore to extract the maximum amount of money from the punter. The people who produce these machines deliberately use their knowledge of cognitive biases in order to keep the punters playing the machines.

Bill Friedman, a gambling addict who became a professor teaching about casino management at the University of Nevada Las Vegas, a casino executive, and a casino consultant, studied over eighty Nevada casinos, trying to determine what they had that brought people to them. The games themselves are the same, so it had to be the package that was different. He found that the only relevant consideration for casino design were: What percentage of visitors gamble? What percentage return to gamble?

Friedman’s analysis lead to thirteen principles designed to extract money from gamblers. Some examples are:

  • Principle 2: Gambling Equipment Immediately Inside Casino Entrances Beats Vacant Entrance Landings and Empty Lobbies
  • Principle 4: The Maze Layout Beats Long, Wide, Straight Passageways and Aisles
  • Principle 8: Low Ceilings Beat High Ceilings
  • Principle 9: Gambling Equipment As the Décor Beats Impressive and Memorable Decorations
  • Principle 11: Pathways Emphasizing the Gambling Equipment Beat the Yellow Brick Road

If you’ve been to hotel where the casino was right there as you entered, the ceilings low, with little decor outside of the tables and machines, and with a layout that somehow always led you back to the casino, then you’ve seen the “gaming design” in action.

So, when it comes to decision making there are two ways you can make a decision: thinking fast or slow and depends on the situation as to which type of decision making is appropriate.

critical thinking

In my last article in this three-part series I will be talking about Critical Thinking which will help explain psychological biases which sometimes lead us to make poor decisions.

Summary points

  • Thinking fast has its place
  • Thinking slowly has its place
  • Practice and time make perfect
  • You only have to be comfortable with your decisions; they don’t have to be perfect.

One thought on “Risk, Decision Making and Critical Thinking In Times of Change and Adversity Part 2

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s