Cognitive bias and risky behaviour

How do we really make decisions?  Horizon (BBC2, 25/2/14) well and truly trounced the idea that most are made with logical, rational thought processes. The vast majority are made using intuition. If this weren’t the case – and we used a Spock-like thought process much more of the time – we’d explode with the sheer impossiblity of it all.

Intuition comes from a “fast”, “type 1” way of thinking – making decisions based on the easiest answer. Slow or Type 2″ thinking, on the other hand, is hard work.  So, for the tends of thousands of decisions we make every day of our lives, most of these decisions are dominated by our intuition.

Daniel Kahneman’s Nobel-prize winning research argued that this battle affects all kinds of decisions – what we eat, what we believe, who we fall in love with, who we blame for a crime.  Each time our decision is not based only on rational facts or logic, but something deeper that it is often difficult to pin down. These are cognitive biases – and they are responsible for judgement errors.

Kahneman’s research highlighted some important areas of governance where these innate cognitive biases are cause for concern.  One was the financial services industry; another is state security.

Money

Where money is concerned the rules which govern our decision-making change.  Humans are more likely to engage in risky behaviour when faced with a loss compared to faced with a gain.  How this plays out in the financial sector is all too apparent in the system crashes that brought the city to its knees.

So, what can we do about it?  The researchers wondered how far in our evolutionary development this trait emerged in our DNA.  In a study which required monkeys to make similar decisions (involving a loss of a food currency, or a gain) they found the same cognitive bias.  The fact that we share this behaviour with monkeys made the researchers conclude that this bias is not something we can unlearn or turn off easily – it is deep within our DNA.

Who’s the baddie?

Just as alarming was the research conducted in the complex world of state security.  When working through a life-like scenario involving a terrorism threat to a fictitious US city twelve CIA analysts were asked to review all available data and identify the most likely suspect.  The group of analysts included some who were novices and some who were highly trained.

Out of the twelve analysts only one got the correct answer. The others were all prey to “confirmation bias” – that murky factor that gives rise to presumed guilt rather than presumed innocence.

Implications for learning and development

So what does this mean for those of us who are engaged in developing people and organizations? And can all of us learn to be more logical when making decisions? The insight from this research is that this isn’t something we can teach, or learn.

But we can (and must) do something about the system.  We can identify risky behaviours due to cognitive biases and develop checks and balances to minimize the risk of cognitive biases happening in critical areas of high-stakes work.

Systems-thinking might involve looking at:

  • what needs to change about the system to reduce the risk of bias clouding our judgements, bringing us to the wrong answer?
  • what might be the implications (for our customers, target groups, results) of errors caused by different types of bias?
  • how we could encourage and remind people to be mindful about bias?  Would the CIA analysts’ decisions have changed if they had been asked to consciously think about different types of biases first?

Who’s on the case?

Leave a Reply

Your email address will not be published. Required fields are marked *