‘Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake acceptable. and if the jump saves much time and effort. [It] is risky when the situation is unfamiliar, [and] the stakes are high……. These are the circumstances in which intuitive errors are probable, which may be prevented by a deliberate intervention of [our conscious minds].’

Daniel Kahneman, ‘Thinking Fast and Slow’, Chapter 7, pp 78ff.

It’s hard to think of a more accurate description of the early months of the pandemic in 2020 than as an ‘unfamiliar’ situation in which ‘the stakes are high’ - just the circumstances in which it’s dangerous to take decisions guided by unconscious intuition rather than conscious analysis. Yet that is exactly what both the UK’s politicians and its scientific advisers were doing during these pivotal weeks. If our hyper-rational world experts could be in thrall to their unconscious minds at such a crucial time, then clearly the rest of us need to be on our guard.

How intuition systematically leads us astray

Misleading intuitions are just as convincing as accurate ones. We all fall all the time into the particular unconscious traps I want to look at in this article. They are ‘salience’ bias (what Kahneman refers to as the ‘availability heuristic’), ‘confirmation’ bias, ‘groupthink’ and 'overoptimism'.

They are part of how we think, intrinsic to our unconscious ‘operating systems’ even when we believe we are at our most rational and rigorous. Evolution has hardwired them into us, either because they provide processing shortcuts or comforting emotional shields that enable us to cope psychologically with the chaos of reality, or both. It’s important that we are aware of them, and do what we can to counter them by engaging our conscious, critical minds.

In what follows I draw particularly not only on the work of Kahneman but also that of Tali Sharot on the evolutionary biology and neuroscience which underlies unconscious bias.

The Covid traps

The UK’s late lock down in early 2020 was described by Parliament’s Health and Social Care and Science Committees as ‘one of the worst public health failures in UK history’. Our death toll in the first wave was the highest in Europe and our economy very badly affected. New Zealand locked down in late February, Italy on 9 March and Germany on 22 March, while we waited until 26 March. Experts have suggested that the Government could have reduced the impact of Covid on public health and the economy very significantly if it had introduced stringent measures – closing borders, schools and businesses, restricting mixing and social distancing, increasing testing and tracing capacity – even a week earlier than it did.

Why didn’t the UK Government react faster? We had the advantage of forewarning: the infection didn’t reach the UK until two months after the outbreak in China and several weeks after that in Italy and other European countries. We had time to prepare and learn from the experience of others. The World Health Organisation was urging quick and decisive action to contain the disease and other countries were rapidly locking down and introducing testing.

‘Salience’ or ‘Availability’: a processing shortcut

The Government had long recognised the risk posed by a global pandemic. It had occupied the top place - on the basis of combined probability and seriousness - on the Government’s ‘Risk Register’ for a number of years. Whitehall Departments are regularly required to update the Register and demonstrate that they have the necessary measures in place to mitigate the risks, and the Department of Health had carried out two simulations to test its pandemic response. But, despite the fact that one of the simulations had been based around a coronavirus, the UK’s practical preparations focused on flu and the lessons from the 2009 flu pandemic.

The UK hadn’t experienced the recent epidemics of other coronaviruses, SARS and MERS, which had taken hold in Asia. Its preoccupation with flu, to the exclusion of other viruses, was attributable to what some psychologists refer to as ‘salience bias’ and Kahneman calls the ‘availability heuristic’: our intuitive tendency to judge the frequency of an event by the ease with which instances come to mind rather than by examining the facts and evidence. This unconscious shortcut is why we fear plane crashes more than car accidents, though we’re much less likely to be involved in the former: air disasters are rare, involve a lot of people, are reported very graphically and make a vivid impression on us, while car accidents are a daily occurrence, mostly unreported.

Even our rational scientific advisers had been led astray by intuition, with serious consequences. Flu is less infectious and less deadly than Covid. It requires a less robust response. As a result we were poorly prepared for Covid. We hadn’t given adequate consideration to the logistics of lock downs and testing or the need for a large supply of PPE for medical staff or carers.

Confirmation bias: protecting ourselves from doubt

So, many epidemiologists remained wedded to their flu-based modelling in the early crucial weeks after the first cases were identified in 2020. They assumed that there was plenty of time to prepare and even, at one stage, that it would be unwise to suppress the virus altogether and desirable to achieve some degree of herd immunity. Their reassurances were of course what the politicians wanted to hear.

Confirmation bias prevents us from reacting to evidence that contradicts our assumptions. Our brains protect us from ‘cognitive dissonance’, the psychological unease of wondering whether we might be wrong. We are programmed to seek out information and interpret it in a way that strengthens our pre-established opinions, and to discount evidence that contradicts them. We’re more sensitive to information showing that other people have come to similar conclusions and less sensitive to information that others dissent. And, paradoxically, the cleverer someone is, the better at analysis, the more able they are to organise the evidence to suit their case.

It seems an odd way for evolution to work – to select for people who won’t change their minds in the face of contrary evidence. But Tali Sharot explains that, like our other unconscious shortcuts it is helpful to us – ‘adaptive’ - because most of the time, in our normal lives, when we encounter a piece of information which contradicts what we believe about the world, that piece of information is wrong. If we gave all information equal weight, and continually second-guessed ourselves, we’d never make decisions and wouldn’t be able to get on with our lives. But the consequence is that it’s very difficult to let go of a pre-established opinion or to change someone else’s mind about something, even when we are sure they are wrong.

Groupthink and the illusion of control: avoiding ostracization

The UK’s advisers had publicly espoused a strategy that was different from most other nations’ approaches. This exposure appears to have entrenched them in their decision; no-one felt able to challenge the group consensus; they even explicitly maintained that while they knew their strategy seemed counter-intuitive, given what others were doing, nevertheless they were right and others wrong. They continued to rely on it until the evidence from Italy demonstrated incontrovertibly that Covid was much more dangerous than flu. They were clever people, adept at interpreting data. Their emotions were involved; they had staked their reputations on the flu scenario.

Our unconscious minds place us at the centre of our worlds; they give us the illusion of control. We consistently overestimate the effect of our actions on events and underestimate those of external influences. The UK’s scientific advisers were almost all men. Research has shown that men are more likely to make categorical decisions, while women tend to see things in a more nuanced way – shades of grey versus black and white. Society expects men to be decisive and prepared to take risks, while women are conditioned to be more thoughtful and receptive to others’ views.

Overoptimism: protecting ourselves against reality

And finally, as in the US, there was a hidden, comforting, emotional assumption pervading the culture in which decisions were made: that the UK was exceptional and impregnable. Neither the US nor the UK could believe that great Western nations could possibly be affected; surely this was something which only happened in Asia? Like President Trump, the UK Prime Minister, Boris Johnson was very reluctant to face up to the reality and be the bringer of bad news.

Johnson was famed for his ‘boosterism’. Most of us have a rose- tinted view of ourselves and the future - which is why we cleave to politicians who encourage us in this view. Tali Sharot has made a study of our species bias towards over-optimism – our inclination to overestimate the likelihood of encountering positive events in the future and underestimate the likelihood of encountering negative ones. The data show that most people overestimate their chances of professional success, the abilities of their children, their future health prospects and likely lifespan, and hugely underestimate their likelihood of divorce, unemployment and serious illness. The exception to this is people who are moderately depressed and have a more realistic view of the future.

Sharot suggests that this bias is evolution’s way of protecting us from falling into despair and inertia because we all know that inevitably one day we will die. Our brains are programmed so that we can imagine sought-after events much more richly and vividly than adverse ones, which seem vague and blurry. We find it easy to imagine what success will be like, but when we think about the possibility of failure the images seem much less compelling. And indeed, in general this bias is benign. If our view of the future is rosy, we suffer less stress and our mental and physical health is better. The evidence shows that moderate optimists live longer, are healthier and happier, make better financial plans and are more successful.

The problem arises when we allow our comforting assumption that ‘everything will be fine’ to blind us to the possibility that actually it may not; and it’s compounded by our ability to maintain our rosy view of things in the face of mounting contrary evidence. We see this bias in operation every day in the way in which the costs of large projects – the ill-fated HS2, for example – continually escalate and deadlines drift as initial budgets are revealed to be unrealistic and obstacles which should have been foreseen come to light.

Kahneman characterises the way in which our emotions bias our thinking as substituting ‘how do I feel?’ about an issue, for ‘what do I think?’ about it. The UK’s 2016 vote to leave the European Union is another example where politicians encouraged us in this bias. A Survey in December 2023 found that 55% of people in Great Britain thought that it was wrong to leave the European Union compared with 33% who thought it was the right decision. ‘Bregret’ has set in because the benefits promised by those advocating leaving in 2016 have not been realised. And indeed it is those people who voted for Brexit - mainly less prosperous and less well-educated - whose lives have been most damaged.

People wanted change, to defy the liberal elite, and the prospect of ‘taking back control of our laws and borders’ seemed very attractive. But there was no analysis of the evidence for the prospective benefits or of the economic downside of leaving our biggest trading partner; indeed, the then Prime Minister, David Cameron, actually forbade the civil service to carry out any analysis of the effects of leaving, so confident was he that the vote would be to remain. The cheering assumption that all would be for the best was never subjected to proper conscious challenge.

Invoking the Engineer

As this account shows, it is extremely difficult to counter the ways in which our unconscious minds systematically undermine our rationality. Mostly we don’t notice it’s happening, and when we do we are very adept at rationalisation, at constructing logical arguments to demonstrate that our unconscious promptings are right.

Our only weapon in this war is to engage our conscious, analytical minds; to invoke the dual process and check our intuitive promptings against facts, evidence and logic.

It helps greatly in dealing with the slipperiness of intuition to have some formal tools and get into the habit of applying them: simple algorithms or checklists and procedures. When they were captured by the salience of flu, for example, it might have been helpful if the advisers had ‘triangulated’, ie made a deliberate attempt to give due consideration to all sources of information, particularly external ones – the advice of the WHO, the responses in Asia and other parts of the world, as well as to the UK’s experience and modelling.

The optimism bias is particularly difficult to deal with because of its emotional content. To have any chance of countering it we need to engage our imagination and emotions as well as our rational faculties. Gary Klein has invented an exercise that seeks to do this by engaging the project sponsors in the task of envisaging, not the best outcome, but everything that might go wrong – he calls it a ‘pre-mortem’. He suggests that when the team is converging on a decision but has not yet formally committed to it, they should get together and carry out the following exercise:

‘Imagine that we’re a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Take 5-10 minutes to write a brief history of that disaster.’

In the next article I will provide more practical advice on using your conscious mind to validate your intuitions and how you can train your intuitive ‘muscle’.

Sources
Kahneman, D, Thinking Fast and Slow, Allen Lane, 2011
Coronavirus: Lessons Learned to Date, House of Commons joint report by Health and Social Care Committee and Science Committee, October 2021
Sharot, T, The Optimism Bias, Robinson, 2012, and The Influential Mind, Abacus, 2018
Shrira, I, Women More Likely than Men to See Nuance When Making Decisions, Scientific American, September 2011

If you would like to be notified when a new article is posted, please sign up below. If you would like to comment please use the form below or email me at janet@janet-evans.co.uk.

1 thought on “4 Misdirection: Why Intuition has a Bad Name

  1. Reply
    Eva Burkowski - February 19, 2024

    “But the consequence is that it’s very difficult to let go of a pre-established opinion or to change someone else’s mind about something, even when we are sure they are wrong.” This sentence really hits home in the present political climate, especially where I live, in the “really nice apartment over a meth lab”, as Robin Williams described Canada. If there are strategies for mind-changing, please share them:) I also found it relevant that you mention the fact that those taking the erroneous decisions were mainly men; it is time for decision-makers to reflect the population better. I believe some Scandiwegian countries have female led governments–perhaps we need to learn from them.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to top