Close

24th June 2019

When to trust gut instinct….. Some principles to consider when making decisions in a workplace.

We all have them – intuitions and gut feelings….

We all have them – intuitive and gut feelings, but whether one should trust them is another matter. There is a fair bit of scientific investigation around this area, and interestingly, two key experts who hail from different schools of thought around this topic, agree on one thing; that when it comes to trusting our intuition there is no hard and fast answer, but rather that ‘it depends’.

By way of background – my interest in this area has come from running mindfulness courses with professionals from the Nuclear Power industry. Anchor Point has adapted a conventional mindfulness course to fit the client (EDF) with alterations to take account of the scientific background of many attendees and the safety critical nature of their workplace. The outcomes have been excellent in terms of gains in wellbeing and awareness. One particular aspect that has been interesting to explore has been the extent to which to include work on the ‘embodied’ self as part of the course.

Reality is, of course, that we are all ‘embodied’ in the sense that we all inhabit our bodies. But taking this further and asking engineers and scientist to ‘tune into their bodies and feelings’ has required lots of careful thought and reflection on my part. I have very much been feeling my way – no pun intended! It is the intention of this article to share some of the milestones of my learning in this area, and to provide some insight into the related question of whether to what extent we should trust our gut instinct.

I feel, therefore I am….

 

descartes

 

My starting point for beginning to understand this area more fully has been the work of the neuroscientist, Antonio Domasio. His work has been fundamental in challenging the idea that humans operate from a purely rational, logical perspective. In his aptly named 2008 book, ‘Descartes’ Error’, Demasio suggests that Descarte’s axiom, ‘I think therefore I am’ should be replaced by, ‘I feel therefore I am’. In an experiment that, in part, led to this bold statement, Demasio and his colleagues had participants choose playing cards on a computer screen from a set of positively and negatively rigged decks. They discovered that participants began to generate a measurable stress response in their bodies when their screen cursor hovered over negatively rigged card decks, long before they consciously worked out which decks were or weren’t advantageous. It’s useful to be reminded that our bodies are more than just a skeletal framework for transporting the brain around. There is a more collaborative process going on. Feelings and intuition, such as a gut sense that something isn’t right, undoubtedly have a role in guiding our behaviour, and tuning into this might well be of value.

 

 

So, what is intuition? 

Many key thinkers in this area agree on the starting point – that humans are sense-making organisms who constantly seek out patterns in what they experience when encountering life. These patterns, once identified, get stored away in our long-term memory and the next time we detect one of those patterns or something similar, our brain finds it and it shapes our subsequent behaviour. It is this act of recollection that forms the basis of intuition.

Should we trust intuition? Two schools of thought.

Two key thinkers in this area; Daniel Kahneman and Gary Klein, present very different starting points as to whether this phenomenon called intuition should be trusted. Kahneman is a behavioural economist, winner of the Nobel Prize for Economics in 2002 and author of the influential book, ‘Thinking Fast and Slow’. He suggests that intuition is part of ‘System 1’ thinking. This is automatic, fast, associative and memory based. It is, Kahneman asserts, ‘the secret author of much of our lives’. Kahneman’s work centres on the problematic nature of ‘System 1’ thinking, which is prone to errors such as confirmation bias (looking for evidence to reinforce what we think we already know), over-simplification and a tendency to overstate our ability to predict likely outcomes. Kahneman argues that worse still, much this biased thinking is unconscious – we are unaware that we are getting things wrong. This, he names ‘unconscious bias’. His assertion is that to successfully counteract unconscious bias we should consider ‘System 2’ thinking; where we consciously work in a deliberate, effortful way. Such work has many advantages and is a useful buffer to unconscious bias, but, as Kahneman points out, it requires effort and it’s unrealistic to apply it to all situations.

 

 

Gary Klein comes at intuition from a very different angle. He has spent much of his career thinking about ways to promote reliance on expert intuition in executive decision making. Much of his initial research looked at decision making amongst fire fighters.  He labels himself part of a group of academics who study decision making in real-world, fast moving and stressful situations (an approach labelled ‘Naturalistic Decision Making’). Their work has pointed to commonalities between the decision making processes of professionals such as firefighters, nuclear power plant controllors, offshore oil rig controllers and military personnel amongst others. These experts make decisions in the face of uncertainty, time pressure, high stakes and shifting conditions. What many of these professionals exhibited was Recognition-primed decision-making (RPD) strategies – in other words decision making that, in part, took advantage of a professional’s prior knowledge and experience.

Those undertaking Naturalistic decision-making (NDM) often make use of their intuition. They are at their best in utilising this intuition when they have had years in which to build skills within their specific environments. Whether they be chess players, firefighters or nurses, what is key is the time and opportunity for an individual to build expertise in their specific domain. This allows decisions to be made in part by recognising patterns from similar past experiences.

So when should we trust intuition? 

What’s refreshing about Kahneman and Klein, who have very different starting points on whether we should trust intuition, is that they have spent time together exploring the common ground (Klein, 2009 and Kahneman, Thinking Fast and Slow Chapter 22). Here are some of their key areas of agreement – an interesting list that might help us to each explore when to trust our own intuition or that of others.

  1. The level of a person’s confidence is an unreliable indication of whether we should trust a given person’s intuition, even our own. As Kahnemen points out, intuition has a bedfellow; the illusion of reality. True experts, it is said know when they don’t know. Without meaning to sound too much like Donald Rumsfeld, non-experts certainly do not know when they don’t know. There are some interesting links here to the 4 stages of competency model.
  2. So, if subjective levels of confidence cannot be trusted, what can we trust? The key is to take account of the environment in which the decision is being made and the conditions in which the skill or expertise has been acquired.
  3. Taking these in turn; the first of these is to look at the environment in which the decision is being taken. Does it have a degree of predictability? A firefighter, for example, has a much smaller range of possible outcomes when facing a fire than a political or economic forecaster looking a few years ahead. In other words, it’s an environment where recollection of past experiences counts. For a firefighter, experience can help them anticipate how flames might spread through a building, or to notice the signs that a house might be about to collapse. Such environments are labelled ‘high validity environments’. Political and economic analysts on the other hand are, to a greater degree, shooting in the dark.
  4. The second element is to carefully evaluate is the level of personal experience. How much relevant practice has the decision maker had, and has that practice been deliberate? Central to this sense of deliberate practice is to be within a working environment where quality feedback is freely available. To take one specific example, an anaesthetist works in an environment of rapid feedback – the effects of their actions become quickly evident in terms of any physiological changes for the patient before them. Experience can be built much more effectively when such feedback is available.
  5. Finally, there needs to be caution around the transferability of expertise (called ‘fractionation’). Experts in one field are sometimes called upon to make judgements in areas in which they have no real skill or expertise.

What does this mean for our own intuition in our own workplaces? Again, it depends….

It is clear that in most cases, taking a slow and deliberate decision and challenging our biases is of great value. Wherever possible, we should be aiming to use Kahneman’s ‘System 2’ thinking and looking at the facts in a rounded, considered and deliberate way. Even when we think we can trust our intuition, we should be cautious. There are however, situations when a more intuitive approach to decisions could be the right thing to do. When to trust this intuitive sense can be summarised by saying, ‘it depends’; it depends upon many things that we need to carefully consider. Levels of confidence in intuition are a poor guide, particularly when we are looking to our own intuition. Instead, we need to consider the level of expertise, whether the field of expertise can truly be applied to this particular case, whether experience has built up in an environment that allows for feedback and subsequent adjustment, and finally, whether the environment has a degree of predictability.

 

 

Khaneman suggests that an anaesthetist is someone who is likely to fulfil many of the above conditions; they have built up experience in a known environment, and furthermore their expertise has been built up through direct feedback between actions and outcomes. If an experienced anaesthetist says, “I have a feeling that something is wrong”, Kahneman suggests that everyone in the operating theatre should prepare for an emergency.

Thinking Fast and Slow… Is it a binary choice?

Sabrina Cohen-Hatton is a very unconventional firefighter. Her backstory is worthy of a read (spanning homelessness, through to completing a doctorate in behavioural neuroscience and becoming a Deputy Assistant Commissioner in the London Fire Brigade). Her doctoral research, carried out through videoing and analysing live situational firefighting decision making, turned up some interesting conclusions. It tended to support the view that in fast changing situations, such as fire service commanders responding to emergencies, the ‘System 1’ approach to decision-making seems to prevail.

‘Traditionally, in the fire and rescue service, we believed that our decisions were very analytical and followed a set sequence of gathering information (situational analysis), evaluating options (plan formation) and enacting a plan (plan execution). However, the result of research with fire commanders across the UK showed that they made these thoughtful, considered decisions only 20 per cent of the time. Instead, commanders were making use of their previous experiences – consciously or unconsciously – and making instinctive, gut decisions 80 per cent of the time.’

Recognising the importance of intuitive ‘System 1’ thinking and the reality of its common place use in decision-making, Cohen-Hatton developed a framework to help commanders to guard against decision traps when relying on gut instinct in live, time-critical situations. The simple framework acts as a set of checks and balances to act as a buffer against making rash decisions.

  • Consider the goal – what do I want this decision to achieve?
  • Check with expectations – What do I expect to happen as a result?
  • Weigh risk vs. benefits – how do the benefits outweigh the risks?

What Sabrina Cohen-Hatton and her research team discovered was that ‘the technique worked: commanders linked their actions to their plans and situational awareness increased significantly. Most importantly, the technique didn’t slow down decision making.’ (The Future of Incident Command, National Fire Chiefs Council, 2015)

I have noticed interesting similarities between Cohen-Hatton’s framework and the human performance tools used in the nuclear power plants where I deliver mindfulness training. Here too, there are practices that aim to constructively challenge uninhibited ‘System 1’ thinking. For example, each of the control rooms in the power stations encourage a ‘STAR’ framework (‘Stop – Think – Act – Review’) to encourage checks and balances when decision-making. In these environments I’ve equally noticed that value is placed upon experience, and there is a recognition of the time it takes to build expertise through day to day work, relevant training and simulation.

So, what to make of all of this? 

 

 

Maybe the first port of call for each of us should be to sometimes put the brakes on, to add a pause and a decisive question or two into the mix before trusting our intuitive gut reactions.

There is a useful phrase that resonates with me, again taken from my work with the nuclear sector; which is that of ‘healthy unease’. The reason I like it is that the very act of being ‘uneasy’ about something is a body based, intuitive feeling. As such, our feelings do have a role to play in making decisions. And equally from another angle, it’s a lovely reminder to treat all the information we receive (including our own intuition) with caution, and to be careful to consider its providence. With regards to intuitive feelings, we may usefully ask ourselves whether they come from a place of genuine experience, and whether the situation at hand is sufficiently predictable to make this prior experience truly relevant.

We can value our experienced-based intuition, but should not to do so unquestioningly.

The final word goes to Al Pittampalli who summed up this sense of self-awareness succinctly;

We’re likely to have reliable intuitions in certain domains and unreliable ones in others. Think of intuition as a compass, and the world as a vast land dotted with areas of high magnetic resonance. The compass is invaluable in certain areas, but, corrupted by the magnetic field, can be misleading in others. One of the most important tasks of professionals is to draw a map for yourself, so you know when to trust the compass and when to put it away.

Martin works with clients to bring mindfulness interventions into the workplace. He works in many sectors, including those in safety critical environments. For further details see his contact details here.