I’ve recently read “How Doctors Think” by Jerome Groopman, MD. It is a fascinating book in which Professor Groopman explores the thinking processes that are taught and practised by doctors. In medical school students are taught to use logical sequential processes to come to diagnoses, sometimes based on Bayesian statistical decision-making. Usually the thinking processes behind this are implicit rather than explicit.
However once they graduate and face real clinical situations doctors generally use thinking processes based on pattern recognition. The time constraints and pressures of everyday clinical practice appear to offer little alternative. In medicine Groopman recounts from his own experience that thinking is inseparable from acting – as Professor Donald Schon of MIT has said medicine is “thought-in-action.”
Groopman describes a number of un-conscious biases in medical decision-making. An example would be confirmation bias where it the obvious as suggested by immediate presenting information or others’ opinions guides decision-making and narrows thinking to focus around a particular explanation. He describes the story of a woman who suffered for 15 years with a series of stomach problems. She was diagnosed as having anorexia nervosa with bulimia and latterly irritable bowel syndrome. This was confirmed by a series of doctors both in primary care and specialists. Finally as she came close to death and weighed only 5 stone, she saw a new specialist. He asked her to tell her story again from the beginning. He slowly and painstakingly realised that the problem was something completely different – and after some tests – identified she had coeliac disease. With the right advice and treatment she soon recovered. She nearly died as much as a result of unconscious biases in medical decision-making as from an underlying disease.
There are many other unconscious biases that can affect medical decision-making including the impact of emotion, marketing and affection. Not surprising as the work of Daniel Kahneman and others has identified over 100 different forms of unconscious bias in decision-making.
Groopman helpfully gives a list of questions which patients and their relatives can use to help guide their doctors to avoid cognitive traps. He suggests offering to tell your story afresh and telling the doctor what you are most worried about. Asking “what else could it be?” or “is there anything that doesn’t fit all the diagnosis?” or “is it possible that I may have more than one problem?” can help steer your doctor through decision-making traps. It’s a good read and well worth equipping yourself!
Reading Groopman got me thinking about how I think in various roles that I play; as a consultant, non-executive director and leader. Interestingly in Tricordant, we often use the analogy of the human body as a metaphor for organisation and its problems. We have even published a paper with Cambridge academics on the subject! By the very use of such a metaphor, even though it illustrates key aspects of complex adaptive systems, there is a danger that the focus is on what’s wrong (pathology) as against what’s working well (health and wellbeing). Or that we focus on the body or organisation rather than the wider environment or strategic context.
My own thinking process (I think) naturally tends to be a rapid collation and integration of multiple information sources to create a picture and narrative for an organisation and it’s presenting issues. This ‘heuristic’ draws from a range of data sources both so called ‘hard’ and ‘soft.’ I draw from finance and activity information as well as the stories I hear and the pictures I see. I absorb the feel of the place and the smiles and or not – and behaviours of staff.
Within an overall story or metanarrative I typically identify the key issues which appear to the symptoms or surface issues and, what I will think (believe?) are the underlying causes. There will often be a story in my mind about how things connect up.
Such a view is usualy formed at an (almost) subconcious level. It focuses on the high-level. It’s usually there within a few minutes or hours of ‘meeting an organisation’. Recently I had a very notable exception when it took me nearly six months to make sense of a client – which really stood out (and bugged me!).
What I’ve learned is that any story or picture of an organisation needs to be ‘held lightly.’ And I need to embrace the tensions and lack of clarity. As Francis Baldwin said recently; ‘we dance on the nest of ambiguity.’ I need to resist the urge to make everything fit into before its time. Even if the client is desperately seeking reassurance that I know what I’m talking about! And the expect us to bring an accurate ‘diagnosis’ [more medicine!] and propose interventions [prescriptions!].
Other biases I see commonly in consulting include:
• Bias to the cognitive – ignoring emotions
• Valuing hard quantitative data above soft qualitative data
• Endless analysis of data
• Attributing cause-and-effect too easily
• Denial of context
• Blindness to culture and history
• Listening only to what is said and not listening to what is not said or indeed cannot be said
The work of David Rock and others point to the links to underlying neuroscience. I could go on (I really could!), and you can easily see how commercial considerations might cloud consulting processes. Many of these biases are most powerful because they are unspoken and unnoticed.
So perhaps like Groopman we need some questions we can bring out as leaders, non-executive directors, consultants or clients. These can help us and those we are listening to. Here are a few to get us going.
• ‘What is your model or frame of reference?’
• ‘Who have you talked to and why?’
• ‘What might X say if they were here?’ – Where X might be a customer, a known dissident or even Union colleague.
• ‘What other stories or patterns explain what you have presented?’
• ‘Can you present your findings in a completely different way – words, stories, or pictures?
Perhaps more than anything we should be asking ourselves these questions!