Omnia Health is part of the Informa Markets Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Critical meta-thinking is essential in clinical diagnosis

Article-Critical meta-thinking is essential in clinical diagnosis

Metaverse op-ed by Maha Chehab.png
The de-biasing of decision-making is neuroscience’s call to action for improved patient outcomes.

Diagnosis and treatment plans are at the heart of the clinical practice, but decision-making determines the success of the diagnostic process. Surprisingly, a correct diagnosis is not made as often as it is thought. The underlying factors contributing to diagnostic error have multiple causes, including a no-fault error, system-related, and most common cognitive errors. A no-fault error is described as when the disease is masked or is in an uncommon appearance or due to uncooperative, misleading information by the patient, while system-related errors are due to technical failure and equipment problems, organisational processes, flaws, and problems with policies and procedures, inefficient processes, teamwork, and communication.

Whilst the most common factor in diagnostic error is a cognitive error where problems involved faulty synthesis, knowledge or data gathering, and cognitive judgment. In most cases leading to premature closure after an initial diagnosis was reached without considering reasonable alternatives. Common illnesses are commonly misdiagnosed, because their signs and symptoms are overlapping with those of numerous other diseases, ruling out the possibility of other diseases which in many cases lead to fatal cases.

Understanding errors

According to research system errors contributed to the diagnostic error in 65 per cent of the cases and cognitive factors in 74 per cent. Usually, diagnosis reflects the clinician’s knowledge, clinical acumen, and problem-solving skills; however, it is not a lack of knowledge that leads to failure; over the past half-century, cognitive psychologists and neuroscientists have indicated the cognitive and affective state of the human mind's vulnerability to memory fallacies, cognitive biases, incorrect assumptions, and other reasoning failures.

Cognitive failures are best understood in the context of how our brains manage and process information through the reasoning process.

Memory, attention, judgment, and decision-making in contemporary theories of clinical reasoning adopt the dual-process processing model system, which consists of two systems of thinking. Type 1 is autonomous, subconscious, fast, and intuitive, is generally either hard-wired or acquired through repeated experience, and doesn’t require working memory. It is considered an independent cognitive ability and mostly serves us well and is indispensable in enabling us to get through life in a fixed-action mindless pattern. Type 2, on the other hand, is conscious, controlled, slow, reflective logical, and analytical requiring working memory following laws of science and logic and is collated with cognitive ability. Descriptions of the operating characteristics of the dual processing system provide a useful starting point for learning about medical decision-making.

Generally, it seems that much of our everyday thinking is flawed and our thinking is a mixture of system 1 and system 2 and clinicians are not immune to the problem.

Addressing biases

Typically, diagnostic error is viewed as a cognitive failing and cognitive theories about human memory propose that such errors may arise from both Type 1 and Type 2 reasoning; however, Type 1 arising from the intuitive mode is the primary source of cognitive failure that can lead to cognitive biases, fallacies, and thinking failures.

More than 100 biases of information processing can interfere with sound clinical reasoning and decision-making influencing medical disciplines with at least 50 cognitive biases applicable in medicine. However, anchoring and confirmation are prominent in cognitive errors.

For example, when a patient undergoes an analytic assessment for chest pain in a cardiac clinic that culminates in angiography, the conclusion is invariably correct.

By contrast, when physician practices prioritise information and data that support their initial impressions or beliefs, an anchoring bias has occurred.

In another case, a 60-year-old male arrived in the emergency room (ER) with flank pain and hematuria. The fast and frugal approach to his complaint involves primarily a pattern recognition heuristic that leads to a diagnosis of renal colic. This may well be correct. Occasionally, however, it will be a dissecting abdominal aortic aneurysm, and the heuristic will have catastrophically failed. When common illnesses' symptoms overlap with other diseases' symptoms heuristics or mental shortcuts used in clinical decision-making often serve well, but occasionally fail (Pat Croskerry, 2005).

Another may be countertransference, an effective bias that can be triggered by past experiences, such as a patient’s behaviour or appearance evoking a memory of a previous similar encounter with the physician and producing a biased response or occurs when the therapist projects their unresolved conflicts onto the client.

Despite, diagnostic error and analytic failures being commonly multifactorial in origin, typically involving both system-related and cognitive factors like biases; cognitive overload, fatigue, sleep deprivation, or emotional agitations also play a major role. In an impact, an affective state with its vulnerability to mood alterations as anyone else, decision-making and judgment are affected, yet the impact of the affective state on decision-making has gained little attention to date such as in a caregiving role.

Organisational changing conditions, like the deployment of ongoing new technology systems, change management, business transformation where physicians are racing to keep up with digital transformation extorted requirements and interpersonal development, conflicts, and lack of effective leadership in the workplace may lead to temporary or ongoing changes in the affective state of physicians. Optimal perception, attention, memory, and reasoning performance become impaired.

Stress, fatigue, and profound psychological effects such as anger, guilt, inadequacy, and depression, are well known to produce irritability, intolerance, and other mood changes that will also exert an influence on judgment. The impact of diagnostic failure on patient safety does not appear to have been fully recognised yet and we remain unrealistic about acknowledging the impact of cognitive biases and affective states and their effect on clinical reasoning. Despite the substantial impact of our evolving understanding of cognitive psychology, organisational psychology, and neuroscience, and the significant influence on academic medicine over the past 20 to 30 years; major social sciences have not historically been considered within the remit of medicine.

Maha Chehab.jpg

Maha Chehab, Business Psychologist and Organisational Neuroscience Specialist

Securing optimised performance

To drive critical meta-cognition thinking, fortunately, cognitive psychology and organisational neuroscience provide insights into how to prevent biases and it would be beneficial to include them in the medical school curriculum part of the solution is to maintain a culture that works toward recognising the psychological safety of physicians at work and safety of patients at clinics.

Recognising that such cognitive errors are not inevitable, organisational neuroscientists and psychologists can help:

  • Curate educative programmes, cognitive tutoring systems, training and coaching for medical students, residents, and fellows on cognitive biases, and the role they play in diagnostic and treatment errors.
  • Help physicians become more self-aware and familiarised with the many types of cognitive biases and build effective debiasing strategies and provide interventions to guard against our psychological defence mechanisms that hinder humans from examining their thinking, motivation, and desires too closely.
  • Build a critical thinking program of neuroscience and coaching, teaching how decision-making works where cognitive biases are addressed, taught, and learned. This is by showing how cognition, memory, and attention apply to clinical cases and how to fix biases and de-bias oneself during diagnosis.
  • Help with building the “ability” to maintain keen vigilance and mindfulness to engage in the purposeful, self-regulatory judgment of one's thinking.
  • Provide interventions to build habitual memory focus through self-directed mind-ware awareness, enhancing the memory to retrieve rules, knowledge, procedures, and strategies to aid decision-making and problem-solving as soon as the situation arises.
  • Work closely with leadership on fostering the psychological safety and well-being of the physicians in the workplace strengthening the emotional and attentional intelligence by which it is important to note that, the affected state of the human is inseparable from thinking and plays an integral part in our ability to process information meaningfully, make good decisions, drive optimised performance, and avoid affective pitfalls.
     

In the age of digital transformation, clinical decision support software or artificial intelligence (AI) methods, in particular, machine learning (ML), reinforcement learning, and deep learning, which have been there for the past 15 years and are not something new to the medical field are well-suited to deal and aide physicians in evaluating multiple outcomes to optimise diagnosis.

However, deep learning models are less easily interpretable and may be biased too, and to establish a causal link still, requires an immense amount of big data to be generated and the fact that any biased data learned by AI remains a factor in question in the decision support system, given that just like AI algorithms fed by humans who are mostly unaware of their own biases determine the quality of the clinical decision, potentially creating a new bias that is data bias among the medical field. Hence, the need to focus on the stabilisation of human bias before creating a dissonance between human and AI biases.

Bridging social sciences such as psychology and neuroscience within the social ecosystems including all forms of organisations, educational and medical systems, and raising awareness to build a life-long commitment toward self-de-biasing and learning on the significance of safeguarding self’s critical thinking and cognition is the major goal to focus on in our transition towards cognitive and digital societies.
 

References

Diagnostic Error in Internal Medicine | Health Care Safety | JAMA Internal Medicine | JAMA Network
Diagnostic Failure: A Cognitive and Affective Approach - Advances in Patient Safety: From Research to Implementation (Volume 2: Concepts and Methodology) - NCBI Bookshelf (nih.gov)
https://journalofethics.ama-assn.org/article/believing-overcoming-cognitive-biases/2020-09
https://www.ncbi.nlm.nih.gov/books/NBK20487/
Medical Error Reduction and Prevention - StatPearls - NCBI Bookshelf (nih.gov)
The Causes of Errors in Clinical Reasoning: Cognitive Biases, Knowledge Deficits, and Dual Process Thinking - PubMed (nih.gov)


Maha Chehab is a Business Psychologist and Organisational Neuroscience Specialist | Change and Transformation | Cognitive and Digital Enterprises.

TAGS: Clinical
Hide comments
account-default-image

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish