Property
Type 1 processes
Type 2 processes
Decision making
Intuitive
Analytical
Heuristic
Normative
Associative
Deductive
Concrete
Abstract
Proportion of cognitive time
High
Low
Awareness
Low
High
Automaticity, reflexivity
High
Low
Speed of response
Fast
Slow
Effort required
Minimal
Considerable
Resource cost
Low
High
Vulnerability to bias
Yes
Less so
Errors
Common
Few
Affective involvement
Often
Less so
Importance of context
High
Low
Hard-wired
Sometimes
No
Able to be overridden
Yes
Yes
Metacognition
Impulsivity
Mindfulness
Mindlessness
Decoupling
Dysrationalia
A variety of approaches to decision making and reasoning have been proposed, but most are compatible with this dual process model [5]. The first application of the model in medicine appears to have been made by Dawson [8], and has since been adopted as a universal approach to clinical decision making (CDM) [9] (Fig. 33.1). The model is relatively straightforward and can be easily taught.
Fig. 33.1
The dual process model of decision making. The multiple arrows shown for Type I processing indicate different subgroups. Many Type 1 decisions will be acted upon without calibration
Type 1 Decision Making
As soon as one sees, smells, hears, tastes, or feels anything, the brain’s automatic and initial response is to try to match it to a familiar pattern. These patterns may be hard-wired or acquired through experience. If a matching pattern can be found, an automatic response results, and Type 1 processing occurs. Importantly, it is reflexive and subconscious. We can make decisions and act on Type 1 output, but we do not deliberately reason in Type 1. Stanovich refers to these processes as The Autonomous Set of Systems [10]. In medicine, it is the system that delivers the augenblick (“moment of an eye”) response—when a pattern of symptoms or signs presents itself and the clinician reflexively makes the diagnosis [11]. Often, the diagnosis is correct, especially for highly pathognomonic cases, but trusting completely in such spot diagnoses can be dangerous. The eminent surgeon, Cope, observed: “Spot diagnosis may be magnificent, but it is not sound diagnosis. It is impressive but unsafe” [12].
Most of our decision making occurs in Type 1. Psychologists say that we spend about 95 % of our waking lives there [2], and mostly it serves us well. Type 1 processing includes creativity, imagination, inspiration, romance, and other activities vital for life. The necessity for Type 1 has been succinctly and elegantly described by Smallberg [13]. Most importantly, prevailing dispositions (or biases) to respond to salient features of the environment in predictable ways saves us from having to re-invent every wheel in our lives. These biases give us the ability to perform a wide variety of simple to complex acts automatically, allowing us to achieve much of what needs to be done through serial associations [14]. However, as Denes-Raj and Epstein note (referring to Type 1 as the experiential system): “Although experiential processing is highly efficient and adaptive in most circumstances, in other circumstances it is error-prone and a source of maladaptive biases” [15]. Such bias, notes Smallberg, “is the thumb that experience puts on the scale” [13]. In short, we cannot live without Type 1 processing, but we must be vigilant when we use it.
Type 2 Thinking
If the sensory input is not matched with any existing pattern, we default to Type 2 thinking. Now, we seek to understand the stimulus in a conscious, deliberate, rational manner, generating and testing hypotheses. This is a more reliable way of making decisions and generally results in fewer errors. It is rational and therefore follows the laws of logic and science but it can still be seen as a form of pattern matching. Fundamentally, scientific enquiry represents an effort to make sense of and find patterns in data. Clinicians who follow the rules of science, logic, rationality, and critical thinking get the most out of Type 2 processing. Clinicians who do not may expose themselves—and their patients—to peril. Importantly, cognitive debiasing depends upon Type 2 thinking—the means by which we can mitigate bias in decision making. However, Type 2 thinking is time-consuming and resource-intensive; neither is it completely error free. Healthcare leaders, for example, may rationally deliberate and decide on policies that turn out to be fundamentally flawed. When error does occur in Type 2 thinking, the consequences may be far-reaching.
Expertise
It is important to know how the Dual Process Model deals with the development of expertise. In its simplest form, this process can be depicted as the acquisition of a habit or skill by repeated presentations to Type 2 processing where eventually the response is relegated to Type 1 processing (Fig. 33.1). Consider for example learning the skill of intubation. Initial efforts at intubation, usually on a mannequin, require holding a strange instrument in the non-dominant hand and attempting to visualize an anatomically indistinct area to which an endotracheal tube needs to be directed with the dominant hand. It is a complex, visual-motor-haptic skill that takes many repetitions to accomplish smoothly and become what is often a life-saving maneuver in real patients. With experience comes expertise, depicted in pathway A of Fig. 33.2. However, as with many skills, some become better than others, and experience does not always lead to expertise. Pathway B represents someone who has become experienced through multiple repetitions of the behavior but who has not become expert, perhaps from poor instruction, a poor learning environment, or through other factors [16]. In the longer term, there is some evidence that as physicians age, they will spend less time in Type 2 and more in Type 1 thinking.
Fig. 33.2
Acquisition of expertise (a) versus becoming an experienced non-expert (b)
The Executive Override Ability
Several more properties of the model need to be explained. Although Type 1 processes are reflexive, this does not mean that the decision maker is unaware of them or that they cannot be changed. Say, for example, a middle-aged patient presents to an emergency department with flank pain, vomiting, and hematuria. The physician’s immediate response may be that the patient has a kidney stone; an augenblick diagnosis based on an extremely familiar pattern. But the physician does not need to commit to it. He can observe his own response and may be reminded that he saw a case presented at morbidity and mortality rounds recently where a patient had similar signs and symptoms but was undergoing abdominal aortic dissection. His Type 2 thinking therefore overrides the initial Type 1 response, resulting in purposeful thinking to exclude other diagnoses. This check is referred to as executive override, also known as reflective thinking, mindful practice, and metacognition; it is also the basis for cognitive debiasing discussed below. After the override, the clinician may return to Type 1 thinking for more inspiration or might establish a differential diagnosis and systematically work through the options.
Irrational Decision Making
Type 1 processing can also override Type 2 thinking. Despite knowing the most rational thing to do in a particular situation, the clinician may do something else, often following his or her intuition. For example, if a clinician assesses a patient with a neck injury and finds that the physical examination and mechanism of injury do not warrant a cervical radiograph according to published decision rules but orders a radiograph anyway, he or she is overriding a well researched rule with usually high sensitivities that would probably outperform his or her decision making on most days. Stanovich refers to this Type 1 override of Type 2 as dysrationalia [17]. Historically, it is also known as akrasia [18], irrational behavior, dysfunctional decision making, or weakness of will [19, 20]. Road rage, binge eating, drinking to excess, gambling, and a variety of other human behaviors are vivid examples in everyday life.
Dynamic Calibration of the Two Systems
In many schematic presentations of the dual process model, there is a start point at which the initial stimulus is presented and an end point at which a decision is made. Thus, the model appears linear. However, an important feature is that the model is inherently dynamic. The first point, already mentioned, is the executive override junction. A transition from Type 1 to 2 processing is accomplished, which may then result in a return to Type 1 and perhaps reactivation of Type 2. This exchange can be thought of as a “toggle function” that allows for dynamic oscillation between the two types. There has been debate about whether Type 1 and Type 2 processing are on a continuum or whether they toggle back and forth [21], but the consensus is that the two types are parallel and distinct from each other.
The final part of the model is a calibration junction. The mark of a well calibrated thinker is the ability to balance the right blend of intuition and analytical reasoning in decision making for a particular situation.
The Cognitive Miser Function
The brain generally seeks to conserve energy. It can do so by defaulting to Type 1 processing where very little effort is required to keep the cognitive wheels turning. The evolutionary imperative for this conservation is related to the brain’s metabolic demand for 20 % of resting metabolic energy, despite comprising only 2 % of total body weight. In our ancient past, when calories were hard to come by, those who conserved energy had a selective advantage and were more likely to get their genes into the next generation. In modern times, several phenomena are associated with cognitive miserliness. For example, the prevailing tendency is to resist change, preserve the status quo, and avoid the uptake of new information, all of which require Type 2 thinking. One of the reasons for not dealing with decisions in Type 2 thinking, notes Kahneman, is that it requires cognitive effort [14].
There is an overwhelming tendency to revert to Type 1 decision making, becoming what Pink Floyd called “comfortably numb,” or as the Foundation for Critical thinking describes it, “living an unexamined life … in a more or less automated, uncritical way” [22]. An example is given in Fig. 33.3, which shows a side-view of a school bus, and the question is: Which direction is the bus travelling? Most people, seeing the symmetrical figure, conclude that it is impossible to say which direction the bus is travelling, yet preschoolers usually get it right and say the bus is going left. Their reason, which escapes most adults, is they cannot see the entry door (if they live in a country where you drive on the left side of the road, the direction of the bus is to the right). For the preschooler, the door is extremely important for getting on the bus, but adults who are too far removed from the problem and reluctant to make the cognitive effort to solve the problem will fail. Under various conditions (fatigue, sleep deprivation, cognitive overload, and negative moods) there is an increased tendency to default into Type 1 processing [23] and take the path of least resistance.
Fig. 33.3
Cognitive miser function. In which direction is the bus travelling? There are two possible answers: left or right
Individual Differences in Decision Making
Medical educators assume, perhaps naively, that there is a greater consistency among decision makers than probably exists and, given some basic variance, there is a reasonable degree of rational, logical, and intellectual homogeneity in a medical class. Thus, if a particular topic is adequately covered, the decision making performance of the class should be reasonably predictable. However, in practice, things may turn out a little differently. We tend not to acknowledge that sex, age, intellectual ability, rationality, personality, and other individual characteristics may all substantially influence people’s decisions [24]. Further, individual decision making is influenced by discipline-specific training; a process that may result in a predictable distortion of the way in which clinicians will see the world.
The Importance of Context and Ambient Conditions
Decisions are not made in isolation; each has some sort of context [25]. Decision making typically involves detecting a signal and distinguishing it from interference or noise; signals rarely arise without some noise attached to them. Noise is not simply acoustic, it may occur in any of the five senses and influence perception. Hogarth’s “wicked” environments [15] typically have marked amounts of noise that interfere with the ability to accurately interpret particular signals. This concept must be considered in any process that examines decision making out of context. A good example is morbidity and mortality rounds. Although they provide some of the best clinical learning opportunities, the signal is typically separated from the context in which it was originally perceived, and most of the surrounding context is usually lost. A further complication is that the outcome is usually known and, through hindsight bias, may distort our perception of the original decision making. Further, the impact of fatigue, sleep deprivation, sleep debt, dysphoria, cognitive overloading, interruptions, distractions, and other factors that influence decision making are rarely considered and difficult to assess, once the original environment is left. Critical incident reviews and root-cause analyses, always performed after the fact, suffer similar shortcomings as well as memory failures.
Cognitive Failure
A variety of factors lead to cognitive failure in the individual decision maker: cognitive and affective biases, reasoning failures, knowledge deficits, and others. Cognitive laziness is uncommon in medicine, but certain conditions may lend themselves to slipping into the cognitive miser mode. Mindlessly adopting various strategies to conserve thinking effort can lead to problems: failure to do a thorough history and physical exam, accepting biased or gratuitous comments from others, accepting verbatim the information given at handover, cutting and pasting someone else’s history and physical, deferring to authority without question, adopting a non-skeptical attitude, and many others. Healthcare providers cannot afford to be comfortably numb when patient care is at stake. By far the most important problem, however, appears to be the influence of cognitive and affective bias on decision making.
The Bias Problem
A universal feature of human decision making is its vulnerability to bias. Over a 100 cognitive, affective, and social biases have been described [26–30]. According to the Foundation for Critical Thinking: “Everyone thinks; it is our nature to do so. But much of our thinking, left to itself, is biased, distorted, partial, uninformed, or down-right prejudiced. Yet the quality of our life and that of what we produce, make or build depends precisely on the quality of our thought… Excellence in thought, however, must be systematically cultivated” [31]. These problems in thinking are compounded by clinicians being unaware of the many biases that affect their decision making, a condition known as “blind spot” bias. Several other biases may impede optimal decision making (Table 33.2). Bias should be considered a normal operating characteristic of the human brain – biases are everywhere and have the potential to influence almost every decision we make [32].
Table 33.2
Impediments to the awareness and understanding of cognitive biases in clinical judgment
Impediment | Effect |
---|---|
Lack of perceived clinical relevance | Medical undergraduates are not explicitly exposed to cognitive training in decision making. Historically, this area has not been seen as relevant to clinical performance and calibration |
Lack of awareness | Although the lay press has heavily promoted the impact of cognitive and affective biases on everyday decision making, clinicians are generally unaware of their potential impact on medical decision making |
Invulnerability | Even where awareness does exist, physician hubris, overconfidence, and lack of intellectual humility may deter them from accepting that they are just as vulnerable as others to biased judgments |
Status quo bias | It is always easier for clinicians to continue to make decisions as they have done in the past. There is a prevailing tendency against learning de-biasing strategies and executing them, given the additional required cognitive effort and time |
Vivid-pallid dimension | Cognitive and affective processes are mostly invisible and, at present, can only be inferred from outcomes or the clinician’s behavior. Descriptions of them are invariably abstract and uninteresting. They typically lack the vividness and concrete nature of clinical disease presentations that are far more meaningful and appealing to the medically trained mind < div class='tao-gold-member'>
Only gold members can continue reading. Log In or Register a > to continue
Stay updated, free articles. Join our Telegram channelFull access? Get Clinical TreeGet Clinical Tree app for offline access |