You might be more familiar with emotional hijacking than you are with the cognitive variety. Emotional hijacking occurs when we get scared or angry and diminish our ability to access the executive function in the prefrontal cortex. When this happens, we lose awareness, go on automatic (become mindless), and can infect others with these emotions.
Cognitive hijacking occurs when we perceive, think and make decisions with unconscious biases and impaired functioning of the brain. This is in part caused by emotional hijacking but also occurs with more subtle signs of danger to our sense of self and safety.
Cognitive hijacking takes a number of forms. Here I explore “confirmation bias.” Because so many of us, myself included, seem to behave as if we are not only entitled to our opinions but also to our own facts these days, I intend to explore the multiple forms of cognitive hijacking over the next few weeks including “loss aversion,” “intolerance of cognitive dissonance,” and “identity-protection cognition”. Don’t these ways we fool ourselves sound fascinating?
Confirmation bias is the tendency to seek out and interpret evidence—even new evidence—that confirms or supports what we already think or believe. We ignore or distrust evidence that contradicts it. Another term for this is “motivated reasoning” in which, to borrow an analogy from psychologist Jonathan Haidt, we act more as lawyers looking for evidence to prove our case than we do as scientists looking for evidence to find out what is true.
A classic experiment illustrates the influence of prior beliefs on people’s response to “objective data.” Participants in the study are exposed to the results of two fake studies: one set of information supports the idea that capital punishment prevents violent crime and the other contradicts it. When the data conflicts with people’s beliefs, they criticize the findings and methodology of the study. When it confirms their beliefs they think the findings are convincing. These results have been substantiated over many decades in ensuing studies. The evidence that we all are subject to confirmation bias and motivated reasoning is compelling. Here’s the scary thing. You might not believe this evidence if it does not confirm your beliefs about human beings!
Confirmation bias interferes with our ability to come to a common understanding of the issues at hand in our conversations at work and in our communities. This is why it is critical to include in our meetings:
- Stakeholders that represent diversity of all kinds including ideological and intellectual;
- Unbiased education for all stakeholders about the issues at hand;
- Use of diverse small groups (maximum of five or six) in meetings so people can engage meaningfully and safely with one another;
- Ground rules to guide the conduct of the interactions and help people feel safe.
In small groups, ask people to (1) share how they came to believe what they believe and why it is important to them; (2) share the impact of what they hear from others in non-judgmental, responsible ways (When you say “X,” I feel anxious because I am afraid “Y” will happen.” Not, “How can you possibly think that? That is so typical of you people;” (3) set aside their beliefs and positions for a time and focus on understanding those of others; and (4) don’t go too long in the conversation without mindfully checking in on their internal atmosphere, i.e., notice whether they are getting emotionally hijacked and/or cognitively hijacked.
These four requests help people become aware of their biases and choose to honestly consider information and points of view that challenge their beliefs. In addition, avoid asking people to justify their beliefs—this provokes defensiveness. Instead ask them to explain how their ideas would work in real life. This invites people to investigate their beliefs and feel safer while they do it.