We all are biased. To make matters worse, we are largely unaware of our biases and the ways our prior attitudes and beliefs shape our reasoning. The political psychologists, Charles Taber and Milton Lodge, call this motivated reasoning. It means our thinking is guided by our purposes or goals – we try to confirm what we already believe. In other words, we are highly motivated to defend our prior, existing beliefs. These selective attention, exposure, and judgment processes powerfully influence how and what we think.
Several recent books also highlight the range of biases that often guide our thinking. Daniel Kahneman’s book, Thinking, Fast and Slow, demonstrates how we typically rely on our fast thinking, which is more intuitive and emotional. It results in snap judgments that are typically based on stereotypes and biases. And this is true for even the most expert thinkers among us. As Kahneman notes, we tend to see patterns and causation in randomness; we “see the world as more tidy, simple, predictable, and coherent than it really is” (p. 204). For Kahneman, we don’t pay enough attention to the reliability and accuracy of information and end up with a much simpler view of the world than the data or information we have justifies.
The psychologist Jonathan Haidt, in his book titled The Righteous Mind, makes a similar case. Haidt argues that our intuitions come first and strategic reasoning comes later. According to Haidt, intuition and “groupish righteousness” tend to drive our reasoning and make us blind to our own biases. This is exemplified by the political polarization we see in American culture.
Michael Shermer’s 2011 book, The Believing Brain, also highlights the different ways we form beliefs and then rationalize, justify, and defend them by seeking confirmatory facts and explanations that support them. Among other biases, Shermer outlines the following powerful tendencies in our thinking:
- Patternicity: the tendency to find meaningful patterns in even meaningless data;
- Agenticity: the tendency to infuse random occurrences or complex multi-causal events with human intention and agency;
- Confirmation bias: the tendency to seek and find confirmatory evidence to support already existing beliefs;
- Hindsight bias: the tendency to make the past fit with our present knowledge.
We might add to this list disconfirmation bias, which means we are quick to criticize, denigrate, and reject ideas that run counter to our existing beliefs.
As these authors note, we have a low tolerance for ambiguity; skepticism and critical thinking is difficult work. Slower, more deliberative, and careful thinking that might check or counter bias is difficult for most of us.
However, the political scientist James Druckman offers some hope for good thinking. He draws on the work of Houston and Fazio (1989) to note that bias can be countered when people are directed to focus on the nature of the judgmental process. This work suggests that motivated reasoning and bias may be reduced when people reflect on their reasoning processes. Druckman argues that the motivation to be accurate can help people more consciously process information in more even-handed ways. Talking about the role of unbiased and critical thinking in democracies, Druckman argues that “citizens should aim to process information consciously and to consider multiple perspectives… In sum, the motivation to be accurate serves as a realistic and flexible standard by which one can evaluate democratic competence.
As educators, it is our responsibility to motivate and guide students to be accurate in their reasoning. We can do so by providing them with clear standards for accurate thinking and by teaching them to use processes of careful and critical reasoning through explicit instruction, modeling, and the use of key questions and procedures to guide their thinking. We also need to help students become more aware of their biases and tendencies by teaching reflective thinking processes.
In our work, for example, we have proposed reader reflexivity questions to help readers focus on the beliefs, biases, values, and emotions they may bring to particular texts. These guiding questions include:
- What prior knowledge, personal experiences, and other texts help me make sense of this text?
- What affects the way I read this text (e.g., prior experiences and learning; my values, opinions, emotions; my background and culture)?
- What additional thoughts or questions do I have about the text? What additional information is necessary to help me understand the text?
- How might people from different backgrounds and with different experiences read this text (e.g., from different ethnic, cultural, national, age, gender, political perspectives)?
As students get in the practice of asking these questions they might become more aware of their own thinking and the ways their beliefs and biases shape how they engage with different texts. These questions ask students to slow down and check their thinking, to become more aware of how their own attitudes and beliefs might affect the ways they read and think about new information. Undoubtedly, it will take a lot of guidance and practice to help students become more reflective, and careful, thinkers.