Much of what passes for thinking consists of unconscious, not conscious, mental processes. When it comes to taking in information and deciding what to believe and what not to believe, for example, we are appallingly predictable. We are most likely to believe:
What Is Familiar
Information that feels familiar is easier to absorb and believe than information that is unfamiliar. The information could be familiar because it’s associated with other beliefs we have or it could come from a trusted source. On the other hand, it could simply be something we’ve come across before—especially if we’ve come across it multiple times. Frequent repetition can be enough to convince people to believe things that are not true because familiarity generates a sense of cognitive ease. Called the mere-exposure effect, advertisers make use of it, but they aren’t the only ones.
Even if we’re aware of the mere-exposure effect, we probably think we’re immune to it because we’re more sophisticated than that. Believing we’re immune to it, however, might make us even more susceptible to it than we would be if we simply recognized it.
What Is Easy
Information that is easy to understand gives us a sense of cognitive ease. Information that is difficult to understand requires greater cognitive effort to process. Our brain prefers to chill out, so it just says “no” to exerting additional cognitive effort.
Say you’re faced with choosing between two concepts, ideas, or explanations. Idea A is easy to understand, while Idea B is more difficult. Statistically speaking, you’re much more likely to accept Idea A instead of Idea B simply because Idea A is easier for you to swallow. This is especially likely to be the case if you are already experiencing some degree of cognitive strain or if your conscious (System 2) attention is depleted. You’ve undoubtedly had the experience of feeling “brain dead” following a mentally fatiguing effort. That’s when you’re most susceptible to believing what is easy.
What Validates Our Preexisting Beliefs
Information that confirms what we already believe to be true makes us feel right and certain, so we’re likely to accept it uncritically. On the other hand, we’re more likely to reject information that is inconsistent with what we already believe. At the very least, we hold inconsistent information up to greater scrutiny. So we have different standards for evaluating information based on the level of cognitive ease it generates. And evidence has precious little impact on us if it conflicts with what we believe simply because the cognitive strain of processing it is too great.
The easy acceptance of information that validates what we already believe is a result of confirmation bias. Confirmation bias causes us to selectively notice and pay attention to what confirms our beliefs and to ignore what doesn’t. For example, people who favor gun control pay more attention to stories about injuries and deaths resulting from gun use; people who are against gun control pay more attention to stories about people using guns to defend themselves. Confirmation bias underlies the discomfort we feel around people who disagree with us and the ease we feel around people who share our beliefs [see What is Familiar and What is Easy, above].
It’s easy to believe what’s familiar, what’s easy to grasp, and what validates our pre-existing beliefs. No critical thinking or cognitive effort are required. On the other hand, actual thinking, as Dan Ariely says, is difficult and sometimes unpleasant.