What Is Cognitive Ease—and Why Should You Be Wary of It?
Everyone wants to be right and to feel certain about things. These are built-in biological drives, not character flaws. When we think we’re right and when we feel certain, we experience a sense of cognitive ease. The world makes sense to us. And that puts us in a good mood.
Cognitive ease feels good, but it gives us a false sense of security because it makes us think we understand far more than we actually do.
Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance. —Daniel Kahneman, Thinking, Fast and Slow
Comfortably Numb
When it comes to taking in information and deciding what to believe and what not to believe, we are appallingly predictable. We are most likely to believe:
- What Is Familiar
Information that feels familiar is easier to absorb and believe than information that is unfamiliar. It could be familiar because it’s associated with other beliefs we have or it could come from a trusted source. On the other hand, it could simply be something we’ve come across before—especially if we’ve come across it multiple times. Frequent repetition can be enough to convince people to believe things that are not true because familiarity generates a sense of cognitive ease. Called the mere-exposure effect, advertisers make use of it, but they aren’t the only ones.
- What Is Easy
Information that is easy to understand also gives us a sense of cognitive ease. Information that is difficult to understand requires more cognitive effort to process, and our brain’s preference is to take it easy. Say you’re faced with choosing between two concepts, ideas, or explanations. Idea A is easy to understand, while Idea B is more difficult. Statistically speaking, you’re much more likely to accept Idea A instead of Idea B simply because Idea A is easier for you to swallow. Does that give you a sense of cognitive dis-ease?
- What Validates Our Preexisting Beliefs
Information that confirms what we already believe to be true makes us feel right and certain, so we’re likely to accept it uncritically. On the other hand, we’re more likely to reject information that is inconsistent with what we already believe or at least we hold inconsistent information up to greater scrutiny. We have different standards for evaluating information depending on the level of cognitive ease it generates.
And evidence has precious little impact on us if it conflicts with what we believe simply because the cognitive strain of processing it is too great. For example, it is easier to believe that What You See Is All There Is (WYSIATI), even after being confronted with evidence that you have missed something that was right in front of your face, than it is to believe that you are aware of only a tiny fraction of what is going on around you.
Cognitive Biases
We use cognitive biases as shortcuts to help us understand the world. We don’t have to use any critical thinking skills. No cognitive effort is required. We aren’t forced to reevaluate our existing beliefs. Because of our cognitive biases, we make snap judgments, form quick impressions or opinions, and operate on autopilot.
The bad news is that, since cognitive biases are by their nature distortions or errors in thinking, they actually decrease our understanding all the while giving us that feel-good sense of cognitive ease.
That’s just fine with the conscious part of our brain, which is slow and kind of lazy and doesn’t want to work if it doesn’t have to. It’s happy to let the unconscious handle as much of the load as possible. Because cognitive biases operate at the unconscious level, unless we make an effort to recognize them, we aren’t aware of them. We will even deny we have them.
To have a human brain is to be subject to cognitive biases. Some of the most common are:
- Confirmation Bias
The easy acceptance of information that validates what we already believe (as described in What Validates Our Preexisting Beliefs, above) is a result of confirmation bias. Confirmation bias causes us to selectively notice and pay attention to what confirms our beliefs and to ignore what doesn’t. Confirmation bias underlies the discomfort we feel around people who disagree with us and the ease we feel around people who share our beliefs. Example: People who favor gun control pay more attention to stories about injuries and deaths resulting from gun use; people who are against gun control pay more attention to stories about people using guns to defend themselves.
- The Halo Effect
The tendency to view other people as all good (or all bad) is the result of a cognitive bias called the halo effect. When we consider someone to be a good person, we find it easier to excuse or ignore behavior that is inconsistent with being a good person. Good people can do no wrong. On the other hand, if we consider someone to be a bad person, we find it hard to accept that he or she has any positive qualities. Bad people can do no good. In either case, we ignore evidence that contradicts our general impression of the person. The halo effect requires black and white thinking. Example: People tend to have a completely positive view of the political party they support and a completely negative view of the political party they don’t support.
- Negativity Bias
Our brains are wired to notice negative events more than positive events, so we give them more attention. This leads us to believe that more negative events are taking place than positive events. It also leads us to give more credence to negative claims about people with whom we disagree. Negativity bias is responsible for the fears we have about some things that are disproportionate to the actual likelihood of their occurring. Bad stuff seems to have more of an impact on us than good stuff, and we are quicker to react to it. This bias can make us susceptible to fear-mongering. Examples: (1) The news. (2) People tend to pay more attention—and give more weight—to critical comments than to praise.
- Impact Bias
We think we can predict how we will react to potential events, both good and bad, and reliably estimate the impact they will have on us. But in making such predictions, we routinely overestimate how good we will feel (and for how long) after a positive event and how bad we will feel (and for how long) after a negative event. Although we are extremely poor fortune tellers, that doesn’t stop us from being certain about how we will feel in the future. In reality, our excitement over something good will likely dim faster than we predict, and we are likely to rebound from a loss sooner than we predict. Example: People tend to believe a positive change, such marriage, a new job, a bigger house, winning the lottery, etc. will make them feel better—and for a longer time—than it actually will.
- Hindsight Bias
In retrospect everything seems inevitable. The hindsight bias (“I knew it all along”) makes us think the world is more predictable than it actually is. After the fact, we selectively reconstruct events to make it appear the outcome was inevitable. In doing so, we also exaggerate how likely we considered the outcome to be before it occurred. If the outcome is a negative one, we think someone should have foreseen it and prevented it. Example: After 9/11, many people thought the attacks by al-Qaeda could have been prevented based on the available information. However, the information was not, at that time, as clear as it appeared to be in hindsight.
- Outcome Bias
The outcome bias leads us to evaluate a decision based on the eventual results or outcome of the decision rather than on the soundness or quality of the decision at the time it was made. If something works out, we think it was a great decision (genius, even), although the reasoning that led to it may have been flawed. Conversely, if something doesn’t work out, we think it was a bad decision, although the reasoning that led to it may have been entirely sound. When outcomes are good, we think the decisions were good; when outcomes are bad, we think the decisions were bad. Example: People tend to think that if something goes wrong during a low-risk surgical procedure, the decision to do the procedure was a bad one.
- Hidden (or Implicit) Bias
Hidden Biases are attitudes or stereotypes we have, both favorable and unfavorable, particularly about other people in regard to race, gender, age, etc. We don’t all have the same hidden biases, but everyone has them. However, because they are hidden—primarily from ourselves—we are unaware of them, even though they affect our feelings, our behavior, and our reactions. Hidden biases may be at odds with our conscious attitudes and feelings. But some of our hidden biases may be apparent to others.
We can’t find out about hidden biases through introspection. We may be able to learn something about them through observing ourselves. Also Harvard University has developed an implicit association test that is available online (https://implicit.harvard.edu/implicit/) so you can test yourself for your own hidden biases.
Hidden biases contribute to a sense of cognitive ease by tending to confirm that whatever groups we belong to (ethnic, racial, age, income, etc.) are the best groups because they have more positive characteristics than those other groups have.
Cognitive Distortions
Cognitive distortions are habitual ways of thinking that alter our perception. Many, although not all, cognitive distortions are negative. But even negative cognitive distortions contribute to a sense of cognitive ease just because they are habitual. If you are used to thinking about yourself in a negative way or seeing the world in a negative way, that will feel more comfortable than trying to see things in a different (more positive) way.
Cognitive distortions are not uncommon, and there are a lot of different ones. However, not everyone is subject to them—or at least not to the same degree. A few common cognitive distortions are:
- Mindreading: believing you know what other people are thinking or what their motives are
- Overgeneralizing: drawing too broad a conclusion from a single event or piece of information or from limited information
- Catastrophizing: imagining worst case scenarios; exaggerating the likelihood of negative or disastrous outcomes
- All or Nothing Thinking (also called Black and White Thinking): thinking in extremes without allowing for complexity (shades of gray); believing that if something isn’t perfect or the best, it’s worthless
The Cognitive Ease Continuum
According to Daniel Kahneman, cognitive ease is both a cause and a consequence of a pleasant feeling. Cognitive ease makes us feel more favorable toward things that are familiar, easy to understand, and easy to see or read. We feel less favorable toward what is unfamiliar, difficult to understand, or difficult to see or read. We don’t even have to be consciously aware that something is familiar to us in order to feel good about it. The feel-good response comes from the unconscious part of our brain. It’s part of our hardwiring for survival. A good mood tells our brain everything is OK and we can let our guard down.
Being in a good mood is associated with intuition, creativity, gullibility, and increased reliance on the unconscious part of the brain. At the other end of the continuum are sadness, vigilance, suspicion, an analytic approach, and increased effort.
We can’t worry when we’re happy. But because we’re less vigilant when in a good mood, we’re more prone to making logical errors. We’re more susceptible to cognitive biases. We think we understand more than we do. We even think we’re thinking.
The Fruits of a Lesser Discontent
I don’t mean to imply that all great ideas or outcomes—or at least all of my ideas or outcomes—arise from states of discontent. Some have been the result of a logical progression of thought or activity. Others have come from Aha! moments when my unconscious connected some previously unconnected or unrecognized dots.
But just as a moment of deep existential discontent started me on the path of creating Farther to Go!, a moment of lesser discontent led to the creation of the What Do You Want? course. And weather played a role that time, too.
One overcast and unusually cool early fall day, I rebelled against immersing myself in the tasks I needed to complete. Imagine me mentally stamping my foot and scowling. This isn’t a particularly common occurrence, but it’s definitely more likely to happen on gray days than on sunny ones. In this instance, I decided to make myself a cup of coffee to generate some motivation or at least a small burst of energy.
While I was waiting for the water to boil, I asked myself, out of the blue, what I wanted to do instead of all the boring and tedious stuff. What did I really want to do? If I could do anything. And then it happened! I found myself answering a different question instead, an easier one: What do I want to do that’s practical?
By then I was familiar with the brain’s tendency to substitute an easier question for a hard one and to answer the easier question. But I had never before been aware of it as it happened, and I was kind of stunned. Why couldn’t I answer the original question? What made it too hard to answer? I should know what I want, right?
Well, maybe. Later that day, I decided to try to find out. I set myself the task of asking and answering the question “What do I really want?” every day for 30 days. Not just once, but multiple times, using 5×8 index cards. I ended up with nearly 500 answers, including several surprises. Obviously I hadn’t known everything I wanted.
Afterward, I put the individual items into general categories. That was even more illuminating. But the final step was what made the process priceless. I realized that all the items on my list fit under the umbrella of one or more of what I came to call Big Picture Wants. As I wrote out the words and phrases—in my case 12—of my own Big Picture Wants I knew I was on to something huge. I had been able to identify everything I wanted to have in my life.
Now that I’ve done this, I can’t imagine not being clear about what those things are. How can I set goals, make decisions or choices, or work on habits and intentions without knowing how they fit into the bigger picture? How can anyone?
When discontent strikes, we can try to make it go away quickly, or we can use it as motivation to dig deeper and examine our assumptions. If I were given a choice between being discontent and being complacent, I’d choose being discontent every time.
The Gift of Existential Discontent
Spring in New Mexico brings longer, brighter days, but those days seem to be carried in on incessant, howling, nasty winds. Two years ago, I was out for a walk on one of those very windy spring days. It was so windy that each step I took was an effort, and effort seemed to accurately describe my entire existence at that point. Abruptly, I thought, If this is how it’s going to be, I’m not interested.
Unhappiness and dissatisfaction are associated with a release of cortisol by the brain. Cortisol makes us want to do something to change how we’re feeling. A low level of cortisol—indicating a low level of discontent—triggers us to do something we know will make us feel better. Immediately! Whether that response is eating something sweet, going for a run, or surfing the internet, it’s automatic. No conscious thought is involved.
Cortisol also makes us pay attention. But more than a little cortisol has to be released before we actually sit up and pay conscious attention to our discontent. Otherwise the stimulus-response of cortisol and self-soothing behavior just runs in the background—at least until we start to notice all the weight we’ve gained or the time we’ve lost.
The amount of existential discontent I experienced that day did not feel good at all. I definitely wanted to do something about it! But I knew there was no easy response or quick fix. I couldn’t just go home and lose myself in a good book or have a glass of wine or play with my cat and expect to forget about it.
If this is how it’s going to be, I’m not interested was the impulse—the inciting incident, you could say—that eventually launched Farther to Go! I didn’t just want to feel better; I wanted to be better. I had a variety of tools to work with, processes and techniques I’d used before, but I quickly recognized none would do the trick this time. So I began carving out a path, hacking through my own wilderness, to find a way to be better.
I was kind of excited about my discoveries (if you know me, feel free to laugh here) and shared them with anyone who would listen. After a few months I began getting together twice a month with several other women. The members of the group changed, and as a result of my ongoing explorations, so did our focus. It was a few months before I found my way to learning about how the brain works and the revelation that underlies Farther to Go!
Trying to understand and change behavior without taking the brain into account is like trying to bake a cake without understanding that baking involves chemical reactions.
Two years ago, I had a general idea of what cortisol was, and since I had been a substance abuse counselor, I knew a little about serotonin and dopamine. But I had no idea how fortunate I was on that windy spring day to experience enough existential discontent that the amount of cortisol my brain released made it impossible to ignore.
Challenging Conventional Wisdom
Remember that we treat ideas like possessions, and it will be hard for us to part with them. —Nassim Nicholas Taleb, The Black Swan
Many of our ideas are based on what could be called common sense or conventional wisdom. They just seem so obvious we never consider questioning them. Because they make sense to us, we operate as if they are factual. We don’t need to know if there’s any evidence to support them. But those kinds of ideas are actually beliefs: things we accept or trust to be true. And when it comes to beliefs, trust generally trumps the need for evidence.
Here are two recent examples where evidence doesn’t support the conventional wisdom. Both involve children and child-rearing attitudes.
The conventional wisdom is that parents’ involvement with their children’s schooling is advantageous to their children’s education. That just seems like common sense. But this belief had never actually been tested or measured until recently. And it turns out that the conventional wisdom is not all that wise.
Don’t Help Your Kids with their Homework and other insights from a ground-breaking study of how parents impact children’s academic achievement: Parents can impact their kids academic success, but not by helping them with their homework, especially when the kids get to middle school.
Other conventional wisdom in regard to kids is that the world is a more dangerous place than it used to be, and the primary job of adults is to keep kids safe. This also seems obvious. But it’s also a belief that isn’t often examined. It turns out that the world may not be that much more dangerous than it used to be, and the zealous overprotection of kids may be doing them more harm than good.
The Overprotected Kid: A preoccupation with safety has stripped childhood of independence, risk taking, and discovery—without making it safer: Kids need to have time away from the watchful eyes of their parents or other adults, and they need to experience a feeling of being in danger in order to develop into competent adults.
There are many more examples of evidence not supporting the conventional wisdom in other areas, especially aging and behavior. In the two instances cited above, I think it’s interesting to consider how these beliefs may have been formed and how they became so widely accepted. It’s generally harder to find a middle ground when beliefs are involved because beliefs have such a strong emotional component.
And that’s another area in which common sense or conventional wisdom fails us. We think the level of confidence we have in a belief has some positive correlation with the accuracy of the belief. But it doesn’t. In fact, there’s probably little evidence to support many of our beliefs.
Declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true. —Daniel Kahneman, Thinking, Fast and Slow
The bottom line is that our brain craves certainty, and beliefs provide us with a feeling of certainty. If we want to use our brain, however, we need to challenge some of our own deeply-held beliefs instead of doing everything we can to shore them up. That’s easier said than done, of course.
- « Previous Page
- 1
- …
- 20
- 21
- 22
- 23
- 24
- …
- 29
- Next Page »