It’s fascinating to explore the effects of cognitive biases on our behavior. Here’s a short video that explains the self-serving bias.
WHEN You Choose Can Impact WHAT You Choose
People tend to favor maintaining the status quo to such an extent that it’s a recognized cognitive bias, one of many systematic distortions of thinking we’re prone to. The status-quo bias goes hand-in-hand with the loss-aversion bias, which leads us to pay more attention to what we might lose than to what we might gain. The status quo often feels less risky, whether or not it actually is.
It stands to reason, then, that when faced with making a choice between an option that maintains the status quo (the default option) and an alternative option, we’d be more likely to choose the default option. And we are—but only if we make the choice immediately. If we delay making a choice that we could have made immediately, we’re much more likely to choose the alternative option.
There have been a number of studies over the past 25 years, all of which show the same results. Simply delaying making a decision we could have made immediately decreases the likelihood we’ll choose the default option. It doesn’t matter what the options are or which option, if either, is the better choice. Delay itself casts doubt on the default option.
Failure to Choose
This isn’t hard to understand. If we could have made a choice immediately, then why didn’t we? The know-it-all interpreter—or explainer—inside our head has an answer for this, as it does for just about everything: obviously there’s some doubt as to the appeal of the default option. Otherwise, based on the status quo bias, we would have chosen it immediately.
It also turns out that being in a state of doubt about something that is completely unrelated to the choice at hand can have the same impact on our choice. Doubt, in general, influences us to choose the alternative option rather than the default option.
Delay and doubt are factors we should take into consideration when we’re faced with making a choice between a default option and an alternative option. The conventional wisdom is that taking time to make a choice leads to making better choices. That seems reasonable, but it isn’t entirely accurate. Yes, delay has an effect; it’s just not the effect we may have attributed to it.
If we’re aware that delay tends to make the default option seem less appealing, we can factor that in when choosing when to choose. We can mitigate some of the effect of delaying choice just by knowing the effect is there.
Thinking, Fast and Slow (animated)
What Is Cognitive Ease—and Why Should You Be Wary of It?
Everyone wants to be right and to feel certain about things. These are built-in biological drives, not character flaws. When we think we’re right and when we feel certain, we experience a sense of cognitive ease. The world makes sense to us. And that puts us in a good mood.
Cognitive ease feels good, but it gives us a false sense of security because it makes us think we understand far more than we actually do.
Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance. —Daniel Kahneman, Thinking, Fast and Slow
Comfortably Numb
When it comes to taking in information and deciding what to believe and what not to believe, we are appallingly predictable. We are most likely to believe:
- What Is Familiar
Information that feels familiar is easier to absorb and believe than information that is unfamiliar. It could be familiar because it’s associated with other beliefs we have or it could come from a trusted source. On the other hand, it could simply be something we’ve come across before—especially if we’ve come across it multiple times. Frequent repetition can be enough to convince people to believe things that are not true because familiarity generates a sense of cognitive ease. Called the mere-exposure effect, advertisers make use of it, but they aren’t the only ones.
- What Is Easy
Information that is easy to understand also gives us a sense of cognitive ease. Information that is difficult to understand requires more cognitive effort to process, and our brain’s preference is to take it easy. Say you’re faced with choosing between two concepts, ideas, or explanations. Idea A is easy to understand, while Idea B is more difficult. Statistically speaking, you’re much more likely to accept Idea A instead of Idea B simply because Idea A is easier for you to swallow. Does that give you a sense of cognitive dis-ease?
- What Validates Our Preexisting Beliefs
Information that confirms what we already believe to be true makes us feel right and certain, so we’re likely to accept it uncritically. On the other hand, we’re more likely to reject information that is inconsistent with what we already believe or at least we hold inconsistent information up to greater scrutiny. We have different standards for evaluating information depending on the level of cognitive ease it generates.
And evidence has precious little impact on us if it conflicts with what we believe simply because the cognitive strain of processing it is too great. For example, it is easier to believe that What You See Is All There Is (WYSIATI), even after being confronted with evidence that you have missed something that was right in front of your face, than it is to believe that you are aware of only a tiny fraction of what is going on around you.
Cognitive Biases
We use cognitive biases as shortcuts to help us understand the world. We don’t have to use any critical thinking skills. No cognitive effort is required. We aren’t forced to reevaluate our existing beliefs. Because of our cognitive biases, we make snap judgments, form quick impressions or opinions, and operate on autopilot.
The bad news is that, since cognitive biases are by their nature distortions or errors in thinking, they actually decrease our understanding all the while giving us that feel-good sense of cognitive ease.
That’s just fine with the conscious part of our brain, which is slow and kind of lazy and doesn’t want to work if it doesn’t have to. It’s happy to let the unconscious handle as much of the load as possible. Because cognitive biases operate at the unconscious level, unless we make an effort to recognize them, we aren’t aware of them. We will even deny we have them.
To have a human brain is to be subject to cognitive biases. Some of the most common are:
- Confirmation Bias
The easy acceptance of information that validates what we already believe (as described in What Validates Our Preexisting Beliefs, above) is a result of confirmation bias. Confirmation bias causes us to selectively notice and pay attention to what confirms our beliefs and to ignore what doesn’t. Confirmation bias underlies the discomfort we feel around people who disagree with us and the ease we feel around people who share our beliefs. Example: People who favor gun control pay more attention to stories about injuries and deaths resulting from gun use; people who are against gun control pay more attention to stories about people using guns to defend themselves.
- The Halo Effect
The tendency to view other people as all good (or all bad) is the result of a cognitive bias called the halo effect. When we consider someone to be a good person, we find it easier to excuse or ignore behavior that is inconsistent with being a good person. Good people can do no wrong. On the other hand, if we consider someone to be a bad person, we find it hard to accept that he or she has any positive qualities. Bad people can do no good. In either case, we ignore evidence that contradicts our general impression of the person. The halo effect requires black and white thinking. Example: People tend to have a completely positive view of the political party they support and a completely negative view of the political party they don’t support.
- Negativity Bias
Our brains are wired to notice negative events more than positive events, so we give them more attention. This leads us to believe that more negative events are taking place than positive events. It also leads us to give more credence to negative claims about people with whom we disagree. Negativity bias is responsible for the fears we have about some things that are disproportionate to the actual likelihood of their occurring. Bad stuff seems to have more of an impact on us than good stuff, and we are quicker to react to it. This bias can make us susceptible to fear-mongering. Examples: (1) The news. (2) People tend to pay more attention—and give more weight—to critical comments than to praise.
- Impact Bias
We think we can predict how we will react to potential events, both good and bad, and reliably estimate the impact they will have on us. But in making such predictions, we routinely overestimate how good we will feel (and for how long) after a positive event and how bad we will feel (and for how long) after a negative event. Although we are extremely poor fortune tellers, that doesn’t stop us from being certain about how we will feel in the future. In reality, our excitement over something good will likely dim faster than we predict, and we are likely to rebound from a loss sooner than we predict. Example: People tend to believe a positive change, such marriage, a new job, a bigger house, winning the lottery, etc. will make them feel better—and for a longer time—than it actually will.
- Hindsight Bias
In retrospect everything seems inevitable. The hindsight bias (“I knew it all along”) makes us think the world is more predictable than it actually is. After the fact, we selectively reconstruct events to make it appear the outcome was inevitable. In doing so, we also exaggerate how likely we considered the outcome to be before it occurred. If the outcome is a negative one, we think someone should have foreseen it and prevented it. Example: After 9/11, many people thought the attacks by al-Qaeda could have been prevented based on the available information. However, the information was not, at that time, as clear as it appeared to be in hindsight.
- Outcome Bias
The outcome bias leads us to evaluate a decision based on the eventual results or outcome of the decision rather than on the soundness or quality of the decision at the time it was made. If something works out, we think it was a great decision (genius, even), although the reasoning that led to it may have been flawed. Conversely, if something doesn’t work out, we think it was a bad decision, although the reasoning that led to it may have been entirely sound. When outcomes are good, we think the decisions were good; when outcomes are bad, we think the decisions were bad. Example: People tend to think that if something goes wrong during a low-risk surgical procedure, the decision to do the procedure was a bad one.
- Hidden (or Implicit) Bias
Hidden Biases are attitudes or stereotypes we have, both favorable and unfavorable, particularly about other people in regard to race, gender, age, etc. We don’t all have the same hidden biases, but everyone has them. However, because they are hidden—primarily from ourselves—we are unaware of them, even though they affect our feelings, our behavior, and our reactions. Hidden biases may be at odds with our conscious attitudes and feelings. But some of our hidden biases may be apparent to others.
We can’t find out about hidden biases through introspection. We may be able to learn something about them through observing ourselves. Also Harvard University has developed an implicit association test that is available online (https://implicit.harvard.edu/implicit/) so you can test yourself for your own hidden biases.
Hidden biases contribute to a sense of cognitive ease by tending to confirm that whatever groups we belong to (ethnic, racial, age, income, etc.) are the best groups because they have more positive characteristics than those other groups have.
Cognitive Distortions
Cognitive distortions are habitual ways of thinking that alter our perception. Many, although not all, cognitive distortions are negative. But even negative cognitive distortions contribute to a sense of cognitive ease just because they are habitual. If you are used to thinking about yourself in a negative way or seeing the world in a negative way, that will feel more comfortable than trying to see things in a different (more positive) way.
Cognitive distortions are not uncommon, and there are a lot of different ones. However, not everyone is subject to them—or at least not to the same degree. A few common cognitive distortions are:
- Mindreading: believing you know what other people are thinking or what their motives are
- Overgeneralizing: drawing too broad a conclusion from a single event or piece of information or from limited information
- Catastrophizing: imagining worst case scenarios; exaggerating the likelihood of negative or disastrous outcomes
- All or Nothing Thinking (also called Black and White Thinking): thinking in extremes without allowing for complexity (shades of gray); believing that if something isn’t perfect or the best, it’s worthless
The Cognitive Ease Continuum
According to Daniel Kahneman, cognitive ease is both a cause and a consequence of a pleasant feeling. Cognitive ease makes us feel more favorable toward things that are familiar, easy to understand, and easy to see or read. We feel less favorable toward what is unfamiliar, difficult to understand, or difficult to see or read. We don’t even have to be consciously aware that something is familiar to us in order to feel good about it. The feel-good response comes from the unconscious part of our brain. It’s part of our hardwiring for survival. A good mood tells our brain everything is OK and we can let our guard down.
Being in a good mood is associated with intuition, creativity, gullibility, and increased reliance on the unconscious part of the brain. At the other end of the continuum are sadness, vigilance, suspicion, an analytic approach, and increased effort.
We can’t worry when we’re happy. But because we’re less vigilant when in a good mood, we’re more prone to making logical errors. We’re more susceptible to cognitive biases. We think we understand more than we do. We even think we’re thinking.
The Gift of Existential Discontent
Spring in New Mexico brings longer, brighter days, but those days seem to be carried in on incessant, howling, nasty winds. Two years ago, I was out for a walk on one of those very windy spring days. It was so windy that each step I took was an effort, and effort seemed to accurately describe my entire existence at that point. Abruptly, I thought, If this is how it’s going to be, I’m not interested.
Unhappiness and dissatisfaction are associated with a release of cortisol by the brain. Cortisol makes us want to do something to change how we’re feeling. A low level of cortisol—indicating a low level of discontent—triggers us to do something we know will make us feel better. Immediately! Whether that response is eating something sweet, going for a run, or surfing the internet, it’s automatic. No conscious thought is involved.
Cortisol also makes us pay attention. But more than a little cortisol has to be released before we actually sit up and pay conscious attention to our discontent. Otherwise the stimulus-response of cortisol and self-soothing behavior just runs in the background—at least until we start to notice all the weight we’ve gained or the time we’ve lost.
The amount of existential discontent I experienced that day did not feel good at all. I definitely wanted to do something about it! But I knew there was no easy response or quick fix. I couldn’t just go home and lose myself in a good book or have a glass of wine or play with my cat and expect to forget about it.
If this is how it’s going to be, I’m not interested was the impulse—the inciting incident, you could say—that eventually launched Farther to Go! I didn’t just want to feel better; I wanted to be better. I had a variety of tools to work with, processes and techniques I’d used before, but I quickly recognized none would do the trick this time. So I began carving out a path, hacking through my own wilderness, to find a way to be better.
I was kind of excited about my discoveries (if you know me, feel free to laugh here) and shared them with anyone who would listen. After a few months I began getting together twice a month with several other women. The members of the group changed, and as a result of my ongoing explorations, so did our focus. It was a few months before I found my way to learning about how the brain works and the revelation that underlies Farther to Go!
Trying to understand and change behavior without taking the brain into account is like trying to bake a cake without understanding that baking involves chemical reactions.
Two years ago, I had a general idea of what cortisol was, and since I had been a substance abuse counselor, I knew a little about serotonin and dopamine. But I had no idea how fortunate I was on that windy spring day to experience enough existential discontent that the amount of cortisol my brain released made it impossible to ignore.
- « Previous Page
- 1
- …
- 21
- 22
- 23
- 24
- 25
- …
- 31
- Next Page »