It’s fascinating to explore the effects of cognitive biases on our behavior. Here’s a short video that explains the self-serving bias.
WHEN You Choose Can Impact WHAT You Choose
People tend to favor maintaining the status quo to such an extent that it’s a recognized cognitive bias, one of many systematic distortions of thinking we’re prone to. The status-quo bias goes hand-in-hand with the loss-aversion bias, which leads us to pay more attention to what we might lose than to what we might gain. The status quo often feels less risky, whether or not it actually is.
It stands to reason, then, that when faced with making a choice between an option that maintains the status quo (the default option) and an alternative option, we’d be more likely to choose the default option. And we are—but only if we make the choice immediately. If we delay making a choice that we could have made immediately, we’re much more likely to choose the alternative option.
There have been a number of studies over the past 25 years, all of which show the same results. Simply delaying making a decision we could have made immediately decreases the likelihood we’ll choose the default option. It doesn’t matter what the options are or which option, if either, is the better choice. Delay itself casts doubt on the default option.
Failure to Choose
This isn’t hard to understand. If we could have made a choice immediately, then why didn’t we? The know-it-all interpreter—or explainer—inside our head has an answer for this, as it does for just about everything: obviously there’s some doubt as to the appeal of the default option. Otherwise, based on the status quo bias, we would have chosen it immediately.
It also turns out that being in a state of doubt about something that is completely unrelated to the choice at hand can have the same impact on our choice. Doubt, in general, influences us to choose the alternative option rather than the default option.
Delay and doubt are factors we should take into consideration when we’re faced with making a choice between a default option and an alternative option. The conventional wisdom is that taking time to make a choice leads to making better choices. That seems reasonable, but it isn’t entirely accurate. Yes, delay has an effect; it’s just not the effect we may have attributed to it.
If we’re aware that delay tends to make the default option seem less appealing, we can factor that in when choosing when to choose. We can mitigate some of the effect of delaying choice just by knowing the effect is there.
When the Going Gets Grueling
No one struggles to get through the good times or looks for strategies to cope with them. But the tough or unpleasant times are different. The attitudes or strategies we use when things are going great don’t necessarily work—or work the same—when things are not so great. What does it actually take to get through those difficult days or weeks or months?
I’m someone who is 100% responsible for every single aspect and task in my life, as are many other people. I’m also someone who operates a business on my own and is 100% responsible for every single aspect and task of the business, as are more and more other people these days. The number of things to do and things to keep track of when you’re 100% responsible for everything doesn’t just feel overwhelming at times, it is overwhelming. All the time. I sometimes wonder if people like me—and there are many of us—have some kind of a glutton-for-punishment gene.
While many of the things I do are stimulating and satisfying, there are plenty of other things that are some combination of boring, difficult, and exhausting. I’m sure this is true for everyone, whether or not you’re running a business solo or living your life that way.
While I don’t always think I get through the difficult times as well as I could, I generally do get through them. Recently, I finished 10+ days focused entirely on organizing course materials and office systems (well, that and all the things I need to do to keep the rest of my life running). I desperately want a clutter-free office, but I also desperately dislike putting time and attention into this kind of stuff. I realize this isn’t equivalent to putting in hard labor, but still, dislike is putting it mildly.
It had to be done, though. Investing the time and energy now in something I don’t like doing will make it possible for me to spend more time down the road doing what I do like doing. But it was pretty grueling. I forced myself to find a place for every single piece of paper or index card or else toss it out. I updated and printed copies of all the course materials that have been finalized. I made sure the systems I set up worked, and if they didn’t, I tweaked them until they did.
In order to hunker down and finish this project, I gave up going to the gym for a week. I didn’t do any writing of any kind or any research. All of my conscious (System 2) attention went to dealing with these organizational details; nothing was left over for anything else. And, of course, none of the many other things on my to-do list, all of which are equally important, got done. By the end of each day, I was stiff, tired, and out of sorts.
But I saw the project through to the end. It was well worth the diversion of time and effort, the sacrifice of small pleasures, and the multiple calluses on my mouse hand. There’s no way I could have accomplished this by doing a little bit here and there. Once I freed up a few brain cells, I started thinking about the question in the first paragraph: what attributes or characteristics make it easier to stick with something unpleasant or difficult long enough to achieve a degree of success.
This is what I came up with:
- Fortitude (Don’t Leave Home Without It)
- Focus (Keep Your Eyes on the Prize)
- Patience (Learn to Play the Waiting Game)
- Embrace Uncertainty
- Know When to Get Assistance
- Reduce the Clutter in Your Life
I think anyone—no matter what their individual circumstances—would benefit from having or developing these attributes. So I’ll be writing a post on each one—including how our brain can help or hinder us—on consecutive Mondays beginning next week.
As I said, my organizational stint doesn’t qualify as hard labor. It was sort of like putting myself in the penalty box for a period and having to sit out a portion of the game.
What kind of work or tasks do you have that (when you’re doing them) make you feel like you’re in the penalty box for a period?
Thinking, Fast and Slow (animated)
What Is Cognitive Ease—and Why Should You Be Wary of It?
Everyone wants to be right and to feel certain about things. These are built-in biological drives, not character flaws. When we think we’re right and when we feel certain, we experience a sense of cognitive ease. The world makes sense to us. And that puts us in a good mood.
Cognitive ease feels good, but it gives us a false sense of security because it makes us think we understand far more than we actually do.
Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance. —Daniel Kahneman, Thinking, Fast and Slow
Comfortably Numb
When it comes to taking in information and deciding what to believe and what not to believe, we are appallingly predictable. We are most likely to believe:
- What Is Familiar
Information that feels familiar is easier to absorb and believe than information that is unfamiliar. It could be familiar because it’s associated with other beliefs we have or it could come from a trusted source. On the other hand, it could simply be something we’ve come across before—especially if we’ve come across it multiple times. Frequent repetition can be enough to convince people to believe things that are not true because familiarity generates a sense of cognitive ease. Called the mere-exposure effect, advertisers make use of it, but they aren’t the only ones.
- What Is Easy
Information that is easy to understand also gives us a sense of cognitive ease. Information that is difficult to understand requires more cognitive effort to process, and our brain’s preference is to take it easy. Say you’re faced with choosing between two concepts, ideas, or explanations. Idea A is easy to understand, while Idea B is more difficult. Statistically speaking, you’re much more likely to accept Idea A instead of Idea B simply because Idea A is easier for you to swallow. Does that give you a sense of cognitive dis-ease?
- What Validates Our Preexisting Beliefs
Information that confirms what we already believe to be true makes us feel right and certain, so we’re likely to accept it uncritically. On the other hand, we’re more likely to reject information that is inconsistent with what we already believe or at least we hold inconsistent information up to greater scrutiny. We have different standards for evaluating information depending on the level of cognitive ease it generates.
And evidence has precious little impact on us if it conflicts with what we believe simply because the cognitive strain of processing it is too great. For example, it is easier to believe that What You See Is All There Is (WYSIATI), even after being confronted with evidence that you have missed something that was right in front of your face, than it is to believe that you are aware of only a tiny fraction of what is going on around you.
Cognitive Biases
We use cognitive biases as shortcuts to help us understand the world. We don’t have to use any critical thinking skills. No cognitive effort is required. We aren’t forced to reevaluate our existing beliefs. Because of our cognitive biases, we make snap judgments, form quick impressions or opinions, and operate on autopilot.
The bad news is that, since cognitive biases are by their nature distortions or errors in thinking, they actually decrease our understanding all the while giving us that feel-good sense of cognitive ease.
That’s just fine with the conscious part of our brain, which is slow and kind of lazy and doesn’t want to work if it doesn’t have to. It’s happy to let the unconscious handle as much of the load as possible. Because cognitive biases operate at the unconscious level, unless we make an effort to recognize them, we aren’t aware of them. We will even deny we have them.
To have a human brain is to be subject to cognitive biases. Some of the most common are:
- Confirmation Bias
The easy acceptance of information that validates what we already believe (as described in What Validates Our Preexisting Beliefs, above) is a result of confirmation bias. Confirmation bias causes us to selectively notice and pay attention to what confirms our beliefs and to ignore what doesn’t. Confirmation bias underlies the discomfort we feel around people who disagree with us and the ease we feel around people who share our beliefs. Example: People who favor gun control pay more attention to stories about injuries and deaths resulting from gun use; people who are against gun control pay more attention to stories about people using guns to defend themselves.
- The Halo Effect
The tendency to view other people as all good (or all bad) is the result of a cognitive bias called the halo effect. When we consider someone to be a good person, we find it easier to excuse or ignore behavior that is inconsistent with being a good person. Good people can do no wrong. On the other hand, if we consider someone to be a bad person, we find it hard to accept that he or she has any positive qualities. Bad people can do no good. In either case, we ignore evidence that contradicts our general impression of the person. The halo effect requires black and white thinking. Example: People tend to have a completely positive view of the political party they support and a completely negative view of the political party they don’t support.
- Negativity Bias
Our brains are wired to notice negative events more than positive events, so we give them more attention. This leads us to believe that more negative events are taking place than positive events. It also leads us to give more credence to negative claims about people with whom we disagree. Negativity bias is responsible for the fears we have about some things that are disproportionate to the actual likelihood of their occurring. Bad stuff seems to have more of an impact on us than good stuff, and we are quicker to react to it. This bias can make us susceptible to fear-mongering. Examples: (1) The news. (2) People tend to pay more attention—and give more weight—to critical comments than to praise.
- Impact Bias
We think we can predict how we will react to potential events, both good and bad, and reliably estimate the impact they will have on us. But in making such predictions, we routinely overestimate how good we will feel (and for how long) after a positive event and how bad we will feel (and for how long) after a negative event. Although we are extremely poor fortune tellers, that doesn’t stop us from being certain about how we will feel in the future. In reality, our excitement over something good will likely dim faster than we predict, and we are likely to rebound from a loss sooner than we predict. Example: People tend to believe a positive change, such marriage, a new job, a bigger house, winning the lottery, etc. will make them feel better—and for a longer time—than it actually will.
- Hindsight Bias
In retrospect everything seems inevitable. The hindsight bias (“I knew it all along”) makes us think the world is more predictable than it actually is. After the fact, we selectively reconstruct events to make it appear the outcome was inevitable. In doing so, we also exaggerate how likely we considered the outcome to be before it occurred. If the outcome is a negative one, we think someone should have foreseen it and prevented it. Example: After 9/11, many people thought the attacks by al-Qaeda could have been prevented based on the available information. However, the information was not, at that time, as clear as it appeared to be in hindsight.
- Outcome Bias
The outcome bias leads us to evaluate a decision based on the eventual results or outcome of the decision rather than on the soundness or quality of the decision at the time it was made. If something works out, we think it was a great decision (genius, even), although the reasoning that led to it may have been flawed. Conversely, if something doesn’t work out, we think it was a bad decision, although the reasoning that led to it may have been entirely sound. When outcomes are good, we think the decisions were good; when outcomes are bad, we think the decisions were bad. Example: People tend to think that if something goes wrong during a low-risk surgical procedure, the decision to do the procedure was a bad one.
- Hidden (or Implicit) Bias
Hidden Biases are attitudes or stereotypes we have, both favorable and unfavorable, particularly about other people in regard to race, gender, age, etc. We don’t all have the same hidden biases, but everyone has them. However, because they are hidden—primarily from ourselves—we are unaware of them, even though they affect our feelings, our behavior, and our reactions. Hidden biases may be at odds with our conscious attitudes and feelings. But some of our hidden biases may be apparent to others.
We can’t find out about hidden biases through introspection. We may be able to learn something about them through observing ourselves. Also Harvard University has developed an implicit association test that is available online (https://implicit.harvard.edu/implicit/) so you can test yourself for your own hidden biases.
Hidden biases contribute to a sense of cognitive ease by tending to confirm that whatever groups we belong to (ethnic, racial, age, income, etc.) are the best groups because they have more positive characteristics than those other groups have.
Cognitive Distortions
Cognitive distortions are habitual ways of thinking that alter our perception. Many, although not all, cognitive distortions are negative. But even negative cognitive distortions contribute to a sense of cognitive ease just because they are habitual. If you are used to thinking about yourself in a negative way or seeing the world in a negative way, that will feel more comfortable than trying to see things in a different (more positive) way.
Cognitive distortions are not uncommon, and there are a lot of different ones. However, not everyone is subject to them—or at least not to the same degree. A few common cognitive distortions are:
- Mindreading: believing you know what other people are thinking or what their motives are
- Overgeneralizing: drawing too broad a conclusion from a single event or piece of information or from limited information
- Catastrophizing: imagining worst case scenarios; exaggerating the likelihood of negative or disastrous outcomes
- All or Nothing Thinking (also called Black and White Thinking): thinking in extremes without allowing for complexity (shades of gray); believing that if something isn’t perfect or the best, it’s worthless
The Cognitive Ease Continuum
According to Daniel Kahneman, cognitive ease is both a cause and a consequence of a pleasant feeling. Cognitive ease makes us feel more favorable toward things that are familiar, easy to understand, and easy to see or read. We feel less favorable toward what is unfamiliar, difficult to understand, or difficult to see or read. We don’t even have to be consciously aware that something is familiar to us in order to feel good about it. The feel-good response comes from the unconscious part of our brain. It’s part of our hardwiring for survival. A good mood tells our brain everything is OK and we can let our guard down.
Being in a good mood is associated with intuition, creativity, gullibility, and increased reliance on the unconscious part of the brain. At the other end of the continuum are sadness, vigilance, suspicion, an analytic approach, and increased effort.
We can’t worry when we’re happy. But because we’re less vigilant when in a good mood, we’re more prone to making logical errors. We’re more susceptible to cognitive biases. We think we understand more than we do. We even think we’re thinking.
- « Previous Page
- 1
- …
- 35
- 36
- 37
- 38
- 39
- …
- 45
- Next Page »