Bats, Balls, and Biases

Critical thinking is the ability to think clearly, rationally, and objectively and to understand the logical connection between ideas. It’s an active rather than a passive process. Because it requires System 2 (conscious) attention, it doesn’t come naturally to us and it isn’t easy.

In some instances, we equate difficult with boring. In fact, after reading the short paragraph above, you may already be bored. Critical thinking? Who cares and why bother?

Well, for one thing, it’s possible that improving your critical thinking skills might help you become a better person. But more importantly, it might help you get more of what you want and less of what you don’t want. That’s because good critical thinking skills are essential if you want to master the art and science of change. And unless you master the art and science of change, you’ll continue to be stuck with whatever the status quo happens to be—or to become.

On the BIAS

We all view what happens in the world and what happens to us through our own individual perspectives (our mental models). That means we are all biased.

Here’s an easy way to remember bias:

Beliefs and Values
Interpretations
Assumptions
Stereotypes

Beliefs are ideas or principles we have come to accept as true.
Values are our personal principles or standards.
Interpretations are explanations or understanding.
Assumptions are suppositions: what we take for granted or assume.
Stereotypes are generalizations and oversimplifications.

All of these elements operate in the background (System 1) so we aren’t usually consciously aware of them. Being biased is the normal state of affairs. We don’t have to make an effort to be biased. We have to make an effort to become aware of our biases so we have a fighting chance to act in our own best interest rather than automatically.

One of the most fascinating aspects of the human condition is that we think of the conscious part of the brain (System 2) as “I.” Yet it’s the biased unconscious part of the brain (System 1) that usually runs us. It takes no time or effort to come up with a System 1 reaction or response to a situation, question, or event because System 1 is fast, vast, and always on.

As Daniel Kahneman says in Thinking, Fast and Slow:

Everybody recognizes the difference between thoughts that come to mind automatically and thoughts that you need to produce. That is the distinction.

System 1 has an answer for everything. And its answers are correct often enough to lull us into accepting them unconditionally most of the time. But you’re not going to get change from System 1; you’re going to get same old/same old.

In addition to understanding our own biases, we also need to develop the capacity to know when it’s OK to go along with System 1’s response and when it isn’t. Well-developed critical thinking skills can help us make important decisions and solve significant problems by allowing us to effectively evaluate both the information at hand and the “intuitive” suggestions spontaneously arising from System 1.

Do I need an umbrella?

If you look outside and observe rain falling, you could safely jump to the conclusion that you need to take an umbrella with you when you go outside. You would not increase your chances of making the best decision by checking the weather report on your smartphone (getting more information) or analyzing your interpretation that rain falling means you’re likely to get wet if you go out in it.

How much does the ball cost?

On the other hand, you may not want to count on the first response that comes to mind as an answer to the following question:

A bat and a ball cost $1.10.
The bat costs one dollar more than the ball.
How much does the ball cost?

If you jump to the conclusion that the ball costs 10 cents, you would be wrong—no matter how confident you might feel about your conclusion.

That’s because if the bat costs one dollar more than the ball and the ball costs 10 cents, the bat would cost $1.10 for a total of $1.20. So the correct answer is that the ball costs 5 cents and the bat costs $1.05 for a total of $1.10.

Did you do the math, so to speak, or did you jump to the quick—and erroneous—conclusion? If you jumped to the wrong conclusion, how confident did you feel about your answer? And does it make you feel any better to know that between 50% and 80% of college students also come up with the wrong answer.

Civil Discourse, Critical Thinking…and Facebook

perspective

I’m going to tell you about my abortion, but first some background. Facebook has only been accessible to the general public for about 10 years, but it’s hard to remember life before it—probably impossible for those of a certain age. As a social networking site, it started out as a place to share personal information and photos and to meet like-minded others. It developed a reputation for focusing excessively on “what people had for lunch,” which is how those who disdain it still think of it.

Honestly, I can’t remember why I joined Facebook, and there have been periods when I haven’t paid much attention to it. It’s a curious phenomenon. At my Monthly Meeting of the Mind (& Brain) last month, I asked everyone to share the first word or phrase that came to mind when I said “Facebook.” The responses varied; mine was information. That’s primarily why I use it now, and why I would be loathe to give it up. So many educational, scientific, and just plain thought-provoking sources update their Facebook feeds on a regular basis that it allows me to keep current without having to spend hours going to individual websites or searching the internet for what I might not even know is available.

But more and more Facebook is also becoming a place for us to let everyone know where we stand in matters political, social, religious, moral, dietary, and in regard to the age-old question: which makes a better pet, a cat or a dog? I guess this is only natural, a logical outcome of the sharing we do of our favorite movies, the books we’ve read, the sports teams we follow, and the posts we “like.”

The cat vs. dog argument rarely gets ugly. The same can’t be said for our stances in those other, more highly charged, areas. That’s because we don’t simply want to let others know our position. Kind of like chest-beating apes, we want to proclaim our superiority. We want to demonstrate how right we are and how wrong those who disagree with us are. As a result, many such posts amount to a whole lot of signifying, righteous indignation, and extreme disdain for those on the other side. Because if we’re on one side, there has to be another side. And if we’re right, those others have to be wrong.

This cognitive bias is known as black and white thinking. It’s a simplistic way of viewing an issue that doesn’t allow for shades of gray. Imagine two groups of people shouting at each other across a vast chasm. Neither group is listening to the other; no difference is being made. But everyone has a sense of satisfaction as a result of expressing their opinion.

The problem is that although no practical difference is being made, this state of affairs is not innocuous. If one person proclaims his or her position and implies that those on the other side are misinformed, mentally challenged, or flat-out evil, you can bet those others are going to react. It’s like throwing a metaphorical hand grenade into a crowd. Most likely, those on the receiving end are going to respond as if they’re being threatened. How do you react when you’re being threatened? Do you stop to evaluate the merits of your aggressor’s point of view?

On the Other Hand…

I have this crazy idea that Facebook could be a possibility for civil discourse between people of opposing views, so every once in a while I attempt to engage with someone who clearly doesn’t see things the way I do.

One of my friends shared a recent meme suggesting that men who want to purchase guns should be required to go through the same hoops women seeking abortions have to go through. One of her Facebook friends commented that most people who buy guns never kill anything, but every woman who has an abortion kills a human being.

I had an abortion many years ago, and this woman’s assertion hit me hard. After taking a deep breath, I decided to respond. I replied that what she’d said was a generalization that wasn’t true. Her response was that regardless of my opinion, abortion was MURDER (caps hers). She also indicated she had children, who had been “valuable human beings from the moment of conception.” At that point I realized it was the ideology talking, so it was futile to pursue a dialogue. I told her she was fortunate to have been able to conceive and bear children, which wasn’t the case for some of the rest of us. And I ended the interaction.

But it was painful to have this woman who knows nothing about me or my experience make unwarranted assumptions about me and obliquely, at least, cast me in the role of a murderer. Every woman who has an abortion kills a human being. Based purely on the fact that I’d disagreed with her, she determined I was on the opposite side, and therefore in favor of abortion. Since, in her mind, there are only two sides (you’re either with us or against us), that meant I was wrong and she was not interested in hearing anything I had to say.

I’m pro-choice. That doesn’t mean what anti-abortionists think it means. In my case, having an abortion was not only the last thing I wanted to do, it didn’t actually involve much choice on my part.

Failure to Conceive

I’d always assumed I would have children—that it would just happen when I was ready to start a family. I don’t recall ever thinking otherwise. Both of my younger brothers had married and had children by the time I got married. But it never happened for me. I spent more than two years going through fertility treatments, seeing different doctors, taking my basal body temperature every day, having to show up at the gynecologist’s office at the crack of dawn, and facing the same disappointment month after month. I joined the subculture of women who are consumed by their attempts to get pregnant. And I mean consumed. Getting pregnant was the number one focus of our attention, the main thing we read about, talked about, and thought about.

Finally, as a result of the most painful medical procedure I’ve ever had, it was determined that my Fallopian tubes were blocked, which meant all of my efforts of the previous two years had been futile. I could not get pregnant. There might have been further treatment available. I can’t remember. I do know I was worn out from the ordeal by then. So I came to terms with the situation and settled into my childless life. No one had told me it was possible for Fallopian tubes to become unblocked all by themselves—which is what happened to me a few years before menopause.

I’d been feeling tired and run down, but didn’t think anything of it at first. I wasn’t nauseous. I just didn’t feel like myself. Then I noticed I hadn’t had a period for two months. A couple of friends kept asking me if I’d had a pregnancy test, and I remember being quite irritated. So late one Friday evening, I drove to a drugstore and bought a test solely to be able to prove they were wrong. The insert that came with the test showed a pale pink “positive” response. What I got was hot pink—closer to fuschia. I was stunned.

I wasn’t working at that time and had no health insurance. I was also in my mid-40s, and my then partner (the same person I’d been married to previously—another story altogether) was 14 years older than me. But after the shock wore off, I immediately started trying to figure out how I could do this thing. I checked out every pregnancy book the local library had on its shelves. I told several of my friends—and everyone I spoke to offered to help me any way they could. One person gave me the name and number of her gynecologist, and early on Monday I called for an appointment. They got me in within a couple of days, by which time I already knew there were potential problems and risks associated with having a first pregnancy at my age.

Happy Valentine’s Day

I had an amniocentesis. My fingers were crossed for the better part of a week. If the baby was OK, I would have it. I would be a mother! But that wasn’t the way this story played out. When the doctor called to give me the test results, he told me that I would eventually miscarry, and if I waited for that to happen, it could be dangerous, even life-threatening. He wanted me to have the abortion procedure as soon as possible. In fact, he had a cancellation that week. It was on Valentine’s Day. I took the appointment.

I’m sure the Facebook commenter who thinks all women who get abortions are murderers is quite confident she is in the right. For my part, it’s hard to imagine how someone who was able to have children—something I may have spent more time, attention, energy, and money attempting to do than she did—could possibly have anything but sympathy for me. It’s true she doesn’t know the particulars of my situation. But that’s exactly my point. When you’re shouting (MURDER) across the chasm at the other side, you don’t need to be bothered by particulars. There are no shades of gray.

It doesn’t matter what the issue is or what side of the political spectrum we’re on or how confident we feel about our beliefs or positions. If we’re participating in this shouting match, we’re part of what’s wrong in the world. It’s so easy to share things on Facebook or to dash off a righteously indignant comment that we don’t even have to think about it. But we ought to think about it. We ought to engage the conscious part of our brain for a few seconds to ask ourselves what we’re doing. Do we really need to keep shouting and lobbing metaphorical hand grenades at each other? Is that the best we can do?

Think You’re Thinking?

English: Uriah Heep from "David Copperfie...
English: Uriah Heep from “David Copperfield”, Ink and wash drawing (Photo credit: Wikipedia)

Much of what passes for thinking consists of unconscious, not conscious, mental processes. When it comes to taking in information and deciding what to believe and what not to believe, for example, we are appallingly predictable. We are most likely to believe:

What Is Familiar

Information that feels familiar is easier to absorb and believe than information that is unfamiliar. The information could be familiar because it’s associated with other beliefs we have or it could come from a trusted source. On the other hand, it could simply be something we’ve come across before—especially if we’ve come across it multiple times. Frequent repetition can be enough to convince people to believe things that are not true because familiarity generates a sense of cognitive ease. Called the mere-exposure effect, advertisers make use of it, but they aren’t the only ones.

Even if we’re aware of the mere-exposure effect, we probably think we’re immune to it because we’re more sophisticated than that. Believing we’re immune to it, however, might make us even more susceptible to it than we would be if we simply recognized it.

What Is Easy

Information that is easy to understand gives us a sense of cognitive ease. Information that is difficult to understand requires greater cognitive effort to process. Our brain prefers to chill out, so it just says “no” to exerting additional cognitive effort.

Say you’re faced with choosing between two concepts, ideas, or explanations. Idea A is easy to understand, while Idea B is more difficult. Statistically speaking, you’re much more likely to accept Idea A instead of Idea B simply because Idea A is easier for you to swallow. This is especially likely to be the case if you are already experiencing some degree of cognitive strain or if your conscious (System 2) attention is depleted. You’ve undoubtedly had the experience of feeling “brain dead” following a mentally fatiguing effort. That’s when you’re most susceptible to believing what is easy.

What Validates Our Preexisting Beliefs

Information that confirms what we already believe to be true makes us feel right and certain, so we’re likely to accept it uncritically. On the other hand, we’re more likely to reject information that is inconsistent with what we already believe. At the very least, we hold inconsistent information up to greater scrutiny. So we have different standards for evaluating information based on the level of cognitive ease it generates. And evidence has precious little impact on us if it conflicts with what we believe simply because the cognitive strain of processing it is too great.

The easy acceptance of information that validates what we already believe is a result of confirmation bias. Confirmation bias causes us to selectively notice and pay attention to what confirms our beliefs and to ignore what doesn’t. For example, people who favor gun control pay more attention to stories about injuries and deaths resulting from gun use; people who are against gun control pay more attention to stories about people using guns to defend themselves. Confirmation bias underlies the discomfort we feel around people who disagree with us and the ease we feel around people who share our beliefs [see What is Familiar and What is Easy, above].

It’s easy to believe what’s familiar, what’s easy to grasp, and what validates our pre-existing beliefs. No critical thinking or cognitive effort are required. On the other hand, actual thinking, as Dan Ariely says, is difficult and sometimes unpleasant.Enhanced by Zemanta

What Is Cognitive Ease—and Why Should You Be Wary of It?

sense of danger

Everyone wants to be right and to feel certain about things. These are built-in biological drives, not character flaws. When we think we’re right and when we feel certain, we experience a sense of cognitive ease. The world makes sense to us. And that puts us in a good mood.

Cognitive ease feels good, but it gives us a false sense of security because it makes us think we understand far more than we actually do.

Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance. —Daniel Kahneman, Thinking, Fast and Slow

Comfortably Numb

When it comes to taking in information and deciding what to believe and what not to believe, we are appallingly predictable. We are most likely to believe:

  • What Is Familiar

Information that feels familiar is easier to absorb and believe than information that is unfamiliar. It could be familiar because it’s associated with other beliefs we have or it could come from a trusted source. On the other hand, it could simply be something we’ve come across before—especially if we’ve come across it multiple times. Frequent repetition can be enough to convince people to believe things that are not true because familiarity generates a sense of cognitive ease. Called the mere-exposure effect, advertisers make use of it, but they aren’t the only ones.

  • What Is Easy

Information that is easy to understand also gives us a sense of cognitive ease. Information that is difficult to understand requires more cognitive effort to process, and our brain’s preference is to take it easy. Say you’re faced with choosing between two concepts, ideas, or explanations. Idea A is easy to understand, while Idea B is more difficult. Statistically speaking, you’re much more likely to accept Idea A instead of Idea B simply because Idea A is easier for you to swallow. Does that give you a sense of cognitive dis-ease?

  • What Validates Our Preexisting Beliefs

Information that confirms what we already believe to be true makes us feel right and certain, so we’re likely to accept it uncritically. On the other hand, we’re more likely to reject information that is inconsistent with what we already believe or at least we hold inconsistent information up to greater scrutiny. We have different standards for evaluating information depending on the level of cognitive ease it generates.

And evidence has precious little impact on us if it conflicts with what we believe simply because the cognitive strain of processing it is too great. For example, it is easier to believe that What You See Is All There Is (WYSIATI), even after being confronted with evidence that you have missed something that was right in front of your face, than it is to believe that you are aware of only a tiny fraction of what is going on around you.

Cognitive Biases

We use cognitive biases as shortcuts to help us understand the world. We don’t have to use any critical thinking skills. No cognitive effort is required. We aren’t forced to reevaluate our existing beliefs. Because of our cognitive biases, we make snap judgments, form quick impressions or opinions, and operate on autopilot.

The bad news is that, since cognitive biases are by their nature distortions or errors in thinking, they actually decrease our understanding all the while giving us that feel-good sense of cognitive ease.

That’s just fine with the conscious part of our brain, which is slow and kind of lazy and doesn’t want to work if it doesn’t have to. It’s happy to let the unconscious handle as much of the load as possible. Because cognitive biases operate at the unconscious level, unless we make an effort to recognize them, we aren’t aware of them. We will even deny we have them.

To have a human brain is to be subject to cognitive biases. Some of the most common are:

  • Confirmation Bias

The easy acceptance of information that validates what we already believe (as described in What Validates Our Preexisting Beliefs, above) is a result of confirmation bias. Confirmation bias causes us to selectively notice and pay attention to what confirms our beliefs and to ignore what doesn’t. Confirmation bias underlies the discomfort we feel around people who disagree with us and the ease we feel around people who share our beliefs. Example: People who favor gun control pay more attention to stories about injuries and deaths resulting from gun use; people who are against gun control pay more attention to stories about people using guns to defend themselves.

  • The Halo Effect

The tendency to view other people as all good (or all bad) is the result of a cognitive bias called the halo effect. When we consider someone to be a good person, we find it easier to excuse or ignore behavior that is inconsistent with being a good person. Good people can do no wrong. On the other hand, if we consider someone to be a bad person, we find it hard to accept that he or she has any positive qualities. Bad people can do no good. In either case, we ignore evidence that contradicts our general impression of the person. The halo effect requires black and white thinking. Example: People tend to have a completely positive view of the political party they support and a completely negative view of the political party they don’t support.

  • Negativity Bias

Our brains are wired to notice negative events more than positive events, so we give them more attention. This leads us to believe that more negative events are taking place than positive events. It also leads us to give more credence to negative claims about people with whom we disagree. Negativity bias is responsible for the fears we have about some things that are disproportionate to the actual likelihood of their occurring. Bad stuff seems to have more of an impact on us than good stuff, and we are quicker to react to it. This bias can make us susceptible to fear-mongering. Examples: (1) The news. (2) People tend to pay more attention—and give more weight—to critical comments than to praise.

  • Impact Bias

We think we can predict how we will react to potential events, both good and bad, and reliably estimate the impact they will have on us. But in making such predictions, we routinely overestimate how good we will feel (and for how long) after a positive event and how bad we will feel (and for how long) after a negative event. Although we are extremely poor fortune tellers, that doesn’t stop us from being certain about how we will feel in the future. In reality, our excitement over something good will likely dim faster than we predict, and we are likely to rebound from a loss sooner than we predict. Example: People tend to believe a positive change, such marriage, a new job, a bigger house, winning the lottery, etc. will make them feel better—and for a longer time—than it actually will.

  • Hindsight Bias

In retrospect everything seems inevitable. The hindsight bias (“I knew it all along”) makes us think the world is more predictable than it actually is. After the fact, we selectively reconstruct events to make it appear the outcome was inevitable. In doing so, we also exaggerate how likely we considered the outcome to be before it occurred. If the outcome is a negative one, we think someone should have foreseen it and prevented it. Example: After 9/11, many people thought the attacks by al-Qaeda could have been prevented based on the available information. However, the information was not, at that time, as clear as it appeared to be in hindsight.

  • Outcome Bias

The outcome bias leads us to evaluate a decision based on the eventual results or outcome of the decision rather than on the soundness or quality of the decision at the time it was made. If something works out, we think it was a great decision (genius, even), although the reasoning that led to it may have been flawed. Conversely, if something doesn’t work out, we think it was a bad decision, although the reasoning that led to it may have been entirely sound. When outcomes are good, we think the decisions were good; when outcomes are bad, we think the decisions were bad. Example: People tend to think that if something goes wrong during a low-risk surgical procedure, the decision to do the procedure was a bad one.

  • Hidden (or Implicit) Bias

Hidden Biases are attitudes or stereotypes we have, both favorable and unfavorable, particularly about other people in regard to race, gender, age, etc. We don’t all have the same hidden biases, but everyone has them. However, because they are hidden—primarily from ourselves—we are unaware of them, even though they affect our feelings, our behavior, and our reactions. Hidden biases may be at odds with our conscious attitudes and feelings. But some of our hidden biases may be apparent to others.

We can’t find out about hidden biases through introspection. We may be able to learn something about them through observing ourselves. Also Harvard University has developed an implicit association test that is available online (https://implicit.harvard.edu/implicit/) so you can test yourself for your own hidden biases.

Hidden biases contribute to a sense of cognitive ease by tending to confirm that whatever groups we belong to (ethnic, racial, age, income, etc.) are the best groups because they have more positive characteristics than those other groups have.

Cognitive Distortions

Cognitive distortions are habitual ways of thinking that alter our perception. Many, although not all, cognitive distortions are negative. But even negative cognitive distortions contribute to a sense of cognitive ease just because they are habitual. If you are used to thinking about yourself in a negative way or seeing the world in a negative way, that will feel more comfortable than trying to see things in a different (more positive) way.

Cognitive distortions are not uncommon, and there are a lot of different ones. However, not everyone is subject to them—or at least not to the same degree. A few common cognitive distortions are:

  • Mindreading: believing you know what other people are thinking or what their motives are
  • Overgeneralizing: drawing too broad a conclusion from a single event or piece of information or from limited information
  • Catastrophizing: imagining worst case scenarios; exaggerating the likelihood of negative or disastrous outcomes
  • All or Nothing Thinking (also called Black and White Thinking): thinking in extremes without allowing for complexity (shades of gray); believing that if something isn’t perfect or the best, it’s worthless
The Cognitive Ease Continuum

According to Daniel Kahneman, cognitive ease is both a cause and a consequence of a pleasant feeling. Cognitive ease makes us feel more favorable toward things that are familiar, easy to understand, and easy to see or read. We feel less favorable toward what is unfamiliar, difficult to understand, or difficult to see or read. We don’t even have to be consciously aware that something is familiar to us in order to feel good about it. The feel-good response comes from the unconscious part of our brain. It’s part of our hardwiring for survival. A good mood tells our brain everything is OK and we can let our guard down.

Being in a good mood is associated with intuition, creativity, gullibility, and increased reliance on the unconscious part of the brain. At the other end of the continuum are sadness, vigilance, suspicion, an analytic approach, and increased effort.

We can’t worry when we’re happy. But because we’re less vigilant when in a good mood, we’re more prone to making logical errors. We’re more susceptible to cognitive biases. We think we understand more than we do. We even think we’re thinking.

Enhanced by Zemanta