Farther to Go!

Brain-Based Transformational Solutions

  • Home
  • About
    • Farther to Go!
    • Personal Operating Systems
    • Joycelyn Campbell
    • Testimonials
    • Reading List
  • Blog
  • On the Road
    • Lay of the Land
    • Introductory Workshops
    • Courses
  • Links
    • Member Links (Courses)
    • Member Links
    • Imaginarium
    • Newsletter
    • Transformation Toolbox
  • Certification Program
    • Wired that Way Certification
    • What Color Is Change? Certification
    • Art & Science of Transformational Change Certification
    • Certification Facilitation
    • SML Certification
  • Contact

Miswanting: The Problems with Affective Forecasting

July 20, 2016 by Joycelyn Campbell Leave a Comment

affective forecasting

Affective forecasting refers to our attempt to imagine a future event and predict how we’re going to feel about it when it occurs. The term and the research on it may be relatively new, but we engage in the process whenever we attempt to determine a course of action. The results of numerous studies on affective forecasting reveal that (1) we’re not very good at it, (2) we don’t know we’re not very good at it, and so (3) we keep making the same mistakes when pursuing what we think will make us happy. The term for this coined by Daniel T. Gilbert and Timothy D. Wilson is miswanting.

The reason we’re not very good at predicting our future feelings is that we routinely make all kinds of errors, some of which are described below. First the good news: we’re generally good at predicting whether a future experience will be positive or negative. And when we make short-term (tomorrow) versus long-term (a year from now) predictions, we’re pretty good at accurately identifying the emotions we’re likely to feel when we experience an event.

Impact

What we’re not very good at is predicting how intense our feelings will be and how long they will last. This prediction error is known as the impact bias.

Whether people overestimate how good or bad they will feel, overestimate how quickly those feelings will arise, or underestimate how quickly they will dissipate, the important point is that they overestimate how powerfully the event will impact their emotional lives—Timothy D. Wilson, Daniel T. Gilbert (2003)

So we tend to believe that both positive and negative events will affect us more intensely and that the duration of those effects will be longer than they’re likely to be. We think that getting the new job, the guy/girl, the new house/car, or winning the lottery will cause us to feel fantastic for the foreseeable future. We think not getting the job, failing a test, losing a friend, or experiencing a financial setback will cause us to feel devastated for the foreseeable future.

Big vs. Small

We believe that a bigger problem will have a bigger negative effect on us than a smaller, chronic problem or minor annoyance will. But that doesn’t turn out to be the case for a couple of reasons. One is that we tend to respond to and take care of the bigger problems but often let the smaller ones drag on and annoy us indefinitely. The other is that we have a so-called psychological immune system that’s triggered by big problems to help us cope with them.

Misconstrual

In order to predict how we’re likely to feel about something, we need to be able to imagine the event. That’s easier to do if we’ve experienced it or something similar in the past. If we’ve been to a lot of parties, we can imagine—in general—how we’ll feel about attending a party on Saturday. If we’ve cleaned out the garage before, we can imagine how we’ll feel about doing that on Saturday, too. But if we haven’t experienced an event, what we imagine or expect may not bear much resemblance to the way the actual event unfolds. Thinking we can predict the future leads us to believe in the veracity of what we imagine.

Memory

Even if we’re able to imagine an event because we’ve experienced it before, our memory of it—and how we felt at the time—may be faulty simply because it’s the nature of memory to be faulty. And the feelings we experience when remembering a past event are not necessarily the same feelings we had when the event took place. Additionally, when we don’t recall actual details of an event, we may come to rely instead on our beliefs or theories about how such an event will make us feel.

Variability

When trying to decide where to vacation, which movie to see, or which house to buy, we tend to focus on, compare, and overestimate the differences between various options and underestimate their similarities. Furthermore, the order in which people are asked to think about differences vs. similarities has been found to influence the accuracy of their affective forecasting. Those who thought last about the similarities tended to be happier about their choices.

Hot vs. Cold States

When we’re in a “hot” emotional state (anxious, fearful, hungry, courageous, or sexually excited, for example), we have a hard time predicting what we will want when we’re in a “cold” (more rational) state—and vice versa. That means when we’re in a cold state—satiated, for example—we’re likely to predict we’ll have enough willpower to avoid binging on the bag of potato chips we’re picking up at the supermarket. But later that evening, when we’re hungry—in a hot state—we do, in fact, binge eat.

These mistakes—which arise because of the way we’re wired, not because there’s something wrong with us—aren’t the only mistakes we make when trying to predict what will make us happy or sad in the future. But hopefully they help clarify why it’s so hard to make accurate predictions and why we’re often disappointed by the choices we make.

Next time: The Other Problem with Affective Forecasting

Filed Under: Beliefs, Choice, Cognitive Biases, Finding What You Want, Happiness, Making Different Choices Tagged With: Affective Forecasting, Happiness, Impact Bias

Upheaval Is Easy;
Sustained Change Is Hard

July 13, 2016 by Joycelyn Campbell Leave a Comment

change

Although we have a fundamental belief in human rationality, which our laws are based upon, the evidence is mounting that we are, as psychologist Dan Ariely says, “predictably irrational.” On the one hand, this explains quite a lot about the way things play out in the wider world. When you recognize how irrational we actually are, you’re less likely to be surprised by the things people do and say and think. On the other hand, if you’re in favor of fairness and justice, the situation is extremely troubling.

The path to correcting society’s most significant ills may need to begin with questioning some of our basic assumptions about human nature.

The Status Quo Is Status Quo

We have a hard time making behavior changes in our own lives, yet we’re often surprised that enacting social change is so frustrating, difficult, and time consuming. But the situation isn’t remotely surprising. Change is difficult and slow because our brain is wired to maintain the status quo, and it is we—people with brains wired to maintain the status quo—who put into place and are then affected by laws and social policies.

One part of our brain (System 2) can see the benefit of change and wants to make changes. The other part of the brain (System 1) actively resists change. The part of the brain that can see the benefit of change is slow, lazy, and easily depleted. Much of the time it’s offline. The part of the brain that resists change is fast, vast, and always on. When System 2 is depleted, we revert to operating not logically and rationally, but on autopilot.

Furthermore, laws and social policies are based on the idea that people are rational actors, who respond to incentives in straightforward ways. We believe that education, awareness, and clearly defined negative consequences are effective strategies. This is a very logical position to take. It’s also one of the reason why our laws and policies don’t work the way we expect them to work.

Many of our social institutions—and laws in particular—implicitly assume that human actions are largely the product of conscious knowledge and intention. We believe that all we need for a law-abiding society is to let people know what is right and what is wrong, and everything will follow from there. Sure, we make exceptions for people with grave mental disorders, but we assume most human behavior is conscious and intentional. Even when we acknowledge the power of unconscious influence, we believe it can be overcome by willpower or education.—Shankar Vedantam, The Hidden Brain

The hidden brain, as Shankar Vedantam refers to System 1, doesn’t operate logically or rationally. It isn’t necessarily up to the same thing the conscious part of our brain, System 2, is up to. For example:

  1. System 1 focuses on survival and detecting threats to our survival.
  2. System 1 can’t handle complexity, so it generalizes instead.
  3. System 1 is biased because biases make it easier to decide what we think.
Threat Detection

The brain is, first and foremost, a survival tool, and the way that it has found to be most effective at guaranteeing survival is through the threat and reward response. Put simply, your brain will cause you to move away from threats and move toward rewards. —Dr. David Rock, author of Your Brain at Work

This sounds reasonable and not particularly problematic until you realize that, in additional to actual survival needs (food, water, shelter, etc.) and actual physical threats, each of us has personalized our threat-detection system to include situations we have defined as threatening. And once the brain gets the idea that something is a threat, it responds as if it is facing a literal threat to our physical survival.

How logical do you tend to be when you’re facing a threat to your survival?

When the brain is under severe threat, it immediately changes the way it processes information, and starts to prioritize rapid responses. “The normal long pathways through the orbitofrontal cortex, where people evaluate situations in a logical and conscious fashion and [consider] the risks and benefits of different behaviors— that gets short circuited,” says Dr. Eric Hollander, professor of psychiatry at Montefiore/Albert Einstein School of Medicine in New York.  Instead, he says, “You have sensory input right through the sensory [regions] and into the amygdala or limbic system.”

This dramatically alters how we think, since the limbic system is deeply engaged with modulating our emotions.  “The neural networks in the brain that are involved in rational, abstract cognition— essentially, the systems that mediate our most humane and creative thoughts— are very sensitive to emotional states, especially fear.” So when people are terrorized, “Problem solving becomes more categorical, concrete and emotional [and] we become more vulnerable to reactive and short-sighted solutions.” —Maia Szalavitz , neuroscience journalist

When we feel threatened, logic and rationality go offline.

Generalization

Statistical facts don’t come to people naturally. Quite the opposite. Most people understand the world by generalizing personal experiences which are very biased. In the media the “news-worthy” events exaggerate the unusual and put the focus on swift changes. Slow and steady changes in major trends don’t get much attention. Unintentionally, people end-up carrying around a sack of outdated facts that we got in school (including knowledge that often was outdated when acquired in school). —gapminder.org/ignorance

System 1 processes data and information through association. It sees patterns and makes connections, whether or not the patterns and connections actually exist. It is, as Daniel Kahneman (Thinking, Fast and Slow) writes, “radically insensitive to both the quality and quantity of the information that gives rise to impressions and intuitions.” As a result, System 1 accepts anecdotal evidence as being as valid as verified evidence.

Seeing patterns and finding connections makes it easy to come up with sometimes sweeping generalizations.

One example: Person A is similar to Person B in some particular way; therefore, Person B is probably similar to Person A in other ways. Since I know Person A, I now believe I also know and understand Person B. And I see all of the people who share some of these same characteristics as being alike. This leads me to believe I understand more than I do and know more than I know about Person B and other people who bear some similarity to Person B.

Another example: Extrapolating from my own personal experience to assume that everyone thinks the way I think, feels the way I feel, or would respond the way I respond.

Generalizing can be useful when we need to make quick assessments. But it’s a lazy way of thinking that can be dangerous when used in important or critical situations.

It’s easy to find examples of generalizing in the opinions we have and the alliances we form around hot-button social topics such as climate change, GMOs, vaccines, immigration, and Planned Parenthood. It can also be seen in how people line up in the pro- or anti-science camps.

When we generalize, we make assumptions and draw conclusions from limited data or evidence.

Implicit Biases

Critical thinking doesn’t come naturally. Since we need to make all kinds of assessments and decisions in the course of our lives—and since the part of the brain that can think critically is often offline—we use mental shortcuts instead of thinking most things through.

[Implicit] biases, which encompass both favorable and unfavorable assessments, are activated involuntarily and without an individual’s awareness or intentional control. Residing deep in the subconscious, these biases are different from known biases that individuals may choose to conceal for the purposes of social and/or political correctness. Rather, implicit biases are not accessible through introspection.

The implicit associations we harbor in our subconscious cause us to have feelings and attitudes about other people based on characteristics such as race, ethnicity, age, and appearance.  These associations develop over the course of a lifetime beginning at a very early age through exposure to direct and indirect messages. In addition to early life experiences, the media and news programming are often-cited origins of implicit associations.

A Few Key Characteristics of Implicit Biases

Implicit biases are pervasive. Everyone possesses them, even people with avowed commitments to impartiality such as judges.

Implicit and explicit biases are related but distinct mental constructs.They are not mutually exclusive and may even reinforce each other.

The implicit associations we hold do not necessarily align with our declared beliefs or even reflect stances we would explicitly endorse.

We generally tend to hold implicit biases that favor our own ingroup, though research has shown that we can still hold implicit biases against our ingroup.

Implicit biases are malleable. Our brains are incredibly complex, and the implicit associations that we have formed can be gradually unlearned through a variety of debiasing techniques.

Source: kirwaninstitute.osu.edu. Note: Harvard University has developed an implicit association test that is available online so you can test yourself for your own hidden biases.

Now What?

Change is hard because of the way we’re wired. If we can come to terms with the fact that we operate less rationally than we think we do, we might be able to create or modify laws and public policies to be more effective for more people.

Things to remember:

  • System 1’s agenda is to maintain the status quo, so most of the time that’s our agenda and everyone else’s, too. If it’s difficult for us to make personal changes, imagine how difficult it is to make changes that involve large groups of people—or to change other peoples’ minds.
  • System 1 is primarily a threat-detector. When we feel threatened, we are not going to be thinking or behaving logically, and we should expect the same to be true of others. People who feel threatened are easier to manipulate, and they may take actions that are not in their own best interest.
  • We generalize because System 1 doesn’t handle complexity well. Generalizing leads to a feeling of cognitive ease because we think we know more than we do and understand more than we do. That may not be a problem in trivial matters, but it has huge implications when it comes to laws and public policies.
  • We are all at the effect of implicit biases. Because we aren’t directly aware of them, it’s easy for us to deny we have them. That doesn’t make them go away, however. The best thing to do is to pay attention to how we act and react to other people so we can begin to recognize, acknowledge, and eventually neutralize some of these biases.

Making the unconscious conscious is difficult because the central obstacle lies within ourselves. But putting reason ahead of instinct and intuition is also what sets us apart from every other species that has ever lived. Understanding the hidden brain and building safeguards to protect us against its vagaries can help us be more successful in our everyday lives. It can aid us in our battle against threats and help us spend our money more wisely. But it can also do something more important than any of those things: It can make us better people. —Shankar Vedantam, The Hidden Brain

Note: This is a slightly modified version of my 10/16/15 post as a response to recent events not only in the U.S. but around the world.

Filed Under: Beliefs, Brain, Cognitive Biases, Living, Wired that Way Tagged With: and Save Our Lives, Brain, Change, Control Markets, Implicit Bias, Mind, The Hidden Brain: How Our Unconscious Minds Elect Presidents, Wage Wars

The Enneagram:
Use It; Don’t Abuse It

June 29, 2016 by Joycelyn Campbell 1 Comment

stereotypes

The Enneagram is a fascinating and powerful tool for understanding ourselves and others better—but only when it’s used wisely.

Any system or method of classifying people has the potential to be used in harmful ways. But classifying things and people is one of the ways in which we organize and make sense of the world. Our brains do most of this classifying on their own without our conscious intervention. It isn’t possible or even desirable to dispense with our classifying behavior.

In terms of the physical/material world, it’s good to know which classifications of mushrooms are safe to eat and which are not, which insects have a deadly sting and which are harmless, which sounds and smells signal danger and which are innocuous.

In terms of people, things can get a bit dicey. We have all kinds of classifications for people based on nationality, religion, race, gender, age, level of education, make and model of car, number of children, physical appearance, language, whether or not they just cut you off in traffic, where they live, and even whether or not you know them. We also, of course, classify people by their personalities or temperaments.

Types and Stereotypes

We can get into trouble with temperament or personality typing when we forget about individual exceptions. Just because something is true in general for an entire group of people doesn’t mean it is true for every single individual within that group.

We can also get into trouble by using types or stereotypes against people. Although the most egregious examples are racism, sexism, religious persecution, and the like, we can also use personality stereotypes against other people.

But “stereotypes are observations…neither good nor bad, desirable nor undesirable, moral nor immoral,” according to evolutionary psychologist Satoshi Kanazawa. “Stereotypes tell us what groups of people tend to be or do in general; they do not tell us how we ought to treat them.”

Each of us views the world through our own set of filters, biases, opinions, judgments, personal experiences, and type. We make snap judgments, jump to conclusions, and react emotionally. When conflicts arise or someone says or does something we don’t like, it can be tempting to blame their behavior on their personality type.

While the Enneagram can definitely help us understand ourselves and others better, type is only part of the picture. We can never fully know someone else’s story. If we judge them solely on their personality type, we’re doing ourselves, the other person, and even the Enneagram a disservice.

Filed Under: Cognitive Biases, Enneagram, Living Tagged With: Enneagram, Personality Types, Stereotypes

Is What You See All There Is?

December 11, 2015 by Joycelyn Campbell Leave a Comment

rainbow eye

As you move through the world, you probably have the sense that you’re aware of whatever there is to be aware of as it is. This applies not only to the sensory world, but also to events, situations, interpersonal interactions—actually to everything that exists or occurs within your world. But the capacity of conscious attention is much too limited for this to even be possible. The 11,000,000 bits of information being processed by the unconscious part of the brain at any given moment need to be considerably (and swiftly) condensed and summarized into the 40 bits you can process consciously.

Consciousness is a way of projecting all the activity in your nervous system into a simpler form. [It] gives you a summary that is useful for the larger picture, useful at the scale of apples and rivers and humans with whom you might be able to mate. —David Eagleman, Incognito

What You See Is All There Is (WYSIATI)*

Your brain maintains a model of the world that represents what’s normal in it for you. The result is that you experience a stripped-down, customized version of the actual world. To a great extent, each of us really does inhabit our own world. But it would be incorrect to say that we create our reality; rather, our brain creates our reality for us.

Much, if not most, of what you do, think, and feel consists of automatically generated responses to internal or external stimuli. And it isn’t possible to consciously mediate all of your responses. It wouldn’t even be a good idea to try.

In addition to helping you navigate the world, your mental model gives rise to your sense of the way things should be. It generates expectations (that are either confirmed or denied), assumptions, biases, etc. that determine what you pay attention to, what you perceive (even what you are able to perceive), how you interpret and respond to what you perceive, and the meaning you make of it all. Your mental model is the result of your genes and your experiences, of both intention and accident. Your brain has been constructing your particular model of the world since your birth, and it is continually updating and modifying it—most of the time entirely outside your awareness.

But while the contents of your particular mental model determine what you think, feel, do, and say, you can’t search them—or follow a bread-crumb trail backward through them—to find out precisely which aspects (and when and how they came to be) give rise to any specific facet of who you are and how you react now.

The significance of your mental model in your life can’t be overstated. Although you aren’t consciously aware of it, your mental model circumscribes not only every aspect of your present experience but also what is possible for you to do and be. It determines what you see and how you see the world, both literally and figuratively, as well as how you see yourself.

But…Your Brain Can Get It Wrong

System 1, the unconscious part of your brain, uses associative thinking to develop and maintain your model of the world. However, there are some problems with associative thinking. For example:

  • It sacrifices accuracy for speed.
  • It doesn’t discriminate very well.
  • It takes cognitive shortcuts (aka cognitive biases).

Your mental model can—and sometimes does—lead to erroneous conclusions and inappropriate responses. It’s the job of consciousness (System 2) to check the impulses and suggestions it receives from System 1, but consciousness is slow, lazy, and easily depleted. Most of the time, it’s content to go along with System 1, which means it’s susceptible to cognitive biases. By definition, cognitive biases are distortions or errors in thinking. They actually decrease your understanding while giving you a feel-good sense of cognitive ease.

Confirmation bias is the easy acceptance of information that validates what you already believe. It causes you to selectively notice and pay attention to what confirms your beliefs and to ignore what doesn’t. It underlies the discomfort you feel around people who disagree with you and the ease you feel around people who share your beliefs.

Information that confirms what you already believe to be true makes you feel right and certain, so you’re likely to accept it uncritically. On the other hand, you’re more likely to reject information that is inconsistent with what you already believe or at least you hold inconsistent information up to greater scrutiny. You have different standards for evaluating information depending on the level of cognitive ease it generates.

Evidence has precious little impact on any of us if it conflicts with what we believe simply because the cognitive strain of processing it is too great. To a very real extent, we don’t even “see” conflicting evidence. While total commitment to our particular worldview (mental model) makes us feel more confident, it narrows—rather than expands—our possibilities. That means it limits our powers of discernment, our ability to increase our understanding of the world around us, and our creative potential. It closes the world off for us instead of opening it up.

The often-quoted statement is true: we don’t see things as they are; we see them as we are. If we want to live fuller lives, if we want to be more effective or useful or loving in the world, we first need to recognize that our greatest constraints are imposed by our own mental models.

It’s important to remember that what you see is not all there is.

*Daniel Kahneman, Thinking, Fast and Slow

Filed Under: Attention, Beliefs, Brain, Cognitive Biases, Consciousness, Mind, Unconscious Tagged With: Brain, Cognitive Biases, Mental Model, Mind, Perception, Reality

Predictably Irrational: The Hidden Brain and Social Change

October 16, 2015 by Joycelyn Campbell Leave a Comment

resistance

We have a difficult time making behavior changes in our own lives, yet we’re often surprised that enacting social change is so frustrating, difficult, and time consuming. But the situation isn’t remotely surprising. Change is difficult and slow because our brain is wired to maintain the status quo, and it is we—people with brains wired to maintain the status quo—who put into place and are then affected by laws and social policies.

One part of our brain (System 2) can see the benefit of change and wants to make changes. The other part of the brain (System 1) actively resists change. The part of the brain that can see the benefit of change is slow, lazy, and easily depleted. The part of the brain that resists change is fast, vast, and always on. When System 2 is depleted–which is often–we revert to operating not logically and rationally, but on autopilot.

Furthermore, laws and social policies are based on the idea that people are rational actors, who respond to incentives in straightforward ways. We believe that education, awareness, and clearly defined negative consequences are effective strategies. This is a very logical position to take. It’s also one of the reason why our laws and policies don’t work the way we expect them to work.

Many of our social institutions—and laws in particular—implicitly assume that human actions are largely the product of conscious knowledge and intention. We believe that all we need for a law-abiding society is to let people know what is right and what is wrong, and everything will follow from there. Sure, we make exceptions for people with grave mental disorders, but we assume most human behavior is conscious and intentional. Even when we acknowledge the power of unconscious influence, we believe it can be overcome by willpower or education.—Shankar Vedantam, The Hidden Brain

The hidden brain, as Shankar Vedantam refers to System 1, doesn’t operate logically or rationally. It isn’t necessarily up to the same thing the conscious part of our brain, System 2, is up to. For example:

  1. System 1 focuses on survival and detecting threats to our survival.
  2. System 1 can’t handle complexity, so it generalizes instead.
  3. System 1 is biased because biases make it easier to decide what we think.
Threat Detection

The brain is, first and foremost, a survival tool, and the way that it has found to be most effective at guaranteeing survival is through the threat and reward response. Put simply, your brain will cause you to move away from threats and move toward rewards. —Dr. David Rock, author of Your Brain at Work

This sounds reasonable and not particularly problematic until you realize that, in additional to actual survival needs (food, water, shelter, etc.) and actual physical threats, each of us has personalized our threat-detection system to include situations we have defined as threatening. And once the brain gets the idea that something is a threat, it responds as if it is facing a threat to our physical survival.

How logical do you tend to be when you’re facing a threat to your survival?

When the brain is under severe threat, it immediately changes the way it processes information, and starts to prioritize rapid responses. “The normal long pathways through the orbitofrontal cortex, where people evaluate situations in a logical and conscious fashion and [consider] the risks and benefits of different behaviors— that gets short circuited,” says Dr. Eric Hollander, professor of psychiatry at Montefiore/Albert Einstein School of Medicine in New York.  Instead, he says, “You have sensory input right through the sensory [regions] and into the amygdala or limbic system.”

This dramatically alters how we think, since the limbic system is deeply engaged with modulating our emotions.  “The neural networks in the brain that are involved in rational, abstract cognition— essentially, the systems that mediate our most humane and creative thoughts— are very sensitive to emotional states, especially fear.” So when people are terrorized, “Problem solving becomes more categorical, concrete and emotional [and] we become more vulnerable to reactive and short-sighted solutions.” —Maia Szalavitz , neuroscience journalist

When we feel threatened, logic and rationality go offline.

Generalization

Statistical facts don’t come to people naturally. Quite the opposite. Most people understand the world by generalizing personal experiences which are very biased. In the media the “news-worthy” events exaggerate the unusual and put the focus on swift changes. Slow and steady changes in major trends don’t get much attention. Unintentionally, people end-up carrying around a sack of outdated facts that we got in school (including knowledge that often was outdated when acquired in school). —gapminder.org/ignorance

System 1 processes data and information through association. It sees patterns and makes connections, whether or not the patterns and connections actually exist. It is, as Daniel Kahneman (Thinking, Fast and Slow) writes, “radically insensitive to both the quality and quantity of the information that gives rise to impressions and intuitions.” As a result, System 1 accepts anecdotal evidence as being as valid as verified evidence.

Seeing patterns and finding connections makes it easy to come up with sometimes sweeping generalizations.

One example: Person A is similar to Person B in some particular way; therefore, Person B is probably similar to Person A in other ways. Since I know Person A, I now believe I also know and understand Person B. And I see all of the people who share some of these same characteristics as being alike. This leads me to believe I understand more than I do and know more than I know about Person B and other people who bear some similarity to Person B.

Another example: Extrapolating from my own personal experience to assume that everyone thinks the way I think, feels the way I feel, or would respond the way I respond.

Generalizing can be useful when we need to make quick assessments. But it’s a lazy way of thinking that can be dangerous when used in important or critical situations.

It’s easy to find examples of generalizing in the opinions we have and the alliances we form around hot-button social topics such as climate change, GMOs, vaccines, immigration, and Planned Parenthood. It can also be seen in how people line up in the pro- or anti-science camps.

When we generalize, we make assumptions and draw conclusions from limited data or evidence.

Implicit Biases

Critical thinking doesn’t come naturally. Since we need to make all kinds of assessments and decisions in the course of our lives—and since the part of the brain that can think critically is often offline—we use mental shortcuts instead of thinking most things through.

[Implicit] biases, which encompass both favorable and unfavorable assessments, are activated involuntarily and without an individual’s awareness or intentional control. Residing deep in the subconscious, these biases are different from known biases that individuals may choose to conceal for the purposes of social and/or political correctness. Rather, implicit biases are not accessible through introspection.

The implicit associations we harbor in our subconscious cause us to have feelings and attitudes about other people based on characteristics such as race, ethnicity, age, and appearance.  These associations develop over the course of a lifetime beginning at a very early age through exposure to direct and indirect messages. In addition to early life experiences, the media and news programming are often-cited origins of implicit associations.

A Few Key Characteristics of Implicit Biases

  • Implicit biases are pervasive. Everyone possesses them, even people with avowed commitments to impartiality such as judges.
  • Implicit and explicit biases are related but distinct mental constructs. They are not mutually exclusive and may even reinforce each other.
  • The implicit associations we hold do not necessarily align with our declared beliefs or even reflect stances we would explicitly endorse.
  • We generally tend to hold implicit biases that favor our own ingroup, though research has shown that we can still hold implicit biases against our ingroup.
  • Implicit biases are malleable. Our brains are incredibly complex, and the implicit associations that we have formed can be gradually unlearned through a variety of debiasing techniques.

Source: kirwaninstitute.osu.edu. Note: Harvard University has developed an implicit association test that is available online (https://implicit.harvard.edu/implicit/) so you can test yourself for your own hidden biases.

Now What?

Change is hard because of the way we’re wired. If we can come to terms with the fact that we operate less rationally than we think we do, we might be able to create or modify laws and public policies to be more effective for more people.

Things to remember:

  1. System 1’s agenda is to maintain the status quo, so most of the time that’s our agenda and everyone else’s, too. If it’s difficult for us to make personal changes, imagine how difficult it is to make changes that involve large groups of people—or to change other peoples’ minds.
  2. System 1 is primarily a threat-detector. When we feel threatened, we are not going to be thinking or behaving logically, and we should expect the same to be true of others. People who feel threatened are easier to manipulate, and they may take actions that are not in their own best interest.
  3. We generalize because System 1 doesn’t handle complexity well. Generalizing leads to a feeling of cognitive ease because we think we know more than we do and understand more than we do. That may not be a problem in trivial matters, but it has huge implications when it comes to laws and public policies.
  4. We are all at the effect of implicit biases. Because we aren’t directly aware of them, it’s easy for us to deny we have them. That doesn’t make them go away, however. The best thing to do is to pay attention to how we act and react to other people so we can begin to recognize, acknowledge, and eventually neutralize some of these biases.

Filed Under: Beliefs, Brain, Cognitive Biases, Living, Unconscious Tagged With: Dan Ariely, Predictably Irrational, Shankar Vedantam, Social Change, System 1, System 2, the Hidden Brain

  • « Previous Page
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • Next Page »

Subscribe to Farther to Go!

Enter your email address to receive notifications of new Farther to Go! posts by email.

Search Posts

Recent Posts

  • No Good Deed Goes Unpunished
  • Always Look on
    the Bright Side of Life
  • The Cosmic Gift & Misery
    Distribution System
  • Should You Practice Gratitude?
  • You Give Truth a Bad Name
  • What Are So-Called
    Secondary Emotions?

Explore

The Farther to Go! Manifesto

Contact Me

joycelyn@farthertogo.com
505-332-8677

  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter
  • Home
  • About
  • Blog
  • On the Road
  • Links
  • Certification Program
  • Contact

Copyright © 2025 · Parallax Pro Theme on Genesis Framework · WordPress · Log in