Farther to Go!

Brain-Based Transformational Solutions

  • Home
  • About
    • Farther to Go!
    • Personal Operating Systems
    • Joycelyn Campbell
    • Testimonials
    • Reading List
  • Blog
  • On the Road
    • Lay of the Land
    • Introductory Workshops
    • Courses
  • Links
    • Member Links (Courses)
    • Member Links
    • Imaginarium
    • Newsletter
    • Transformation Toolbox
  • Certification Program
    • Wired that Way Certification
    • What Color Is Change? Certification
    • Art & Science of Transformational Change Certification
    • Certification Facilitation
    • SML Certification
  • Contact

Predictably Irrational: The Hidden Brain and Social Change

October 16, 2015 by Joycelyn Campbell Leave a Comment

resistance

We have a difficult time making behavior changes in our own lives, yet we’re often surprised that enacting social change is so frustrating, difficult, and time consuming. But the situation isn’t remotely surprising. Change is difficult and slow because our brain is wired to maintain the status quo, and it is we—people with brains wired to maintain the status quo—who put into place and are then affected by laws and social policies.

One part of our brain (System 2) can see the benefit of change and wants to make changes. The other part of the brain (System 1) actively resists change. The part of the brain that can see the benefit of change is slow, lazy, and easily depleted. The part of the brain that resists change is fast, vast, and always on. When System 2 is depleted–which is often–we revert to operating not logically and rationally, but on autopilot.

Furthermore, laws and social policies are based on the idea that people are rational actors, who respond to incentives in straightforward ways. We believe that education, awareness, and clearly defined negative consequences are effective strategies. This is a very logical position to take. It’s also one of the reason why our laws and policies don’t work the way we expect them to work.

Many of our social institutions—and laws in particular—implicitly assume that human actions are largely the product of conscious knowledge and intention. We believe that all we need for a law-abiding society is to let people know what is right and what is wrong, and everything will follow from there. Sure, we make exceptions for people with grave mental disorders, but we assume most human behavior is conscious and intentional. Even when we acknowledge the power of unconscious influence, we believe it can be overcome by willpower or education.—Shankar Vedantam, The Hidden Brain

The hidden brain, as Shankar Vedantam refers to System 1, doesn’t operate logically or rationally. It isn’t necessarily up to the same thing the conscious part of our brain, System 2, is up to. For example:

  1. System 1 focuses on survival and detecting threats to our survival.
  2. System 1 can’t handle complexity, so it generalizes instead.
  3. System 1 is biased because biases make it easier to decide what we think.
Threat Detection

The brain is, first and foremost, a survival tool, and the way that it has found to be most effective at guaranteeing survival is through the threat and reward response. Put simply, your brain will cause you to move away from threats and move toward rewards. —Dr. David Rock, author of Your Brain at Work

This sounds reasonable and not particularly problematic until you realize that, in additional to actual survival needs (food, water, shelter, etc.) and actual physical threats, each of us has personalized our threat-detection system to include situations we have defined as threatening. And once the brain gets the idea that something is a threat, it responds as if it is facing a threat to our physical survival.

How logical do you tend to be when you’re facing a threat to your survival?

When the brain is under severe threat, it immediately changes the way it processes information, and starts to prioritize rapid responses. “The normal long pathways through the orbitofrontal cortex, where people evaluate situations in a logical and conscious fashion and [consider] the risks and benefits of different behaviors— that gets short circuited,” says Dr. Eric Hollander, professor of psychiatry at Montefiore/Albert Einstein School of Medicine in New York.  Instead, he says, “You have sensory input right through the sensory [regions] and into the amygdala or limbic system.”

This dramatically alters how we think, since the limbic system is deeply engaged with modulating our emotions.  “The neural networks in the brain that are involved in rational, abstract cognition— essentially, the systems that mediate our most humane and creative thoughts— are very sensitive to emotional states, especially fear.” So when people are terrorized, “Problem solving becomes more categorical, concrete and emotional [and] we become more vulnerable to reactive and short-sighted solutions.” —Maia Szalavitz , neuroscience journalist

When we feel threatened, logic and rationality go offline.

Generalization

Statistical facts don’t come to people naturally. Quite the opposite. Most people understand the world by generalizing personal experiences which are very biased. In the media the “news-worthy” events exaggerate the unusual and put the focus on swift changes. Slow and steady changes in major trends don’t get much attention. Unintentionally, people end-up carrying around a sack of outdated facts that we got in school (including knowledge that often was outdated when acquired in school). —gapminder.org/ignorance

System 1 processes data and information through association. It sees patterns and makes connections, whether or not the patterns and connections actually exist. It is, as Daniel Kahneman (Thinking, Fast and Slow) writes, “radically insensitive to both the quality and quantity of the information that gives rise to impressions and intuitions.” As a result, System 1 accepts anecdotal evidence as being as valid as verified evidence.

Seeing patterns and finding connections makes it easy to come up with sometimes sweeping generalizations.

One example: Person A is similar to Person B in some particular way; therefore, Person B is probably similar to Person A in other ways. Since I know Person A, I now believe I also know and understand Person B. And I see all of the people who share some of these same characteristics as being alike. This leads me to believe I understand more than I do and know more than I know about Person B and other people who bear some similarity to Person B.

Another example: Extrapolating from my own personal experience to assume that everyone thinks the way I think, feels the way I feel, or would respond the way I respond.

Generalizing can be useful when we need to make quick assessments. But it’s a lazy way of thinking that can be dangerous when used in important or critical situations.

It’s easy to find examples of generalizing in the opinions we have and the alliances we form around hot-button social topics such as climate change, GMOs, vaccines, immigration, and Planned Parenthood. It can also be seen in how people line up in the pro- or anti-science camps.

When we generalize, we make assumptions and draw conclusions from limited data or evidence.

Implicit Biases

Critical thinking doesn’t come naturally. Since we need to make all kinds of assessments and decisions in the course of our lives—and since the part of the brain that can think critically is often offline—we use mental shortcuts instead of thinking most things through.

[Implicit] biases, which encompass both favorable and unfavorable assessments, are activated involuntarily and without an individual’s awareness or intentional control. Residing deep in the subconscious, these biases are different from known biases that individuals may choose to conceal for the purposes of social and/or political correctness. Rather, implicit biases are not accessible through introspection.

The implicit associations we harbor in our subconscious cause us to have feelings and attitudes about other people based on characteristics such as race, ethnicity, age, and appearance.  These associations develop over the course of a lifetime beginning at a very early age through exposure to direct and indirect messages. In addition to early life experiences, the media and news programming are often-cited origins of implicit associations.

A Few Key Characteristics of Implicit Biases

  • Implicit biases are pervasive. Everyone possesses them, even people with avowed commitments to impartiality such as judges.
  • Implicit and explicit biases are related but distinct mental constructs. They are not mutually exclusive and may even reinforce each other.
  • The implicit associations we hold do not necessarily align with our declared beliefs or even reflect stances we would explicitly endorse.
  • We generally tend to hold implicit biases that favor our own ingroup, though research has shown that we can still hold implicit biases against our ingroup.
  • Implicit biases are malleable. Our brains are incredibly complex, and the implicit associations that we have formed can be gradually unlearned through a variety of debiasing techniques.

Source: kirwaninstitute.osu.edu. Note: Harvard University has developed an implicit association test that is available online (https://implicit.harvard.edu/implicit/) so you can test yourself for your own hidden biases.

Now What?

Change is hard because of the way we’re wired. If we can come to terms with the fact that we operate less rationally than we think we do, we might be able to create or modify laws and public policies to be more effective for more people.

Things to remember:

  1. System 1’s agenda is to maintain the status quo, so most of the time that’s our agenda and everyone else’s, too. If it’s difficult for us to make personal changes, imagine how difficult it is to make changes that involve large groups of people—or to change other peoples’ minds.
  2. System 1 is primarily a threat-detector. When we feel threatened, we are not going to be thinking or behaving logically, and we should expect the same to be true of others. People who feel threatened are easier to manipulate, and they may take actions that are not in their own best interest.
  3. We generalize because System 1 doesn’t handle complexity well. Generalizing leads to a feeling of cognitive ease because we think we know more than we do and understand more than we do. That may not be a problem in trivial matters, but it has huge implications when it comes to laws and public policies.
  4. We are all at the effect of implicit biases. Because we aren’t directly aware of them, it’s easy for us to deny we have them. That doesn’t make them go away, however. The best thing to do is to pay attention to how we act and react to other people so we can begin to recognize, acknowledge, and eventually neutralize some of these biases.

Filed Under: Beliefs, Brain, Cognitive Biases, Living, Unconscious Tagged With: Dan Ariely, Predictably Irrational, Shankar Vedantam, Social Change, System 1, System 2, the Hidden Brain

Think You’re Thinking?

May 26, 2014 by Joycelyn Campbell 4 Comments

English: Uriah Heep from "David Copperfie...
English: Uriah Heep from “David Copperfield”, Ink and wash drawing (Photo credit: Wikipedia)

Much of what passes for thinking consists of unconscious, not conscious, mental processes. When it comes to taking in information and deciding what to believe and what not to believe, for example, we are appallingly predictable. We are most likely to believe:

What Is Familiar

Information that feels familiar is easier to absorb and believe than information that is unfamiliar. The information could be familiar because it’s associated with other beliefs we have or it could come from a trusted source. On the other hand, it could simply be something we’ve come across before—especially if we’ve come across it multiple times. Frequent repetition can be enough to convince people to believe things that are not true because familiarity generates a sense of cognitive ease. Called the mere-exposure effect, advertisers make use of it, but they aren’t the only ones.

Even if we’re aware of the mere-exposure effect, we probably think we’re immune to it because we’re more sophisticated than that. Believing we’re immune to it, however, might make us even more susceptible to it than we would be if we simply recognized it.

What Is Easy

Information that is easy to understand gives us a sense of cognitive ease. Information that is difficult to understand requires greater cognitive effort to process. Our brain prefers to chill out, so it just says “no” to exerting additional cognitive effort.

Say you’re faced with choosing between two concepts, ideas, or explanations. Idea A is easy to understand, while Idea B is more difficult. Statistically speaking, you’re much more likely to accept Idea A instead of Idea B simply because Idea A is easier for you to swallow. This is especially likely to be the case if you are already experiencing some degree of cognitive strain or if your conscious (System 2) attention is depleted. You’ve undoubtedly had the experience of feeling “brain dead” following a mentally fatiguing effort. That’s when you’re most susceptible to believing what is easy.

What Validates Our Preexisting Beliefs

Information that confirms what we already believe to be true makes us feel right and certain, so we’re likely to accept it uncritically. On the other hand, we’re more likely to reject information that is inconsistent with what we already believe. At the very least, we hold inconsistent information up to greater scrutiny. So we have different standards for evaluating information based on the level of cognitive ease it generates. And evidence has precious little impact on us if it conflicts with what we believe simply because the cognitive strain of processing it is too great.

The easy acceptance of information that validates what we already believe is a result of confirmation bias. Confirmation bias causes us to selectively notice and pay attention to what confirms our beliefs and to ignore what doesn’t. For example, people who favor gun control pay more attention to stories about injuries and deaths resulting from gun use; people who are against gun control pay more attention to stories about people using guns to defend themselves. Confirmation bias underlies the discomfort we feel around people who disagree with us and the ease we feel around people who share our beliefs [see What is Familiar and What is Easy, above].

It’s easy to believe what’s familiar, what’s easy to grasp, and what validates our pre-existing beliefs. No critical thinking or cognitive effort are required. On the other hand, actual thinking, as Dan Ariely says, is difficult and sometimes unpleasant.Enhanced by Zemanta

Filed Under: Beliefs, Brain, Cognitive Biases, Mind, Unconscious Tagged With: beliefs, Believing, Cognition, Cognitive bias, Confirmation bias, Critical thinking, Dan Ariely, Thinking

Thinking Is Difficult

April 5, 2014 by Joycelyn Campbell 2 Comments

thinking is difficult

Filed Under: Brain, Living, Mind Tagged With: Brain, Dan Ariely, Mind, Predictably Irrational, Thinking

Subscribe to Farther to Go!

Enter your email address to receive notifications of new Farther to Go! posts by email.

Search Posts

Recent Posts

  • No Good Deed Goes Unpunished
  • Always Look on
    the Bright Side of Life
  • The Cosmic Gift & Misery
    Distribution System
  • Should You Practice Gratitude?
  • You Give Truth a Bad Name
  • What Are So-Called
    Secondary Emotions?

Explore

The Farther to Go! Manifesto

Contact Me

joycelyn@farthertogo.com
505-332-8677

  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter
  • Home
  • About
  • Blog
  • On the Road
  • Links
  • Certification Program
  • Contact

Copyright © 2025 · Parallax Pro Theme on Genesis Framework · WordPress · Log in