Farther to Go!

Brain-Based Transformational Solutions

  • Home
  • About
    • Farther to Go!
    • Personal Operating Systems
    • Joycelyn Campbell
    • Testimonials
    • Reading List
  • Blog
  • On the Road
    • Lay of the Land
    • Introductory Workshops
    • Courses
  • Links
    • Member Links (Courses)
    • Member Links
    • Imaginarium
    • Newsletter
    • Transformation Toolbox
  • Certification Program
    • Wired that Way Certification
    • What Color Is Change? Certification
    • Art & Science of Transformational Change Certification
    • Certification Facilitation
    • SML Certification
  • Contact

Predictably Irrational: The Hidden Brain and Social Change

October 16, 2015 by Joycelyn Campbell Leave a Comment

resistance

We have a difficult time making behavior changes in our own lives, yet we’re often surprised that enacting social change is so frustrating, difficult, and time consuming. But the situation isn’t remotely surprising. Change is difficult and slow because our brain is wired to maintain the status quo, and it is we—people with brains wired to maintain the status quo—who put into place and are then affected by laws and social policies.

One part of our brain (System 2) can see the benefit of change and wants to make changes. The other part of the brain (System 1) actively resists change. The part of the brain that can see the benefit of change is slow, lazy, and easily depleted. The part of the brain that resists change is fast, vast, and always on. When System 2 is depleted–which is often–we revert to operating not logically and rationally, but on autopilot.

Furthermore, laws and social policies are based on the idea that people are rational actors, who respond to incentives in straightforward ways. We believe that education, awareness, and clearly defined negative consequences are effective strategies. This is a very logical position to take. It’s also one of the reason why our laws and policies don’t work the way we expect them to work.

Many of our social institutions—and laws in particular—implicitly assume that human actions are largely the product of conscious knowledge and intention. We believe that all we need for a law-abiding society is to let people know what is right and what is wrong, and everything will follow from there. Sure, we make exceptions for people with grave mental disorders, but we assume most human behavior is conscious and intentional. Even when we acknowledge the power of unconscious influence, we believe it can be overcome by willpower or education.—Shankar Vedantam, The Hidden Brain

The hidden brain, as Shankar Vedantam refers to System 1, doesn’t operate logically or rationally. It isn’t necessarily up to the same thing the conscious part of our brain, System 2, is up to. For example:

  1. System 1 focuses on survival and detecting threats to our survival.
  2. System 1 can’t handle complexity, so it generalizes instead.
  3. System 1 is biased because biases make it easier to decide what we think.
Threat Detection

The brain is, first and foremost, a survival tool, and the way that it has found to be most effective at guaranteeing survival is through the threat and reward response. Put simply, your brain will cause you to move away from threats and move toward rewards. —Dr. David Rock, author of Your Brain at Work

This sounds reasonable and not particularly problematic until you realize that, in additional to actual survival needs (food, water, shelter, etc.) and actual physical threats, each of us has personalized our threat-detection system to include situations we have defined as threatening. And once the brain gets the idea that something is a threat, it responds as if it is facing a threat to our physical survival.

How logical do you tend to be when you’re facing a threat to your survival?

When the brain is under severe threat, it immediately changes the way it processes information, and starts to prioritize rapid responses. “The normal long pathways through the orbitofrontal cortex, where people evaluate situations in a logical and conscious fashion and [consider] the risks and benefits of different behaviors— that gets short circuited,” says Dr. Eric Hollander, professor of psychiatry at Montefiore/Albert Einstein School of Medicine in New York.  Instead, he says, “You have sensory input right through the sensory [regions] and into the amygdala or limbic system.”

This dramatically alters how we think, since the limbic system is deeply engaged with modulating our emotions.  “The neural networks in the brain that are involved in rational, abstract cognition— essentially, the systems that mediate our most humane and creative thoughts— are very sensitive to emotional states, especially fear.” So when people are terrorized, “Problem solving becomes more categorical, concrete and emotional [and] we become more vulnerable to reactive and short-sighted solutions.” —Maia Szalavitz , neuroscience journalist

When we feel threatened, logic and rationality go offline.

Generalization

Statistical facts don’t come to people naturally. Quite the opposite. Most people understand the world by generalizing personal experiences which are very biased. In the media the “news-worthy” events exaggerate the unusual and put the focus on swift changes. Slow and steady changes in major trends don’t get much attention. Unintentionally, people end-up carrying around a sack of outdated facts that we got in school (including knowledge that often was outdated when acquired in school). —gapminder.org/ignorance

System 1 processes data and information through association. It sees patterns and makes connections, whether or not the patterns and connections actually exist. It is, as Daniel Kahneman (Thinking, Fast and Slow) writes, “radically insensitive to both the quality and quantity of the information that gives rise to impressions and intuitions.” As a result, System 1 accepts anecdotal evidence as being as valid as verified evidence.

Seeing patterns and finding connections makes it easy to come up with sometimes sweeping generalizations.

One example: Person A is similar to Person B in some particular way; therefore, Person B is probably similar to Person A in other ways. Since I know Person A, I now believe I also know and understand Person B. And I see all of the people who share some of these same characteristics as being alike. This leads me to believe I understand more than I do and know more than I know about Person B and other people who bear some similarity to Person B.

Another example: Extrapolating from my own personal experience to assume that everyone thinks the way I think, feels the way I feel, or would respond the way I respond.

Generalizing can be useful when we need to make quick assessments. But it’s a lazy way of thinking that can be dangerous when used in important or critical situations.

It’s easy to find examples of generalizing in the opinions we have and the alliances we form around hot-button social topics such as climate change, GMOs, vaccines, immigration, and Planned Parenthood. It can also be seen in how people line up in the pro- or anti-science camps.

When we generalize, we make assumptions and draw conclusions from limited data or evidence.

Implicit Biases

Critical thinking doesn’t come naturally. Since we need to make all kinds of assessments and decisions in the course of our lives—and since the part of the brain that can think critically is often offline—we use mental shortcuts instead of thinking most things through.

[Implicit] biases, which encompass both favorable and unfavorable assessments, are activated involuntarily and without an individual’s awareness or intentional control. Residing deep in the subconscious, these biases are different from known biases that individuals may choose to conceal for the purposes of social and/or political correctness. Rather, implicit biases are not accessible through introspection.

The implicit associations we harbor in our subconscious cause us to have feelings and attitudes about other people based on characteristics such as race, ethnicity, age, and appearance.  These associations develop over the course of a lifetime beginning at a very early age through exposure to direct and indirect messages. In addition to early life experiences, the media and news programming are often-cited origins of implicit associations.

A Few Key Characteristics of Implicit Biases

  • Implicit biases are pervasive. Everyone possesses them, even people with avowed commitments to impartiality such as judges.
  • Implicit and explicit biases are related but distinct mental constructs. They are not mutually exclusive and may even reinforce each other.
  • The implicit associations we hold do not necessarily align with our declared beliefs or even reflect stances we would explicitly endorse.
  • We generally tend to hold implicit biases that favor our own ingroup, though research has shown that we can still hold implicit biases against our ingroup.
  • Implicit biases are malleable. Our brains are incredibly complex, and the implicit associations that we have formed can be gradually unlearned through a variety of debiasing techniques.

Source: kirwaninstitute.osu.edu. Note: Harvard University has developed an implicit association test that is available online (https://implicit.harvard.edu/implicit/) so you can test yourself for your own hidden biases.

Now What?

Change is hard because of the way we’re wired. If we can come to terms with the fact that we operate less rationally than we think we do, we might be able to create or modify laws and public policies to be more effective for more people.

Things to remember:

  1. System 1’s agenda is to maintain the status quo, so most of the time that’s our agenda and everyone else’s, too. If it’s difficult for us to make personal changes, imagine how difficult it is to make changes that involve large groups of people—or to change other peoples’ minds.
  2. System 1 is primarily a threat-detector. When we feel threatened, we are not going to be thinking or behaving logically, and we should expect the same to be true of others. People who feel threatened are easier to manipulate, and they may take actions that are not in their own best interest.
  3. We generalize because System 1 doesn’t handle complexity well. Generalizing leads to a feeling of cognitive ease because we think we know more than we do and understand more than we do. That may not be a problem in trivial matters, but it has huge implications when it comes to laws and public policies.
  4. We are all at the effect of implicit biases. Because we aren’t directly aware of them, it’s easy for us to deny we have them. That doesn’t make them go away, however. The best thing to do is to pay attention to how we act and react to other people so we can begin to recognize, acknowledge, and eventually neutralize some of these biases.

Filed Under: Beliefs, Brain, Cognitive Biases, Living, Unconscious Tagged With: Dan Ariely, Predictably Irrational, Shankar Vedantam, Social Change, System 1, System 2, the Hidden Brain

Brain Dead: Is Your Mind Temporarily Offline?

September 4, 2015 by Joycelyn Campbell Leave a Comment

brain fog3

Your brain has two systems for processing the stimuli and experiences of your life and determining how you act upon them.

Conscious: The processing system you’re aware of is called System 2. It is logical and intentional and sometimes referred to as “true reasoning.” (A formal outline is a good example.) It is also slow, limited, and easily depleted. It processes about 40 bits of information at a time.

Unconscious: The processing system you’re not aware of is called System 1. It is associative, which means it sees patterns and connects dots. (A mindmap is a good example.) It is fast, vast, and always on. It processes about 11,000,000 bits of information at a time.

If System 1 were to go offline, you would, too. Game over! But you can still function when System 2 is temporarily offline, even for long periods of time, such as when you’re asleep. So when you think or talk about being temporarily brain dead, you’re talking about exhausting System 2 attention.

If you’re in good health, there’s not much you can do to tax or exhaust the capacity of System 1—and there are things you can do to enhance its functioning. However, your supply of System 2 attention is always limited, and anything that occupies your working memory reduces it. Some examples of things that tax System 2 attention are:

  • Physical illness (even minor), injury, or lack of sleep
  • Making numerous trivial decisions throughout the day
  • Stress, anxiety, and worry
  • Exercising will power (forcing yourself to do something you don’t want to do or to not do something you do want to do)
  • Monitoring your behavior
  • Monitoring your environment if it is new or you consider it unsafe
  • Learning something new, traveling an unfamiliar route, etc.
  • Completing a complex computation
  • Trying to tune out distractions
  • A long period of concentrated or focused attention
  • Trying to remember dates, numbers, or unrelated facts
  • Listening to me talk

Since System 1 is fast, vast, always on, and has an answer for almost everything—and since you don’t need System 2 attention for most of what you do when you’re awake—what’s the big deal if you run out of System 2 attention from time to time?

Three Categories of Errors

Optimally, the two systems work together, and neither type of processing is superior. However, System 1 is more useful in some situations, while System 2 is not only more useful but also required in other situations.

System 1 is pretty good at what it does because its models of familiar situations are accurate so its short-term predictions tend to be accurate. But that’s not always the case. System 1 sacrifices accuracy for speed, meaning it jumps to conclusions. It also has biases and is prone to making logical errors.

One of System 2’s jobs is to detect System 1’s errors and adjust course by overriding System 1’s impulses. As Daniel Kahneman says in Thinking, Fast and Slow:

There are vital tasks that only System 2 can perform because they require effort and acts of self-control in which the intuitions and impulses of System 1 are overcome.

Bear in mind that System 1 is not rational. If System 2 is depleted and can’t veto or modify the non-rational impulses of System 1, those impulses then turn into actions (or speech).

There are three categories of errors you tend to make when System 2 is depleted.

Logical Errors

System 1 thinking uses shortcuts. System 2 thinking takes the long (logical/linear) way home. So when you’re out of System 2 attention, you’re more prone to making mistakes in anything that requires logical, linear thinking. Errors of intuitive thought can be difficult for System 2 to catch on a good day. When System 2 is offline, you automatically assume them to be correct. As a result:

  • You will have trouble making, following, or checking the validity of a complex logical argument. You’ll be more likely to be led by the cognitive biases and distortions System 1 uses because they don’t require any effort and give you a comforting sense of cognitive ease.
  • You will have difficulty comparing the features of two items for overall value. If you have to make a choice, you’ll be more likely to go with what intuitively feels right or the item that has some emotionally compelling attribute (it reminds you of the one your mother had, for example, or reminds you of your mother).
  • You will be more gullible. You’ll be more likely to believe things you wouldn’t otherwise believe or be persuaded by empty messages, such as in commercials. System 2 is the skeptic, so the best time for someone to take advantage of you is when it is offline.
Intention or Response Errors

System 1 continuously picks up on cues and triggers in your environment to determine what situation you’re in and to predict what’s next. Any deviation from the norm requires System 2 attention. If it isn’t available, you’re likely to do not what you intended to do but whatever is normal for you in that situation. And without System 2 attention, you’re much more likely to respond automatically (habitually) to the stimulus (cue or trigger).

  • System 2 is in charge of self-control, continuously monitoring your behavior, keeping you polite, for example, when you’re angry. In the heat of the moment, when you’re out of System 2 attention, you’re much less likely to be able to suppress your immediate emotional reactions to people and situations.
  • System 1 has an answer for almost everything. But when it encounters a surprising situation (something it hasn’t previously encountered or that is unusual in that situation), it notifies System 2. You don’t need System 2 attention to drive a familiar route, but if you encounter an obstacle along that route, you need System 2 to figure out what it is and to respond appropriately to it.
  • System 2 is also in charge of will power. If you are in the process of trying to stop doing something you habitually do (such as raiding the refrigerator in the evening), you need System 2 to belay the impulse from System 1 to see if there’s more pie. Without System 2, you’re more likely to give in, look for the pie…and eat it.
  • You need System 2 if you want to take a different route from your usual one or make an extra stop you don’t normally make. Without adequate System 2 attention, you’re likely to find yourself taking the usual route and forgetting to make that stop.
Gatekeeping Errors

We all have biases, whether or not we’re aware of them and whether or not we want to admit it. While it’s easy to spot overt biases and prejudices in other people, most of your own biases are hidden even from you. In the case of biases toward specific groups of people, you’ve likely come to a reasoned conclusion they’re wrong and have chosen not to think about and treat other people based on stereotypes. But that doesn’t mean the biases have disappeared. They’re still part of System 1’s associative processing operations. It’s just that when System 1 suggests a biased response to System 2, System 2 normally overrides it. Per Daniel Kahneman:

Conflict between an automatic reaction (System 1) and an intention to control it (System 2) is common in our lives.

When System 2 is depleted, there is no one at the gate to keep the biased or prejudiced responses from getting through. You may simply have a biased thought. You may say something in the presence of others that you wouldn’t normally say. Or you may respond to another person based on a group stereotype. The thought, comment, or behavior may be something you later regret. If you were to claim it doesn’t represent what you believe or the way you really feel or think, you’d most likely be right.

But when you see a blatant expression of bias or prejudice in someone else—especially a celebrity—you might have a different reaction. You might assume their true colors are showing.  We think that what we see in other people when their guard is down and they’re pushed or stressed reveals the truth about them. But the actual truth is that to the extent we have any civility at all, it’s because System 2 maintains it.  Without System 2 you and I would have no ability to question our biases or prejudices, no ability to come to reasoned conclusions about them, and no ability to monitor and veto System 1’s automatic reactions.

Conclusion

It isn’t always necessary, advisable, or even possible to override System 1. But when you deplete System 2, you can’t override it even when you want or need to. Without System 2, you can’t think straight (logically and linearly). So:

  • Don’t try to make important decisions of any kind when you feel brain dead.
  • Don’t assume you’ll feel or think the same way about something the next day as you do when you’re stressed, sick, just completed your annual tax return, or have recently fallen in love.
  • Don’t stay up late to watch the QVC channel unless you have a lot of money you’re trying to unload.
  • Don’t keep pie around if you’re trying not to eat it.
  • Don’t get into debates about complex issues after you’ve had a few beers.
  • Don’t tax your working memory with details you can keep track of some other way.
  • Don’t take System 2’s censoring of your biases and prejudices for granted. And don’t assume other people’s mental lapses reveal deep-seated truths about them.

Filed Under: Attention, Brain, Cognitive Biases, Consciousness, Living, Memory, Mind, Unconscious Tagged With: Brain, Brain Dead, Cognitive Biases, Daniel Kahneman, Fast and Slow, Mind, Predictably Irrational, System 1, System 2, Thinking

Thinking Is Difficult

April 5, 2014 by Joycelyn Campbell 2 Comments

thinking is difficult

Filed Under: Brain, Living, Mind Tagged With: Brain, Dan Ariely, Mind, Predictably Irrational, Thinking

Subscribe to Farther to Go!

Enter your email address to receive notifications of new Farther to Go! posts by email.

Search Posts

Recent Posts

  • No Good Deed Goes Unpunished
  • Always Look on
    the Bright Side of Life
  • The Cosmic Gift & Misery
    Distribution System
  • Should You Practice Gratitude?
  • You Give Truth a Bad Name
  • What Are So-Called
    Secondary Emotions?

Explore

The Farther to Go! Manifesto

Contact Me

joycelyn@farthertogo.com
505-332-8677

  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter
  • Home
  • About
  • Blog
  • On the Road
  • Links
  • Certification Program
  • Contact

Copyright © 2025 · Parallax Pro Theme on Genesis Framework · WordPress · Log in