Although we have a fundamental belief in human rationality, which our laws are based upon, the evidence is mounting that we are, as psychologist Dan Ariely says, “predictably irrational.” On the one hand, this explains quite a lot about the way things play out in the wider world. When you recognize how irrational we actually are, you’re less likely to be surprised by the things people do and say and think. On the other hand, if you’re in favor of fairness and justice, the situation is extremely troubling.
The path to correcting society’s most significant ills may need to begin with questioning some of our basic assumptions about human nature.
The Status Quo Is Status Quo
We have a hard time making behavior changes in our own lives, yet we’re often surprised that enacting social change is so frustrating, difficult, and time consuming. But the situation isn’t remotely surprising. Change is difficult and slow because our brain is wired to maintain the status quo, and it is we—people with brains wired to maintain the status quo—who put into place and are then affected by laws and social policies.
One part of our brain (System 2) can see the benefit of change and wants to make changes. The other part of the brain (System 1) actively resists change. The part of the brain that can see the benefit of change is slow, lazy, and easily depleted. Much of the time it’s offline. The part of the brain that resists change is fast, vast, and always on. When System 2 is depleted, we revert to operating not logically and rationally, but on autopilot.
Furthermore, laws and social policies are based on the idea that people are rational actors, who respond to incentives in straightforward ways. We believe that education, awareness, and clearly defined negative consequences are effective strategies. This is a very logical position to take. It’s also one of the reason why our laws and policies don’t work the way we expect them to work.
Many of our social institutions—and laws in particular—implicitly assume that human actions are largely the product of conscious knowledge and intention. We believe that all we need for a law-abiding society is to let people know what is right and what is wrong, and everything will follow from there. Sure, we make exceptions for people with grave mental disorders, but we assume most human behavior is conscious and intentional. Even when we acknowledge the power of unconscious influence, we believe it can be overcome by willpower or education.—Shankar Vedantam, The Hidden Brain
The hidden brain, as Shankar Vedantam refers to System 1, doesn’t operate logically or rationally. It isn’t necessarily up to the same thing the conscious part of our brain, System 2, is up to. For example:
- System 1 focuses on survival and detecting threats to our survival.
- System 1 can’t handle complexity, so it generalizes instead.
- System 1 is biased because biases make it easier to decide what we think.
Threat Detection
The brain is, first and foremost, a survival tool, and the way that it has found to be most effective at guaranteeing survival is through the threat and reward response. Put simply, your brain will cause you to move away from threats and move toward rewards. —Dr. David Rock, author of Your Brain at Work
This sounds reasonable and not particularly problematic until you realize that, in additional to actual survival needs (food, water, shelter, etc.) and actual physical threats, each of us has personalized our threat-detection system to include situations we have defined as threatening. And once the brain gets the idea that something is a threat, it responds as if it is facing a literal threat to our physical survival.
How logical do you tend to be when you’re facing a threat to your survival?
When the brain is under severe threat, it immediately changes the way it processes information, and starts to prioritize rapid responses. “The normal long pathways through the orbitofrontal cortex, where people evaluate situations in a logical and conscious fashion and [consider] the risks and benefits of different behaviors— that gets short circuited,” says Dr. Eric Hollander, professor of psychiatry at Montefiore/Albert Einstein School of Medicine in New York. Instead, he says, “You have sensory input right through the sensory [regions] and into the amygdala or limbic system.”
This dramatically alters how we think, since the limbic system is deeply engaged with modulating our emotions. “The neural networks in the brain that are involved in rational, abstract cognition— essentially, the systems that mediate our most humane and creative thoughts— are very sensitive to emotional states, especially fear.” So when people are terrorized, “Problem solving becomes more categorical, concrete and emotional [and] we become more vulnerable to reactive and short-sighted solutions.” —Maia Szalavitz , neuroscience journalist
When we feel threatened, logic and rationality go offline.
Generalization
Statistical facts don’t come to people naturally. Quite the opposite. Most people understand the world by generalizing personal experiences which are very biased. In the media the “news-worthy” events exaggerate the unusual and put the focus on swift changes. Slow and steady changes in major trends don’t get much attention. Unintentionally, people end-up carrying around a sack of outdated facts that we got in school (including knowledge that often was outdated when acquired in school). —gapminder.org/ignorance
System 1 processes data and information through association. It sees patterns and makes connections, whether or not the patterns and connections actually exist. It is, as Daniel Kahneman (Thinking, Fast and Slow) writes, “radically insensitive to both the quality and quantity of the information that gives rise to impressions and intuitions.” As a result, System 1 accepts anecdotal evidence as being as valid as verified evidence.
Seeing patterns and finding connections makes it easy to come up with sometimes sweeping generalizations.
One example: Person A is similar to Person B in some particular way; therefore, Person B is probably similar to Person A in other ways. Since I know Person A, I now believe I also know and understand Person B. And I see all of the people who share some of these same characteristics as being alike. This leads me to believe I understand more than I do and know more than I know about Person B and other people who bear some similarity to Person B.
Another example: Extrapolating from my own personal experience to assume that everyone thinks the way I think, feels the way I feel, or would respond the way I respond.
Generalizing can be useful when we need to make quick assessments. But it’s a lazy way of thinking that can be dangerous when used in important or critical situations.
It’s easy to find examples of generalizing in the opinions we have and the alliances we form around hot-button social topics such as climate change, GMOs, vaccines, immigration, and Planned Parenthood. It can also be seen in how people line up in the pro- or anti-science camps.
When we generalize, we make assumptions and draw conclusions from limited data or evidence.
Implicit Biases
Critical thinking doesn’t come naturally. Since we need to make all kinds of assessments and decisions in the course of our lives—and since the part of the brain that can think critically is often offline—we use mental shortcuts instead of thinking most things through.
[Implicit] biases, which encompass both favorable and unfavorable assessments, are activated involuntarily and without an individual’s awareness or intentional control. Residing deep in the subconscious, these biases are different from known biases that individuals may choose to conceal for the purposes of social and/or political correctness. Rather, implicit biases are not accessible through introspection.
The implicit associations we harbor in our subconscious cause us to have feelings and attitudes about other people based on characteristics such as race, ethnicity, age, and appearance. These associations develop over the course of a lifetime beginning at a very early age through exposure to direct and indirect messages. In addition to early life experiences, the media and news programming are often-cited origins of implicit associations.
A Few Key Characteristics of Implicit Biases
Implicit biases are pervasive. Everyone possesses them, even people with avowed commitments to impartiality such as judges.
Implicit and explicit biases are related but distinct mental constructs.They are not mutually exclusive and may even reinforce each other.
The implicit associations we hold do not necessarily align with our declared beliefs or even reflect stances we would explicitly endorse.
We generally tend to hold implicit biases that favor our own ingroup, though research has shown that we can still hold implicit biases against our ingroup.
Implicit biases are malleable. Our brains are incredibly complex, and the implicit associations that we have formed can be gradually unlearned through a variety of debiasing techniques.
Source: kirwaninstitute.osu.edu. Note: Harvard University has developed an implicit association test that is available online so you can test yourself for your own hidden biases.
Now What?
Change is hard because of the way we’re wired. If we can come to terms with the fact that we operate less rationally than we think we do, we might be able to create or modify laws and public policies to be more effective for more people.
Things to remember:
- System 1’s agenda is to maintain the status quo, so most of the time that’s our agenda and everyone else’s, too. If it’s difficult for us to make personal changes, imagine how difficult it is to make changes that involve large groups of people—or to change other peoples’ minds.
- System 1 is primarily a threat-detector. When we feel threatened, we are not going to be thinking or behaving logically, and we should expect the same to be true of others. People who feel threatened are easier to manipulate, and they may take actions that are not in their own best interest.
- We generalize because System 1 doesn’t handle complexity well. Generalizing leads to a feeling of cognitive ease because we think we know more than we do and understand more than we do. That may not be a problem in trivial matters, but it has huge implications when it comes to laws and public policies.
- We are all at the effect of implicit biases. Because we aren’t directly aware of them, it’s easy for us to deny we have them. That doesn’t make them go away, however. The best thing to do is to pay attention to how we act and react to other people so we can begin to recognize, acknowledge, and eventually neutralize some of these biases.
Making the unconscious conscious is difficult because the central obstacle lies within ourselves. But putting reason ahead of instinct and intuition is also what sets us apart from every other species that has ever lived. Understanding the hidden brain and building safeguards to protect us against its vagaries can help us be more successful in our everyday lives. It can aid us in our battle against threats and help us spend our money more wisely. But it can also do something more important than any of those things: It can make us better people. —Shankar Vedantam, The Hidden Brain
Note: This is a slightly modified version of my 10/16/15 post as a response to recent events not only in the U.S. but around the world.