Farther to Go!

Brain-Based Transformational Solutions

  • Home
  • About
    • Farther to Go!
    • Personal Operating Systems
    • Joycelyn Campbell
    • Testimonials
    • Reading List
  • Blog
  • On the Road
    • Lay of the Land
    • Introductory Workshops
    • Courses
  • Links
    • Member Links (Courses)
    • Member Links
    • Imaginarium
    • Newsletter
    • Transformation Toolbox
  • Certification Program
    • Wired that Way Certification
    • What Color Is Change? Certification
    • Art & Science of Transformational Change Certification
    • Certification Facilitation
    • SML Certification
  • Contact

Hunting for Foxhogs,
I Find a Foxcat Instead

September 10, 2020 by Joycelyn Campbell 1 Comment

In the last chapter of Curious, Ian Leslie lays out “seven ways to stay curious.” Item number three on the list is “forage like a foxhog.” The foraging he refers to is for information. The question under consideration is whether it’s better to have a depth of knowledge (specialize) or a breadth of knowledge (generalize).

Eventually he connects these two approaches to a quote from Greek poet Archilochus:

The fox knows many things, but the hedgehog knows one big thing.

Leslie suggests that these animals represent two different ways of thinking, neither of which is really better than the other: the hedgehog knows a lot about a little, while the fox knows a little about a lot.

The thinkers best positioned to thrive today and in the future will be a hybrid of these two animals. In a highly competitive, high-information world, it’s crucial to know one or two big things and to know them in more depth and detail than most of your contemporaries. But to really ignite that knowledge, you need the ability to think about it from a variety of eclectic perspectives and to be able to collaborate fruitfully with people who have different specializations. —Ian Leslie

So by combining the fox and the hedgehog, we get the “foxhog.”

Leslie devotes six pages to this discussion, at the end of which I was not entirely clear about the distinctions he was making beyond specialization vs. generalization. So I did a little research of my own.

Assumptions Were Made (but not by me)

The first thing I discovered was that this concept of the hedgehog and the fox is fairly widely used. That was a little surprising. Also surprising was the fact that although people seem to have definite ideas about what the concept means, it doesn’t appear to mean the same thing to everyone.

I listened to a 38-minute podcast of The Hidden Brain titled The Fox and the Hedgehog, which I found interesting and worth listening to. But it did not advance my understanding at all.

It turns out that Archilochus may have been the source of the quote, but we have no elaboration from him on its meaning. That credit goes to philosopher Isaiah Berlin and his essay The Hedgehog and the Fox published as a book in 1953. It was Berlin who first classified various philosophers, writers, and scientists as either hedgehogs or foxes. But the focus of the essay was Leo Tolstoy, who Berlin conceived of as that hybrid creature, the “foxhog” (although he did not, of course, use that term).

According to Berlin, Tolstoy was really a fox who wanted to be a hedgehog, and this internal dissonance was a source of distress to him. That would make Tolstoy a bad example of a “foxhog,” but Leslie does give us a few positive role models.

After checking out Berlin, I understood that Shankar Vedantam (the Hidden Brain podcast) had based his understanding of the concept of the hedgehog and the fox on Berlin’s essay. But other people had somewhat different ideas, and I was still trying to understand it in terms of types of thinkers—or leaders—or learners. The characteristics associated with foxes and hedgehogs by various proponents didn’t really hang together.

Enter the Foxcat

Eventually, I came across a different perspective based on an Aesop’s fable. It turns out there is a fable titled The Fox and the Hedgehog, but the moral of that story doesn’t seem to have anything to do with what Berlin or Leslie or any of the others are talking about. The fable that does connect is The Fox and the Cat.

This fable sees the fox and cat discussing the various tricks and dodges they know: the fox has many, while the cat says he has just one. The fox appears to have the advantage, until a pack of wild dogs attacks them both. The cat’s one bright idea—climb a tree to get out of harm’s way—rewards him by saving him from the dogs, while the fox—busy chewing over which of his bright ideas to act upon—remains rooted to the spot and is torn apart by the hounds.

Clearly there’s a moral there: act quickly and decisively when you have to, rather than endlessly turning over the various options in your head. —interestingliterature.com

In this story, the fox represents System 2, conscious processing, which allows for more possibilities but is also slow and energy intensive. The cat represents System 1, unconscious processing, which is fast because it acts based on habit and instinct: what worked in the past. (I especially like this because I frequently use my cat as an example of a creature who acts exclusively on System 1 impulses.)

Is There a Moral to This Story?

Neither the fable of The Fox and the Hedgehog nor the fable of The Fox and the Cat are directly relevant to Leslie’s idea about foraging for information. (I don’t think they’re relevant to Isaiah Berlin’s ideas about Tolstoy, either, but that’s another rabbit hole.) In terms of staying curious, I definitely agree with Leslie that breadth is as important as depth. “T-shaped knowledge” combines specialization (the vertical axis) with broad understanding in other areas (the horizontal axis).

The same could be said of System 1 and System 2 thinking: one is as important as the other. It’s important to know when to apply logical, linear, critical thinking and when to allow unconscious associative thinking.

But the moral of the story is that there’s no good reason for us to believe that we know what we’re talking about—or what anyone else is talking about, for that matter. We take the world at face value when we ought to question our assumptions.

Sure, curiosity may have killed the cat. But satisfaction brought it back.

Filed Under: Clarity, Curiosity, Living, Stories Tagged With: Aesop's Fables, Curiosity, Curious, Isiah Berlin, The Hedgehog and the Fox, the Hidden Brain, Thinking

Predictably Irrational: The Hidden Brain and Social Change

October 16, 2015 by Joycelyn Campbell Leave a Comment

resistance

We have a difficult time making behavior changes in our own lives, yet we’re often surprised that enacting social change is so frustrating, difficult, and time consuming. But the situation isn’t remotely surprising. Change is difficult and slow because our brain is wired to maintain the status quo, and it is we—people with brains wired to maintain the status quo—who put into place and are then affected by laws and social policies.

One part of our brain (System 2) can see the benefit of change and wants to make changes. The other part of the brain (System 1) actively resists change. The part of the brain that can see the benefit of change is slow, lazy, and easily depleted. The part of the brain that resists change is fast, vast, and always on. When System 2 is depleted–which is often–we revert to operating not logically and rationally, but on autopilot.

Furthermore, laws and social policies are based on the idea that people are rational actors, who respond to incentives in straightforward ways. We believe that education, awareness, and clearly defined negative consequences are effective strategies. This is a very logical position to take. It’s also one of the reason why our laws and policies don’t work the way we expect them to work.

Many of our social institutions—and laws in particular—implicitly assume that human actions are largely the product of conscious knowledge and intention. We believe that all we need for a law-abiding society is to let people know what is right and what is wrong, and everything will follow from there. Sure, we make exceptions for people with grave mental disorders, but we assume most human behavior is conscious and intentional. Even when we acknowledge the power of unconscious influence, we believe it can be overcome by willpower or education.—Shankar Vedantam, The Hidden Brain

The hidden brain, as Shankar Vedantam refers to System 1, doesn’t operate logically or rationally. It isn’t necessarily up to the same thing the conscious part of our brain, System 2, is up to. For example:

  1. System 1 focuses on survival and detecting threats to our survival.
  2. System 1 can’t handle complexity, so it generalizes instead.
  3. System 1 is biased because biases make it easier to decide what we think.
Threat Detection

The brain is, first and foremost, a survival tool, and the way that it has found to be most effective at guaranteeing survival is through the threat and reward response. Put simply, your brain will cause you to move away from threats and move toward rewards. —Dr. David Rock, author of Your Brain at Work

This sounds reasonable and not particularly problematic until you realize that, in additional to actual survival needs (food, water, shelter, etc.) and actual physical threats, each of us has personalized our threat-detection system to include situations we have defined as threatening. And once the brain gets the idea that something is a threat, it responds as if it is facing a threat to our physical survival.

How logical do you tend to be when you’re facing a threat to your survival?

When the brain is under severe threat, it immediately changes the way it processes information, and starts to prioritize rapid responses. “The normal long pathways through the orbitofrontal cortex, where people evaluate situations in a logical and conscious fashion and [consider] the risks and benefits of different behaviors— that gets short circuited,” says Dr. Eric Hollander, professor of psychiatry at Montefiore/Albert Einstein School of Medicine in New York.  Instead, he says, “You have sensory input right through the sensory [regions] and into the amygdala or limbic system.”

This dramatically alters how we think, since the limbic system is deeply engaged with modulating our emotions.  “The neural networks in the brain that are involved in rational, abstract cognition— essentially, the systems that mediate our most humane and creative thoughts— are very sensitive to emotional states, especially fear.” So when people are terrorized, “Problem solving becomes more categorical, concrete and emotional [and] we become more vulnerable to reactive and short-sighted solutions.” —Maia Szalavitz , neuroscience journalist

When we feel threatened, logic and rationality go offline.

Generalization

Statistical facts don’t come to people naturally. Quite the opposite. Most people understand the world by generalizing personal experiences which are very biased. In the media the “news-worthy” events exaggerate the unusual and put the focus on swift changes. Slow and steady changes in major trends don’t get much attention. Unintentionally, people end-up carrying around a sack of outdated facts that we got in school (including knowledge that often was outdated when acquired in school). —gapminder.org/ignorance

System 1 processes data and information through association. It sees patterns and makes connections, whether or not the patterns and connections actually exist. It is, as Daniel Kahneman (Thinking, Fast and Slow) writes, “radically insensitive to both the quality and quantity of the information that gives rise to impressions and intuitions.” As a result, System 1 accepts anecdotal evidence as being as valid as verified evidence.

Seeing patterns and finding connections makes it easy to come up with sometimes sweeping generalizations.

One example: Person A is similar to Person B in some particular way; therefore, Person B is probably similar to Person A in other ways. Since I know Person A, I now believe I also know and understand Person B. And I see all of the people who share some of these same characteristics as being alike. This leads me to believe I understand more than I do and know more than I know about Person B and other people who bear some similarity to Person B.

Another example: Extrapolating from my own personal experience to assume that everyone thinks the way I think, feels the way I feel, or would respond the way I respond.

Generalizing can be useful when we need to make quick assessments. But it’s a lazy way of thinking that can be dangerous when used in important or critical situations.

It’s easy to find examples of generalizing in the opinions we have and the alliances we form around hot-button social topics such as climate change, GMOs, vaccines, immigration, and Planned Parenthood. It can also be seen in how people line up in the pro- or anti-science camps.

When we generalize, we make assumptions and draw conclusions from limited data or evidence.

Implicit Biases

Critical thinking doesn’t come naturally. Since we need to make all kinds of assessments and decisions in the course of our lives—and since the part of the brain that can think critically is often offline—we use mental shortcuts instead of thinking most things through.

[Implicit] biases, which encompass both favorable and unfavorable assessments, are activated involuntarily and without an individual’s awareness or intentional control. Residing deep in the subconscious, these biases are different from known biases that individuals may choose to conceal for the purposes of social and/or political correctness. Rather, implicit biases are not accessible through introspection.

The implicit associations we harbor in our subconscious cause us to have feelings and attitudes about other people based on characteristics such as race, ethnicity, age, and appearance.  These associations develop over the course of a lifetime beginning at a very early age through exposure to direct and indirect messages. In addition to early life experiences, the media and news programming are often-cited origins of implicit associations.

A Few Key Characteristics of Implicit Biases

  • Implicit biases are pervasive. Everyone possesses them, even people with avowed commitments to impartiality such as judges.
  • Implicit and explicit biases are related but distinct mental constructs. They are not mutually exclusive and may even reinforce each other.
  • The implicit associations we hold do not necessarily align with our declared beliefs or even reflect stances we would explicitly endorse.
  • We generally tend to hold implicit biases that favor our own ingroup, though research has shown that we can still hold implicit biases against our ingroup.
  • Implicit biases are malleable. Our brains are incredibly complex, and the implicit associations that we have formed can be gradually unlearned through a variety of debiasing techniques.

Source: kirwaninstitute.osu.edu. Note: Harvard University has developed an implicit association test that is available online (https://implicit.harvard.edu/implicit/) so you can test yourself for your own hidden biases.

Now What?

Change is hard because of the way we’re wired. If we can come to terms with the fact that we operate less rationally than we think we do, we might be able to create or modify laws and public policies to be more effective for more people.

Things to remember:

  1. System 1’s agenda is to maintain the status quo, so most of the time that’s our agenda and everyone else’s, too. If it’s difficult for us to make personal changes, imagine how difficult it is to make changes that involve large groups of people—or to change other peoples’ minds.
  2. System 1 is primarily a threat-detector. When we feel threatened, we are not going to be thinking or behaving logically, and we should expect the same to be true of others. People who feel threatened are easier to manipulate, and they may take actions that are not in their own best interest.
  3. We generalize because System 1 doesn’t handle complexity well. Generalizing leads to a feeling of cognitive ease because we think we know more than we do and understand more than we do. That may not be a problem in trivial matters, but it has huge implications when it comes to laws and public policies.
  4. We are all at the effect of implicit biases. Because we aren’t directly aware of them, it’s easy for us to deny we have them. That doesn’t make them go away, however. The best thing to do is to pay attention to how we act and react to other people so we can begin to recognize, acknowledge, and eventually neutralize some of these biases.

Filed Under: Beliefs, Brain, Cognitive Biases, Living, Unconscious Tagged With: Dan Ariely, Predictably Irrational, Shankar Vedantam, Social Change, System 1, System 2, the Hidden Brain

Subscribe to Farther to Go!

Enter your email address to receive notifications of new Farther to Go! posts by email.

Search Posts

Recent Posts

  • No Good Deed Goes Unpunished
  • Always Look on
    the Bright Side of Life
  • The Cosmic Gift & Misery
    Distribution System
  • Should You Practice Gratitude?
  • You Give Truth a Bad Name
  • What Are So-Called
    Secondary Emotions?

Explore

The Farther to Go! Manifesto

Contact Me

joycelyn@farthertogo.com
505-332-8677

  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter
  • Home
  • About
  • Blog
  • On the Road
  • Links
  • Certification Program
  • Contact

Copyright © 2025 · Parallax Pro Theme on Genesis Framework · WordPress · Log in