Farther to Go!

Brain-Based Transformational Solutions

  • Home
  • About
    • Farther to Go!
    • Personal Operating Systems
    • Joycelyn Campbell
    • Testimonials
    • Reading List
  • Blog
  • On the Road
    • Lay of the Land
    • Introductory Workshops
    • Courses
  • Links
    • Member Links (Courses)
    • Member Links
    • Imaginarium
    • Newsletter
    • Transformation Toolbox
  • Certification Program
    • Wired that Way Certification
    • What Color Is Change? Certification
    • Art & Science of Transformational Change Certification
    • Certification Facilitation
    • SML Certification
  • Contact

Hunting for Foxhogs,
I Find a Foxcat Instead

September 10, 2020 by Joycelyn Campbell 1 Comment

In the last chapter of Curious, Ian Leslie lays out “seven ways to stay curious.” Item number three on the list is “forage like a foxhog.” The foraging he refers to is for information. The question under consideration is whether it’s better to have a depth of knowledge (specialize) or a breadth of knowledge (generalize).

Eventually he connects these two approaches to a quote from Greek poet Archilochus:

The fox knows many things, but the hedgehog knows one big thing.

Leslie suggests that these animals represent two different ways of thinking, neither of which is really better than the other: the hedgehog knows a lot about a little, while the fox knows a little about a lot.

The thinkers best positioned to thrive today and in the future will be a hybrid of these two animals. In a highly competitive, high-information world, it’s crucial to know one or two big things and to know them in more depth and detail than most of your contemporaries. But to really ignite that knowledge, you need the ability to think about it from a variety of eclectic perspectives and to be able to collaborate fruitfully with people who have different specializations. —Ian Leslie

So by combining the fox and the hedgehog, we get the “foxhog.”

Leslie devotes six pages to this discussion, at the end of which I was not entirely clear about the distinctions he was making beyond specialization vs. generalization. So I did a little research of my own.

Assumptions Were Made (but not by me)

The first thing I discovered was that this concept of the hedgehog and the fox is fairly widely used. That was a little surprising. Also surprising was the fact that although people seem to have definite ideas about what the concept means, it doesn’t appear to mean the same thing to everyone.

I listened to a 38-minute podcast of The Hidden Brain titled The Fox and the Hedgehog, which I found interesting and worth listening to. But it did not advance my understanding at all.

It turns out that Archilochus may have been the source of the quote, but we have no elaboration from him on its meaning. That credit goes to philosopher Isaiah Berlin and his essay The Hedgehog and the Fox published as a book in 1953. It was Berlin who first classified various philosophers, writers, and scientists as either hedgehogs or foxes. But the focus of the essay was Leo Tolstoy, who Berlin conceived of as that hybrid creature, the “foxhog” (although he did not, of course, use that term).

According to Berlin, Tolstoy was really a fox who wanted to be a hedgehog, and this internal dissonance was a source of distress to him. That would make Tolstoy a bad example of a “foxhog,” but Leslie does give us a few positive role models.

After checking out Berlin, I understood that Shankar Vedantam (the Hidden Brain podcast) had based his understanding of the concept of the hedgehog and the fox on Berlin’s essay. But other people had somewhat different ideas, and I was still trying to understand it in terms of types of thinkers—or leaders—or learners. The characteristics associated with foxes and hedgehogs by various proponents didn’t really hang together.

Enter the Foxcat

Eventually, I came across a different perspective based on an Aesop’s fable. It turns out there is a fable titled The Fox and the Hedgehog, but the moral of that story doesn’t seem to have anything to do with what Berlin or Leslie or any of the others are talking about. The fable that does connect is The Fox and the Cat.

This fable sees the fox and cat discussing the various tricks and dodges they know: the fox has many, while the cat says he has just one. The fox appears to have the advantage, until a pack of wild dogs attacks them both. The cat’s one bright idea—climb a tree to get out of harm’s way—rewards him by saving him from the dogs, while the fox—busy chewing over which of his bright ideas to act upon—remains rooted to the spot and is torn apart by the hounds.

Clearly there’s a moral there: act quickly and decisively when you have to, rather than endlessly turning over the various options in your head. —interestingliterature.com

In this story, the fox represents System 2, conscious processing, which allows for more possibilities but is also slow and energy intensive. The cat represents System 1, unconscious processing, which is fast because it acts based on habit and instinct: what worked in the past. (I especially like this because I frequently use my cat as an example of a creature who acts exclusively on System 1 impulses.)

Is There a Moral to This Story?

Neither the fable of The Fox and the Hedgehog nor the fable of The Fox and the Cat are directly relevant to Leslie’s idea about foraging for information. (I don’t think they’re relevant to Isaiah Berlin’s ideas about Tolstoy, either, but that’s another rabbit hole.) In terms of staying curious, I definitely agree with Leslie that breadth is as important as depth. “T-shaped knowledge” combines specialization (the vertical axis) with broad understanding in other areas (the horizontal axis).

The same could be said of System 1 and System 2 thinking: one is as important as the other. It’s important to know when to apply logical, linear, critical thinking and when to allow unconscious associative thinking.

But the moral of the story is that there’s no good reason for us to believe that we know what we’re talking about—or what anyone else is talking about, for that matter. We take the world at face value when we ought to question our assumptions.

Sure, curiosity may have killed the cat. But satisfaction brought it back.

Filed Under: Clarity, Curiosity, Living, Stories Tagged With: Aesop's Fables, Curiosity, Curious, Isiah Berlin, The Hedgehog and the Fox, the Hidden Brain, Thinking

Think INside the Box

May 4, 2017 by Joycelyn Campbell Leave a Comment

The concept of thinking outside the box is a metaphor for thinking differently, unconventionally, or from a new perspective. It’s also a cliché about clichéd thinking. You can’t actually think outside the box, anyway, since you are constrained by the mental model your brain constructs and maintains for you. The mental model is the box, and you are always inside it. Contrary to some branches of popular thought, that’s not a bad thing.

Here’s a story that’s meant to illustrate thinking outside the box but that’s actually an excellent example of just the opposite—thinking inside the box.

Island of Safety*

On August 5, 1949, 15 firefighters and their foreman, Wag Dodge, were airlifted to Mann Gulch in Montana to extinguish what they thought would be a relatively small brush fire on one side of the gulch. They parachuted onto the opposite side of the gulch, joined one fire guard, and began descending with the wind at their backs.

Suddenly and unexpectedly, the wind reversed, and the fire jumped over to ignite the grass on their side. As the flames rapidly approached them, the men began to climb the slope to try to outrun the fire, pausing only to drop their heavy equipment.

But Dodge, the foreman, realized the fire was moving too quickly for that to work. He stopped and lit the grass in front of him with a match. The dry grass immediately caught fire and the wind blew the fire up the side of the gulch, away from him. That left a patch of charred ground Dodge crawled onto. When the advancing fire arrived, it flowed around and then away from his island of safety.

The other men misunderstood what he was doing and in spite of his exhortations for them to join him, continued up the slope. Only two, who had found shelter in a narrow crevice, survived.

Notice that it was the foreman who had the idea to fight fire with fire.

As the foreman, Dodge presumably had more experience and knowledge than the men he was supervising. The other firefighters not only didn’t come up with the idea, they also didn’t understand it when he showed it to them. The “box” Dodge was thinking inside was different from the boxes of the other men.

While you can’t escape thinking from inside your own box, you can continually remodel and expand it, thereby increasing your possibilities for original, innovative, and creative thinking.

Here’s another thinking-inside-the-box example.

WALL-E*

Andrew Stanton of Pixar Animation Studios was working on the screenplay for WALL-E, about the last robot left on a hopelessly polluted earth abandoned by humans. He was struggling with the design of WALL-E’s face, which he wanted to be both machinelike and expressive.

At a baseball game one day, he borrowed binoculars from someone sitting next to him. When he mistakenly turned them around so that the lenses were on the wrong side, he realized the binoculars looked like a face. After flexing the inner hinges several times to create different facial expressions, he decided WALL-E would look like a “binocular on a stem.”

Stanton had been writing and directing animated films for 20 years by the time he started working on WALL-E. He had already framed—and attempted to solve—the problem of WALL-E’s appearance before his binocular incident. And just as Wag Dodge did, he had a vast reservoir of experience and knowledge to draw upon.

The contents and the connections inside his box made it possible for him to come up with the solution.

The best things you can do for yourself to live a healthy (on every level) life also happen to be the best things you can do to expand your mental model: learn, move, create, challenge yourself; repeat.


*The two stories were drawn from The Eureka Factor by John Kounios and Mark Beeman.

Filed Under: Beliefs, Brain, Clarity, Living Tagged With: Creativity, Inside the Box, Mental Model, Thinking

T Is for Thinking

March 15, 2017 by Joycelyn Campbell 2 Comments

What exactly is thinking? It turns out this is an area where you can’t trust dictionaries to provide meaningful definitions. If you consider the various definitions of the word—or the process—you’re likely to either be confused or to grab one that fits your existing concept so you (and your not-necessarily-thinking brain) can move on.

I don’t want to get philosophical about it, but I think there’s value in acknowledging the confusion. Being able to think clearly and effectively is essential for anyone who wants to lead a satisfying and meaningful life. It’s the difference between using your brain and letting your brain use you.

Warning! Metacognition* Ahead.

One definition equates thinking and opinion. But that source also equates opinion and judgment, so my opinion is that their thinking is sloppy and can’t be trusted. Are they referring to opinions and judgments rendered as a result of careful deliberation or are they referring to off-the-cuff (and often off-the-wall) moment-to-moment opinions and judgments that result from jumping to conclusions based on little or no evidence?

Another source says thinking is the action of using one’s mind to produce thoughts. This sounds reasonable, but I’m not sure what they mean by “using one’s mind.” Based on the way the two parts of the brain work, we know that the majority of thoughts we have are suggestions from System 1 (the unconscious) rather than the result of conscious deliberation.

Yet another definition equates thinking with having a conscious mind. But there’s a difference between consciousness and both the contents of consciousness (what you’re aware of—see above) and conscious processes. You’re conscious of all kinds of things you’ve never given any particular thought to.

For example, I’m aware that I dislike the color pink and rainy climates. I’m also aware that I’m suspicious of people who prefer rainy climates. But I’m not under the impression that any actual thinking was involved in the development of those so-called “thoughts.”

How Do I Think?
Let Me Count the Ways.

Some of the confusion undoubtedly results from the fact that, as with memory, there are so many different types of thinking that the term needs adjectives to clarify and differentiate them. Variations on the theme of thinking include:

  • Critical thinking
  • Associative thinking
  • Ruminative thinking
  • Creative thinking
  • Default-mode thinking
  • Counterfactual thinking
  • Overthinking
  • Positive thinking

Critical thinking is the ability to think clearly, rationally, and objectively and to understand the logical connection between ideas. It’s an active rather than a passive process. Because it requires System 2 (conscious) attention, it doesn’t come naturally and isn’t easy. In order to make an important decision or solve a significant problem, you need well-developed critical thinking skills so you can effectively evaluate both the information at hand and the “intuitive” suggestions spontaneously arising from System 1.

Associative thinking is the process System 1 (the unconscious) uses to link one thing (thought, idea, experience, etc.) to another. Associative thinking is much faster than logical, linear thinking, and there are times and places when quick, non-reflective responses are required. But there are some built-in problems with associative thinking. It sacrifices accuracy for speed, so the patterns it sees and the connections it makes don’t always lead to useful conclusions. It doesn’t discriminate very well, preferring clear-cut distinctions rather than shades of gray. And it takes numerous cognitive shortcuts known as cognitive biases.

Ruminative thinking is the tendency to passively think about the meaning, origins, and consequences of negative emotions (Nolen-Hoeksema, 1991). One negative incident or thought leads to another, and the escalating intensity of negative thoughts can result in depression, aggression, or even an increase in physical pain. You can ruminate about situations, other people, or about yourself (self-rumination). Rumination can feel like problem-solving, but all it does is keep you focused on the problem. The danger is that it can become a habit—and habits are notoriously difficult to change.

Creative thinking (or creativity) is the ability to see what already exists in a new light, to think of new ideas, and to make new things. This is less a talent or gift than an approach to life, and it provides many rewards apart from the products of creativity. Creative thinkers are less likely to be bored, more likely to have greater problem-solving abilities, and are very likely to get more general enjoyment out of life. The key to creative thinking is to know when to use logical, linear (System 2) thinking and when to use associative (System 1) thinking.

Counterfactual thinking is thinking that runs counter to the facts. It consists of imagining outcomes other than the ones that occurred: the way things could have been—or should have been—different from the way they turned out.  Being able to imagine different outcomes is an enormous evolutionary and practical advantage. It’s integral to being creative or inventive and in not continuing to make the same mistakes over and over again. Counterfactual thinking can be either functional (helps you figure out what to do next time) or nonfunctional (leads to blame, stress, anxiety, etc.). And it can be either upward (how could things have gone better?) or downward (how could things have gone worse?).

Default-mode thinking is the opposite of mindfulness. Although you can sometimes direct your mind to focus on what you want it to focus on, at other times it just wanders along a winding path on a trajectory of its own. That’s because whenever you’re not focused on an external task—and even sometimes when you are—the network of brain structures referred to as the Default Mode Network (DMN) is active. Mind wandering isn’t the same as being distracted. In fact, default mode thinking is essential for consolidating memory and maintaining your sense of self (who you are).

Overthinking is often the result of believing you can fully determine—or even guarantee—an outcome based on the amount of thinking you do about it. It often consists of making multiple lists of pros and cons, running through if/then scenarios, trying to gather as much information as possible, or attempting to approach an issue from every conceivable angle. This is not an effective approach to planning or decision making because thinking more or thinking harder doesn’t lead to clarity, only to confusion and possibly a headache. Too much logical, linear thinking can be as bad as too little.

Positive thinking is usually defined as a mental attitude that accentuates the positive and eliminates the negative. Supposedly, positive thinking can help you succeed and better deal with life’s upsets and challenges. However, a considerable amount of research has come to a different conclusion, which is that positive thinking may be more of a hindrance to success than a help. Positive thinking isn’t the same as optimism, which is a character trait. Positivity and optimism are desirable, but not to the point where your glasses become so rose-colored you’re unable to see through them.

*Metacognition means thinking about thinking as opposed to reacting to it or being at the effect of it. The part of the brain that runs you most of the time (the unconscious) initiates both thoughts and actions that serve to maintain your personal status quo. So if you want to change the status quo, you need to determine what kind of thinking you’re doing—or what kind of thinking is “doing” you.


Part of the series A-Z: An Alphabet of Change.

Filed Under: Alphabet of Change, Brain, Clarity, Cognitive Biases, Consciousness, Creating, Living, Mind, Unconscious Tagged With: Change, Metacognition, System 1, System 2, Thinking

Brain Dead: Is Your Mind Temporarily Offline?

September 4, 2015 by Joycelyn Campbell Leave a Comment

brain fog3

Your brain has two systems for processing the stimuli and experiences of your life and determining how you act upon them.

Conscious: The processing system you’re aware of is called System 2. It is logical and intentional and sometimes referred to as “true reasoning.” (A formal outline is a good example.) It is also slow, limited, and easily depleted. It processes about 40 bits of information at a time.

Unconscious: The processing system you’re not aware of is called System 1. It is associative, which means it sees patterns and connects dots. (A mindmap is a good example.) It is fast, vast, and always on. It processes about 11,000,000 bits of information at a time.

If System 1 were to go offline, you would, too. Game over! But you can still function when System 2 is temporarily offline, even for long periods of time, such as when you’re asleep. So when you think or talk about being temporarily brain dead, you’re talking about exhausting System 2 attention.

If you’re in good health, there’s not much you can do to tax or exhaust the capacity of System 1—and there are things you can do to enhance its functioning. However, your supply of System 2 attention is always limited, and anything that occupies your working memory reduces it. Some examples of things that tax System 2 attention are:

  • Physical illness (even minor), injury, or lack of sleep
  • Making numerous trivial decisions throughout the day
  • Stress, anxiety, and worry
  • Exercising will power (forcing yourself to do something you don’t want to do or to not do something you do want to do)
  • Monitoring your behavior
  • Monitoring your environment if it is new or you consider it unsafe
  • Learning something new, traveling an unfamiliar route, etc.
  • Completing a complex computation
  • Trying to tune out distractions
  • A long period of concentrated or focused attention
  • Trying to remember dates, numbers, or unrelated facts
  • Listening to me talk

Since System 1 is fast, vast, always on, and has an answer for almost everything—and since you don’t need System 2 attention for most of what you do when you’re awake—what’s the big deal if you run out of System 2 attention from time to time?

Three Categories of Errors

Optimally, the two systems work together, and neither type of processing is superior. However, System 1 is more useful in some situations, while System 2 is not only more useful but also required in other situations.

System 1 is pretty good at what it does because its models of familiar situations are accurate so its short-term predictions tend to be accurate. But that’s not always the case. System 1 sacrifices accuracy for speed, meaning it jumps to conclusions. It also has biases and is prone to making logical errors.

One of System 2’s jobs is to detect System 1’s errors and adjust course by overriding System 1’s impulses. As Daniel Kahneman says in Thinking, Fast and Slow:

There are vital tasks that only System 2 can perform because they require effort and acts of self-control in which the intuitions and impulses of System 1 are overcome.

Bear in mind that System 1 is not rational. If System 2 is depleted and can’t veto or modify the non-rational impulses of System 1, those impulses then turn into actions (or speech).

There are three categories of errors you tend to make when System 2 is depleted.

Logical Errors

System 1 thinking uses shortcuts. System 2 thinking takes the long (logical/linear) way home. So when you’re out of System 2 attention, you’re more prone to making mistakes in anything that requires logical, linear thinking. Errors of intuitive thought can be difficult for System 2 to catch on a good day. When System 2 is offline, you automatically assume them to be correct. As a result:

  • You will have trouble making, following, or checking the validity of a complex logical argument. You’ll be more likely to be led by the cognitive biases and distortions System 1 uses because they don’t require any effort and give you a comforting sense of cognitive ease.
  • You will have difficulty comparing the features of two items for overall value. If you have to make a choice, you’ll be more likely to go with what intuitively feels right or the item that has some emotionally compelling attribute (it reminds you of the one your mother had, for example, or reminds you of your mother).
  • You will be more gullible. You’ll be more likely to believe things you wouldn’t otherwise believe or be persuaded by empty messages, such as in commercials. System 2 is the skeptic, so the best time for someone to take advantage of you is when it is offline.
Intention or Response Errors

System 1 continuously picks up on cues and triggers in your environment to determine what situation you’re in and to predict what’s next. Any deviation from the norm requires System 2 attention. If it isn’t available, you’re likely to do not what you intended to do but whatever is normal for you in that situation. And without System 2 attention, you’re much more likely to respond automatically (habitually) to the stimulus (cue or trigger).

  • System 2 is in charge of self-control, continuously monitoring your behavior, keeping you polite, for example, when you’re angry. In the heat of the moment, when you’re out of System 2 attention, you’re much less likely to be able to suppress your immediate emotional reactions to people and situations.
  • System 1 has an answer for almost everything. But when it encounters a surprising situation (something it hasn’t previously encountered or that is unusual in that situation), it notifies System 2. You don’t need System 2 attention to drive a familiar route, but if you encounter an obstacle along that route, you need System 2 to figure out what it is and to respond appropriately to it.
  • System 2 is also in charge of will power. If you are in the process of trying to stop doing something you habitually do (such as raiding the refrigerator in the evening), you need System 2 to belay the impulse from System 1 to see if there’s more pie. Without System 2, you’re more likely to give in, look for the pie…and eat it.
  • You need System 2 if you want to take a different route from your usual one or make an extra stop you don’t normally make. Without adequate System 2 attention, you’re likely to find yourself taking the usual route and forgetting to make that stop.
Gatekeeping Errors

We all have biases, whether or not we’re aware of them and whether or not we want to admit it. While it’s easy to spot overt biases and prejudices in other people, most of your own biases are hidden even from you. In the case of biases toward specific groups of people, you’ve likely come to a reasoned conclusion they’re wrong and have chosen not to think about and treat other people based on stereotypes. But that doesn’t mean the biases have disappeared. They’re still part of System 1’s associative processing operations. It’s just that when System 1 suggests a biased response to System 2, System 2 normally overrides it. Per Daniel Kahneman:

Conflict between an automatic reaction (System 1) and an intention to control it (System 2) is common in our lives.

When System 2 is depleted, there is no one at the gate to keep the biased or prejudiced responses from getting through. You may simply have a biased thought. You may say something in the presence of others that you wouldn’t normally say. Or you may respond to another person based on a group stereotype. The thought, comment, or behavior may be something you later regret. If you were to claim it doesn’t represent what you believe or the way you really feel or think, you’d most likely be right.

But when you see a blatant expression of bias or prejudice in someone else—especially a celebrity—you might have a different reaction. You might assume their true colors are showing.  We think that what we see in other people when their guard is down and they’re pushed or stressed reveals the truth about them. But the actual truth is that to the extent we have any civility at all, it’s because System 2 maintains it.  Without System 2 you and I would have no ability to question our biases or prejudices, no ability to come to reasoned conclusions about them, and no ability to monitor and veto System 1’s automatic reactions.

Conclusion

It isn’t always necessary, advisable, or even possible to override System 1. But when you deplete System 2, you can’t override it even when you want or need to. Without System 2, you can’t think straight (logically and linearly). So:

  • Don’t try to make important decisions of any kind when you feel brain dead.
  • Don’t assume you’ll feel or think the same way about something the next day as you do when you’re stressed, sick, just completed your annual tax return, or have recently fallen in love.
  • Don’t stay up late to watch the QVC channel unless you have a lot of money you’re trying to unload.
  • Don’t keep pie around if you’re trying not to eat it.
  • Don’t get into debates about complex issues after you’ve had a few beers.
  • Don’t tax your working memory with details you can keep track of some other way.
  • Don’t take System 2’s censoring of your biases and prejudices for granted. And don’t assume other people’s mental lapses reveal deep-seated truths about them.

Filed Under: Attention, Brain, Cognitive Biases, Consciousness, Living, Memory, Mind, Unconscious Tagged With: Brain, Brain Dead, Cognitive Biases, Daniel Kahneman, Fast and Slow, Mind, Predictably Irrational, System 1, System 2, Thinking

Think You’re Thinking?

May 26, 2014 by Joycelyn Campbell 4 Comments

English: Uriah Heep from "David Copperfie...
English: Uriah Heep from “David Copperfield”, Ink and wash drawing (Photo credit: Wikipedia)

Much of what passes for thinking consists of unconscious, not conscious, mental processes. When it comes to taking in information and deciding what to believe and what not to believe, for example, we are appallingly predictable. We are most likely to believe:

What Is Familiar

Information that feels familiar is easier to absorb and believe than information that is unfamiliar. The information could be familiar because it’s associated with other beliefs we have or it could come from a trusted source. On the other hand, it could simply be something we’ve come across before—especially if we’ve come across it multiple times. Frequent repetition can be enough to convince people to believe things that are not true because familiarity generates a sense of cognitive ease. Called the mere-exposure effect, advertisers make use of it, but they aren’t the only ones.

Even if we’re aware of the mere-exposure effect, we probably think we’re immune to it because we’re more sophisticated than that. Believing we’re immune to it, however, might make us even more susceptible to it than we would be if we simply recognized it.

What Is Easy

Information that is easy to understand gives us a sense of cognitive ease. Information that is difficult to understand requires greater cognitive effort to process. Our brain prefers to chill out, so it just says “no” to exerting additional cognitive effort.

Say you’re faced with choosing between two concepts, ideas, or explanations. Idea A is easy to understand, while Idea B is more difficult. Statistically speaking, you’re much more likely to accept Idea A instead of Idea B simply because Idea A is easier for you to swallow. This is especially likely to be the case if you are already experiencing some degree of cognitive strain or if your conscious (System 2) attention is depleted. You’ve undoubtedly had the experience of feeling “brain dead” following a mentally fatiguing effort. That’s when you’re most susceptible to believing what is easy.

What Validates Our Preexisting Beliefs

Information that confirms what we already believe to be true makes us feel right and certain, so we’re likely to accept it uncritically. On the other hand, we’re more likely to reject information that is inconsistent with what we already believe. At the very least, we hold inconsistent information up to greater scrutiny. So we have different standards for evaluating information based on the level of cognitive ease it generates. And evidence has precious little impact on us if it conflicts with what we believe simply because the cognitive strain of processing it is too great.

The easy acceptance of information that validates what we already believe is a result of confirmation bias. Confirmation bias causes us to selectively notice and pay attention to what confirms our beliefs and to ignore what doesn’t. For example, people who favor gun control pay more attention to stories about injuries and deaths resulting from gun use; people who are against gun control pay more attention to stories about people using guns to defend themselves. Confirmation bias underlies the discomfort we feel around people who disagree with us and the ease we feel around people who share our beliefs [see What is Familiar and What is Easy, above].

It’s easy to believe what’s familiar, what’s easy to grasp, and what validates our pre-existing beliefs. No critical thinking or cognitive effort are required. On the other hand, actual thinking, as Dan Ariely says, is difficult and sometimes unpleasant.Enhanced by Zemanta

Filed Under: Beliefs, Brain, Cognitive Biases, Mind, Unconscious Tagged With: beliefs, Believing, Cognition, Cognitive bias, Confirmation bias, Critical thinking, Dan Ariely, Thinking

  • 1
  • 2
  • Next Page »

Subscribe to Farther to Go!

Enter your email address to receive notifications of new Farther to Go! posts by email.

Search Posts

Recent Posts

  • No Good Deed Goes Unpunished
  • Always Look on
    the Bright Side of Life
  • The Cosmic Gift & Misery
    Distribution System
  • Should You Practice Gratitude?
  • You Give Truth a Bad Name
  • What Are So-Called
    Secondary Emotions?

Explore

The Farther to Go! Manifesto

Contact Me

joycelyn@farthertogo.com
505-332-8677

  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter
  • Home
  • About
  • Blog
  • On the Road
  • Links
  • Certification Program
  • Contact

Copyright © 2025 · Parallax Pro Theme on Genesis Framework · WordPress · Log in