Farther to Go!

Brain-Based Transformational Solutions

  • Home
  • About
    • Farther to Go!
    • Personal Operating Systems
    • Joycelyn Campbell
    • Testimonials
    • Reading List
  • Blog
  • On the Road
    • Lay of the Land
    • Introductory Workshops
    • Courses
  • Links
    • Member Links (Courses)
    • Member Links
    • Imaginarium
    • Newsletter
    • Transformation Toolbox
  • Certification Program
    • Wired that Way Certification
    • What Color Is Change? Certification
    • Art & Science of Transformational Change Certification
    • Certification Facilitation
    • SML Certification
  • Contact

Count your Yesses

May 28, 2015 by Joycelyn Campbell 2 Comments

YesAs Rick Hanson famously says, “Your brain is like Velcro for negative experiences and Teflon for positive ones.” That’s because your brain’s primary concern is your survival, so it’s primed to pay more attention to the negative. Positive things may indeed help you survive. But negative things can kill you. As far as your brain is concerned, it’s definitely better to be safe than sorry. It’s better to expect and prepare for a possible threat (there might be a tiger behind that bush) than to be surprised (and wounded or eaten) by that tiger.

It’s easy to forget that we’re operating with essentially the same brain our ancestors on the savanna had. But if you want to overcome your brain’s negativity bias, it’s important to remember that System 1, the unconscious part of your brain that runs you most of the time, doesn’t always deal effectively with the stimulation, stressors, and sheer volume of information you have to contend with in your daily life.

It’s easier for all of us to pay attention to the negative: the threats, the slights, the hurts, the things that fall apart or don’t go our way. We don’t have to make a point of looking for what isn’t working in order to find it. Our brain does that automatically.  Another aspect of our survival-based brain—its associative method of “thinking”—makes it easy to get on a negative track and stay there. One darn thing leads to another, meaning one similar thought reminds you of another similar thought. Before you know it, your mood and your attitude have soured, and your ability to refocus your attention has evaporated.

You can’t stop your brain from noticing the negative, and it wouldn’t even be a good idea to try. But neither do you have to give in to it. The advice to count your blessings comes to mind, but I find blessings to be a loaded word on several levels. I prefer to count my yesses. It’s a great way to turn the tide when I notice I’ve mentally starting traveling along that road to nowhere.

Although I tend to be pretty optimistic and upbeat, the first thing I noticed when I began this practice was how much easier it is to count my nos. Because the nos are brought to our attention by System 1, the unconscious part of our brain that is always on and processes 11,000,000 bits of information at a time, they come to mind immediately and automatically without any effort on our part. Counting yesses, on the other hand, requires intention, which is a function of System 2, the conscious part of the brain that is slow, lazy, and easily depleted.

The process of shifting my attention doesn’t just change the mental track I’m on; it also causes me to be aware of how influential my mental model of the world is at any given moment.

We don’t see things as they are; we see them as we are.

That quote has been attributed to several different people, but regardless of who said it, it’s true.

When you’re tired, stressed, or sick—or when life has dealt you some kind of blow—you simply have less System 2 attention available. So it’s easy for the nos to get the upper hand. A couple of weeks ago I went through a bout of food poisoning. During the illness itself and the two days that followed, the nos were abundant. I observed the downward trend in my thoughts, but I also understood what was happening. I was pretty sure my perspective would change once I got better (which it did), so I didn’t let the nos carry me too far downstream.

Someone I know regularly posts what she calls “The Daily Yes” on Facebook. It’s a prompt that works well for me because I don’t have a regular schedule for accessing Facebook, so I don’t always see it at the same time of day. But every time I do see it, I stop to read it. It doesn’t matter what the specific content is. It’s the word yes that’s my cue to pay attention to what’s juicy and zesty and working in my life—to who and what has said yes to me and who and what I’ve said yes to.

It’s easy for one no to outweigh many yesses, so much so that we may not even notice the yesses when they occur. That’s why I’ve found it helpful to make a list, whether it’s on paper or just a mental list. It reminds me that my brain does have a negativity bias—but that I don’t have to agree with it or go along for that particular ride.

Filed Under: Attention, Brain, Cognitive Biases, Consciousness, Living, Mind, Unconscious Tagged With: Attention, Brain, Intention, Mind, Negativity Bias

Overthinking: Don’t Get Stuck in Analysis Paralysis

December 1, 2014 by Joycelyn Campbell 2 Comments

spanish inquisitionIt’s one thing to look before you leap. It only makes sense to consider the potential outcome or consequences of an action you’re about to take. But it’s another thing altogether to believe you can fully determine—or even guarantee—the outcome based on the amount of thinking you do about it.

Overthinking often consists of making multiple lists of pros and cons, running through if/then scenarios, trying to gather as much information as possible, or attempting to approach the issue from every conceivable angle. The process of trying to make a decision becomes overwhelming. Worse, it drains conscious (System 2) attention throughout the period of time you’re trying to make a particular decision. So the more thinking you do about it, the less effective your thinking becomes. You can find yourself going around and around in mental circles, either unable to make the decision or just taking a stab at something—anything—because you can’t stand thinking about it any longer.

Overthinking also begets second-guessing, in which you get to run through several rounds of “if only/then” scenarios.

Overthinking is driven by your brain’s craving for certainty. But thinking harder or longer about something won’t necessarily get you closer to an answer. Here’s why:

  • In spite of your best efforts, your information will always be incomplete. There are things you don’t know, can’t know, or won’t know at the time you’re trying to decide, and any of those things could be important enough to affect the outcome. Unfortunately, we don’t know what we don’t know, and so we don’t take it into consideration.
  • Even if you were to have access to all of the information, because you’re human you’re subject to numerous cognitive biases, which means you won’t be able to view it entirely objectively. For example, you will overweigh some information and underweigh, or even ignore, other information. System 2 thinking may be what you’re aware of, but System 1 still has plenty of input, and System 1 makes mistakes.
  • You can’t account for randomness. The very idea of randomness makes your brain a little crazy, so it refuses to accept it. Your brain is under the impression it can find a cause-and-effect link for anything and everything. The consequences of randomness, according to physicist Leonard Mlodinow, are counterintuitive. (Nobody expects the Spanish Inquisition!)
  • You can’t predict the future. Even more to the point, you can’t predict how you’re going to feel in the future. Daniel Gilbert, in Stumbling on Happiness, says we tend to think the future will be a lot like today…only different. But the future is fundamentally different from today, and the way you feel right now when you think about the consequences of taking some action is not necessarily the way you will feel when you are living with the consequences of that action.
  • Taking any action can have unexpected results and undesired consequences. Although you can anticipate that such things might occur, you can’t plan for them because you won’t know what they are until after they happen.

Too much logical, linear thinking is as bad as too little. After framing the problem or situation and considering possible solutions, turn it over to your unconscious (System 1) for a while and see what it comes up with. Let your mind wander instead of keeping it on a tight leash. The sudden insight, moment of clarity, or change in perspective you get may surprise you. But this is the way the creative process works, and it’s a great way to use both parts of your brain to your advantage.

Additional reading: Intuition: Knowing without Knowing How We Know.

Filed Under: Brain, Choice, Clarity, Cognitive Biases, Living, Mind, Unconscious Tagged With: Analysis Paralysis, Clarity, Decision-making, Overthinking, System 1, System 2

Why All the News Is Bad: Our Negativity Bias

September 16, 2014 by Joycelyn Campbell Leave a Comment

negativity-bias

Our brain’s own hardwiring for survival makes us vulnerable to stress and anxiety. It evolved to quickly detect threats in the environment and sound the alarm: time to fight or flee now! When we were facing multiple life-or-death threats a million years ago, it was definitely better to err on the safe side. If we reacted to something that didn’t turn out to be a real threat, no significant harm was done. But if we failed to react to something that did turn out to be a serious threat, it could mean the end of us.

The unconscious part of our brain was all about survival a million years ago, and it’s still all about survival today. Although the world we live in has changed radically, our brain has a ways to go to catch up. Operating at a much faster speed that we can consciously keep up with, making connections and seeing patterns that might or might not be there, the unconscious brain signals red alert at the slightest indication of trouble, setting into motion a cascade of physiological effects.

Sometimes this works for us, keeping us safe from actual harm; however, there are far more false alarms than real ones. And we pay a heavy price when this threat-detection system runs unchecked. It’s at the root of what is called the negativity bias. It’s why we notice, react to, and remember negative events to a much greater degree than we do positive ones.

The brain is like Velcro for negative experiences but Teflon for positive ones. –Rick Hanson, Ph.D.

System 1: Danger, Danger, Will Robinson

Our unconscious shrugs off neutral or positive news or experiences, sometimes barely registering them, and hones in on the negative stuff. We have a stronger emotional reaction to negative stimuli, which increases the likelihood we’ll remember it. It takes less time for negative experiences to get stored in memory than for the positive experiences, which means our unconscious has more negative memories to draw on than positive ones when it’s evaluating information. And negative experiences affect us longer.

As a result, we are extremely sensitive to perceived or apparent threats. These days, those threats are less likely to be to our immediate survival. But that doesn’t make any difference to our brain. We react just the same whether the threat is to our ideas and beliefs, to our physical or emotional well-being, to our self-esteem, or to a freedom we hold dear.

We all have the same hardwiring. We are all primed to pay attention to the negative. At this point in time, the danger we’re facing is less a result of threats from the environment and more directly a result of our negativity bias. Whether in our intimate relationships, our international relations, or our personal health and well-being, the actual and potential costs of operating from the negativity bias are enormous.

So what can we do?

System 2: Belay that Order

One thing we can’t do is eliminate the negativity bias. It’s up to evolution to modify our perception of and reaction to threats. Hopefully that will happen before it becomes a moot point.

What we can do is develop an awareness of our predisposition to pay attention to and accentuate the negative. We can use System 2—our conscious attention—to:

  1. Notice the negativity bias in ourselves. It’s not easy to be aware of a cognitive bias in the moment, so often the noticing occurs after the fact. But that’s OK. If we continue paying attention, we’ll get faster at spotting the negativity bias in action. We’ll be less at the effect of it.
  2. Notice the negativity bias in others. The point isn’t to call other people out on it. We’re all operating on autopilot most of the time, and when we’re on autopilot we don’t think things through. If we’re aware that someone else is operating from the negativity bias, we don’t have to get caught up in the fear. We don’t have to react.
  3. Ask: Is there a real threat here or only a perceived threat? Once we become familiar with how the negativity bias works, we can develop the habit of evaluating our reactions and calming ourselves.
  4. Intend to pay attention to positive events and experiences. Yes, our attention naturally goes to the negative, but we can train ourselves to focus on positive things. We can intentionally include more pleasure, joy, and laughter in our lives.

Just because we have a negativity bias doesn’t mean we have to give into and continue feeding it. Let’s keep reminding ourselves that more often than not the threat really is all in our head.

Filed Under: Attention, Brain, Cognitive Biases, Consciousness, Habit, Memory, Unconscious Tagged With: Brain, Cognitive bias, Fear, Memory, Mind, Negativity Bias, Survival

How to Beat the Planning Fallacy

August 28, 2014 by Joycelyn Campbell 2 Comments

Depiction of frustration

The planning fallacy is a tendency to “describe plans and forecasts that are unrealistically close to best-case scenarios.” [Daniel Kahneman and Amos Tversy] In other words, people tend to make plans, set goals, schedule their time, etc., based on an assumption that everything will go smoothly, easily, and according to the plan they have created.

One effect of the planning fallacy is underestimating how long something will take to complete. If a deadline is involved, the result can range from a period of burning the midnight oil to catch up to a major catastrophe—depending on the situation.

Another effect is an inability to tolerate the inevitable delays and obstacles that are a normal part of any project or process and to interpret them to mean that something must be terribly wrong or someone must be to blame (because things haven’t gone according to the plan).

The way to beat the planning fallacy is to focus on process rather than on outcome.

Concentrating on process—the steps or activities necessary to achieve the desired result—helps people focus their attention, leads to more realistic expectations, and reduces anxiety. This allows people to anticipate potential problems as well as potential solutions.

Of course, it’s important to identify the desired outcome so you know where you’re headed. But once you have done that, if you keep your attention on what it will take to get there, you’re much more likely to arrive and to maintain your sanity.

Filed Under: Attention, Cognitive Biases, Creating, Mind, Mindfulness Tagged With: Attention, Best-Case Scenario, Goals, Outcome, Planning, Planning fallacy, Plans, Process

Think You’re Thinking?

May 26, 2014 by Joycelyn Campbell 4 Comments

English: Uriah Heep from "David Copperfie...
English: Uriah Heep from “David Copperfield”, Ink and wash drawing (Photo credit: Wikipedia)

Much of what passes for thinking consists of unconscious, not conscious, mental processes. When it comes to taking in information and deciding what to believe and what not to believe, for example, we are appallingly predictable. We are most likely to believe:

What Is Familiar

Information that feels familiar is easier to absorb and believe than information that is unfamiliar. The information could be familiar because it’s associated with other beliefs we have or it could come from a trusted source. On the other hand, it could simply be something we’ve come across before—especially if we’ve come across it multiple times. Frequent repetition can be enough to convince people to believe things that are not true because familiarity generates a sense of cognitive ease. Called the mere-exposure effect, advertisers make use of it, but they aren’t the only ones.

Even if we’re aware of the mere-exposure effect, we probably think we’re immune to it because we’re more sophisticated than that. Believing we’re immune to it, however, might make us even more susceptible to it than we would be if we simply recognized it.

What Is Easy

Information that is easy to understand gives us a sense of cognitive ease. Information that is difficult to understand requires greater cognitive effort to process. Our brain prefers to chill out, so it just says “no” to exerting additional cognitive effort.

Say you’re faced with choosing between two concepts, ideas, or explanations. Idea A is easy to understand, while Idea B is more difficult. Statistically speaking, you’re much more likely to accept Idea A instead of Idea B simply because Idea A is easier for you to swallow. This is especially likely to be the case if you are already experiencing some degree of cognitive strain or if your conscious (System 2) attention is depleted. You’ve undoubtedly had the experience of feeling “brain dead” following a mentally fatiguing effort. That’s when you’re most susceptible to believing what is easy.

What Validates Our Preexisting Beliefs

Information that confirms what we already believe to be true makes us feel right and certain, so we’re likely to accept it uncritically. On the other hand, we’re more likely to reject information that is inconsistent with what we already believe. At the very least, we hold inconsistent information up to greater scrutiny. So we have different standards for evaluating information based on the level of cognitive ease it generates. And evidence has precious little impact on us if it conflicts with what we believe simply because the cognitive strain of processing it is too great.

The easy acceptance of information that validates what we already believe is a result of confirmation bias. Confirmation bias causes us to selectively notice and pay attention to what confirms our beliefs and to ignore what doesn’t. For example, people who favor gun control pay more attention to stories about injuries and deaths resulting from gun use; people who are against gun control pay more attention to stories about people using guns to defend themselves. Confirmation bias underlies the discomfort we feel around people who disagree with us and the ease we feel around people who share our beliefs [see What is Familiar and What is Easy, above].

It’s easy to believe what’s familiar, what’s easy to grasp, and what validates our pre-existing beliefs. No critical thinking or cognitive effort are required. On the other hand, actual thinking, as Dan Ariely says, is difficult and sometimes unpleasant.Enhanced by Zemanta

Filed Under: Beliefs, Brain, Cognitive Biases, Mind, Unconscious Tagged With: beliefs, Believing, Cognition, Cognitive bias, Confirmation bias, Critical thinking, Dan Ariely, Thinking

  • « Previous Page
  • 1
  • …
  • 4
  • 5
  • 6
  • 7
  • Next Page »

Subscribe to Farther to Go!

Enter your email address to receive notifications of new Farther to Go! posts by email.

Search Posts

Recent Posts

  • No Good Deed Goes Unpunished
  • Always Look on
    the Bright Side of Life
  • The Cosmic Gift & Misery
    Distribution System
  • Should You Practice Gratitude?
  • You Give Truth a Bad Name
  • What Are So-Called
    Secondary Emotions?

Explore

The Farther to Go! Manifesto

Contact Me

joycelyn@farthertogo.com
505-332-8677

  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter
  • Home
  • About
  • Blog
  • On the Road
  • Links
  • Certification Program
  • Contact

Copyright © 2025 · Parallax Pro Theme on Genesis Framework · WordPress · Log in