Farther to Go!

Brain-Based Transformational Solutions

  • Home
  • About
    • Farther to Go!
    • Personal Operating Systems
    • Joycelyn Campbell
    • Testimonials
    • Reading List
  • Blog
  • On the Road
    • Lay of the Land
    • Introductory Workshops
    • Courses
  • Links
    • Member Links (Courses)
    • Member Links
    • Imaginarium
    • Newsletter
    • Transformation Toolbox
  • Certification Program
    • Wired that Way Certification
    • What Color Is Change? Certification
    • Art & Science of Transformational Change Certification
    • Certification Facilitation
    • SML Certification
  • Contact

Feedback Loops: Use Them or Be Used by Them

July 17, 2015 by Joycelyn Campbell Leave a Comment

driver feedback signIt isn’t too much of a stretch to say that feedback loops make the world go round. Among other things, feedback loops keep machinery—both digital and analog—running smoothly, moderate our weather, and maintain homeostasis in our bodies. Feedback loops also function to either maintain or disrupt the status quo within businesses and other organizations, in politics, in the economy, in interpersonal relationships, and even in regard to our own behavior.

David DiSalvo calls feedback loops “the engines of your adaptive brain.” He says research across multiple disciplines—psychology, sociology, economics, engineering, epidemiology, and business strategy, for example—has validated feedback loops as a solid governing principle.

Day in and day out, we make decisions based on the results of feedback loops that run in our minds without our noticing. None of us stops to think through each stage of the loop—how the data we’ve gathered is being processed to lead us to our next action. And yet, even without our conscious monitoring, the loops just keep moving.

Decision-making requires conscious thought. So it may be more accurate to say we react based on feedback loops rather than that we make decisions. In the same way that our brain has criteria for evaluating the data provided by physiological feedback loops (in order to, say, maintain our body temperature and signal when we need to eat or drink—or stop eating or drinking), it also has criteria for evaluating the data provided by our mental, emotional, and behavioral feedback loops. The problem is that these criteria are part of our mental model of the world, much of which is unconscious, which means we’re not aware of it.

If we don’t stop to think through “how the data we’ve gathered is being processed,” we’re more likely to maintain the very habits of thinking and behaving we’re trying to change.

What Exactly Is a Feedback Loop?

The four stages of a feedback loop as described by science writer Thomas Goetz in Wired Magazine are:

  • Evidence
  • Relevance
  • Consequence
  • Action
A feedback loop involves four distinct stages. First comes the data: A behavior must be measured, captured, and stored. This is the evidence stage.
Second, the information must be relayed to the individual, not in the raw-data form in which it was captured but in a context that makes it emotionally resonant. This is the relevance stage.
But even compelling information is useless if we don’t know what to make of it, so we need a third stage: consequence. The information must illuminate one or more paths ahead.
And finally, the fourth stage: action. There must be a clear moment when the individual can recalibrate a behavior, make a choice, and act. Then that action is measured, and the feedback loop can run once more, every action stimulating new behaviors that inch us closer to our goals.

When it comes to behavior-related feedback loops, such as changing an old habit or starting a new one, the sequence looks more like this:

  • Action
  • Evidence
  • Relevance
  • Consequence
  • New Action (or Reaction)

Just about any activity generates feedback of some sort. The result of an action can be large or infinitesimal, desirable or undesirable. Ideally, you notice what happens and use the feedback to determine what to do next. If you’re driving your car along a snowy road and it begins to skid, the skid is evidence that road conditions require you to make some type of adjustment to your driving. The evidence is relevant to you because you want to avoid an accident, which is a potential consequence of not paying attention to the evidence. Your reaction might be to slow down.

That’s a fairly straightforward example. Another driving-related example, one you may have encountered and which Goetz wrote about in Wired, involves “dynamic speed displays,” also called driver feedback signs. These speed limit signs include radar sensors attached to digital readouts that flash your vehicle’s speed once you get in range. Driver feedback signs have been so successful in decreasing speeding they’re springing up in more and more locations.

The basic premise is simple. Provide people with information about their actions in real time (or something close to it), then give them an opportunity to change those actions, pushing them toward better behaviors. Action, information, reaction. 
The Premise May Be Simple, But the Process Isn’t.

The apparent result of an action we’ve taken—the evidence—must first be interpreted before we can proceed through the steps of the feedback loop to determine how to react. A roadside sign that tells you both the speed limit and your current speed provides you with straightforward, unambiguous evidence. If all the evidence we were faced with was similarly unambiguous, our lives would be much less complex and our decisions would be much easier to make. Alas, such is not the case.

As stated above, DiSalvo says we make decisions based on the results of feedback loops, but even in cases where we’re making decisions rather than simply reacting, it would be more accurate to say we make decisions based on our interpretation of the results of feedback loops.

Because we perceive the world through our particular mental model, we’re predisposed to interpret the results of our actions in certain ways. This can be problematic in general, but it’s especially so when we’re presented with negative evidence. Things didn’t work out the way we planned; we did something other than what we intended or wanted to do; or we’re faced with unexpected obstacles. The most useful way to respond to such information is to look at it objectively. We tried something and it didn’t work. We can then try to figure out why it didn’t work and decide whether to try it again or to try something else.

Instead of viewing the negative results of our actions objectively, however, we’re prone to interpreting them as evidence of failure. Once we interpret the results as evidence of failure, we’re much less likely to try to figure out what didn’t work and what to do next, and we’re much more likely to give up. At that point, the habit or behavior we were trying to change becomes even more entrenched than it was before we attempted to do something about it. And the goal we were trying to achieve seems even more distant.

A student in one of my classes reported struggling for several years with a particular issue of having to document, in detail, time spent caretaking a family member. Every time she tried and failed to find a system that worked, she interpreted it as evidence of personal failure. One day in class, she outlined something new to try. When she returned the following week, she was very excited, but not because the new system had worked. It hadn’t. What she was excited about was that when she realized that particular system didn’t work, rather than viewing it as more evidence of failure she was able to view it objectively. Because she was able to view it objectively, she didn’t waste time beating herself up over it. Instead, she immediately decided to try something else and that new system did work.

Confirmation bias is very powerful. If we believe we’re lazy or incapable or don’t follow through on anything, we’re likely to view the negative results of our actions as confirmation of our preexisting belief and then behave as though that belief is reality. So it’s important to remember that our automatic interpretations can’t always be trusted; sometimes we need to slow down long enough to question them.

Not everything you try is going to go smoothly or work out the way you hoped it would. Sometimes the road is slippery, under construction, or takes a detour. Noticing that what you tried simply didn’t work will allow you to use the information as feedback to help you determine the best way to correct your course—or to chart a brand new one.

Filed Under: Beliefs, Brain, Cognitive Biases, Habit, Living, Unconscious Tagged With: Behavior, Brain, Confirmation bias, Feedback Loops, Habit, Mind, Unconscious

Think You’re Thinking?

May 26, 2014 by Joycelyn Campbell 4 Comments

English: Uriah Heep from "David Copperfie...
English: Uriah Heep from “David Copperfield”, Ink and wash drawing (Photo credit: Wikipedia)

Much of what passes for thinking consists of unconscious, not conscious, mental processes. When it comes to taking in information and deciding what to believe and what not to believe, for example, we are appallingly predictable. We are most likely to believe:

What Is Familiar

Information that feels familiar is easier to absorb and believe than information that is unfamiliar. The information could be familiar because it’s associated with other beliefs we have or it could come from a trusted source. On the other hand, it could simply be something we’ve come across before—especially if we’ve come across it multiple times. Frequent repetition can be enough to convince people to believe things that are not true because familiarity generates a sense of cognitive ease. Called the mere-exposure effect, advertisers make use of it, but they aren’t the only ones.

Even if we’re aware of the mere-exposure effect, we probably think we’re immune to it because we’re more sophisticated than that. Believing we’re immune to it, however, might make us even more susceptible to it than we would be if we simply recognized it.

What Is Easy

Information that is easy to understand gives us a sense of cognitive ease. Information that is difficult to understand requires greater cognitive effort to process. Our brain prefers to chill out, so it just says “no” to exerting additional cognitive effort.

Say you’re faced with choosing between two concepts, ideas, or explanations. Idea A is easy to understand, while Idea B is more difficult. Statistically speaking, you’re much more likely to accept Idea A instead of Idea B simply because Idea A is easier for you to swallow. This is especially likely to be the case if you are already experiencing some degree of cognitive strain or if your conscious (System 2) attention is depleted. You’ve undoubtedly had the experience of feeling “brain dead” following a mentally fatiguing effort. That’s when you’re most susceptible to believing what is easy.

What Validates Our Preexisting Beliefs

Information that confirms what we already believe to be true makes us feel right and certain, so we’re likely to accept it uncritically. On the other hand, we’re more likely to reject information that is inconsistent with what we already believe. At the very least, we hold inconsistent information up to greater scrutiny. So we have different standards for evaluating information based on the level of cognitive ease it generates. And evidence has precious little impact on us if it conflicts with what we believe simply because the cognitive strain of processing it is too great.

The easy acceptance of information that validates what we already believe is a result of confirmation bias. Confirmation bias causes us to selectively notice and pay attention to what confirms our beliefs and to ignore what doesn’t. For example, people who favor gun control pay more attention to stories about injuries and deaths resulting from gun use; people who are against gun control pay more attention to stories about people using guns to defend themselves. Confirmation bias underlies the discomfort we feel around people who disagree with us and the ease we feel around people who share our beliefs [see What is Familiar and What is Easy, above].

It’s easy to believe what’s familiar, what’s easy to grasp, and what validates our pre-existing beliefs. No critical thinking or cognitive effort are required. On the other hand, actual thinking, as Dan Ariely says, is difficult and sometimes unpleasant.Enhanced by Zemanta

Filed Under: Beliefs, Brain, Cognitive Biases, Mind, Unconscious Tagged With: beliefs, Believing, Cognition, Cognitive bias, Confirmation bias, Critical thinking, Dan Ariely, Thinking

What Is Cognitive Ease—and Why Should You Be Wary of It?

April 18, 2014 by Joycelyn Campbell 6 Comments

sense of danger

Everyone wants to be right and to feel certain about things. These are built-in biological drives, not character flaws. When we think we’re right and when we feel certain, we experience a sense of cognitive ease. The world makes sense to us. And that puts us in a good mood.

Cognitive ease feels good, but it gives us a false sense of security because it makes us think we understand far more than we actually do.

Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance. —Daniel Kahneman, Thinking, Fast and Slow

Comfortably Numb

When it comes to taking in information and deciding what to believe and what not to believe, we are appallingly predictable. We are most likely to believe:

  • What Is Familiar

Information that feels familiar is easier to absorb and believe than information that is unfamiliar. It could be familiar because it’s associated with other beliefs we have or it could come from a trusted source. On the other hand, it could simply be something we’ve come across before—especially if we’ve come across it multiple times. Frequent repetition can be enough to convince people to believe things that are not true because familiarity generates a sense of cognitive ease. Called the mere-exposure effect, advertisers make use of it, but they aren’t the only ones.

  • What Is Easy

Information that is easy to understand also gives us a sense of cognitive ease. Information that is difficult to understand requires more cognitive effort to process, and our brain’s preference is to take it easy. Say you’re faced with choosing between two concepts, ideas, or explanations. Idea A is easy to understand, while Idea B is more difficult. Statistically speaking, you’re much more likely to accept Idea A instead of Idea B simply because Idea A is easier for you to swallow. Does that give you a sense of cognitive dis-ease?

  • What Validates Our Preexisting Beliefs

Information that confirms what we already believe to be true makes us feel right and certain, so we’re likely to accept it uncritically. On the other hand, we’re more likely to reject information that is inconsistent with what we already believe or at least we hold inconsistent information up to greater scrutiny. We have different standards for evaluating information depending on the level of cognitive ease it generates.

And evidence has precious little impact on us if it conflicts with what we believe simply because the cognitive strain of processing it is too great. For example, it is easier to believe that What You See Is All There Is (WYSIATI), even after being confronted with evidence that you have missed something that was right in front of your face, than it is to believe that you are aware of only a tiny fraction of what is going on around you.

Cognitive Biases

We use cognitive biases as shortcuts to help us understand the world. We don’t have to use any critical thinking skills. No cognitive effort is required. We aren’t forced to reevaluate our existing beliefs. Because of our cognitive biases, we make snap judgments, form quick impressions or opinions, and operate on autopilot.

The bad news is that, since cognitive biases are by their nature distortions or errors in thinking, they actually decrease our understanding all the while giving us that feel-good sense of cognitive ease.

That’s just fine with the conscious part of our brain, which is slow and kind of lazy and doesn’t want to work if it doesn’t have to. It’s happy to let the unconscious handle as much of the load as possible. Because cognitive biases operate at the unconscious level, unless we make an effort to recognize them, we aren’t aware of them. We will even deny we have them.

To have a human brain is to be subject to cognitive biases. Some of the most common are:

  • Confirmation Bias

The easy acceptance of information that validates what we already believe (as described in What Validates Our Preexisting Beliefs, above) is a result of confirmation bias. Confirmation bias causes us to selectively notice and pay attention to what confirms our beliefs and to ignore what doesn’t. Confirmation bias underlies the discomfort we feel around people who disagree with us and the ease we feel around people who share our beliefs. Example: People who favor gun control pay more attention to stories about injuries and deaths resulting from gun use; people who are against gun control pay more attention to stories about people using guns to defend themselves.

  • The Halo Effect

The tendency to view other people as all good (or all bad) is the result of a cognitive bias called the halo effect. When we consider someone to be a good person, we find it easier to excuse or ignore behavior that is inconsistent with being a good person. Good people can do no wrong. On the other hand, if we consider someone to be a bad person, we find it hard to accept that he or she has any positive qualities. Bad people can do no good. In either case, we ignore evidence that contradicts our general impression of the person. The halo effect requires black and white thinking. Example: People tend to have a completely positive view of the political party they support and a completely negative view of the political party they don’t support.

  • Negativity Bias

Our brains are wired to notice negative events more than positive events, so we give them more attention. This leads us to believe that more negative events are taking place than positive events. It also leads us to give more credence to negative claims about people with whom we disagree. Negativity bias is responsible for the fears we have about some things that are disproportionate to the actual likelihood of their occurring. Bad stuff seems to have more of an impact on us than good stuff, and we are quicker to react to it. This bias can make us susceptible to fear-mongering. Examples: (1) The news. (2) People tend to pay more attention—and give more weight—to critical comments than to praise.

  • Impact Bias

We think we can predict how we will react to potential events, both good and bad, and reliably estimate the impact they will have on us. But in making such predictions, we routinely overestimate how good we will feel (and for how long) after a positive event and how bad we will feel (and for how long) after a negative event. Although we are extremely poor fortune tellers, that doesn’t stop us from being certain about how we will feel in the future. In reality, our excitement over something good will likely dim faster than we predict, and we are likely to rebound from a loss sooner than we predict. Example: People tend to believe a positive change, such marriage, a new job, a bigger house, winning the lottery, etc. will make them feel better—and for a longer time—than it actually will.

  • Hindsight Bias

In retrospect everything seems inevitable. The hindsight bias (“I knew it all along”) makes us think the world is more predictable than it actually is. After the fact, we selectively reconstruct events to make it appear the outcome was inevitable. In doing so, we also exaggerate how likely we considered the outcome to be before it occurred. If the outcome is a negative one, we think someone should have foreseen it and prevented it. Example: After 9/11, many people thought the attacks by al-Qaeda could have been prevented based on the available information. However, the information was not, at that time, as clear as it appeared to be in hindsight.

  • Outcome Bias

The outcome bias leads us to evaluate a decision based on the eventual results or outcome of the decision rather than on the soundness or quality of the decision at the time it was made. If something works out, we think it was a great decision (genius, even), although the reasoning that led to it may have been flawed. Conversely, if something doesn’t work out, we think it was a bad decision, although the reasoning that led to it may have been entirely sound. When outcomes are good, we think the decisions were good; when outcomes are bad, we think the decisions were bad. Example: People tend to think that if something goes wrong during a low-risk surgical procedure, the decision to do the procedure was a bad one.

  • Hidden (or Implicit) Bias

Hidden Biases are attitudes or stereotypes we have, both favorable and unfavorable, particularly about other people in regard to race, gender, age, etc. We don’t all have the same hidden biases, but everyone has them. However, because they are hidden—primarily from ourselves—we are unaware of them, even though they affect our feelings, our behavior, and our reactions. Hidden biases may be at odds with our conscious attitudes and feelings. But some of our hidden biases may be apparent to others.

We can’t find out about hidden biases through introspection. We may be able to learn something about them through observing ourselves. Also Harvard University has developed an implicit association test that is available online (https://implicit.harvard.edu/implicit/) so you can test yourself for your own hidden biases.

Hidden biases contribute to a sense of cognitive ease by tending to confirm that whatever groups we belong to (ethnic, racial, age, income, etc.) are the best groups because they have more positive characteristics than those other groups have.

Cognitive Distortions

Cognitive distortions are habitual ways of thinking that alter our perception. Many, although not all, cognitive distortions are negative. But even negative cognitive distortions contribute to a sense of cognitive ease just because they are habitual. If you are used to thinking about yourself in a negative way or seeing the world in a negative way, that will feel more comfortable than trying to see things in a different (more positive) way.

Cognitive distortions are not uncommon, and there are a lot of different ones. However, not everyone is subject to them—or at least not to the same degree. A few common cognitive distortions are:

  • Mindreading: believing you know what other people are thinking or what their motives are
  • Overgeneralizing: drawing too broad a conclusion from a single event or piece of information or from limited information
  • Catastrophizing: imagining worst case scenarios; exaggerating the likelihood of negative or disastrous outcomes
  • All or Nothing Thinking (also called Black and White Thinking): thinking in extremes without allowing for complexity (shades of gray); believing that if something isn’t perfect or the best, it’s worthless
The Cognitive Ease Continuum

According to Daniel Kahneman, cognitive ease is both a cause and a consequence of a pleasant feeling. Cognitive ease makes us feel more favorable toward things that are familiar, easy to understand, and easy to see or read. We feel less favorable toward what is unfamiliar, difficult to understand, or difficult to see or read. We don’t even have to be consciously aware that something is familiar to us in order to feel good about it. The feel-good response comes from the unconscious part of our brain. It’s part of our hardwiring for survival. A good mood tells our brain everything is OK and we can let our guard down.

Being in a good mood is associated with intuition, creativity, gullibility, and increased reliance on the unconscious part of the brain. At the other end of the continuum are sadness, vigilance, suspicion, an analytic approach, and increased effort.

We can’t worry when we’re happy. But because we’re less vigilant when in a good mood, we’re more prone to making logical errors. We’re more susceptible to cognitive biases. We think we understand more than we do. We even think we’re thinking.

Enhanced by Zemanta

Filed Under: Brain, Cognitive Biases, Consciousness, Mind, Monthly Meetings of the Mind Tagged With: Brain, Cognition, Cognitive bias, Cognitive Ease, Confirmation bias, Critical thinking, Halo effect

Subscribe to Farther to Go!

Enter your email address to receive notifications of new Farther to Go! posts by email.

Search Posts

Recent Posts

  • No Good Deed Goes Unpunished
  • Always Look on
    the Bright Side of Life
  • The Cosmic Gift & Misery
    Distribution System
  • Should You Practice Gratitude?
  • You Give Truth a Bad Name
  • What Are So-Called
    Secondary Emotions?

Explore

The Farther to Go! Manifesto

Contact Me

joycelyn@farthertogo.com
505-332-8677

  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter
  • Home
  • About
  • Blog
  • On the Road
  • Links
  • Certification Program
  • Contact

Copyright © 2025 · Parallax Pro Theme on Genesis Framework · WordPress · Log in