Farther to Go!

Brain-Based Transformational Solutions

  • Home
  • About
    • Farther to Go!
    • Personal Operating Systems
    • Joycelyn Campbell
    • Testimonials
    • Reading List
  • Blog
  • On the Road
    • Lay of the Land
    • Introductory Workshops
    • Courses
  • Links
    • Member Links (Courses)
    • Member Links
    • Imaginarium
    • Newsletter
    • Transformation Toolbox
  • Certification Program
    • Wired that Way Certification
    • What Color Is Change? Certification
    • Art & Science of Transformational Change Certification
    • Certification Facilitation
    • SML Certification
  • Contact

Intention Seekers
(Conspiracy Part 2)

June 16, 2020 by Joycelyn Campbell Leave a Comment

People who believe in conspiracy theories (conspiracists) are motivated by the same thing that motivates everyone: the drive to understand and make sense of the world we live in. Failing to understand what’s happening around us or how things work could jeopardize our survival.

So from an early age, we begin developing and testing theories to increase our understanding. The brains of both conspiracists and non-conspiracists are always trying to connect the dots. System 1 (the unconscious) operates by making associations: detecting patterns and making connections. It functions at a rapid pace and uses heuristics (mental shortcuts) to make determinations. As a result, it jumps to conclusions, seeing patterns that may not be there and making connections that may not exist. Again, this is true for everyone.

It’s System 2’s job to scrutinize questionable System 1 conclusions. But as we know, System 2 is slow, lazy, easily depleted, and may be otherwise occupied; it misses a lot.

Conspiracists appear to be both more likely to see patterns and connections and less likely to question them, especially when they support preexisting beliefs. In The Believing Brain, Michael Shermer says:

Why do people believe in highly improbable conspiracies? I contend that it is because their pattern-detection filters are wide open, thereby letting in any and all patterns as real, with little to no screening of potential false patterns.

All Explanatory Theories Are Not Equal

Conspiracy theories are different from other theories in a number of ways. They aren’t falsifiable, which means they can’t be disproved, so they can’t be proved; they are only apparent to those who are in the know or can see through the purported cover-ups; they represent a gloomy, sometimes sinister, worldview; they tend to be vast, far-reaching, and complex; and they disallow for the possibility of random or accidental events or occurrences.

Conspiracy theories can’t be proved because they are not likely to be based on verifiable evidence. Lack of evidence would disqualify most other types of theories, but in the case of conspiracy theories the lack of evidence is considered to be evidence of the existence of the conspiracy.

In addition to having wide-open pattern detection filters, the people who believe in conspiracy theories tend to be more suspicious, untrusting, and eccentric than their non-conspiracist counterparts. They have a need to feel special and tend to regard the world as an inherently dangerous place. They are also more likely to infer meaning and motive where others do not.

Several other personality characteristics and cognitive biases have been linked with the tendency to endorse conspiracy theories, including:

  • openness
  • neuroticism
  • authoritarianism
  • mild paranoia
  • confirmation bias
  • the conjunction fallacy
  • the proportionality bias
  • projection
  • attributions of intentionality
  • decreased sense of personal agency
  • traditionalism
  • rejection of science and/or experts
  • confidence in one’s beliefs

Two additional factors were identified in research reported by Lehigh University in 2018.

  1. People who overestimate how well they understand politics are more likely to believe that hidden actors or clandestine groups are conspiring in wide-ranging activities to influence important world actions, events, and outcomes.
    .
  2. People who identify with traditional values and systems they believe are under siege due to social change also tend to adopt conspiracy theory thinking.
Intention Seeking

Just as both conspiracists and non-conspiracists are driven to understand the world in which they live, both are also attempting to discern the intentions of others—again because not being able to do so accurately can have significant negative consequences. Our ability to quickly discern intentionality develops rapidly during childhood. Like pattern-detection, it is an automatic function of System 1, the unconscious. And System 1 can make the same kinds of mistakes in discerning intentions as it does in detecting patterns.

The fast and automatic operation of intentionality-seeking cognitive processes allows us to quickly make inferences about the mental states of those around us—an important evolutionary adaptation. However, as is the case with other low-level cognitive processes, inferences of intentionality may be subject to biases and heuristics. Not only are we sensitive to the intentions of others, but we may be overly sensitive, biased towards perceiving or inferring intentionality even where such an attribution may not be warranted. —Robert Brotherton and Christopher C. French, PLoS One

One series of studies reported in 2008 suggested that our brain automatically attributes intentionality to all actions, even those we know are not intentional. System 2 has to override this automatic process in order for us to recognize the lack of intention.

Judging an action to be unintentional requires more cognitive resources, takes longer, and results in increased ease of recall compared to judging the same action to be intentional. —E. Rossett, Cognition

This is an intriguing area of research given that we now know how little of our behavior, moment-to-moment, is in fact either rational or intentional. The consistent, coordinated, intentional action of multiple individuals over time and across distance for agreed-upon nefarious purposes isn’t impossible, of course. But it is highly improbable.

Nevertheless, as Brotherton and French state in their PLoS One article:

To the extent that an individual tends to regard ambiguous events or situations generally as having been intended, conspiracy theories may appear more plausible than alternative explanations.

Next time: Part 3: Conspiracy Theories and the Storytelling Mind
Last time: Part 1: Conspiracy: Making Distinctions

Filed Under: Beliefs, Brain, Cognitive Biases, Consciousness, Learning, Mind, Unconscious Tagged With: Cognitive Biases, Conspiracy Theories, Intention Seeking, Pattern-Detection

Is What You See All There Is?

December 11, 2015 by Joycelyn Campbell Leave a Comment

rainbow eye

As you move through the world, you probably have the sense that you’re aware of whatever there is to be aware of as it is. This applies not only to the sensory world, but also to events, situations, interpersonal interactions—actually to everything that exists or occurs within your world. But the capacity of conscious attention is much too limited for this to even be possible. The 11,000,000 bits of information being processed by the unconscious part of the brain at any given moment need to be considerably (and swiftly) condensed and summarized into the 40 bits you can process consciously.

Consciousness is a way of projecting all the activity in your nervous system into a simpler form. [It] gives you a summary that is useful for the larger picture, useful at the scale of apples and rivers and humans with whom you might be able to mate. —David Eagleman, Incognito

What You See Is All There Is (WYSIATI)*

Your brain maintains a model of the world that represents what’s normal in it for you. The result is that you experience a stripped-down, customized version of the actual world. To a great extent, each of us really does inhabit our own world. But it would be incorrect to say that we create our reality; rather, our brain creates our reality for us.

Much, if not most, of what you do, think, and feel consists of automatically generated responses to internal or external stimuli. And it isn’t possible to consciously mediate all of your responses. It wouldn’t even be a good idea to try.

In addition to helping you navigate the world, your mental model gives rise to your sense of the way things should be. It generates expectations (that are either confirmed or denied), assumptions, biases, etc. that determine what you pay attention to, what you perceive (even what you are able to perceive), how you interpret and respond to what you perceive, and the meaning you make of it all. Your mental model is the result of your genes and your experiences, of both intention and accident. Your brain has been constructing your particular model of the world since your birth, and it is continually updating and modifying it—most of the time entirely outside your awareness.

But while the contents of your particular mental model determine what you think, feel, do, and say, you can’t search them—or follow a bread-crumb trail backward through them—to find out precisely which aspects (and when and how they came to be) give rise to any specific facet of who you are and how you react now.

The significance of your mental model in your life can’t be overstated. Although you aren’t consciously aware of it, your mental model circumscribes not only every aspect of your present experience but also what is possible for you to do and be. It determines what you see and how you see the world, both literally and figuratively, as well as how you see yourself.

But…Your Brain Can Get It Wrong

System 1, the unconscious part of your brain, uses associative thinking to develop and maintain your model of the world. However, there are some problems with associative thinking. For example:

  • It sacrifices accuracy for speed.
  • It doesn’t discriminate very well.
  • It takes cognitive shortcuts (aka cognitive biases).

Your mental model can—and sometimes does—lead to erroneous conclusions and inappropriate responses. It’s the job of consciousness (System 2) to check the impulses and suggestions it receives from System 1, but consciousness is slow, lazy, and easily depleted. Most of the time, it’s content to go along with System 1, which means it’s susceptible to cognitive biases. By definition, cognitive biases are distortions or errors in thinking. They actually decrease your understanding while giving you a feel-good sense of cognitive ease.

Confirmation bias is the easy acceptance of information that validates what you already believe. It causes you to selectively notice and pay attention to what confirms your beliefs and to ignore what doesn’t. It underlies the discomfort you feel around people who disagree with you and the ease you feel around people who share your beliefs.

Information that confirms what you already believe to be true makes you feel right and certain, so you’re likely to accept it uncritically. On the other hand, you’re more likely to reject information that is inconsistent with what you already believe or at least you hold inconsistent information up to greater scrutiny. You have different standards for evaluating information depending on the level of cognitive ease it generates.

Evidence has precious little impact on any of us if it conflicts with what we believe simply because the cognitive strain of processing it is too great. To a very real extent, we don’t even “see” conflicting evidence. While total commitment to our particular worldview (mental model) makes us feel more confident, it narrows—rather than expands—our possibilities. That means it limits our powers of discernment, our ability to increase our understanding of the world around us, and our creative potential. It closes the world off for us instead of opening it up.

The often-quoted statement is true: we don’t see things as they are; we see them as we are. If we want to live fuller lives, if we want to be more effective or useful or loving in the world, we first need to recognize that our greatest constraints are imposed by our own mental models.

It’s important to remember that what you see is not all there is.

*Daniel Kahneman, Thinking, Fast and Slow

Filed Under: Attention, Beliefs, Brain, Cognitive Biases, Consciousness, Mind, Unconscious Tagged With: Brain, Cognitive Biases, Mental Model, Mind, Perception, Reality

Brain Dead: Is Your Mind Temporarily Offline?

September 4, 2015 by Joycelyn Campbell Leave a Comment

brain fog3

Your brain has two systems for processing the stimuli and experiences of your life and determining how you act upon them.

Conscious: The processing system you’re aware of is called System 2. It is logical and intentional and sometimes referred to as “true reasoning.” (A formal outline is a good example.) It is also slow, limited, and easily depleted. It processes about 40 bits of information at a time.

Unconscious: The processing system you’re not aware of is called System 1. It is associative, which means it sees patterns and connects dots. (A mindmap is a good example.) It is fast, vast, and always on. It processes about 11,000,000 bits of information at a time.

If System 1 were to go offline, you would, too. Game over! But you can still function when System 2 is temporarily offline, even for long periods of time, such as when you’re asleep. So when you think or talk about being temporarily brain dead, you’re talking about exhausting System 2 attention.

If you’re in good health, there’s not much you can do to tax or exhaust the capacity of System 1—and there are things you can do to enhance its functioning. However, your supply of System 2 attention is always limited, and anything that occupies your working memory reduces it. Some examples of things that tax System 2 attention are:

  • Physical illness (even minor), injury, or lack of sleep
  • Making numerous trivial decisions throughout the day
  • Stress, anxiety, and worry
  • Exercising will power (forcing yourself to do something you don’t want to do or to not do something you do want to do)
  • Monitoring your behavior
  • Monitoring your environment if it is new or you consider it unsafe
  • Learning something new, traveling an unfamiliar route, etc.
  • Completing a complex computation
  • Trying to tune out distractions
  • A long period of concentrated or focused attention
  • Trying to remember dates, numbers, or unrelated facts
  • Listening to me talk

Since System 1 is fast, vast, always on, and has an answer for almost everything—and since you don’t need System 2 attention for most of what you do when you’re awake—what’s the big deal if you run out of System 2 attention from time to time?

Three Categories of Errors

Optimally, the two systems work together, and neither type of processing is superior. However, System 1 is more useful in some situations, while System 2 is not only more useful but also required in other situations.

System 1 is pretty good at what it does because its models of familiar situations are accurate so its short-term predictions tend to be accurate. But that’s not always the case. System 1 sacrifices accuracy for speed, meaning it jumps to conclusions. It also has biases and is prone to making logical errors.

One of System 2’s jobs is to detect System 1’s errors and adjust course by overriding System 1’s impulses. As Daniel Kahneman says in Thinking, Fast and Slow:

There are vital tasks that only System 2 can perform because they require effort and acts of self-control in which the intuitions and impulses of System 1 are overcome.

Bear in mind that System 1 is not rational. If System 2 is depleted and can’t veto or modify the non-rational impulses of System 1, those impulses then turn into actions (or speech).

There are three categories of errors you tend to make when System 2 is depleted.

Logical Errors

System 1 thinking uses shortcuts. System 2 thinking takes the long (logical/linear) way home. So when you’re out of System 2 attention, you’re more prone to making mistakes in anything that requires logical, linear thinking. Errors of intuitive thought can be difficult for System 2 to catch on a good day. When System 2 is offline, you automatically assume them to be correct. As a result:

  • You will have trouble making, following, or checking the validity of a complex logical argument. You’ll be more likely to be led by the cognitive biases and distortions System 1 uses because they don’t require any effort and give you a comforting sense of cognitive ease.
  • You will have difficulty comparing the features of two items for overall value. If you have to make a choice, you’ll be more likely to go with what intuitively feels right or the item that has some emotionally compelling attribute (it reminds you of the one your mother had, for example, or reminds you of your mother).
  • You will be more gullible. You’ll be more likely to believe things you wouldn’t otherwise believe or be persuaded by empty messages, such as in commercials. System 2 is the skeptic, so the best time for someone to take advantage of you is when it is offline.
Intention or Response Errors

System 1 continuously picks up on cues and triggers in your environment to determine what situation you’re in and to predict what’s next. Any deviation from the norm requires System 2 attention. If it isn’t available, you’re likely to do not what you intended to do but whatever is normal for you in that situation. And without System 2 attention, you’re much more likely to respond automatically (habitually) to the stimulus (cue or trigger).

  • System 2 is in charge of self-control, continuously monitoring your behavior, keeping you polite, for example, when you’re angry. In the heat of the moment, when you’re out of System 2 attention, you’re much less likely to be able to suppress your immediate emotional reactions to people and situations.
  • System 1 has an answer for almost everything. But when it encounters a surprising situation (something it hasn’t previously encountered or that is unusual in that situation), it notifies System 2. You don’t need System 2 attention to drive a familiar route, but if you encounter an obstacle along that route, you need System 2 to figure out what it is and to respond appropriately to it.
  • System 2 is also in charge of will power. If you are in the process of trying to stop doing something you habitually do (such as raiding the refrigerator in the evening), you need System 2 to belay the impulse from System 1 to see if there’s more pie. Without System 2, you’re more likely to give in, look for the pie…and eat it.
  • You need System 2 if you want to take a different route from your usual one or make an extra stop you don’t normally make. Without adequate System 2 attention, you’re likely to find yourself taking the usual route and forgetting to make that stop.
Gatekeeping Errors

We all have biases, whether or not we’re aware of them and whether or not we want to admit it. While it’s easy to spot overt biases and prejudices in other people, most of your own biases are hidden even from you. In the case of biases toward specific groups of people, you’ve likely come to a reasoned conclusion they’re wrong and have chosen not to think about and treat other people based on stereotypes. But that doesn’t mean the biases have disappeared. They’re still part of System 1’s associative processing operations. It’s just that when System 1 suggests a biased response to System 2, System 2 normally overrides it. Per Daniel Kahneman:

Conflict between an automatic reaction (System 1) and an intention to control it (System 2) is common in our lives.

When System 2 is depleted, there is no one at the gate to keep the biased or prejudiced responses from getting through. You may simply have a biased thought. You may say something in the presence of others that you wouldn’t normally say. Or you may respond to another person based on a group stereotype. The thought, comment, or behavior may be something you later regret. If you were to claim it doesn’t represent what you believe or the way you really feel or think, you’d most likely be right.

But when you see a blatant expression of bias or prejudice in someone else—especially a celebrity—you might have a different reaction. You might assume their true colors are showing.  We think that what we see in other people when their guard is down and they’re pushed or stressed reveals the truth about them. But the actual truth is that to the extent we have any civility at all, it’s because System 2 maintains it.  Without System 2 you and I would have no ability to question our biases or prejudices, no ability to come to reasoned conclusions about them, and no ability to monitor and veto System 1’s automatic reactions.

Conclusion

It isn’t always necessary, advisable, or even possible to override System 1. But when you deplete System 2, you can’t override it even when you want or need to. Without System 2, you can’t think straight (logically and linearly). So:

  • Don’t try to make important decisions of any kind when you feel brain dead.
  • Don’t assume you’ll feel or think the same way about something the next day as you do when you’re stressed, sick, just completed your annual tax return, or have recently fallen in love.
  • Don’t stay up late to watch the QVC channel unless you have a lot of money you’re trying to unload.
  • Don’t keep pie around if you’re trying not to eat it.
  • Don’t get into debates about complex issues after you’ve had a few beers.
  • Don’t tax your working memory with details you can keep track of some other way.
  • Don’t take System 2’s censoring of your biases and prejudices for granted. And don’t assume other people’s mental lapses reveal deep-seated truths about them.

Filed Under: Attention, Brain, Cognitive Biases, Consciousness, Living, Memory, Mind, Unconscious Tagged With: Brain, Brain Dead, Cognitive Biases, Daniel Kahneman, Fast and Slow, Mind, Predictably Irrational, System 1, System 2, Thinking

The Self-Serving Bias

April 26, 2014 by Joycelyn Campbell 1 Comment

It’s fascinating to explore the effects of cognitive biases on our behavior. Here’s a short video that explains the self-serving bias.

Filed Under: Beliefs, Brain, Cognitive Biases, Living, Mind, Unconscious Tagged With: Brain, Cognitive Biases, Mind, Self-Serving Bias, Thinking

WHEN You Choose Can Impact WHAT You Choose

April 24, 2014 by Joycelyn Campbell 2 Comments

Undecided..
(Photo credit: Vijay..)

People tend to favor maintaining the status quo to such an extent that it’s a recognized cognitive bias, one of many systematic distortions of thinking we’re prone to. The status-quo bias goes hand-in-hand with the loss-aversion bias, which leads us to pay more attention to what we might lose than to what we might gain. The status quo often feels less risky, whether or not it actually is.

It stands to reason, then, that when faced with making a choice between an option that maintains the status quo (the default option) and an alternative option, we’d be more likely to choose the default option. And we are—but only if we make the choice immediately. If we delay making a choice that we could have made immediately, we’re much more likely to choose the alternative option.

There have been a number of studies over the past 25 years, all of which show the same results. Simply delaying making a decision we could have made immediately decreases the likelihood we’ll choose the default option. It doesn’t matter what the options are or which option, if either, is the better choice. Delay itself casts doubt on the default option.

Failure to Choose

This isn’t hard to understand. If we could have made a choice immediately, then why didn’t we? The know-it-all interpreter—or explainer—inside our head has an answer for this, as it does for just about everything: obviously there’s some doubt as to the appeal of the default option. Otherwise, based on the status quo bias, we would have chosen it immediately.

It also turns out that being in a state of doubt about something that is completely unrelated to the choice at hand can have the same impact on our choice. Doubt, in general, influences us to choose the alternative option rather than the default option.

Delay and doubt are factors we should take into consideration when we’re faced with making a choice between a default option and an alternative option. The conventional wisdom is that taking time to make a choice leads to making better choices. That seems reasonable, but it isn’t entirely accurate. Yes, delay has an effect; it’s just not the effect we may have attributed to it.

If we’re aware that delay tends to make the default option seem less appealing, we can factor that in when choosing when to choose. We can mitigate some of the effect of delaying choice just by knowing the effect is there.Enhanced by Zemanta

Filed Under: Brain, Choice, Cognitive Biases, Mind, Unconscious Tagged With: Brain, Choice, Cognitive Biases, Decision Delay, Mind, Status-Quo Bias, Thinking

Subscribe to Farther to Go!

Enter your email address to receive notifications of new Farther to Go! posts by email.

Search Posts

Recent Posts

  • No Good Deed Goes Unpunished
  • Always Look on
    the Bright Side of Life
  • The Cosmic Gift & Misery
    Distribution System
  • Should You Practice Gratitude?
  • You Give Truth a Bad Name
  • What Are So-Called
    Secondary Emotions?

Explore

The Farther to Go! Manifesto

Contact Me

joycelyn@farthertogo.com
505-332-8677

  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter
  • Home
  • About
  • Blog
  • On the Road
  • Links
  • Certification Program
  • Contact

Copyright © 2025 · Parallax Pro Theme on Genesis Framework · WordPress · Log in