jump to navigation

All the Biases: Too Much Information part 1 July 2, 2022

Posted by Ubi Dubium in Brain Glitches.
Tags: ,
trackback

I’m going to start a new occasional series, based on this graphic, which takes Wikipedia’s list of cognitive biases and sorts them into groupings:

I realize that you can’t possibly read that, so you can see the full version here.  Original source story where I found this graphic is here.

Now, cognitive biases are one of my favorite subjects.  Understanding how our own brains mess up and lead us astray can really help us when we are striving for clearer thinking. So I thought I’d start working through this chart one group at a time, as a good chance to review individual cognitive biases, learn about some new ones, and see if I agree with the groupings in the graphic.

As a reminder, a cognitive bias is a systematic error in thinking, something that gets in the way of our rationality.  A lot of them stem from our need to be able to make quick judgments without complete information, and to take mental shortcuts because we just don’t have the brain capacity to think through every decision we have to make.  But sometimes they can lead us to make serious and costly errors, and it’s worthwhile to try to understand them as well as we can, and to catch ourselves when we are committing one.

So, starting at the top of this chart, and working clockwise, the first large grouping is labeled “Too Much Information” and the first subgroup is labeled “We notice things already primed in memory or repeated often.”  Here’s the list of what they place under that category:

  • Availability Heuristic
  • Attentional Bias
  • Illusory Truth Effect
  • Mere-Exposure Effect
  • Context Effect
  • Cue Dependent Forgetting
  • Mood Congruent Memory Bias
  • Frequency Illusion
  • Baader-Meinhof Phenomenon
  • Empathy Gap
  • Omission Bias
  • Base Rate Fallacy

(A “heuristic” is a mental shortcut, a “rule of thumb” if you will.)

The world is too full of information, and our little monkey-brains can’t deal with all of it all at once.  We have to filter it, to choose what we will take in, think about, and base our decisions on.  Some of that choice is deliberate on our part, but our brains just do a lot of filtering without our even realizing it.   Let’s take a short look at each one of these biases listed above:

  • Availability Heuristic

When we are evaluating information, we tend to give more weight to what easily comes to mind.  The things we learned about recently, or hear about a lot, or recall easily, seem more important than other information.  Example:  People who have recently seen a news report about a shark attack are more likely to overestimate the actual risk of shark attacks as compared to other dangers.  Or, a magistrate talking to a class full of schoolchildren on career day asks “What’s the most common thing people are arrested for?” and the class answers “Murder!”  They’ve been influenced by watching TV news and crime dramas, because the most common causes of arrests are much more prosaic: public drunkenness, DUI, and domestic assaults. Those don’t make the news, though, which affects the kids’ assessment.

  • Attentional Bias

What we perceive is affected by what is already on our minds. Example:  Someone who is trying to quit smoking notices cues in their environment related to smoking more often than a non-smoker does.

Or one from my own experience:  When a car goes by, someone who is really into cars will see all kinds of detail, and might be able to tell you the make, model and year of what they saw.  Whereas, since I have no interest in the subject, if you asked me what kind of car it was I might say “A big one.  And it was blue maybe?”  And that’s all the information that I would have registered, even though I saw the very same car.

  • Illusory Truth Effect

The more you hear a statement, the more likely you are to accept it as true.  Even when a statement is blatantly false, constant repetition makes it feel more familiar, and the brain mistakes familiarity for truth.   Purveyors of fake news take advantage of this, and are constantly trying to insert their false assertions into the public discourse as often as they can.  Even if the mention is along the lines of “false statement X is not true” the fact that “false statement X” has been repeated still produces the effect.

  • Mere-Exposure Effect

Also called the “familiarity principle”.  We develop a preference for things that we are used to.  Even if being “used to” something is simply having seen it a few times before. Even if you aren’t even really consciously aware that you have seen it. Advertisers take advantage of this all the time, with product placement, and logos plastered everywhere.

  • Context Effect

This is a pretty broad one.  How we perceive something is dependent on the surrounding environment.  This could include useful things, like being able to figure out an illegible word in a message based on the rest of the text. Or whether a smell is perceived as pleasant can be affected by whether it’s coming from old socks or from aged cheese.  But it could also include perceptions based on something in your environment that’s totally irrelevant.  As an example, how much people are enjoying a TV show affects their evaluation of the quality of the commercials played during it.   How much you are willing to pay for a product is affected by the retail environment of the store, or the prices of the products around it (anchoring bias).  Our brains are primed to see things in a certain way, by factors that we often aren’t aware of.

  • Cue Dependent Forgetting

We store our memories by association with other memories.  We may not be able to remember a whole category of things well, or at all, unless a particular cue is given.  Perhaps you can’t remember the details of a long-ago vacation until someone mentions one thing that happened, and a whole set of memories is then triggered for you.  Or, my own example, as a choral singer my head is full of songs from the years of singing I’ve done.  But I may not be able to sing you a particular song from years ago, until I actually hear a snippet of it.  Then it will come flooding back and I’ll find myself singing parts of it all day.

This is also a reason to study for exams in the same sort of room that you will be taking the exam in.

(This one is actually about memory retrieval, not how we filter incoming information.  I think it actually belongs in the section “We store memories differently based on how they were experienced.”  I’ll try to remember to talk about it again when I get to that section.)

  • Mood Congruent Memory Bias

We are better at remembering things that match our current emotional state.  It’s easier to remember the good parts of our lives when we are relaxed and happy, and when we are angry or depressed we are more likely to recall bad events.

(This, like the one just above, is also about memory retrieval, and I’m going to move it to that section.)

  • Frequency Illusion
  • Baader-Meinhof Phenomenon

This graphic lists these two as separate entries, but as far as I can tell they are the same thing.  When you first become aware of a thing for the first time, you may suddenly notice it more often, giving you the illusion that it suddenly has a high frequency of occurrence.  But the only thing that changed was your awareness of it.  As an example, when I watch a well-known movie for the first time, afterwards I’ll notice a lot of quotes from that movie popping up.  Of course, people around me were already quoting it, I just didn’t recognize what it was before.

  • Empathy Gap

Sometimes we fail to correctly understand another person’s emotional state, and miss important information because of it. Sometimes this happens when we are thinking about members of a hated outgroup, and not considering that they are also people with their own motivations.  I think this is similar to the Fundamental Attribution Error (FAE) : “I do the things I do because I have specific reasons, but the other guy does what he does because of the kind of person he is.”

(This one is in the wrong category too, as I would put it under “We fill in characteristics from stereotypes, generalities and prior histories.”)

  • Omission Bias

When evaluating choices, we more readily excuse a choice that leads to a bad result from inaction over a choice that leads to a bad result from taking action.  Trolley problems often explore this idea, where someone is more willing to let the onrushing trolley be blocked by a person already on the track than they are to push that person onto the track, even though the end result is the same.

(Again, I think this one is in the wrong category.  This is about how we make choices about actions, not about how we filter incoming information.  I’m going to move this one to one of the categories under “We need to act fast.”)

  • Base Rate Fallacy

We give more importance to the specific example in front of us, and less importance to what is known to be generally true.  Evaluations of false positives are an important example of this.  Suppose an employer is giving a drug test to his staff of 1001.  One of these people is actually a drug user, and the test he is using is 99% accurate.  The chances are 99% that the drug user will have a positive test.  But of the 1000 non-drug-using staff, 1% of them (ten people) receive false positive tests.  So the employer now receives a report listing eleven employees with a positive drug test.  And for each of them, the probability is really only one in eleven that they are a drug user, but will the boss take the time to understand the base rate and realize this?  More often he would just treat them all as miscreants.


So many of the entries here belong under other categories on the chart! I don’t have the graphic skills to rework the chart myself.  So, what I think I’ll do is begin a running list on my Brain Glitches page, with the name of each bias and a one-sentence description, putting each bias in the group where I think it actually belongs.

Anybody else have any examples of any of these biases in action?

Next section→

Comments»

1. Daniel Digby - July 3, 2022

You mentioned this diagram once before a few years ago, but I never followed up on it. When I saw omission bias, I misinterpreted that as an example — where consideration of a subject is thought to be too complex to be worth to be worth the time, and so understanding the situation is not even an option. I obviously didn’t know the meaning of omission bias, but is there a name for what I thought it was?

There is something else usually called the recency illusion where, on encountering something for the first time, you think it’s a new phenomenon. For me, it happened in 1948 when I heard “Take Me Out to the Ball Game” for the first time and thinking it was a new song. In reality it dated back to the early 1900s, shortly after the invention of Cracker Jacks. Another example is in “Just Between Dr. Language and I” at http://itre.cis.upenn.edu/~myl/languagelog/archives/002386.html. Does this fit somewhere else in the diagram?

Thanks for taking the time to break this down.

Liked by 1 person

2. Ubi Dubium - July 3, 2022

Both of those may show up as I work through the other biases on this chart. I haven’t read through all of them yet, I’ll do that as I go. The “recency illusion” certainly sounds like it should be on here.

Liked by 1 person

3. Brent - July 5, 2022

Cool! Thanks for doing this; I try to go through these periodically and am always happy to do it again.

Liked by 1 person

4. Headless Unicorn Guy - August 1, 2022

Illusory Truth Effect:

“A lie, repeated often enough, becomes Truth.”
— Reichsminister Josef Goebbels

Liked by 2 people


Thoughts?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: