Every cognitive bias exists for a reason—primarily to save our brains time or energy.
I’ve spent many years referencing Wikipedia’s list of cognitive biases whenever I have a hunch that a certain type of thinking is an official bias. Still, I can’t recall the name or details. But despite trying to absorb the information of this page many times over the years, very little of it seems to stick.
Read More Articles :
- Don’t panic! $130 Babel fish-like gadget fits inside your ear to translate foreign languages in real-time
- Choosing to Skip the Upgrade and Care for the Gadget You’ve Got
- Speed up your internet browser with five simple guidelines
- 5 Tips to enhance Your local Seo in 5 Hours
- What You Need to Know About Women’s Workplace Equality
I decided to try to more deeply absorb and understand this list by coming up with a simpler, clearer organizing structure. If you look at these biases according to the problem they’re trying to solve, it becomes a lot easier to understand why they exist, how they’re useful, and the trade-offs (and resulting mental errors) that they introduce.
Four problems that biases help us address:
Information overload, lack of meaning, the need to act fast, and how to know what needs to be remembered for later.
Problem 1: Too much information
There is just too much information in the world; we have no choice but to filter almost all of it out. Our brain uses a few simple tricks to pick out the bits of information that are most likely going to be useful somehow.
- We notice things that are already primed in memory or repeated often. This is the simple rule that our brains are more likely to notice things related to stuff that’s recently been loaded in memory.
- See Availability heuristic, Attentional bias, Illusory truth effect, Mere exposure effect, Context effect, Cue-dependent forgetting, Mood-congruent memory bias, Frequency illusion, Baader-Meinhof Phenomenon, Empathy gap, Omission bias, or the Base rate fallacy.
- Bizarre/funny/visually-striking/anthropomorphic things stick out more than non-bizarre/unfunny things. Our brains tend to boost the importance of things that are unusual or surprising. Alternatively, we tend to skip over information that we think is ordinary or expected.
- See Bizarreness effect, Humor effect, Von Restorff effect, Picture superiority effect, Self-relevance effect, or Negativity bias.
- We notice when something has changed—and we’ll generally tend to weigh the significance of the new value by the direction the change happened (positive or negative) more than re-evaluating the new value as if it had been presented alone. This also applies to when we compare two similar things.
- See: Anchoring, Contrast effect, Focusing effect, Money illusion, Framing effect, Weber–Fechner law, Conservatism, or Distinction bias.
- We are drawn to details that confirm our own existing beliefs. This is a big one. As is the corollary: we tend to ignore details that contradict our own beliefs.
- See Confirmation bias, Congruence bias, Post-purchase rationalization, Choice-supportive bias, Selective perception, Observer-expectancy effect, Experimenter’s bias, Observer effect, Expectation bias, Ostrich effect, Subjective validation, Continued influence effect, or Semmelweis reflex.
- We notice flaws in others more easily than flaws in ourselves. Yes, before you see this entire article as a list of quirks that compromise how other people think, realize that you are also subject to these biases.
See: Bias blind spot, Naïve cynicism, or Naïve realism.
Problem 2: Not enough meaning
The world is very confusing, and we end up only seeing a tiny sliver of it—but we need to make some sense of it to survive. Once the reduced stream of information comes in, we connect the dots, fill in the gaps with stuff we already think we know and update our mental models of the world.