Loading the Elevenlabs Text to Speech AudioNative Player...

When you understand them and can efficiently recognize them, cognitive biases are predictable patterns of mental shortcutting that lead to systematic, non-random errors in judgment. They're essentially the brain's default settings that happen subconsciously. Think of it like a predictable ghost in the machine of human reasoning that guides us toward expediency over accuracy. Not that it's always good.

Behavioral scientists use the fundamental dichotomy between two models of thinking to understand the phenomenon: Type 1 and Type 2.

  • Type 1: This type of thinking is automatic and requires little energy. It's designed for rapid decision-making under uncertainty. It's the fastest path for thinking reactions when we face threats. That said, it's awful for probabilistic complexity.

  • Type 2: This is a slower thinking process. It's more deliberate and analytical, and some would say it definitely takes more effort. We use this model of thinking to solve complex situations and equations.

The modern scientific exploration of this structural limitation is intriguing. It actually began in 1972 with the groundbreaking work of psychologists Amos Tversky and Daniel Kahneman. Their initial findings centered heavily on innumeracy. They were most interested in the fundamental inability of people to grasp basic probabilistic and statistical concepts intuitively, especially on a large scale.

The genesis of bias in innumeracy highlights a critical point. The human mind is adapted for small-scale, ancestral environments where risk and probability were simpler. We handle much more complex scenarios in modern society.

We've forced this ancient, energy-efficient, almost software-like thinking mechanism into modern-day systemic failures. The bias itself is an adaptive failure of context. That's why I believe that understanding how multiple forms of cognitive bias exist requires exploring functional and symptomatic categories.

The Total Count of Cognitive Biases

The question, "How many cognitive biases are there?" does not generate a simple numerical answer. The true count depends entirely on whether you're listing every observable symptom or mapping the underlying functional structure.

If you want a comprehensive census of observed errors, one of the most cited illustrative resources, called the radial dendrogram popularized by John Manoogian III and Buster Benson, lists approximately 188 distinct cognitive biases.

It's a massive catalog that emphasizes the complexity and widespread nature of cognitive deficiency. The resource details everything from the Bandwagon Effect to the Ambiguity Effect.

Still, relying solely on this high count frames the human mind as "woefully muddled" or fundamentally irrational. It's a conclusion that risks leading to a state of debilitating cynicism.

That's why I believe attempting to compile a list of all cognitive biases can be overwhelming without a structural system.

Rooted in evolutionary psychology, a powerful counter-perspective argues for a more subtle understanding. This adaptationist view suggests that much of biased research can be "profitably reframed and understood in evolutionary terms".

Here, biases are not conceptual failures listed under a so-called system, but instead adaptive heuristics that were "remarkably well designed for important problems of survival and reproduction" in the ancestral environment.

This perspective reduces the overwhelming catalog into a functional taxonomy. It suggests biases fall into three major categories:

  • Heuristics (shortcuts)

  • Error Management Effects (systems designed to minimize the most costly type of error)

  • Experimental Artifacts (errors specific to research conditions).

The difference between these two systems carries enormous strategic weight. If biases are simple errors, the logical solution is... logic, but better logic. 

That said, if biases are adaptive mechanisms, particularly Error Management Effects, then those wanting to manipulate you do not attack a flaw. Instead, they press a feature. Understanding the bias's original, protective purpose is essential to defending against its modern exploitation.

Classification System

Strategic Focus

Approximate Count/Categories

The Cognitive Bias Codex

Defines Vulnerability (How the mind can fail)

~188 distinct errors

Evolutionary Taxonomy (Adaptive Rationality)

Defines the Lever (The adaptive function that can be exploited)

3 Categories (Heuristics, Error Management, Artifacts)


Types of Cognitive Biases You Should Know

Regardless of the classification system, the persistence of fundamental bias is interesting.

One of the most common cognitive bias types is the anchoring effect. You might have experienced it in everyday life. This is where the first piece of information received, known as the "anchor", dominates with an outsized and often irrational influence on subsequent decisions. This initial information becomes a reference point that can distort perceptions, making things like an expensive item seem more reasonable when compared to an even higher anchor price, or an initial salary offer less than one's worth seems acceptable. 

In 2020, research confirmed the power of the anchoring effect in consumer price decisions. Researchers noted that participants were anchored, as it were, by previous numeric values and non-quantitative elements such as "experience perception" related to the product's design concept or innovation. Speaking in this context, the feeling of quality becomes the anchor. It proves that emotional and aesthetic input can exert the same rigid gravitational pull as a numerical data point.

Perhaps even more interesting, increasing the scale of data collection does create greater rationality. Instead, it can magnify existing cognitive errors. Researchers warn that relying on "big data" can inadvertently create "big inferential errors," especially when the dataset is riddled with underlying sampling error or measurement bias. And that's definitely a modern-day society issue, with so much now relying on big data analytics and language learning model datasets.

This phenomenon perfectly echoes the infamous 1936 Literary Digest poll. That poll included a huge sample size of 2.4 million respondents to predict the presidential election, but it was catastrophically wrong. The flaw was not the size. It was the structural sampling bias. The survey reached primarily magazine readers, automobile owners, and telephone subscribers. It's a demographic that's always skewed toward the losing candidate. Despite the colossal size, the error was fundamental.

When external actors, such as media outlets, sophisticated marketers, or political operators, use targeted information (selective exposure) to exploit these internal, adaptive biases and simultaneously rely on flawed big data systems that amplify internal cognitive errors, almost obvious issues arise. The biggest? You experience gaslighting.

Defense against this requires sophisticated and better facts alongside a radical form of self-awareness. Recognizing that your mind's adaptive defaults are now strategic targets is a necessity detailed in works on vulnerability and power, such as I explored in The Black Book of Power. They understand that manipulation demands recognizing where and how your psychological wiring can be exploited.

Why So Many Cognitive Biases Exist

You can understand the persistence of cognitive bias when you explore the explanation of the physical demands of human thought.

The brain naturally favors the fast, intuitive Type 1 process because it demands minimal metabolic expenditure. And you're doing it all the time without thinking about it. Overriding these fast pathways requires the slower, high-energy executive functions housed primarily in the prefrontal cortex (PFC) to activate. It's the center of self-regulation and metacognition.

The scientific community is actually close to discovering the physical location of cognitive failure and distortion. Recent research is specifically focused on identifying neural biomarkers of confirmation bias.

One deep finding from neural population activity analysis complicates the popular idea that bias is simply the mind rejecting or ignoring contrary facts. That's not the case. Studies found that the neural encoding of consistent evidence was often indistinguishable from the encoding of inconsistent evidence.

Interestingly, that means that the brain receives contradictory facts and affirming facts comparably at the sensory level. The systematic deviation, therefore, doesn't reside in the initial reception of data. It's in the rapid, immediate, emotional, or "belief" weighting assigned immediately after processing.

The central implication is actually that simply introducing more facts will not reliably dismantle a deeply held bias. The rational override is the act of resolving conflicts between logical and belief bias.

It requires intense self-regulation, facilitated by metacognition. This structural difficulty explains why changing a deep conviction feels like physical work. You're literally forcing your energy-efficient brain to perform a resource-heavy Type 2 calculation, against its default nature.

And we can all admit that sounds like effort just reading this, let alone doing it. And it's true, the cognitive effort is real, and the brain is programmed to conserve that energy.

The strategic defense against cognitive exploitation begins with a precise map of your own cognitive weak points. However, one of the most insidious effects of bias is that most people cannot accurately self-assess their own susceptibility: we observe irrationality in others but believe ourselves to be logical.

Modern psychological research is addressing this measurement challenge by developing scales focusing on perception rather than just performance. The Perception of Cognitive Biases in Decision-Making Scale (PCBDM-S) is one of the most interesting. Initially validated in mid-2025, it provides a psychometric tool designed to measure an individual's self-awareness regarding their cognitive bias tendencies.

This acknowledgment that awareness is the first method of change aligns with earlier critical distinctions observed in clinical tools. For example, the Cognitive Biases Questionnaire for psychosis (CBQp) suggested that many common distortions are not errors of pure reasoning but are biases of interpretation. If your problem is how you assign meaning to incoming information (the subjective filter), then simply acquiring more objective facts will be insufficient. 

When interpretation itself is biased, the facts will naturally be twisted to fit the pre-existing emotional system.

The ultimate strategic intervention must be internal. It must focus on managing the interpretive frame. To specifically map how external forces target and exploit these hidden interpretive flaws, such as individual patterns, it is essential to use specialized self-assessment. 

Since biases are adaptive structures and cannot be simply erased, you must strategically manage them. Practical application relies on preemptive effort.

If you can, acknowledge the onset of decision fatigue. It's a state known to severely impair judgment and force an immediate Type 1 thinking, a default. This awareness should trigger a pause, preventing high-stakes decisions from being ceded to the efficient but error-prone system.

And if possible, institute a delay between hearing critical information and making a final judgment. I know this isn't always an available option in everyday life. Not everyone will allow you a few minutes to pause when you're talking to their face. Still, this deliberate friction forces the expensive Type 2 override, giving the prefrontal cortex the necessary time to resolve the conflict of belief.

Finally, counter the powerful draw of attitude-congruent information. If you can, actively seek and consume high-quality, reasoned arguments that attack your deeply held beliefs. Do this not so that you are immediately convinced, but to strengthen your metacognitive muscles. You can treat the contradictory argument as a weight to be lifted by your regulatory functions.

Before attempting any intervention, knowing your archetype helps you understand which strategies will actually work for your psychological wiring.

One Last Point

You shouldn't want to count the number of cognitive biases. That's not the point. The core significance of understanding how many biases there are, whether it's 3 functional categories or 188 descriptive errors, is that the human mind is a system of brilliant, high-speed compromises. Its adaptations are what make us incredibly efficient but structurally vulnerable in the abstract, high-stakes context of modern power relations.

This understanding should never lead you into the rational trap or a state of cynicism where you conclude that all thought is fatally flawed and therefore meaningless. I prefer people to think of it as a view that this extensive catalog of biases is a detailed operational manual for the machinery of self. It exposes exactly how your psychological defenses were designed to run and, consequently, where they can be most easily targeted.

Externally, the structures of influence will always find the path of least resistance and exploit the adaptive errors encoded in your natural wiring. Internally, the true measure of strategic intelligence is not avoiding mistakes, but recognizing them the moment they happen and applying the necessary metacognitive effort to define your judgment.

As I detailed in The Black Book of Power, the power to command others begins with the painful but necessary effort of commanding your own attention.