20+ popular heuristics and cognitive biases

20+ popular heuristics and cognitive biases

12 Min.
Light
By the Bitbrain team
May 2, 2019

We believe that reality is exactly what we perceive but really, this is just an illusion of our own brain. This happens because our brain takes shortcuts to interpret information and adapt to our surroundings: heuristics on which the brain relies to understand the reality perceived, but… can we always trust our brain when it utilizes these resources?

The brain represents only 2-3% of our body weight but consumes approximately 20% of our energy. It is a very “expensive” organ that is constantly searching for ways to save energy and avoid high cognitive loads. Because of this, many neuroscientists refer to the brain as a “lazy cognitive”.

This behavior of the brain directly affects our daily decision-making processes, which is not carried out as rationally as we like to believe. In many cases, our decisions are based on heuristics, procedures that help us at the time of solving a problem. We think fast thanks to these brain shortcuts, intuitive judgments that are based on information processing with partial knowledge, on experience or on suppositions. Nevertheless, these heuristics can lead us to incorrect judgments and decisions. This is when biases are produced - cognitive prejudices and existing beliefs that lead to errors, can make us act irrationally or have an illogical interpretation of reality (logical fallacy).

Classification of heuristics and cognitive biases

There are more than 200 types of heuristics and cognitive biases in scientific literature and it is complicated to classify them. Buster Beston carried out a simple classification based on the reasons that justify the use:

When there is too much information

We are surrounded by more than 11 million bits of information per second, and it is impossible to mentally process it all. Our brain employs mental shortcuts to select only that piece of information considered useful. For example:

  1. The brain concentrates more on things associated nonconsciously with concepts we utilize frequently or have recently utilized (availability heuristic). This is why, when we are facing the shelves at a supermarket, we quickly find the product we are searching for, or when we are expecting a child, we see pregnant women everywhere (selective attention bias).

  1. The brain focuses more attention and provides more value to weird, funny, stunning and surprising things and we generally omit information that is considered ordinary or expected (von Restorff effect). For example, in the image, by employing neuromarketing techniques such as the eye-tracking system, it can be seen that the zone of maximum interest is the person sustaining the signal, who is seen by 100% of people and to whom the maximum time is dedicated. However, the remaining elements are visualized only by less than 25% of people and for less than 0.25 seconds.

visual attention in advertising

  1. The brain is skillful to detect that something has changed and we generally evaluate this novelty by the effect of the change (positive or negative) rather than by its value (which is the value it would have on its own, isolatedly - anchor effect). This is why when Whatsapp decided to charge 0.99€ for its services many users felt defrauded. The problem was not the price, but the fact that before the price was 0 €.
  2. The brain usually focuses attention on what confirms our opinions and beliefs and ignores things that contradict us (confirmation bias). In other words, the confirmation bias is the trend to give much more credibility to what is aligned with our way of thinking, and this makes us value more the information of a specific communication medium than another.

  3. The brain detects defects in other people much easier than our own defects (bias blind spot). In this way, we think that other people are much more impressionable than ourselves, for example, with publicity. The self-serving bias is related to the blind spot bias, making us attribute more responsibility to successes and hits than to mistakes.

When we don’t know how to provide meaning to what surrounds us

As we only process a small part of the information required for a completely objective vision of the world, we fill in information blanks to give meaning to what surrounds us.

  1. The brain finds stories and patterns even in disperse data (series illusion or apophenia), which is natural to avoid the sensation of unfamiliarity, of something that we don’t like or that makes us feel insecure. This is why we find shapes in clouds.

  2. The brain fills in the missing information with the best suppositions (stereotypes, generalities, own past or third-party experiences) but also, we forget which parts were real and which were suppositions. For example, we evaluate a hotel on Booking as very good because of very positive evaluations from other people, rather than due to the objective information provided by the establishment (drag effect or social test).

  3. The brain gives more value to people or things we are used to. In other words, we include suppositions on the evaluation of what we see. For example, we usually think that attractive people are more intelligent and kind than less-attractive people because we generalize a positive feature to all people (halo effect). The correspondence bias or attribution error is related, which explains behavior on the basis of the “type” of person involved, rather than on social or environmental factors that surround and influence the person.

  4. The brain simplifies probabilities and calculations so it is more simple to think about them. Nevertheless, we are unskilled in math and intuitive statistics, making terrible mistakes (law of small numbers). For example, when playing roulette we don’t want to bet on “red” if the five previous results were red.

  5. The brain makes us believe that we know what others think, and we mold the mind of others from our own minds (false consensus bias and projection bias). This is why we believe that everyone will love the movie we enjoyed so much.

  6. The brain projects our current mentality to the past or future, making us believe that it is/was easy to anticipate the future (retrospective bias, hindsight bias). Therefore, once something happens, we say and feel that “I already thought so” although probably that was not true.

When we have to act fast

We are limited by time and the amount of information that we can process, but we cannot let this paralyze us.

  1. The brain provides us with excessive confidence in our capabilities so we can act (overconfidence bias,  optimism bias, actor observer bias, dunning kruger effect). This is why many times we think we can handle well a conflictive situation that in the end gets out of hand.

  2. To keep us focused on the action, the brain favors what is immediate and closer over what is far in time or distant (hyperbolic discount bias). We value more things in the present than in the future. This makes us ignore our diet because the reward of eating pastry right now is much more irresistible than the reward of losing weight, which will only be achieved in a few months.

  3. The brain motivates us to complete tasks in which we have already invested time and energy. This helps us finish things, even when we have plenty of reasons to give up. For example, the sunk cost fallacy/bias leads a student to keep studying, despite hating the course, because he has already been studying for two years. The correct decision would be to just let go (which this bias does not allow you to do) and find a really exciting career rather than drag through two or three more years to finish a boring course (more information on the decision-making process).

  4. The brain helps us avoid irreversible decisions (status quo bias). If we have to select, we usually choose the option that is perceived as less risky or that preserves the status quo. “Better the devil you know”. This makes it hard to leave our comfort zone and is a clear enemy of innovation.

When we have to choose what to remember

We can only afford to remember the bits of information that will probably be useful in the future. We have to bet constantly and make allowances on what we have to remember or forget. We highlight the following types of memory biases:

  1. We reinforce the memories after a related event occurs and, in this process, some details of the memory can change without us being conscious of it (reconstructive memory or false memory bias). For example, there are people that were not present at the 9/11 attack and after one year, were completely convinced they were actually there. This was demonstrated by researchers of the  9/11 Memory  Consortium. Without going that far, this is why when you run into a friend you haven’t seen for a while and both remember an event, it is possible that your versions are not the same. But nobody is lying, at least not consciously.

  2. We discard specific data to form generalities (implicit stereotype bias). This occurs because of necessity but the result of “lumping everything in the same bag” includes implicit association, stereotypes, and prejudice. For example, if we think about how Spain performed in the most recent World Cup, we only remember how bad it was and don’t remember the specific good things that happened. And when we vote for a political party, we are led by what we have in mind for “conservatives” and “liberals” rather than by a concrete program of actions.

  3. We reduce events and lists to their key elements. As it is difficult to reduce these to generalities, we select some elements to represent the overall. For example, after watching a movie, the peak-end rule leaves us with a feeling due to the moment of highest emotional intensity and how it ended.

  4. We store memories according to the context in which they were obtained, without taking into account its value. For example, the Google effect leads us to forget any information we think we can obtain on the internet. Although the information is relevant, we forget it anyway because we can easily find it again. This is why we don’t memorize any telephone numbers anymore or the directions to a specific location.

The following video from the BBC website also explains what cognitive biases are and how they influence our perception of reality.

Unknown Object

The origin of the idea of heuristics and cognitive biases

During the 19th century, it was believed that human decision-making was essentially rational, with a hyper-maximizing human being that carries out almost perfect evaluations of the cost-benefit for each situation. However, in 1971, Tversky and Kahneman published the study   in which the current decision-making model of the time was challenged. The study explained how 84 participants of the annual meetings of the Mathematical Psychology Society and American Psychological Association were asked about the robustness of statistical estimates and replicability of results. Although the mathematician psychologists who participated in the study had the background and capabilities to solve correctly the questions asked, the majority answered in a completely wrong manner.

This study led to think that people, even well-trained scientists, have strong intuitions on random sampling: we assume that a small random sample is highly representative of the entire population, making a reasoning mistake without noticing it. For example, “the average intelligence quotient (IQ) of a sample of students from the sixth grade of an elementary school in a specific city is 100. We selected a random sample of 50 students from the sixth grade of elementary. The first child of the sample has an IQ of 150. So what is the expected average IQ of the entire sample of 50 kids?”

The correct answer is 101, while the majority of people would answer 100 as they assume that an outlier superior error (IQ = 150) will probably be compensated by another outlier inferior error. But this is not true: the rules of luck don’t operate this way and deviations are not compensated: as the sample grows in the number of units, deviations get diluted. Therefore, it is a mistake to assume that the laws of large numbers (which explain why the average of a random sample of a large population will be close to the average of the entire population) is also applied to small samples.

This belief gives place to the “Gambler’s fallacy”, reported in previous experiments (Tune, 1976), which makes us believe that a “strike” will probably end before what true luck dictates. For example, if you want to predict the results of flipping a coin, the proportion of heads and tails expected is very similar. Nevertheless, this is not necessarily true when the number of tosses is not sufficiently high.

Tversky and Kahneman highlight how cognitive biases make us believe that a “Law of small numbers” can influence terribly in the drawing of conclusions on the scientific world, and encourage researchers to “substitute intuition for computation as much as possible”.

From this work, Tversky and Kahneman published new scientific studies in the same field (in what they denominated the “heuristics and biases program”), in which the authors sustained that the majority of important human decisions are based on a limited number of heuristic principles and not on a formal analysis of the problem, contradicting the current rational decision-making model at the time. The new proposed approach led to an avalanche of new investigations in cognitive psychology and extended to other research fields such as economics, law, sociology, medicine, and political sciences. All this research was paramount for Daniel Kahneman to received the Economics Nobel prize in 2002.

In summary, our brain employs mental shortcuts to make faster decisions, and be more agile at the time of reacting, developing and adapting to the surroundings. In this way, a series of mental strategies is generated, along with other psychological effects that enable us to increase our survival probabilities.

We have mentioned some of the most common cognitive biases and explained them, but… is this enough to avoid being tricked by them? Regrettable, no. Our brain will continue to employ - unfortunately - heuristics and mental shortcuts and, inevitably, sometimes will make systematic errors and mistakes nonconsciously, which will affect our decision-making. Sometimes we can recognize these biases but most of the times we are not capable and our point of view will be affected...

You might be interested in:

Diadem EEG
Self-management and wearable dry-EEG headset.
Learn more