Cognitive Biases

From Forecasting Wiki
Revision as of 17:34, 26 April 2022 by JTPeterson (talk | contribs) (Added biases)

A cognitive bias is a deviation from how one should reason.[1] Such deviations can lead to errors in judgment, resulting in bias and noise in forecasts. Many cognitive biases have been found to be systematic, meaning that when put in the same context, many people commit the same error in the same way. [2] The existence of cognitive biases is a central concern for making better forecasts.

Errors in reasoning

A cognitive bias is not necessarily a deviation from the correct answer, but a deviation from how one should reason. For example, since Bayes Theorem is a Normative Model (a model of how one should reason), any deviation from it would be considered a cognitive bias regardless of the truth value of the judgment. Therefore, to say that someone is wrong because they exhibited a cognitive bias would be an example of the Fallacy Fallacy. This distinction is important because in forecasting the correct answer is not known, but it is still possible to identify cognitive biases, that is to say, errors in reasoning. This makes it different than bias which can only be identified once the true answer is known. Thought it is generally assumed that cognitive biases will lead to biases in forecasts.

Due to disagreements in how one should reason, a cognitive bias from one perspective can be correct reasoning from another perspective. There is ongoing research and debate into the best ways to reason, but generally a few normative models are focused on in the literature, including Bayes Theorem, Formal Logic, Probability Theory, Expected Utility Theory, among others. These list of models have been heavily criticized by various theorists and researchers, such as Nassim Taleb, Gerd Gigerenzer, Gary Klein and many others. Strictly speaking, some variation in the normative models used can improve forecasts so long as the errors in the models are not correlated, and the models are able to pick up on signal. See Fox vs Hedgehog and the Diversity Prediction Theorem.

Causes of bias

The most common reason for the existence of a cognitive bias is that the individual is using a heuristic to reason because it is less cognitively demanding, and faster, than using a Normative Model. For example, rather than finding a good base-rate, a forecaster may consult their memory for similar cases they can think of. This is known as the Availability Heuristic. Many such heuristics have been identified in the literature.

Examples of Cognitive Biases

Since a cognitive bias is merely a deviation from how one should reason, and there is no widespread agreement on how one should reason, a comprehensive list of biases is not possible. And indeed, the list of biases is always growing as researchers discover new ways in which people do not reason according to some normative model. Nevertheless, a list of prioritized nudges for the forecasting context is included below.

- Affect Bias

- Anchoring

- Availability Bias

- Base-Rate Neglect

- Confirmation Bias

- Disbelief in the law of large numbers

- Dunning-Kruger Effect

- Law of Small Numbers

- Recency Bias

- Representativeness Bias

- Selective Perception

Effects of cognitive bias

Cognitive biases lead to bias when making a forecasting, or bias and noise when combining forecasts.

Cognitive biases likely lead to flawed forecasts. This problem can then be multiplied if multiple forecasts are combined, and all the forecasters are reasoning in the same flawed way by using the same heuristic. The more people there are committing the same error in reasoning, the more problematic it is for the forecast. Since such correlation of reasoning patterns is expected, debiasing is an important component of improving forecasts.

Alternatively, the errors in reasoning that forecasters commit may not be correlated. For example, one forecaster may be overconfident, and another underconfident. When this happens, the biases may cancel each other out.

References

  1. [1] - Baron, J. (2006). Thinking and Deciding. https://doi.org/10.1017/cbo9780511840265
  2. [2] - Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.