Cognitive Biases: Difference between revisions

From Forecasting Wiki
Content added Content deleted
No edit summary
No edit summary
 
(10 intermediate revisions by 3 users not shown)
Line 1: Line 1:
{{Banner|wip}}
A cognitive bias is a directional departure from a normative model of reasoning (i.e., how one should reason) and tend to lead to errors in judgment.<ref>[https://www.cambridge.org/highereducation/books/thinking-and-deciding/D61CB67A2638AAA41BF5C23DF7629C15#overview] -
Baron, J. (2006). Thinking and Deciding. https://doi.org/10.1017/cbo9780511840265</ref> Cognitive biases can lead to both [[bias]] and [[noise]] in forecasting. The existence of cognitive biases is a central concern for making better forecasts.


A cognitive bias represents an error in reasoning, or a ''deviation from'' ''how one should reason''.<ref>[https://www.cambridge.org/highereducation/books/thinking-and-deciding/D61CB67A2638AAA41BF5C23DF7629C15#overview] -
A cognitive bias is not ''necessarily'' a deviation from the correct answer (though it can be), but a deviation from normative reasoning. For example, since Bayes Theorem is a normative model, any deviation from it would be considered a cognitive bias regardless of the true state of the world. Therefore, to say that someone is wrong because they exhibited a cognitive bias would be an example of the [[Fallacy Fallacy]]. A cognitive bias does not mean someone is wrong, it only means they did not reason according to how a normative model said they should have reasoned. Thus a cognitive bias is one possible source of [[bias]] and [[noise]] in forecasts, but is not the same as saying one is ''wrong'', though certainly deviations from using normative models can lead to errors in judgment. This distinction is important because in forecasting we often do not know what the correct answer is, but can still identify cognitive biases in our reasoning.
Baron, J. (2006). Thinking and Deciding. https://doi.org/10.1017/cbo9780511840265</ref> Cognitive biases can lead to errors in judgment, resulting in [[bias]] and [[noise]] in forecasts. Many cognitive biases have been found to be systematic, meaning that when put in the same context, many people commit the same error in the same way. <ref>[https://psycnet.apa.org/record/2011-26535-000] -
Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.</ref> The existence of cognitive biases is a central concern for making better forecasts.


==Errors in reasoning==
Since a cognitive bias is merely a directional deviation from normativity, and there is no widespread agreement on the set of models considered normative, a comprehensive list of biases is not possible even in theory. And indeed, in practice the list of biases is always growing as researchers discover new ways in which people do not reason as the researchers think they ought to reason. An example list of biases, and the normative model from which they deviate, is listed below.


A cognitive bias is not necessarily a deviation from the correct answer, but an error in reasoning. While there is no consensus on ''correct reasoning'', we can identify cognitive biases by comparing the observed reasoning with so-called [[Normative Model|normative models]] (models of how one should reason). One such normative model is Bayes' Theorem, and any deviation from it could be considered a cognitive bias. This cognitive bias is present regardless of the truth value of the judgment. To say that someone is wrong simply because they exhibited a cognitive bias would be an example of the [[Fallacy Fallacy]]. This distinction is important because in forecasting the correct answer is not known, but it is still possible to identify cognitive biases, that is to say, errors in reasoning. This distinguishes cognitive biases from [[bias]], which can only be identified once the true answer is known. Though it is generally assumed that cognitive biases will lead to biases in forecasts.
- [[Status Quo Bias]] is a deviation from the [[Invariance Principle]]


Due to disagreements in how one should reason, a cognitive bias from one perspective can be correct reasoning from another perspective. There is ongoing research and debate into the best ways to reason, but generally a few normative models are focused on in the literature, including [[Bayes Theorem]], Formal Logic, Probability Theory, Expected Utility Theory, among others. These list of models have been heavily criticized by various theorists and researchers, such as [[Nassim Taleb]], [[Gerd Gigerenzer]], [[Gary Klein]] and many others. Strictly speaking, some variation in the normative models used can improve forecasts so long as the errors in the models are not correlated, and the models are able to pick up on [[signal]]. See [[Fox vs Hedgehog]] and the [[Diversity Prediction Theorem]].
- [[Conjunction Effect]] is a deviation from both [[Logic]] and [[Probability]]


==Causes of bias==
- [[Base-Rate Fallacy]] is a deviation from [[Bayes Theorem]]


The most common reason for the existence of a cognitive bias is that the individual is using a [[heuristic]] to reason because it is less cognitively demanding, and faster, than using a [[Normative Model]]. For example, rather than finding a good [[base-rate]], a forecaster may consult their memory for similar cases they can think of. This is known as the Availability Heuristic. Many such heuristics have been identified in the literature.
- [[Omission Bias]] is a deviation from [[Expected Utility]]


==Examples of Cognitive Biases==
- [[Availability Bias]] is a deviation from the correct answer


Since a cognitive bias is defined as an error in reasoning, but there is no widespread agreement on how one should reason, a comprehensive list of biases is not possible. And indeed, the list of biases is always growing as researchers discover new ways in which people do not reason according to some normative model. The list below therefore only includes a few examples:
The most common reason for the existence of a cognitive bias is that the individual is using a [[heuristic]] to reason because it is less cognitively demanding, and faster, than applying the normative model. In such a scenario, once the difference between their reasoning and the normative model is explained to the individual, they may agree that the normative model is superior. However, they may also reject that the normative model is superior and prefer their own normative model. For example, they may prefer a [[Frequentist]] approach as opposed to a Bayesian approach. They may even accuse the other individual of having a cognitive bias away from frequentism.


- [[Affect Bias]]
Deviations from normative models are not strictly bad for collective forecasting so long as the deviations are not correlated, and the methods of reasoning still able to pick up on [[signal]]. If all forecasters were to reason in the same way with the same information, then forecasts will be systematically biased. Increases in variance in how people reason, and the information with which they reason, can even lead to decreases in [[bias]] due to the [[Bias-Variance Trade-Off]]. However, cognitive biases do tend to correlate with each other. The set of cognitive biases which have been named by researchers tend to not be isolated incidences, but systematic tendencies in the way in which people reason. Normative models also tend to be better at picking up signal than other models, hence their normativity. Thus [[de-biasing]] is an important component of improving forecasts. Though research suggests that de-biasing techniques mostly improve forecasts through decreasing [[noise]].<ref>[https://www.wu.ac.at/fileadmin/wu/d/i/statmath/Research_Seminar/SS_2020/Satop%C3%A4%C3%A4_BIN_Main_Article.pdf] - Satopää, V., Salikhov, M., Tetlock, P., & Mellers, B. (2020). Bias, Information, Noise: The BIN Model of Forecasting. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3540864</ref>

- [[Anchoring]]

- [[Availability Bias]]

- [[Base-Rate Neglect]]

- [[Confirmation Bias]]

- [[Nonbelief in the law of large numbers|Disbelief in the law of large numbers]]

- [[Dunning-Kruger Effect]]

- [[Law of Small Numbers]]

- [[Recency Bias]]

- [[Representativeness Bias]]

- [[Selective Perception]]

==Effects of cognitive bias==

Cognitive biases usually lead to [[bias]] when making a forecast, or [[bias]] and [[noise]] when combining forecasts.

Cognitive biases likely lead to flawed forecasts. This problem can then be multiplied if multiple forecasts are combined, and all the forecasters are reasoning in the same flawed way by using the same heuristic. The more people there are committing the same error in reasoning, the more problematic it is for the forecast. Since such correlation of reasoning patterns is expected, [[debiasing]] is an important component of improving forecasts.

Alternatively, the errors in reasoning that forecasters commit may not be correlated. For example, one forecaster may be overconfident, and another underconfident. When this happens, the biases may cancel each other out.

== Reducing noise and bias ==

=== The crowd forecasting approach ===
If forecasts are sufficiently diverse and independent, they can be pooled, or aggregated, to form a single crowd forecast which always outperforms the average forecast, and often outperforms the best. Crowd forecasting in this manner has helped public health experts of the Johns Hopkins Center for Health Security to predict the spread and severity of infectious disease such as Covid-19 <ref>https://bmcpublichealth.biomedcentral.com/articles/10.1186/s12889-021-12083-y</ref>.

Methods range from a simple averaging, to an optimized weighted average where the best performing forecasters (as measured per their brier score or calibration) weigh more. Additional tweaks can further improve a crowd forecast such as applying a form of decay where only the most recent forecasts beyond a given threshold are taken into account.

=== Individual forecasting training ===


== References ==
== References ==
<references />
<references />
[[Category:Biases]]
[[Category:Concepts]]

Latest revision as of 12:16, 10 June 2022

This article is work in progess. The author is working on it and it is not yet ready for review.

A cognitive bias represents an error in reasoning, or a deviation from how one should reason.[1] Cognitive biases can lead to errors in judgment, resulting in bias and noise in forecasts. Many cognitive biases have been found to be systematic, meaning that when put in the same context, many people commit the same error in the same way. [2] The existence of cognitive biases is a central concern for making better forecasts.

Errors in reasoning[edit]

A cognitive bias is not necessarily a deviation from the correct answer, but an error in reasoning. While there is no consensus on correct reasoning, we can identify cognitive biases by comparing the observed reasoning with so-called normative models (models of how one should reason). One such normative model is Bayes' Theorem, and any deviation from it could be considered a cognitive bias. This cognitive bias is present regardless of the truth value of the judgment. To say that someone is wrong simply because they exhibited a cognitive bias would be an example of the Fallacy Fallacy. This distinction is important because in forecasting the correct answer is not known, but it is still possible to identify cognitive biases, that is to say, errors in reasoning. This distinguishes cognitive biases from bias, which can only be identified once the true answer is known. Though it is generally assumed that cognitive biases will lead to biases in forecasts.

Due to disagreements in how one should reason, a cognitive bias from one perspective can be correct reasoning from another perspective. There is ongoing research and debate into the best ways to reason, but generally a few normative models are focused on in the literature, including Bayes Theorem, Formal Logic, Probability Theory, Expected Utility Theory, among others. These list of models have been heavily criticized by various theorists and researchers, such as Nassim Taleb, Gerd Gigerenzer, Gary Klein and many others. Strictly speaking, some variation in the normative models used can improve forecasts so long as the errors in the models are not correlated, and the models are able to pick up on signal. See Fox vs Hedgehog and the Diversity Prediction Theorem.

Causes of bias[edit]

The most common reason for the existence of a cognitive bias is that the individual is using a heuristic to reason because it is less cognitively demanding, and faster, than using a Normative Model. For example, rather than finding a good base-rate, a forecaster may consult their memory for similar cases they can think of. This is known as the Availability Heuristic. Many such heuristics have been identified in the literature.

Examples of Cognitive Biases[edit]

Since a cognitive bias is defined as an error in reasoning, but there is no widespread agreement on how one should reason, a comprehensive list of biases is not possible. And indeed, the list of biases is always growing as researchers discover new ways in which people do not reason according to some normative model. The list below therefore only includes a few examples:

- Affect Bias

- Anchoring

- Availability Bias

- Base-Rate Neglect

- Confirmation Bias

- Disbelief in the law of large numbers

- Dunning-Kruger Effect

- Law of Small Numbers

- Recency Bias

- Representativeness Bias

- Selective Perception

Effects of cognitive bias[edit]

Cognitive biases usually lead to bias when making a forecast, or bias and noise when combining forecasts.

Cognitive biases likely lead to flawed forecasts. This problem can then be multiplied if multiple forecasts are combined, and all the forecasters are reasoning in the same flawed way by using the same heuristic. The more people there are committing the same error in reasoning, the more problematic it is for the forecast. Since such correlation of reasoning patterns is expected, debiasing is an important component of improving forecasts.

Alternatively, the errors in reasoning that forecasters commit may not be correlated. For example, one forecaster may be overconfident, and another underconfident. When this happens, the biases may cancel each other out.

Reducing noise and bias[edit]

The crowd forecasting approach[edit]

If forecasts are sufficiently diverse and independent, they can be pooled, or aggregated, to form a single crowd forecast which always outperforms the average forecast, and often outperforms the best. Crowd forecasting in this manner has helped public health experts of the Johns Hopkins Center for Health Security to predict the spread and severity of infectious disease such as Covid-19 [3].

Methods range from a simple averaging, to an optimized weighted average where the best performing forecasters (as measured per their brier score or calibration) weigh more. Additional tweaks can further improve a crowd forecast such as applying a form of decay where only the most recent forecasts beyond a given threshold are taken into account.

Individual forecasting training[edit]

References[edit]

  1. [1] - Baron, J. (2006). Thinking and Deciding. https://doi.org/10.1017/cbo9780511840265
  2. [2] - Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
  3. https://bmcpublichealth.biomedcentral.com/articles/10.1186/s12889-021-12083-y