177
edits
JTPeterson (talk | contribs) No edit summary |
No edit summary |
||
(6 intermediate revisions by 3 users not shown) | |||
Line 1:
{{Banner|wip}}
A cognitive bias is a deviation from how one should reason.<ref>[https://www.cambridge.org/highereducation/books/thinking-and-deciding/D61CB67A2638AAA41BF5C23DF7629C15#overview] -▼
Baron, J. (2006). Thinking and Deciding. https://doi.org/10.1017/cbo9780511840265</ref> Such deviations can lead to errors in judgment, resulting in [[bias]] and [[noise]] in forecasts. Many cognitive biases have been found to be systematic, meaning that when put in the same context, many people commit the same error in the same way. <ref>[https://psycnet.apa.org/record/2011-26535-000] -▼
▲A cognitive bias
▲Baron, J. (2006). Thinking and Deciding. https://doi.org/10.1017/cbo9780511840265</ref>
Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.</ref> The existence of cognitive biases is a central concern for making better forecasts.
==Errors in reasoning==
A cognitive bias is not necessarily a deviation from the correct answer, but
Due to disagreements in how one should reason, a cognitive bias from one perspective can be correct reasoning from another perspective. There is ongoing research and debate into the best ways to reason, but generally a few normative models are focused on in the literature, including [[Bayes Theorem]], Formal Logic, Probability Theory, Expected Utility Theory, among others. These list of models have been heavily criticized by various theorists and researchers, such as [[Nassim Taleb]], [[Gerd Gigerenzer]], [[Gary Klein]] and many others. Strictly speaking, some variation in the normative models used can improve forecasts so long as the errors in the models are not correlated, and the models are able to pick up on [[signal]]. See [[Fox vs Hedgehog]] and the [[Diversity Prediction Theorem]].
Line 15 ⟶ 17:
==Examples of Cognitive Biases==
Since a cognitive bias is
- [[Affect Bias]]
- [[Anchoring]]
Line 24 ⟶ 28:
- [[Confirmation Bias]]
- [[Nonbelief in the law of large numbers|Disbelief in the law of large numbers]]
- [[Dunning-Kruger Effect]]
Line 37 ⟶ 43:
==Effects of cognitive bias==
Cognitive biases usually lead to [[bias]] when making a
Cognitive biases likely lead to flawed forecasts. This problem can then be multiplied if multiple forecasts are combined, and all the forecasters are reasoning in the same flawed way by using the same heuristic. The more people there are committing the same error in reasoning, the more problematic it is for the forecast. Since such correlation of reasoning patterns is expected, [[debiasing]] is an important component of improving forecasts.
Alternatively, the errors in reasoning that forecasters commit may not be correlated. For example, one forecaster may be overconfident, and another underconfident. When this happens, the biases may cancel each other out.
== Reducing noise and bias ==
=== The crowd forecasting approach ===
If forecasts are sufficiently diverse and independent, they can be pooled, or aggregated, to form a single crowd forecast which always outperforms the average forecast, and often outperforms the best. Crowd forecasting in this manner has helped public health experts of the Johns Hopkins Center for Health Security to predict the spread and severity of infectious disease such as Covid-19 <ref>https://bmcpublichealth.biomedcentral.com/articles/10.1186/s12889-021-12083-y</ref>.
Methods range from a simple averaging, to an optimized weighted average where the best performing forecasters (as measured per their brier score or calibration) weigh more. Additional tweaks can further improve a crowd forecast such as applying a form of decay where only the most recent forecasts beyond a given threshold are taken into account.
=== Individual forecasting training ===
== References ==
<references />
[[Category:Biases]]
[[Category:Concepts]]
|