You are on page 1of 4

A cognitive bias is a pattern of deviation in judgment that occurs in particular situations, which may sometimes lead to perceptual distortion,

inaccurate judgm ent, illogical interpretation, or what is broadly called irrationality.[1][2][3] Implicit in the concept of a "pattern of deviation" is a standard of comparison with what is normatively expected; this may be the judgment of people outside t hose particular situations, or may be a set of independently verifiable facts. A continually evolving list of cognitive biases has been identified over the last six decades of research on human judgment and decision-making in cognitive scie nce, social psychology, and behavioral economics. Some cognitive biases are presumably adaptive, for example, because they lead to more effective actions in a given context or enable faster decisions when timel iness is more valuable than accuracy (heuristics). Others presumably result from a lack of appropriate mental mechanisms (bounded rationality), or simply from a limited capacity for information processing. Contents [hide] 1 Overview 2 Types of cognitive biases 3 Practical significance 4 See also 5 References 6 Further reading 7 External links [edit] Overview Daniel KahnemanBias arises from various processes that are sometimes difficult t o distinguish. These include information-processing shortcuts (heuristics),[4] m ental noise and the mind's limited information processing capacity,[5] emotional and moral motivations,[6] or social influence.[7] The notion of cognitive biases was introduced by Amos Tversky and Daniel Kahnema n in 1972[8] and grew out of their experience of people's innumeracy, or inabili ty to reason intuitively with the greater orders of magnitude. They and their co lleagues demonstrated several replicable ways in which human judgments and decis ions differ from rational choice theory. They explained these differences in ter ms of heuristics, rules which are simple for the brain to compute but introduce systematic errors.[8] For instance the Availability heuristic, when the ease wit h which something comes to mind is used to indicate how often (or how recently) it has been encountered. These experiments grew into the heuristics and biases research program which spr ead beyond academic psychology into other disciplines including medicine and pol itical science.[9] It was a major factor in the emergence of behavioral economic s, earning Kahneman a Nobel Prize in 2002.[10] Tversky and Kahneman developed pr ospect theory as a more realistic alternative to rational choice theory.[citatio n needed] Critics of Kahneman and Tversky such as Gerd Gigerenzer argue that heuristics sh ould not lead us to conceive of human thinking as riddled with irrational cognit ive biases, but rather to conceive rationality as an adaptive tool that is not i dentical to the rules of formal logic or the probability calculus.[11] [edit] Types of cognitive biasesBiases can be distinguished on a number of dimen sions. For example, there are biases specific to groups (such as the risky shift ) as well as biases at the individual level. Some biases affect decision-making, where the desirability of options has to be considered (e.g., Sunk Cost fallacy). Others such as Illusory correlation affect

judgment of how likely something is, or of whether one thing is the cause of an other. A distinctive class of biases affect memory,[12] such as consistency bias (remembering one's past attitudes and behavior as more similar to one's present attitudes). Some biases reflect a subject's motivation,[13] for example, the desire for a po sitive self-image leading to Egocentric bias[14] and the avoidance of unpleasant cognitive dissonance. Other biases are due to the particular way the brain perc eives, forms memories and makes judgments. This distinction is sometimes describ ed as "Hot cognition" versus "Cold Cognition", as motivated reasoning can involv e a state of arousal. Among the "cold" biases, some are due to ignoring relevant information (e.g. Neg lect of probability), whereas some involve a decision or judgement being affecte d by irrelevant information (for example the Framing effect where the same probl em receives different responses depending on how it is described) or giving exce ssive weight to an unimportant but salient feature of the problem (e.g., Anchori ng). The fact that some biases reflect motivation, and in particular the motivation t o have positive attitudes to oneself[14] accounts for the fact that many biases are self-serving or self-directed (e.g. Illusion of asymmetric insight, Self-ser ving bias, Projection bias). There are also biases in how subjects evaluate in-g roups or out-groups; evaluating in-groups as more diverse and "better" in many r espects, even when those groups are arbitrarily-defined (Ingroup bias, Outgroup homogeneity bias). Some cognitive biases belong to the subgroup of attentional biases which refer t o the paying of increased attention to certain stimuli. It has been shown, for e xample, that people addicted to alcohol and other drugs pay more attention to dr ug-related stimuli. Common psychological tests to measure those biases are the S troop Task[15][16] and the Dot Probe Task. The following is a list of the more commonly studied cognitive biases: For other noted biases, see List of cognitive biases. Framing by using a too-narrow approach and description of the situation or issue . Hindsight bias, sometimes called the "I-knew-it-all-along" effect, is the inclin ation to see past events as being predictable. Fundamental attribution error is the tendency for people to over-emphasize perso nality-based explanations for behaviors observed in others while under-emphasizi ng the role and power of situational influences on the same behavior. Confirmation bias is the tendency to search for or interpret information in a wa y that confirms one's preconceptions; this is related to the concept of cognitiv e dissonance. Self-serving bias is the tendency to claim more responsibility for successes tha n failures. It may also manifest itself as a tendency for people to evaluate amb iguous information in a way beneficial to their interests. Belief bias is when one's evaluation of the logical strength of an argument is b iased by their belief in the truth or falsity of the conclusion. A 2012 Psychological Bulletin article suggests that at least 8 seemingly unrelat ed biases can be produced by the same information-theoretic generative mechanism .[17] It is shown that noisy deviations in the memory-based information processe s that convert objective evidence (observations) into subjective estimates (deci sions) can produce regressive conservatism, the conservatism (Bayesian), illusor y correlations, better-than-average effect and worse-than-average effect, subadd itivity effect, exaggerated expectation, overconfidence, and the hardeasy effect. [edit] Practical significanceMany social institutions rely on individuals to mak

e rational judgments. A fair jury trial, for example, requires that the jury ign ore irrelevant features of the case, weigh the relevant features appropriately, consider different possibilities open-mindedly and resist fallacies such as appe al to emotion. The various biases demonstrated in these psychological experiment s suggest that people will frequently fail to do all these things.[18] However, they fail to do so in systematic, directional ways that are predictable.[19] Cognitive biases are also related to the persistence of superstition, to large s ocial issues such as prejudice, and they also work as a hindrance in the accepta nce of scientific non-intuitive knowledge by the public.[20] [edit] See also Psychology portal Thinking portal List of biases in judgment and decision making Bounded rationality Cognitive bias mitigation Cognitive dissonance Cognitive distortion Cognitive psychology Cognitive traps for intelligence analysis Critical thinking Emotional bias Evolutionary psychology Expectation bias Fallacy Prejudice Realism theory [edit] References1.^ Kahneman, D.; Tversky, A. (1972). "Subjective probability: A judgment of representativeness". Cognitive Psychology 3 (3): 430454. doi:10.101 6/0010-0285(72)90016-3. 2.^ Baron, J. (2007). Thinking and deciding (4th ed.). New York, NY: Cambridge U niversity Press. 3.^ Ariely, D. (2008). Predictably irrational: The hidden forces that shape our decisions. New York, NY: HarperCollins. 4.^ Kahneman, D., Slovic, P., & Tversky, A. (1982). Judgment under uncertainty: Heuristics and biases (1st ed.). Cambridge University Press. 5.^ Simon, H. A. (1955). A behavioral model of rational choice. The Quarterly Jo urnal of Economics, 69(1), 99 -118. doi:10.2307/1884852 6.^ Pfister, H.-R., & Bhm, G. (2008). The multiplicity of emotions: A framework o f emotional functions in decision making. Judgment and Decision Making, 3, 5-17. 7.^ Wang, X. T., Simons, F., & Brdart, S. (2001). Social cues and verbal framing in risky choice. Journal of Behavioral Decision Making, 14(1), 1-15. doi:10.1002 /1099-0771(200101)14:1<1::AID-BDM361>3.0.CO;2-N 8.^ a b Kahneman, Daniel; Shane Frederick (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Thomas Gilovich, Dale Griffin , Daniel Kahneman. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 5152. ISBN 978-0-521-79679-8. 9.^ Gilovich, Thomas; Dale Griffin (2002). "Heuristics and Biases: Then and Now" . In Thomas Gilovich, Dale Griffin, Daniel Kahneman. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 14. ISBN 978-0-521-79679-8. 10.^ [1] Nobelprize.org 11.^ Gigerenzer, G. (2006). "Bounded and Rational". In Stainton, R. J.. Contempo rary Debates in Cognitive Science. Blackwell. p. 129. ISBN 1-4051-1304-9. 12.^ Schacter, D.L. (1999). "The Seven Sins of Memory: Insights From Psychology and Cognitive Neuroscience". American Psychologist 54 (3): 182203. doi:10.1037/00 03-066X.54.3.182. PMID 10199218 13.^ Kunda, Z. (1990). "The Case for Motivated Reasoning". Psychological Bulleti n 108 (3): 480498. doi:10.1037/0033-2909.108.3.480. PMID 2270237 14.^ a b Hoorens, V. (1993). "Self-enhancement and Superiority Biases in Social

Comparison". In Stroebe, W. and Hewstone, Miles. European Review of Social Psych ology 4. Wiley 15.^ Jensen AR, Rohwer WD (1966). "The Stroop color-word test: a review". Acta p sychologica 25 (1): 3693. doi:10.1016/0001-6918(66)90004-7. PMID 5328883. 16.^ MacLeod CM (March 1991). "Half a century of research on the Stroop effect: an integrative review". Psychological Bulletin 109 (2): 163203. doi:10.1037/00332909.109.2.163. PMID 2034749. http://content.apa.org/journals/bul/109/2/163. 17.^ Martin Hilbert (2012) "Toward a synthesis of cognitive biases: How noisy in formation processing can bias human decision making". Psychological Bulletin, 13 8(2), 211237; free access to the study here: martinhilbert.net/HilbertPsychBull.p df 18.^ Sutherland, Stuart (2007) Irrationality: The Enemy Within Second Edition (F irst Edition 1994) Pinter & Martin. ISBN 978-1-905177-07-3 19.^ Ariely, Dan (2008). Predictably Irrational: The Hidden Forces That Shape Ou r Decisions. HarperCollins. p. 304. ISBN 978-0-06-135323-9. 20.^ Gnter Radden, H. Cuyckens (2003). Motivation in language: studies in honor o f Gnter Radden. John Benjamins. p. 275. ISBN 978-1-58811-426-6. http://books.goog le.com/books?id=qzhJ3KpLpQUC&pg=PA275&dq=essentialism+definition&lr=&cd=3#v=onep age&q=essentialism%20definition&f=false. [edit] Further readingEiser, J.R. and Joop van der Pligt (1988) Attitudes and De cisions London: Routledge. ISBN 978-0-415-01112-9 Fine, Cordelia (2006) A Mind of its Own: How your brain distorts and deceives Ca mbridge, UK: Icon Books. ISBN 1-84046-678-2 Gilovich, Thomas (1993). How We Know What Isn't So: The Fallibility of Human Rea son in Everyday Life. New York: The Free Press. ISBN 0-02-911706-2 Haselton, M.G., Nettle, D. & Andrews, P.W. (2005). The evolution of cognitive bi as. In D.M. Buss (Ed.), Handbook of Evolutionary Psychology, (pp. 724746). Hoboke n: Wiley. Full text Heuer, Richards J. Jr. (1999) Psychology of Intelligence Analysis. Central Intel ligence Agency. http://www.au.af.mil/au/awc/awcgate/psych-intel/art5.html Kahneman D., Slovic P., and Tversky, A. (Eds.) (1982) Judgment Under Uncertainty : Heuristics and Biases. New York: Cambridge University Press ISBN 978-0-521-284 14-1 Kahneman, Daniel (2011) Thinking, Fast and Slow. New York: Farrar, Straus and Gi roux ISBN 978-0-374-27563-1 Kida, Thomas (2006) Don't Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking New York: Prometheus. ISBN 978-1-59102-408-8 Nisbett, R., and Ross, L. (1980) Human Inference: Strategies and shortcomings of human judgement. Englewood Cliffs, NJ: Prentice-Hall ISBN 978-0-13-445130-5 Piatelli-Palmarini, Massimo (1994) Inevitable Illusions: How Mistakes of Reason Rule Our Minds New York: John Wiley & Sons. ISBN 0-471-15962-X Stanovich, Keith (2009). What Intelligence Tests Miss: The Psychology of Rationa l Thought. New Haven (CT): Yale University Press. ISBN 978-0-300-12385-2. Lay su mmary (21 November 2010). Sutherland, Stuart (2007) Irrationality: The Enemy Within Second Edition (First Edition 1994) Pinter & Martin. ISBN 978-1-905177-07-3 Tavris, Carol and Elliot Aronson (2007) Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts Orlando, Florida: Har court Books. ISBN 978-0-15-101098-1 Funder, David C.; Joachim I. Krueger (June 2004). "Towards a balanced social psy chology: Causes, consequences, and cures for the problem-seeking approach to soc ial behavior and cognition". Behavioral and Brain Sciences 27 (3): 313376. PMID 1 5736870. http://132.74.59.154/internal/wiki/images/3/35/%D7%A4%D7%A1%D7%99%D7%9B %D7%95%D7%9C%D7%95%D7%92%D7%99%D7%94_3.pdf. Retrieved 3 May 2011. [edit] External linksThe Roots of Consciousness: To Err Is human Cognitive bias in the financial arena A Visual Study Guide To Cognitive Biases

You might also like