Bias

From LessWrong
(Redirected from Cognitive bias)
Jump to navigation Jump to search
Smallwikipedialogo.png
Wikipedia has articles about


Bias or Cognitive Bias is a systematic deviation from rationality committed by our cognition. They are specific, predictable error patterns in the human mind [1]. The heuristics and biases program in cognitive psychology has documented hundreds of reproducible errors - often big errors. This continues to be a highly active area of investigation in cognitive psychology.

In our evolutionary past, in order that a cognitive algorithm turned out into a satisfactory solution to a given problem, it wasn't enough to solve it properly. It was necessary that the solution accounted for a large number of restrictions, such as time and energetic costs. This algorithm didn't need to be perfect, only good enough to guarantee the survival and reproduction of the individual: “What selective pressures impact on decision mechanisms? Foremost is selection for making an appropriate decision in the given domain. This domain-specific pressure does not imply the need to make the best possible decision, but rather one that is good enough (a satisficing choice, as Herbert Simon, 1955, put it) and, on average, better than those of an individual’s competitors, given the costs and benefits involved.” [2]

Therefore, the human brain make operations which solve cognitive tasks through ‘shortcuts’, that work well on some cases but fail in others. Since the cognitive modules that make those tasks are universals in the human species, how and where those shortcuts lead to mistakes are also regular. The study of why, how and where such errors arise is the field of cognitive bias. Understanding cognitive biases and trying to defend against their effects has been a basic theme of Less Wrong since the days it was part of Overcoming Bias.

Starting points

Blog posts on the concept of "bias"

Blog posts about known cognitive biases

  • Scope Insensitivity - The human brain can't represent large quantities: an environmental measure that will save 200,000 birds doesn't conjure anywhere near a hundred times the emotional impact and willingness-to-pay of a measure that would save 2,000 birds.
  • Correspondence Bias, also known as the fundamental attribution error, refers to the tendency to attribute the behavior of others to intrinsic dispositions, while excusing one's own behavior as the result of circumstance.
  • Confirmation bias, or Positive Bias is the tendency to look for evidence that confirms a hypothesis, rather than disconfirming evidence.
  • Hindsight Bias describes the tendency to seem much more likely in hindsight than could have been predicted beforehand.
  • Planning Fallacy - We tend to plan envisioning that everything will go as expected. Even assuming that such an estimate is accurate conditional on everything going as expected, things will not go as expected. As a result, we routinely see outcomes worse than the ex ante worst case scenario.
  • Conjunction Fallacy - Elementary probability theory tells us that the probability of one thing (we write P(A)) is necessarily greater than or equal to the conjunction of that thing and another thing (write P(A&B)). However, in the psychology lab, subjects' judgments do not conform to this rule. This is not an isolated artifact of a particular study design. Debiasing won't be as simple as practicing specific questions, it requires certain general habits of thought.
  • We Change Our Minds Less Often Than We Think - we all change our minds occasionally, but we don't constantly, honestly reevaluate every decision and course of action. Once you think you believe something, the chances are good that you already do, for better or worse.
  • Priming and Contamination - Even slight exposure to a stimulus is enough to change the outcome of a decision or estimate. See also Never Leave Your Room by Yvain, and Cached Selves by Salamon and Rayhawk.
  • Do We Believe Everything We're Told? - Some experiments on priming suggest that mere exposure to a view is enough to get one to passively accept it, at least until it is specifically rejected.
  • Illusion of Transparency - Everyone knows what their own words mean, but experiments have confirmed that we systematically overestimate how much sense we are making to others.
  • Self-Anchoring - Related to contamination and the illusion of transparancy, we "anchor" on our own experience and underadjust when trying to understand others.
  • Affect Heuristic - Positive and negative emotional impressions exert a greater effect on many decisions than does rational analysis.
  • Evaluability - It's difficult for humans to evaluate an option except in comparison to other options. Poor decisions result when a poor category for comparison is used. Includes an application for cheap gift-shopping.
  • Unbounded Scales, Huge Jury Awards, and Futurism - Without a metric for comparison, estimates of, e.g., what sorts of punative damages should be awarded, or when some future advance will happen, vary widely simply due to the lack of a scale.
  • The Halo Effect - Positive qualities seem to correlate with each other, whether or not they actually do.
  • Asch's Conformity Experiment - The unanimous agreement of surrounding others can make subjects disbelieve (or at least, fail to report) what's right before their eyes. The addition of just one dissenter is enough to dramatically reduce the rates of improper conformity.
  • The Allais Paradox (and subsequent followups) - Offered choices between gambles, people make decision-theoretically inconsistent decisions.

References

  1. POHL, Rüdiger (orgs.). (2005) "Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory". Psychology Press. p. 2
  2. BUSS, David(orgs.). (2005) "The Handbook of Evolutionary Psychology". Wiley, New Jersey. p. 778.

See also

Not to be confused with