Thinking, fast and slow: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Nick Gardner
imported>Nick Gardner
Line 10: Line 10:


==Part II. Heuristics and biases==
==Part II. Heuristics and biases==
Part II explores some of the ways in which judgements and choices can be distorted by interactions between system 1 and system 2. The distortions are attributed either to system 2's "laziness" in resorting to an uncritical dependence upon system 1, or to its "ignorance" in being unaware of the shortcomings of system 1.
The deliberate use  of [[heuristic]]s to get rough-and-ready answers to  difficult questions, is a well-known system 2 strategy. An example, suggested by the eminent mathematician Georgs Polya is the substitution of an easier question  
The deliberate use  of [[heuristic]]s to get rough-and-ready answers to  difficult questions, is a well-known system 2 strategy. An example, suggested by the eminent mathematician Georgs Polya is the substitution of an easier question  
<ref>George Polya: ''How to Solve it: A New Aspect of Mathematical Method Princeton'', University Press, 2004</ref> (for example, responding to the question "who will win next year's presidential election?" by giving the  answer to the question "who has been doing best in this year's polls?"). As an example of the involuntary (system 1) use of a similar strategy, Daniel Kahnemann cites Paul Slovic's "affect heuristic"
<ref>George Polya: ''How to Solve it: A New Aspect of Mathematical Method Princeton'', University Press, 2004</ref> (for example, responding to the question "who will win next year's presidential election?" by giving the  answer to the question "who has been doing best in this year's polls?"). As an example of the involuntary (system 1) use of a similar strategy, Daniel Kahnemann cites Paul Slovic's "affect heuristic"
<ref>[http://people.usd.edu/~xtwang/DM%28GuangHua%29/Readings%28GuangHua%29/AffectHeuristic.pdf Paul Slovic, Melissa Finucane, Ellen Peters, & Donald G. MacGregor: ''The Affect Heuristic'', 2003]</ref>, in which people let their likes and dislikes determine their beliefs about the world, as well as his own research (with Amos Tversky) on the "anchoring" and "availability" heuristics  
<ref>[http://people.usd.edu/~xtwang/DM%28GuangHua%29/Readings%28GuangHua%29/AffectHeuristic.pdf Paul Slovic, Melissa Finucane, Ellen Peters, & Donald G. MacGregor: ''The Affect Heuristic'', 2003]</ref>, in which people let their likes and dislikes determine their beliefs about the world, as well as his own research (with Amos Tversky) on the "anchoring" and "availability" heuristics  
<ref>Amos Tversky and Daniel Kahneman: ''Judgment under Uncertainty: Heuristics and Biases'', Science, 1974 [http://www.math.mcgill.ca/vetta/CS764.dir/judgement.pdf (JSTOR)] - (reproduced as Appendix A of ''Thinking Fast and Slow'') </ref>.
<ref>Amos Tversky and Daniel Kahneman: ''Judgment under Uncertainty: Heuristics and Biases'', Science, 1974 [http://www.math.mcgill.ca/vetta/CS764.dir/judgement.pdf (JSTOR)] - (reproduced as Appendix A of ''Thinking Fast and Slow'')</ref>.


==Part III. Overconfidence==
==Part III. Overconfidence==

Revision as of 12:27, 28 December 2011

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

Thinking, fast and slow is a book by the eminent psychologist, Daniel Kahneman, that presents his view of how the mind works. It draws on recent developments in cognitive and social psychology, and includes as an appendix the "Prospect Theory" article "Judgement under uncertainty: heuristics and biases", for his part in which he was awarded the Nobel Prize in economics. The word fast in the title refers to "system 1" thinking, which operates automatically with little or no effort, and no sense of voluntary control. The word slow refers to "system 2", of mental activities that require concentration, effort and self-control. The book examines the evidence concerning circumstances under which system 1 supplies false information to system 2.

Part I. Two systems

Part I presents the basic elements of Daniel Kahnemann's two-systems approach to judgement and choice. Its purpose is to introduce a vocabulary for thinking about the mind.

The cognitive effort and self-control of system 2 is shown to draw upon a limited resource of "mental energy" - and to actually involve the depletion of the blood system's glucose. The concept of "cognitive strain" is introduced as a response to effort and unmet demands - the absence of which is termed "cognitive ease". It is shown that cognitive ease is both a cause and a consequence of pleasant feelings (when in a good mood, people become more intuitive and more creative, but also less vigilant and more prone to logical errors).

System 1 is seen as conserving mental energy while maintaining and updating a model of its possessor's personal world by forming associations with regularly-ocurring events and outcomes. It operates on the assumption that "what you see is all there is" (WYSIATI), constructing the best story it can from the information that is available and making no allowance for the existence of information that it does not have. When information is scarce - which it often is - it acts as a "machine for jumping to conclusions", putting together a coherent story without reservations about the quality and quantity of the information on which it is based. Much of the time, the coherent story that it creates is close enough to reality to provide a reasonable basis for action, but its dependence upon WYSIATI can lead to a wide variety of errors of judgement and choice.

Part II. Heuristics and biases

Part II explores some of the ways in which judgements and choices can be distorted by interactions between system 1 and system 2. The distortions are attributed either to system 2's "laziness" in resorting to an uncritical dependence upon system 1, or to its "ignorance" in being unaware of the shortcomings of system 1.

The deliberate use of heuristics to get rough-and-ready answers to difficult questions, is a well-known system 2 strategy. An example, suggested by the eminent mathematician Georgs Polya is the substitution of an easier question [1] (for example, responding to the question "who will win next year's presidential election?" by giving the answer to the question "who has been doing best in this year's polls?"). As an example of the involuntary (system 1) use of a similar strategy, Daniel Kahnemann cites Paul Slovic's "affect heuristic" [2], in which people let their likes and dislikes determine their beliefs about the world, as well as his own research (with Amos Tversky) on the "anchoring" and "availability" heuristics [3].

Part III. Overconfidence

Part IV. Choices

Part V.Two selves

Appendix A. Judgement under uncertainty

Appendix B. Choices, values and frames

Reviews

References

  1. George Polya: How to Solve it: A New Aspect of Mathematical Method Princeton, University Press, 2004
  2. Paul Slovic, Melissa Finucane, Ellen Peters, & Donald G. MacGregor: The Affect Heuristic, 2003
  3. Amos Tversky and Daniel Kahneman: Judgment under Uncertainty: Heuristics and Biases, Science, 1974 (JSTOR) - (reproduced as Appendix A of Thinking Fast and Slow)