Saturday, June 9, 2012

Thinking, Fast and Slow: Chapters 7, 8 and 9

Nobel Prize recipient Daniel Kahneman authors this book on the behavioural sciences. Combining his own lifelong research with that of many other leaders in the field, he discusses some of the systematic mental glitches we experience that cause us to stray from rationality, often completely unbeknownst to us. Thinking, Fast and Slow is full of illustrative experiments and examples that you can even try on yourself!

System 1 is quick to jump to conclusions. If System 2 is lazy or otherwise occupied (because it was given other tasks, for instance), we become susceptible to letting our intuition (System 1) guide our thoughts about various subject matter. The problem with this is that System 1 has a number of weaknesses.

For one thing, System 1 seeks confirmatory evidence when it is presented with a question. For example, if we are asked "Is Greg friendly?", System 1 will attempt to recall a number of instances where Greg did act friendly, and will form a conclusion based on its ability to make this recollection. This leads to a potentially different answer than would be the case if the question were "Is Greg unfriendly?" So whereas scientists attempt to find evidence that flies against their hypotheses, the mind works to confirm its hypothesis, which is not ideal.

Since System 1 jumps so quickly to conclusions, it also suffers from the Halo Effect. If we like or dislike something, for example, we are likely to like its other attributes. An elegant experiment demonstrated that people who saw the same description of a person came to different conclusions about whether they would like him, depending on the order in which the person's attributes were presented. For example, someone described as intelligent and stubborn could seen as being justifiably stubborn, whereas someone described as stubborn and intelligent would be seen as a jerk, as early impressions had an effect.

System 1 also has trouble discerning the quality of sources. Strong impressions are formed whether or not a credible source is reporting on an event. Kahneman calls this What You See Is All There Is (WYSIATI), since System 1 doesn't look for what isn't present, but forms its opinions based only on what's available.

Finally, System 1 always wants to have an opinion, whether one is appropriate or not. If System 1 can't answer a question, it often substitutes a different question - one that it can answer! For example, if the question is "How much would you contribute to save an endangered species?" the question we might actually answer is "How much emotion do I feel when I think of dying dolphins?" transformed into some dollar figure.

1 comment:

JOHNNIE said...

wow,thats the nice and intresting article.we must look forward to it