Saturday, June 16, 2012

Thinking, Fast and Slow: Chapters 13, 14 and 15

Nobel Prize recipient Daniel Kahneman authors this book on the behavioural sciences. Combining his own lifelong research with that of many other leaders in the field, he discusses some of the systematic mental glitches we experience that cause us to stray from rationality, often completely unbeknownst to us. Thinking, Fast and Slow is full of illustrative experiments and examples that you can even try on yourself!

These chapters discuss some of our peculiar behaviours relating to availability and representative biases. The availability bias portends to our associating the probability of events based on what we can easily recall. For example, we believe accidental deaths are far more common than they are (whereas in our society, death by disease occurs about 18 times more frequently than death by accident) because the media constantly provides us fresh stories of accidents that we can thus recall easily.

This bias also leads to high demand for insurance after disastrous events. In some cases, an availability cascade can occur where incidents based on the availability bias are picked up by more and more people (driven by emotion and amplified by the media) until such time as political action is taken on the basis of recallability rather than logic.

Today, for example, terrorism is used to incite fear and drive political action. But even in countries where incidents of terrorism are considered high (e.g. Israel), the weekly number of casualties due to terrorism doesn't come close to the number of traffic deaths!

Our representative bias causes us to use stereotypes to judge probabilities. In many cases such stereotypes can lead to correct probability estimations (e.g. young men are likely to speed more than elderly women), but in other cases they can result in logical fallacies.

For example, when subjects are given a description of a person and then asked to select their occupation from a list, they will immediately jump to the occupation that best matches the stereotype description they have read. Unfortunately, this completely ignores base rates, that is, the number of people in society that belong to each occupation. This is our System 1 at work, using it's "What You See Is All There Is" mentality.

An example of a logical fallacy is as follows. When given a description of someone that could be construed as a feminist, subjects actually choose "feminist bank teller" as a more probable label of a person than simply "bank teller". This, of course, makes no sense since the former is a subset of the latter!

No comments: