Nobel Prize recipient Daniel Kahneman authors this book on the behavioural sciences. Combining his own lifelong research with that of many other leaders in the field, he discusses some of the systematic mental glitches we experience that cause us to stray from rationality, often completely unbeknownst to us. Thinking, Fast and Slow is full of illustrative experiments and examples that you can even try on yourself!
Because of our propensity to create causal links, we often have the illusion of understanding things we really don't. System 1 creates stories based on what we hear or "know" (WYSIATI), filling in blanks that may not be filled correctly.
This can lead to hindsight bias, where we think we know something only because our story is coherent - even though our views may have changed (to fit a story) without our knowing! When subjects were asked to make predictions and then asked to reconstruct their predictions following events, they were way off in their reconstructions, believing themselves to be more prescient than they are.
Hindsight bias in turn leads to judging outcomes rather than processes, which leads to all sorts of inefficiencies. Praise is heaped on the lucky and scorn on the unlucky, and processes are altered that result in lower overall outcomes. For example, a surgeon will order more tests and/or turn down procedures because hindsight bias to a jury will make an adverse outcome reflect poorly on him (and his ability to earn a living). Agents get particularly singled out, as they get blamed for poor outcomes, and little credit (as it is allocated to the principal) when outcomes are good.
Kahneman takes shots at books like Built To Last that attempt to draw systematic inferences from what are mostly lucky successes. Since the study for that book, the companies featured in Built To Last have had only average profitability and returns. Humans will look for causal narratives (e.g. the great firms got complacent!) but Kahneman argues it is simply the result of regression to the mean.
Finally, Kahneman makes the point that statistical predictions do a much better job than human expert predictions. Therefore, instead of using intuition to arrive at conclusions (such as medical diagnoses or hiring from a pool of job candidates etc.), humans should derive the factors that are important to the determination, and attempt to find the answers, ignoring intuition. This would result in unbiased statistical choices.
No comments:
Post a Comment