What you could learn from ‘Thinking fast and slow’ by Daniel Kahneman (2012, 300 pages)
What is intuition? How do humans think? Are humans rational or irrational and can we predict them?
In the past economist built models that assumed that human beings are perfectly rational (what behavioural scientists joking refer to as ‘Homo Economicus’). Daniel Kahneman, a Noble Prize-winning author, showed that Homo Economicus does not exist and that humans are actually highly irrational, but their irrationality is predictable.
One of Daniel’s main findings was that humans have two decision-making processes, what he classified as System 1 and System 2.
- System 1 (intuition) is automatic, emotional and fast. It is your first response and often triggered by stress and is usually the first response to a new situation.
- System 2 (rational thinking) is deliberate, logical and slow. System 2 thinking takes effort and time.
Through a lifetime of experiments Daniel, and his partner Amos Tversky, were able to demonstrate that System 1 thinking is very powerful, and often the default. System 1 relies on heuristics (rules of thumb) and results in predictable biases, some of which are listed below:
- Anchoring effect names our tendency to be influenced by irrelevant numbers. Shown higher/lower numbers, experimental subjects gave higher/lower responses
- Availability heuristic is a mental shortcut that occurs when people make judgments about the probability of events by how easy it is to think of examples
- Optimistic bias, which “may well be the most significant of the cognitive biases.” This bias generates the illusion of control that we have substantial control over our lives.
- Framing is the context in which choices are presented. Experiment: subjects were asked whether they would opt for surgery if the “survival” rate is 90%, while others were told that the mortality rate is 10%. The first framing increased acceptance, even though the situation was no different.
- Sunk cost fallacy, show that rather than consider the odds that an incremental investment would produce a positive return, people tend to “throw good money after bad” and continue investing in projects with poor prospects that have already consumed significant resources. In part, this is to avoid feelings of regret
- Overconfidence. It suggests that people often overestimate how much they understand about the world and underestimate the role of chance in particular.
System 2 thinking is more deliberate and rational, but it is expensive – it requires considerable effort and concentration. System 2 can help by:
- Forcing you to think about what information you do not currently have – system 1 thinking makes decisions only on the available information – what you see is all there is (Daniel refers to this as WUSIATI)
- Compare multiple options based on a set of criteria
- Avoid binary decision-making (i.e., yes or no), buy allowing you to creatively come up with other options
Thinking Fast and Slow is one of my top books of all time. I made me think about thinking. Here are the top lessons I learnt from the book:
- Be honest with yourself on your level of expertise – have you spent over 10,000 hours honing your skill in an environment with high-quality and rapid feedback? (if no, don’t rely on intuition)
- Understand your environment. Is your environment deterministic and stable (is it easy to link cause and effect)? (if no, don’t rely on intuition)
- Check your biases. Check your self for the top biases when you are making important decisions
- Think about failure in advance. If you allow yourself and others to see how your idea could fail, you are more likely to be realistic and have good contingencies
- Reflect on your decisions. After one day, one week, one month think about your big decisions. Are you right consistently? If not, you need to adjust
You can find Thinking Fast and Slow from Amazon UK here (all proceeds from the referral go to maintaining this site)