Summary of Thinking, Fast and Slow

PART 1: TWO SYSTEMS

1: The Characters of the story (System 1 & 2)

2: Attention and Effort

3: The Lazy Controller

4: The Associative Machine

5: Cognitive Ease

6: Norms, Surprises and Causes

7: A Machine for Jumping to Conclusions

8: How Judgments Happen

9: Answering an Easier Question

PART II: HEURISTICS AND BIASES

10: Law of Small Numbers

11: Anchors

12: The Science of Availability

13: Availability, Emotion, and Risk

Victims and near victims are very concerned after a disaster. They are diligent… for a while. Recurring cycles of disaster, concern, growing complacency.

We struggle to imagine what we’ve never seen before. We tend to assume that we can safely build beyond the high-water mark of rivers that periodically flood, because we forget that the river may some day flood beyond its earlier limits.

Strokes cause twice as many deaths as all accidents combined, though we tend to be more afraid of accidents than strokes. We should take preventive measures against strokes, asthma, heart disease and other forms of disease.

Our estimates of causes of death are warped y media coverage. Media coverage is biased towards novelty and poignancy. The media shapes public interest, and is shaped by public interest. We give disproportionate attention to whatever has emotional intensity. Boston is more terrifying than Syria. It was not part of the plan.

We make decisions by consulting our emotions. This is called the Affect heuristic. We substitute “What do I think about this?” (which is hard) with “What do I feel about this?” (which is easier).

The emotional tail wags the rational dog.

What you like can hurt you. (For me: Cigarettes). What you dislike can help you (For me: Routines, Discipline.)

Changing the emotional appeal of something changes public perception of it.

There is no such thing as real or objective risk. Defining risk is an exercise in power. Every policy question involves assumptions about human nature.

Existing systems of regulations tend to display a poor setting of priorities, reflecting reaction to public pressures more than careful objective analysis. Risk assessment is highly debatable, but some degree of objectivity is always achievable through science, expertise and careful deliberation. (I am personally reminded of Sam Harris’s Moral Landscape (TED Talk), which compares the management of societies to health. There are many ways to run a society, just as there are many ways to be healthy or unhealthy. Yet we can generally agree on what is healthy and what is not.)

Our minds are bad at dealing with small risks: we either ignore them altogether or give them far too much weight, nothing in between. (Parents staying up waiting for teenagers late from a party know this- they know that it’s probably not a big deal, but images of disaster inevitably creep into the mind.)

Terrorists are the most significant practitioners of the art of inducing availability cascades. (3,000 killed in 9/11, over 100,000 killed in domestic gun violence in the US since then.) The difference is the availability of risks.

Rational or not, fear is painful and debilitating, and policy makers must endeavor to protect the public not only from real dangers, but from fear. (“There is nothing to fear but fear itself.”)

Democracy is inevitably messy. Psychology should inform the design of risk policies that combine the experts’ knowledge with the public’s emotions and intuitions.

14:

15:

16:

17:

18:

19:

20:

21:

22:

23:

24:

25:

26:

27:

28:

29:

30:

31:

32:

33:

34:

35:

36:

37:

38: