Thinking, Fast and Slow

Part I: Two Systems

We can be blind to the obvious, and we are also blind to our blindness.
... learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high.
... the influencing of an action by the idea.
Act calm and kind regardless of how you feel.
A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.
Cognitive strain, whatever its source, mobilizes System 2 [deliberative], which is more likely to reject the intuitive answer suggested by System 1 [reflexive].

Part II. Heuristics and Biases

...intuitive predictions tend to be overconfident and overly extreme.
Extreme predictions and a willingness to predict rare events from weak evidence are both manifestations of System 1. It is natural for the associative machinery to match the extremeness of predictions to the perceived extremeness of evidence on which it is based — this is how substitution works. And it is natural for System 1 to generate overconfident judgments, because confidence, as we have seen, is determined by the coherence of the best story you can tell from the evidence at hand. Be warned: your intuitions will deliver predictions that are too extreme and you will be inclined to put far too much faith in them.

Part III. Overconfidence

Any recent salient event is a candidate to become the kernel of a causal narrative.
Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle. Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.
The core of the illusion is that we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less than we believe we do.
Asked to reconstruct their former beliefs, people retrieve their current ones instead — an instance of substitution — and many cannot believe that they ever felt differently.
... that reality emerges from the interactions of many different agents and forces, including blind luck, often producing large and unpredictable outcomes.
The line that separates the possibly predictable future from the unpredictable distant future is yet to be drawn.
Because you have little direct knowledge of what goes on in your mind, you will never know that you might have made a different judgment or reached a different decision under very slightly different circumstances.
...formulas that assign equal weights to all the predictors are often superior, because they are not affected by accidents of sampling.
The aversion to algorithms making decisions that affect humans is rooted in the strong preference that many people have for the natural over the synthetic or artificial.
If the environment is sufficiently regular and if the judge has had a chance to learn its regularities, the associative machinery will recognize situations and generate quick and accurate predictions and decisions. [maybe]
... skill does not become perfect all at once, and that on the way to near perfection some mistakes are made with great confidence.
... planning fallacy to describe plans and forecasts that are unrealistically close to best-case scenarios could be improved by consulting the statistics of similar cases.

Part IV. Choices

If he owns it, he considers the pain of giving up the bottle. If he does not own it, he considers the pleasure of getting the bottle. The values were unequal because of loss aversion: giving up a bottle of nice wine is more painful than getting an equally good bottle is pleasurable.
The slope of the function is steeper in the negative domain; the response to a loss is stronger than the response to a corresponding gain.
Loss aversion is built into the automatic evaluations of System 1.
No endowment effect is expected when owners view their goods as carriers of value for future exchanges, a widespread attitude in routine commerce and in financial markets.
The self is more motivated to avoid bad self-definitions than to pursue good ones. Bad impressions and bad stereotypes are quicker to form and more resistant to disconfirmation than good ones.
The emotional arousal is associative, automatic, and uncontrolled, and it produces an impulse for protective action. System 2 may “know” that the probability is low, but this knowledge does not eliminate the self-generated discomfort and the wish to avoid it. System 1 cannot be turned off. The emotion is not only disproportionate to the probability, it is also insensitive to the exact level of probability.
Although overestimation and overweighting are distinct phenomena, the same psychological mechanisms are involved in both: focused attention, confirmation bias, and cognitive ease.
The probability of a rare event is most likely to be overestimated when the alternative is not fully specified.
The idea of denominator neglect helps explain why different ways of communicating risks vary so much in their effects.
Unless the rare event comes to your mind explicitly, it will not be overweighted. ... The emotional evaluation of “sure gain” and “sure loss” is an automatic reaction of System 1, which certainly occurs before the more effortful (and optional) computation of the expected values of the two gambles.
... it is costly to be risk averse for gains and risk seeking for losses. These attitudes make you willing to pay a premium to obtain a sure gain rather than face a gamble, and also willing to pay a premium (in expected value) to avoid a sure loss.
... broad framing blunted the emotional reaction to losses and increased the willingness to take risks.
The outside view is a broad frame for thinking about plans. A risk policy is a broad frame that embeds a particular risky choice in a set of similar choices.
The outside view and the risk policy are remedies against two distinct biases that affect many decisions: the exaggerated optimism of the planning fallacy and the exaggerated caution induced by loss aversion. The two biases oppose each other. Exaggerated optimism protects individuals and organizations from the paralyzing effects of loss aversion; loss aversion protects them from the follies of overconfident optimism.
The sunk-cost fallacy keeps people for too long in poor jobs, unhappy marriages, and unpromising research projects.
Regret is one of the counterfactual emotions that are triggered by the availability of alternatives to reality.
The common feature of these poignant stories is that they involve unusual events — and unusual events are easier than normal events to undo in imagination. Associative memory contains a representation of the normal world and its rules. An abnormal event attracts attention, and it also activates the idea of the event that would have been normal under the same circumstances.
Regret and blame are both evoked by a comparison to a norm, but the relevant norms are different.
The key is not the difference between commission and omission but the distinction between default options and actions that deviate from the default.
We spend much of our day anticipating, and trying to avoid, the emotional pains we inflict on ourselves. How seriously should we take these intangible outcomes, the self-administered punishments (and occasional rewards) that we experience as we score our lives?
If you can remember when things go badly that you considered the possibility of regret carefully before deciding, you are likely to experience less of it.
My personal hindsight-avoiding policy is to be either very thorough or completely casual when making a decision with long-term consequences.
Decision makers tend to prefer the sure thing over the gamble (they are risk averse) when the outcomes are good. They tend to reject the sure thing and accept the gamble (they are risk seeking) when both outcomes are negative.
Your moral feelings are attached to frames, to descriptions of reality rather than to reality itself. 
The message about the nature of framing is stark: framing should not be viewed as an intervention that masks or distorts an underlying preference.

Part V. Two Selves

Peak-end rule: The global retrospective rating was well predicted by the average of the level of pain reported at the worst moment of the experience and at its end.
The experiencing self is the one that answers the question: “Does it hurt now?” The remembering self is the one that answers the question: “How was it, on the whole?” Memories are all we get to keep from our experience of living, and the only perspective that we can adopt as we think about our lives is therefore that of the remembering self.
Confusing experience with the memory of it is a compelling cognitive illusion — and it is the substitution that makes us believe a past experience can be ruined. The experiencing self does not have a voice. The remembering self is sometimes wrong, but it is the one that keeps score and governs what we learn from living, and it is the one that makes decisions.
Nothing in life is as important as you think it is when you are thinking about it.
The main exceptions are chronic pain, constant exposure to loud noise, and severe depression. Pain and noise are biologically set to be signals that attract attention, and depression involves a self-reinforcing cycle of miserable thoughts. There is therefore no adaptation to these conditions.
Adaptation to a new situation, whether good or bad, consists in large part of thinking less and less about it. In that sense, most long-term circumstances of life, including paraplegia and marriage, are part-time states that one inhabits only when one attends to them.
Daniel Gilbert and Timothy Wilson introduced the word miswanting to describe bad choices that arise from errors of affective forecasting. This word deserves to be in everyday language. The focusing illusion (which Gilbert and Wilson call focalism) is a rich source of miswanting. In particular, it makes us prone to exaggerate the effect of significant purchases or changed circumstances on our future well-being.
The focusing illusion creates a bias in favor of goods and experiences that are initially exciting, even if they will eventually lose their appeal. Time is neglected, causing experiences that will retain their attention value in the long term to be appreciated less than they deserve to be.
The mistake that people make in the focusing illusion involves attention to selected moments and neglect of what happens at other times. The mind is good with stories, but it does not appear to be well designed for the processing of time.


Ultimately, a richer language is essential to the skill of constructive criticism. Much like medicine, the identification of judgment errors is a diagnostic task, which requires a precise vocabulary.