How to Decide: Simple Tools for Making Better Choices
Annie Duke
INTRODUCTION
There are only two things that determine how your life turns out: luck and the quality of your decisions.
- You have control over only one of those two things.
- The only thing you have control over that can influence the way your life turns out is the quality of your decisions.
Any decision is, in essence, a prediction about the future. When you’re making a decision, your objective is to choose the option that gains you the most ground in achieving your goals, taking into account how much you’re willing to risk.
Because there are so many possible futures, making the best decision depends on your ability to accurately imagine what the world might look like if you were to choose any of the options you’re considering. That means the ideal decision tool would be a crystal ball.
Determining whether a decision is good or bad means examining the quality of the beliefs informing the decision, the available options, and how the future might turn out given any choice you make.
Chapter 1
Resulting: Outcomes in the Rearview Mirror May Appear Larger Than They Are
Resulting Wrap-up
• Resulting is the tendency to look at whether a result was good or bad to figure out whether a decision was good or bad.
• Outcomes cast a shadow over the decision process, leading you to overlook or distort information about the process, making your view of decision quality fit with outcome quality.
• In the short-term, for any single decision, there is only a loose relationship between the quality of the decision and the quality of the outcome. The two are correlated, but the relationship can take a long time to play out.
• Luck is what intervenes between your decision and the actual outcome. Resulting diminishes your view of the role of luck.
• You can’t tell that much about the quality of a decision from a single outcome, because of luck.
• When you make a decision, you can rarely guarantee a good outcome (or a bad one). Instead, the goal is to try to choose the option that will lead to the most favorable range of outcomes.
• Making better decisions starts with learning from experience. Resulting interferes with that learning, causing you to repeat some low-quality decisions and stop making some high-quality decisions. It also keeps you from examining good-quality/good-outcome decisions (as well as bad-quality/bad-outcome decisions), which still offer valuable lessons for future decisions.
• Resulting reduces compassion when it comes to how we treat others and ourselves.
RESULTING CHECKLIST
- How much is the outcome clouding your judgment (or someone’s judgment you’ve observed) about the quality of the decision?
- Even if bad decisions preceded a bad outcome, can you identify some good decisions made along the way?
- Can you identify some ways in which the process of coming to the decision was good?
- Even if good decisions preceded a good outcome, can you identify some ways the decision could have been better?
- Can you identify some ways in which the process of coming to the decision could be improved?
- What are the factors outside the control of the decision-maker (who might be you), including the actions of other people?
- What are the other ways things could have turned out?
Chapter 2
As the Old Saying Goes, Hindsight Is Not 20/20
Hindsight Bias Wrap-up
• Hindsight bias is the tendency to believe that an outcome, after it occurs, was predictable or inevitable.
• Hindsight bias, like resulting, is a manifestation of the outsize influence of outcomes. In this case, the outcome casts a shadow over your ability to accurately remember what you knew at the time of the decision.
• Hindsight bias distorts the way you process outcomes in two ways: Should have known and Knew it all along.
• Hindsight bias is frequently connected with a set of verbal or mental cues. (See the exercises in sections (2) and (4) for examples you’ve identified, as well at the checklist to follow.)
• Once you know how a decision turns out, you can experience memory creep, where the stuff that reveals itself after the fact creeps into your memory of what you knew or was knowable before the decision.
• To learn from your choices and their outcomes, you need to strive to be accurate about what you knew at the time of your decision.
• The Knowledge Tracker is a tool that can help separate what you knew from what you subsequently learned.
• Hindsight bias leads us to lack compassion for ourselves and others.
HINDSIGHT BIAS CHECKLIST
- Identify the bias. “I should have known it.” “I told you so.” “I knew it all along.”
- Address the bias. (1) Was there any information that was revealed after the fact? (2) Was that information reasonably knowable at the time of the decision?
- If you have a journal record of what you knew at the time of the decision, refer back to that. (3) Was the conclusion about the predictability of the result based on information that was unknowable at the time of the decision? (4) After addressing the first three questions, reassess how predictable the outcome was.
Chapter 3
The Decision Multiverse Wrap-up
• The paradox of experience: Experience is necessary for learning, but individual experiences often interfere with learning. This is partly because of the biases that cause us to overfit outcomes and decision quality.
• Viewing the outcome that occurred in the context of other potential outcomes at the time of decision can help to resolve this paradox.
• There are many possible futures but only one past. Because of this, the past feels inevitable.
• Recreating a simplified version of a decision tree puts the actual outcome in its proper context.
• Exploring the other possible outcomes is a form of counterfactual thinking. A counterfactual is something that relates to an outcome that has not happened but could have happened, or an imagined state of the world.
• Our willingness to examine outcomes is asymmetrical. We are more eager to put bad outcomes in context than good ones. Becoming a better decision-maker requires us to try (difficult though it may be) to put those good outcomes in perspective.
THE DECISION MULTIVERSE CHECKLIST
- When evaluating whether the outcome provides a lesson about decision quality, create a simplified decision tree, starting with the following: Identify the decision. Identify the actual outcome. Along with the actual outcome, create a tree with other reasonable outcomes that were possible at the time of the decision.
- Explore the other possible outcomes to understand better what is to be learned from the actual outcome you got.
Chapter 4
The Three Ps: Preferences, Payoffs, and Probabilities
The Three Ps Wrap-up
• Incorporating preferences, payoffs, and probabilities into a decision tree is an integral part of a good decision process.
• Preference is individual to you, dependent on your goals and values.
• The payoff is how an outcome affects your progress toward or away from a goal.
• Some possibilities will have payoffs where you gain something you value. These comprise the upside potential of a decision.
• Some possibilities will have payoffs where you lose something you value. These comprise the downside potential of a decision.
• Risk is your exposure to the downside.
• Payoffs can be measured in anything you value (money, time, happiness, health, the happiness or health or wealth of others, social currency, etc.).
• When you’re figuring out whether a decision is good or bad, you’re comparing the upside to the downside. Does the upside potential compensate for the risk of the downside potential?
• Probabilities express how likely something is to occur.
• Combining probabilities with preferences and payoffs helps you to better resolve the paradox of experience, allowing you to get out from under the shadow of the particular result that you are dealt.
• Combining probabilities with preferences and payoffs helps you more clearly evaluate and compare options.
• A pros and cons list is flat. It lacks information about both the size of payoffs and the probability of any pro or con occurring. Because of that, it is a low-quality decision tool for evaluating options and comparing them to each other.
• Most people are reluctant to estimate the likelihood of something happening in the future. (“That’s speculative.” “I don’t know enough.” “I’d just be guessing.”)
• Even though your information is usually imperfect, you know something about most things, enough to make an educated guess.
• The willingness to guess is essential to improving decisions. If you don’t make yourself guess, you’ll be less likely to ask “What do I know?” and “What don’t I know?”
• You can start expressing probabilities by using common terms. That gets you thinking about how often outcomes will occur, presents a view of relative likelihood, and gives you a snapshot of the overall likelihood of the best and worst outcomes.
THE THREE PS CHECKLIST (Preferences, Payoffs, Probabilities)
- When evaluating a past decision or making a new decision, refer to the Six Steps to Better Decision-Making:
- Step 1—Identify the reasonable set of possible outcomes. These outcomes can be general scenarios or be focused on particular aspects of the outcomes that you especially care about.
- Step 2—Identify your preference for each outcome—to what degree you like or dislike each outcome, given your values. These preferences will be driven by the payoffs associated with each outcome. Gains comprise the upside and losses comprise the downside. Include this information in your decision trees.
- Step 3—Estimate the likelihood of each outcome unfolding. As a start, use common terms that express probabilities. Don’t be afraid to guess.
- Step 4—Assess the relative likelihood of outcomes you like and dislike for the option under consideration.
- Step 5—Repeat Steps 1–4 for other options under consideration.
- Step 6—Compare the options to each other.
Chapter 5
Taking Dead Aim at the Future: The Power of Precision
Taking Dead Aim Wrap-up
• Natural language terms that express likelihoods, like “very likely” and “unlikely,” are useful but blunt instruments.
• The drive to improve on your initial estimates is what motivates you to check your information and learn more. If you hide behind the safety of a general term, there’s no reason to improve on it or calibrate.
• Terms that express likelihoods mean very different things to different people. • Using ambiguous terms can lead to confusion and miscommunication with people you want to engage for help.
• Being more precise, by expressing probabilities as percentages, makes it more likely you’ll uncover information that can correct inaccuracies in your beliefs and broaden your knowledge.
• You can use your answers to the Mauboussins’ survey to help you convert natural language terms to exact probabilities.
• In addition to making precise (bull’s-eye) estimates, offer a range around that estimate to express your uncertainty. Do this by including a lower and upper bound that communicate the size of your target.
• The size of the range signals what you know and what you don’t know. The larger the range, the less information or the lower the quality of the information informing your estimate, and the more you need to learn.
• Communicating the size of the range also signals to others that you need their knowledge and perspective to narrow the range.
• Use the shock test to determine if your upper and lower bounds are reasonable: Would you be really shocked if the correct answer was outside that boundary? Your goal should be to have approximately 90% of your estimates capture the objectively true value.
• Develop a habit of asking yourself, “What information could I find out that would tell me that my estimate or my belief is wrong?”
TAKING DEAD AIM CHECKLIST
Improve your estimates by taking dead aim in the following ways:
- Take the Mauboussins’ survey on the meaning of common terms for probability.
- If you are uncomfortable making specific estimates, use the term that comes to mind and convert it to a specific bull’s-eye estimate by referring to your answers from the survey. In addition, make a target estimate, comprising a range of the reasonable upper and lower bounds.
- Test the reasonableness of your upper and lower bounds with the shock test.
- Ask yourself, “What information could I discover that would make me change my mind?”
- If the information is available, go find it. If it isn’t, keep an eye out for it in the future.
Chapter 6
Turning Decisions Outside In
Turning Decisions Outside In Wrap-up
The inside view is the view of the world through your own perspective, your own beliefs, and your own experiences.
• Many common cognitive biases are, in part, the product of the inside view.
• Pros and cons lists amplify the inside view.
• The outside view is the way that others would see your situation, or what’s true of the world in general, independent of your own perspective.
• It’s important to explore the outside view even if you think you’ve got your facts straight, because it’s possible other people could look at the same facts and come to different conclusions.
• The outside view acts to discipline the biases and inaccuracies that live in the inside view, which is why you want to anchor first to the outside view.
• Accuracy lives at the intersection between the inside view and the outside view. The things that are particular to your situation matter, but those particulars should be married with the things that are true of the world in general.
• When it comes to reasoning about the world, your beliefs are in the driver’s seat.
• Motivated reasoning is the tendency to process information to get a conclusion we want rather than to discover what is true.
• Smart people aren’t immune from motivated reasoning and the inside view. In fact, being smart can make it worse because smart people have more confidence in the truth of their beliefs and can spin better narratives to sway other people (and themselves) toward their point of view.
• A good way to get to the outside view is to look for any base rates that might apply to your situation.
• Another way to get to the outside view is to seek out other people’s perspectives and feedback. It’s important, however, that they feel comfortable expressing disagreement or a perspective that might cast you in an unflattering light. Otherwise, they’re only amplifying the inside view, strengthening your belief in your accuracy because it feels certified by others. You should be eager to hear people disagree with you and motivate them to do so.
• Perspective Tracking is a good decision habit to develop. Intentionally considering your situation entirely from the outside view and then entirely from the inside view can get you to a more accurate view that incorporates both.
TURNING DECISIONS OUTSIDE IN CHECKLIST
- Describe your situation entirely from the outside view. The outside view should include (a) applicable base rates, and (b) perspectives provided by other people.
- Describe your situation entirely from the inside view. Find the intersection of the outside view and the inside view to land on a more accurate narrative.
Chapter 7
Breaking Free from Analysis Paralysis: How to Spend Your Decision-Making Time More Wisely
Breaking Free from Analysis Paralysis Wrap-up
• We spend an enormous amount of time on routine, inconsequential decisions. The average person spends 250–275 hours per year deciding what to eat, watch, and wear. That’s the equivalent of the time they spend at work in six or seven weeks.
• There is a time-accuracy trade-off: Increasing accuracy costs time. Saving time costs accuracy.
• The key to balancing the trade-off between time and accuracy is figuring out the penalty for not getting the decision exactly right.
• Getting an initial understanding of the impact of your decision (through the framework of evaluating possibilities, payoffs, and probabilities) will identify situations in which the penalty is small or nonexistent, giving you leeway to sacrifice accuracy in favor of deciding faster.
• Recognizing when decisions are low impact also maximizes opportunities to poke at the world, which increases your knowledge and helps you learn more about your preferences, improving the quality of all future decisions.
• You can identify low-impact decisions with the Happiness Test, asking yourself if how your decision turns out will likely have an effect on your happiness in a week, a month, or a year. If the type of thing you are deciding about passes the Happiness Test, you can go fast.
• If a decision passes the Happiness Test and the options repeat, you can go even faster.
• A freeroll is a situation in which there is limited downside. Save time deciding whether to seize a freeroll; take time in deciding how to execute it.
• When you have multiple options that are close in potential payoffs, these are a sheep in wolf’s clothing decisions. Close calls for high-impact decisions tend to induce analysis paralysis, but the indecision is, in itself, a signal that you can go fast.
• To determine if a decision is a sheep in wolf’s clothing, use the Only-Option Test, asking yourself for each option, “If this were the only option I had, would I be happy with it?” If your answer is yes for more than one option, you could flip a coin since you can’t be that wrong whichever option you pick.
• Allocate your decision time using the menu strategy. Spend time sorting, determining which options you like. Once you have options you like, save time picking.
• When you pick an option, you’re passing on the potential gains associated with the options you don’t pick. This is known as opportunity cost. The higher the opportunity cost, the higher the penalty for making choices that are less certain.
• You can defray opportunity cost and decide faster by being quit-to-itive, looking at decisions through the framework of whether you can change your mind, quit your choice, and choose something else at a reasonable cost.
• Decisions with a low cost to quit, known as two-way-door decisions, also provide you with low-cost opportunities to make experimental decisions to gather information and learn about your values and preferences for future decisions.
• When you’re facing a decision with a high or prohibitive cost of changing your mind, try decision stacking, making two-way-door decisions ahead of the one-way-door decision.
• You can also defray opportunity cost if you can exercise multiple options in parallel.
• Because you can rarely approach perfect information or be certain of the outcome of your decision, you will make most decisions while still uncertain. To figure out when additional time is no longer likely to increase accuracy in a worthwhile way, ask yourself, “Is there is additional information (available at a reasonable cost) that would establish a clearly preferred option, or if there is already a clearly preferred option, cause you to change your preferred option?” If yes, go find it. If no, decide and move on.
PARALYSIS CHECKLIST
To determine whether you can decide faster, ask yourself the following questions:
- Does the type of thing you’re deciding about pass the Happiness Test? If yes, go fast. Does it pass the Happiness Test with repeating options? If yes, go even faster.
- Are you freerolling? If yes, go fast in seizing the opportunity but take time in the execution.
- Is your decision a sheep in wolf’s clothing, with multiple options that pass the Only-Option Test? If yes, go fast, even flipping a coin to make your choice.
- Can you quit your choice and pick a different option at a reasonable cost? If yes, go fast. If no, can you decision stack?
- Can you exercise multiple options in parallel? If yes, go fast.
- Is there is additional information (available at a reasonable cost) that would establish a clearly preferred option, or if there is already a clearly preferred option, change your preference? If yes, go find it. If no, decide.
Chapter 8
The Power of Negative Thinking
The Power of Negative Thinking Wrap-Up
• We are pretty good at setting positive goals for ourselves. Where we fall flat is at executing the things we need to do to achieve them. The gap between the things we know we should do and the decisions we later make is known as the behavior gap.
• The message of the power of positive thinking is that you’ll succeed if you imagine yourself succeeding. Whether explicitly or by reasonable inference, the message is also that failure is the result of thinking about failure.
• Despite the importance of setting positive goals, positive visualization alone won’t give you the best route to success. Negative thinking helps you identify things that might get in your way so you can consider ways to reach your destination more efficiently.
• Thinking about how things can go wrong is known as mental contrasting. You imagine what you want to accomplish and confront the barriers in the way of accomplishing it.
• You can identify more potential obstacles by combining mental contrasting with mental time travel, picturing yourself in the future having failed to achieve a goal, and then looking back at what got you to that outcome.
• Looking back from an imagined future at the route that got you there is called prospective hindsight.
• A premortem combines prospective hindsight with mental contrasting. To do a premortem, you place yourself in the future and imagine that you have failed to achieve your goal. You then consider the potential reasons things worked out poorly.
• In addition to helping individuals, premortems can help teams minimize groupthink and maximize access to the outside view by eliciting the diversity of opinions. This is especially true if team members do the premortem independently before discussing as a group.
• A companion technique to a premortem is backcasting, where you work backward from a positive future to figure out why you succeeded.
• You can turn the output of premortems and backcasts, for easy reference, into a Decision Exploration Table, which also includes an estimate of the chances of the reasons for failure and success occurring.
• Given what you’ve learned from creating a Decision Exploration Table, the first thing to ask is whether you should modify your goal or change your decision.
• Once you’ve established that you’re sticking with your goal or decision, you can create precommitment contracts, which raise barriers to behavior that interfere with your success, or lower barriers to encourage behavior that promotes your success.
• You can also prepare for your reaction to setbacks along the way to your goal. People compound negative outcomes by making poor decisions after a bad result. Common reactions that occur in the moment that make things worse include tilt. The what-the-hell effect and the sunk cost fallacy are examples of tilt. Planning for your reaction allows you to create precommitments, establish criteria for changing course, and dampen your emotional reaction in the wake of a setback.
• The Dr. Evil game helps identify and address additional ways your behavior in the future might undermine your success. In the game, you note the ways that Dr. Evil would control your mind to make you fail through decisions that are justifiable as one-offs but unjustifiable over time.
• The Dr. Evil game can encourage you to adopt a precommitment called a category decision, where you decide in advance what options you can and cannot choose when you face a decision that falls within that category.
• You can also address potential bad luck by hedging, paying for something that mitigates the impact of a downside event occurring.
THE POWER OF NEGATIVE THINKING CHECKLIST
Attempt to improve your likelihood of success for a goal you set or a decision that involves future execution by doing the following:
- Conduct a premortem by (a) figuring out a reasonable period for achieving the goal or for the decision playing out; (b) imagining it’s the day after that period and you didn’t achieve the goal, or the decision worked out poorly; (c) looking back from that point in the future and coming up with reasons why you failed, divided into “skill” (within your control) and “luck” (outside your control).
- Conduct a backcast by going through the same exercise but imagining you achieved the goal or succeeded with the decision.
- Combine the output of the premortem and backcast in a Decision Exploration Table, including an estimate of the probability that each item in the table will happen.
- Ask whether you should modify your goal or change your decision based on the output of the premortem and backcast.
- Determine whether there are any precommitment contracts you can create to reduce the chances of making bad decisions and increase the chances of making good ones.
- Plan ahead for how you’ll proceed if any of the reasons for failure you identified through a premortem happen.
- Play the Dr. Evil game to determine how you might fall short of your goal by making future decisions that are individually justifiable but in aggregate will cause you to fail.
- Consider adopting category decisions that will reduce the chances you make Dr. Evil decisions. Assess what you can do to hedge against the impact of bad luck.
Chapter 9
Decision Hygiene: If You Want to Know What Someone Thinks, Stop Infecting Them with What You Think
Decision Hygiene Wrap-up
• One of the best ways to improve the quality of your beliefs is to get other people’s perspectives. When their beliefs diverge from yours, it improves your decision-making by exposing you to corrective information and the stuff you don’t know.
• Beliefs are contagious. Informing somebody of your belief before they give their feedback significantly increases the likelihood that they will express the same belief back to you.
• Exercise decision hygiene to stem the infection of beliefs.
• The only way somebody can know that they’re disagreeing with you is if they know your opinion first. Keep your opinions to yourself when you elicit feedback.
• The frame you choose can signal whether you have a positive or negative view about what you’re trying to get feedback on. Stay in neutral as much as possible.
• The word “disagree” has very negative connotations. Using “divergence” or “dispersion” of opinion instead of “disagreement” is a more neutral way of talking about places where people’s opinions differ.
• Outcomes can also infect the quality of feedback. Quarantine others from the way things turned out while eliciting their feedback.
• When you’re asking for feedback about something that’s happened in the past and several outcomes are sitting in the way, iterate feedback.
• For feedback of any kind, put the person, as closely as possible, into the state of knowledge you were in when you made the decision.
• Group settings offer the potential of improving decision quality if you can access the different perspectives of the group. Often, this potential is undermined by the tendency of groups to coalesce around consensus quickly, discouraging members with information or opinions that disagree with the consensus from sharing them.
• Groups can better fulfill their decision-making potential by exercising group decision hygiene, soliciting initial opinions and rationales independently before sharing with the group.
• Due to the halo effect, opinions from high-status members of the group are especially contagious.
• Anonymizing feedback on the first pass allows ideas to be better considered on their merits rather than according to the status of the individual who holds the belief.
• For lower-impact, easier-to-reverse decisions, the group can still contain the contagion through a quick-and-dirty version of this process, where group members write down their opinions and someone reads them aloud or writes them on a whiteboard before discussion, or where members read their own opinions aloud in reverse order of seniority.
• The quality of feedback is limited by the quality of the input into the feedback elicitation process. We tend to spin narratives that highlight, lowlight, or even omit information that isn’t helpful to the conclusion that we would like them to reach.
• Give the other person what they need to know to give you a quality opinion and no more.
• Access the outside view by asking yourself, “If someone came to me asking my opinion about this kind of decision, what would I need to know to give good advice?”
• Build a checklist of relevant details for repeating decisions and make that checklist before you’re in the midst of a decision. Such a list should focus on the applicable goals, values, and resources, along with the details of the situation.
• Members of a group should hold each other accountable to the checklist. If someone is eliciting feedback and they can’t provide details on the checklist, there should be an agreement not to give feedback.
DECISION HYGIENE CHECKLIST
When you’re seeking feedback from others, exercise good decision hygiene in the following ways:
- Quarantine others from your opinions and beliefs when asking for feedback.
- Frame your request for feedback in a neutral fashion, to keep from signaling your conclusions. Quarantine others from outcomes when asking about past decisions.
- If you’re asking for feedback involving multiple outcomes, iterate the feedback. Explain the form of the output you’re seeking.
- Prior to being in the midst of a decision, make a checklist of the facts and relevant information you would need to provide feedback for such a decision.
- Have the people seeking and giving feedback agree to be accountable to provide all the relevant information, asking for anything that has not been provided, and refusing to give feedback if the person seeking feedback can’t provide relevant information.
- When you’re involved in a group setting, exercise the following additional forms of decision hygiene: Solicit feedback independently, before group discussion or members express their views to each other. Anonymize the sources of the views and distribute a compilation to group members for review, in advance of group meetings or discussion.
Additional Notes
Chapter 1
- Resulting. When people result, they look at whether the result was good or bad to figure out if the decision was good or bad. (Psychologists call this “outcome bias,” but I prefer the more intuitive term “resulting.”) We take this resulting shortcut because we can’t clearly “see” whether the decision was good or bad, especially after the fact, but we can clearly see if the outcome was good or bad. Resulting is a way to simplify complex assessments of decision quality.
- Simple isn’t always better. Decision quality and outcome quality are, of course, correlated. But not perfectly, at least not in most decisions we make, and certainly not when we have only one try at the decision. The relationship between the two can take a long time to play out.
- RESULTING A mental shortcut in which we use the quality of an outcome to figure out the quality of a decision.
- A necessary part of becoming a better decision-maker is learning from experience. Experience contains the lessons for improving future decisions. Resulting causes you to learn the wrong lessons.
- The quality of the result filters how we view the decision, even when we have identical details about the decision process, because the quality of the outcome drives how we interpret those details. This is the power of resulting. WHEN THE OUTCOME TURNS out poorly, it’s easy to focus on the details that suggest the decision process was poor. We think we are seeing the decision quality rationally because the bad process is obvious. But once the outcome is flipped, we discount or reinterpret the information about the decision quality because the outcome drives us to write a story that fits the ending.
- We want outcome quality to align with decision quality. We want the world to make sense in this way, to be less random than it is. In trying to get this alignment, we lose sight of the fact that there are more possible futures than the one that actually happens.
- Luck is what intervenes between your decision (which has a range of possible outcomes) and the outcome that you actually experience.
- Luck exerts its influence between your decision and which of the possible paths you end up on. It is the element you have no control over that determines which of the possible outcomes you actually observe in the short run.
- Because any decision determines only the set of possible outcomes (some good, some bad, some in between), this means good outcomes can result from both good and bad decisions and bad outcomes can result from both good and bad decisions.
- In order to become a better decision-maker, it’s important to actively explore all four of the ways that decision quality and outcome quality relate to each other. It’s not easy to be willing to give up the credit that comes from feeling like you made good things happen, but it is worth it in the long run. Small changes in how much you notice the luck that you would otherwise overlook will have a big influence on the way your life turns out. Those small changes act like compounding interest that pays big dividends on your future decision-making. Left to our own devices, we will notice some of the bad luck but overlook most of the dumb luck.
- Striving to improve means being willing to fight the complacency that can come from a good decision leading to a good result. Learning from experience is what allows you to make better decisions as you go along. Resulting keeps you from sharpening the view into your crystal ball, making you a worse predictor of the future because you skip lessons you could get from the past. Don’t assume you can’t find worthwhile lessons while taking a victory lap.
Chapter 2
- HINDSIGHT BIAS The tendency to believe an event, after it occurs, was predictable or inevitable. It’s also been referred to as “knew-it-all-along” thinking or “creeping determinism.”
- MEMORY CREEP When what you know after the fact creeps into your memory of what you knew before the fact.
- Using a Knowledge Tracker reduces hindsight bias by clarifying what you did and didn’t know at the time of decision. Detailing what you knew and when you knew it helps prevent stuff that revealed itself after the fact from reflexively creeping into the before-the-fact box.
- Hindsight bias and compassion Hindsight bias, like resulting, makes us lack compassion for ourselves and others. In order to think about what was reasonable for someone to know, we have to have empathy for them. We don’t generally take the time to do this, and instead make snap judgments. We are quick to blame the decision-maker for a bad outcome, neglecting to put ourselves in their shoes at the time of the decision (e.g., “You made us late to the airport with your stupid shortcut. How could you not have known there would be such bad traffic?”). That’s true even when we’re the decision-maker and we’re in our own shoes. This lack of empathy isn’t limited to bad outcomes. Hindsight bias makes us unduly punish ourselves and others for being careful or agonizing over a decision that works out well (e.g., “Why did I waste so much time worrying about the weather?”).
Chapter 3
- THE PARADOX OF EXPERIENCE Experience is necessary for learning, but individual experiences often interfere with learning.
- COUNTERFACTUAL A what-if. A possible outcome of a decision that is not the one that actually occurred. An imagined, hypothetical state of the world.
- It might feel good in the moment to accept your success without qualification or examination, but you’re going to lose out on so many learning opportunities by doing so. You’ll miss seeing the ways the outcome could have been even better. You’ll miss exploring whether a different decision might have increased the chances of the outcome you got. Or the better outcomes that were possible. Or the worse outcomes that might have happened. You’ll miss the chance to see when the result you got was lucky. We have to see outcomes for what they are, no more and no less, and that’s true whether what happened was great or terrible. We have to find symmetry in our willingness to explore all outcomes.
Chapter 4
- Unexpectedness is really hard to evaluate in retrospect. But if you do the work in advance, not only will your decisions get better because you will be laser focused on how the future might unfold, but you will also be able to tell when you didn’t anticipate the way things might turn out because you will actually have a record of what you were thinking at the time you made the decision.
SIX STEPS TO BETTER DECISION-MAKING
- Step 1—Identify the reasonable set of possible outcomes.
- Step 2—Identify your preference for each outcome—to what degree do you like or dislike each outcome, given your values?
- Step 3—Estimate the likelihood of each outcome unfolding.
- Step 4—Assess the relative likelihood of outcomes you like and dislike for the option under consideration.
- Step 5—Repeat Steps 1–4 for other options under consideration.
- Step 6—Compare the options to one another.
- Advice can be an excellent decision tool as long as you are explicit about your goals and values when you are seeking that advice. Otherwise, you run the risk that the person who’s advice you are seeking will assume you share their preferences and will answer accordingly.
- FOR ALMOST ANY DECISION you make, there are some outcomes you hope for and some you don’t. By explicitly adding preferences to the tree, you can see at a glance how many of the possible outcomes you like and how many you don’t. That’s why it’s helpful to order the possibilities by preference.
- Assessing the quality of a decision involves figuring out whether going for the upside is worth risking the downside.
- Pros and cons lists are flat, as if (payoff) size doesn’t matter. Because it is merely in list form, a pros and cons list treats the chance of an early arrival as equal to the possibility of getting into a serious traffic accident. Without explicit information about size, about the magnitude of any pro or con, it is unclear how you would compare the positive and negative sides of the list. If there are ten pros and five cons, does that mean you should go with the decision? It is impossible to say without information about the size of the payoffs, because without that you can’t figure out if the upside potential outweighs the downside.
- To figure out whether a decision is good or bad, you need to know not just the things that might reasonably happen and what could be gained or lost, but also the likelihood of each possibility unfolding. That means, to become a better decision-maker, you need to be willing to estimate those probabilities.
- This way of thinking, that there is only “right” and “wrong” and nothing in between, is one of the biggest obstacles to good decision-making. Because good decision-making requires a willingness to guess.
- All your knowledge, imperfect as it might be, means that your guess isn’t random. Although you don’t have perfect information, you have a lot more than no information about what the bison weighs.
- You almost always know something, and something is better than nothing. You might not get it perfect, but when it comes to decision-making, you get credit for showing your work.
- The knowledge that you can apply to any single estimate might be small, but it will make a difference in the quality of your decisions. Those differences, even if they are modest, will add up over time. Like compounding interest, those small increases in decision quality will pay big dividends in the long run.
- Your job as a decision-maker is to figure out two things: (1) What do I already know that will make my guess more educated? (2) What can I find out that will make my guess more educated?
- Just like decision-making, archery is not all or nothing, where you get points only for hitting the bull’s-eye and everything else is a miss. An archer gets points for hitting the target at all.
- The value of guessing isn’t in whether the guess is “right” or “wrong.” Your guesses are like the archer’s arrows. If you were omniscient and your guesses were always exactly right, you’d score all bull’s-eyes. When you make an educated guess, you’re aiming at the bull’s-eye and, though it’s likely you’ll miss the exact answer, like the archer you will still score points for getting in the vicinity. It’s okay to acknowledge that you’re not usually going to hit the bull’s-eye; the important thing is to take aim. Aiming for that bull’s-eye by making an educated guess gets you closer to a precise hit because it motivates you to assess what you know and what you don’t know. It motivates you to learn more. Recognizing that the value in taking aim is the archer’s mind-set. Recognizing that guesses aren’t random, that all guesses are educated guesses, is the archer’s mind-set. Otherwise, your decision-making will more closely resemble a game of pin the tail on the donkey. You’ll be purposely blindfolding yourself to the target.
- The advantages of using terms that express probabilities Adding probability estimates to the decision tree will significantly improve the quality of your decisions versus simply identifying the possibilities and your preferences. To make better decisions, you have to consider the likelihood of any outcome occurring, including the ones you prefer and the ones you want to avoid. Without taking this extra step, it is difficult to assess the quality of any option on its own, and even harder to compare options.
- If You Don’t Ask a Question, You Won’t Get an Answer One of the biggest benefits of the Archer’s Mind-set, of making yourself take aim, is that it prompts you to ask yourself those two questions we discussed earlier about the value of guessing: 1. What do I already know that will make my guess more educated? (And how can I apply that knowledge?) 2. What can I find out that will make my guess more educated? Taking aim will make you hungry to answer these questions, moving things from the “stuff you don’t know” box to the “stuff you know” box.
- Your chief weapon to improve your decisions is turning some of the “stuff you don’t know” into “stuff you know.”
- We have two main problems when it comes to the stuff we know. First, we just don’t know very much. Learning new stuff strengthens the foundation, making it sturdier. Second, the stuff we do know is riddled with inaccuracies. A lot of our beliefs are not perfectly true. We can think about these inaccuracies as cracks in the foundation. The only way to fix those cracks and shore up the foundation is to find the inaccuracies in our beliefs. And the only place we are going to find that information is in the universe of stuff we don’t know.
Chapter 5
- A lot of the stuff you don’t know lives in other people’s heads. Because a lot of the stuff you don’t know lives in other people’s heads, getting feedback from other people about the things you believe and the decisions you make will be one of your best tools for extracting knowledge from the world.
- When you use these blunt terms, you and the other people in the conversation are often speaking different languages without even knowing it. If you communicate what you mean with precision, using probabilities expressed as percentages, the disagreement is immediately revealed. If I say something has a 30% chance of happening and you say it has a 70% chance of happening, we know we disagree. There is no ambiguity.
- Precision uncovers disagreement. It uncovers places where your belief is different from someone else’s belief. And that’s good, because you want to find out when you have something wrong. It gives you the chance to get it right.
- This imprecision also makes it harder for you to hold yourself accountable. The broader you allow the target area to be, the less likely you are to search for information that will help get you to a more precise answer. The leeway lets you off the hook not just with other people but also with yourself. That’s why precision matters.
- The range around your bull’s-eye estimate defines the size of your target area and serves a key purpose: it signals, to yourself and others, how uncertain you are of your guess. It reveals where you sit on the continuum between no knowledge and perfect knowledge. The further you are from perfect information, the larger the target you’re defining. The closer you are to having perfect information, the smaller the target you’re defining. On the rare occasions when you have perfect information and no uncertainty, your target will be all bull’s-eye.
- Offering a range around your estimate implies a question to the listener: Can you help me with this? When you create an upper and lower bound you express that you’re sitting somewhere between no knowledge and perfect knowledge. Letting the listener know that you’re unsure makes them more likely to share helpful information and perspectives because you have asked them for help.
- The goal is to set the narrowest range you can, where you would still be pretty shocked if the bull’s-eye wasn’t in the range.
- That is what reasonable means. Wharton professor Abraham Wyner suggests that a good way to get to a reasonable upper and lower bound is to ask yourself, “Would I be pretty shocked if the answer fell outside this range?” If you use that as your standard, your range will naturally reflect how much educated there is in your educated guess. “Pretty shocked” strikes a good balance between being overly exact (when you’re actually pretty uncertain) and having a range that’s so comfortably wide that the answer never falls outside the boundaries.
- What this shows is that we’re generally overselling our knowledge rather than underselling it. We’re usually much more certain of our guesses than the accuracy of our beliefs warrant.
- Approach the quality of the stuff you think you know with more skepticism. That skepticism will make you more willing to question your own beliefs and more eager to seek out what other people know. And that will improve the quality of your decisions.
- THE TENDENCY TOWARD OVERCONFIDENCE vexes decision-making. In general, we don’t question our own beliefs enough. We have too much confidence in what we think we know and we don’t have a realistic view of what we don’t know. Whether it’s about the things we believe to be true, our opinions, or how we think the future might unfold, we could all use a healthy dose of skepticism.
- Making it a habit to ask yourself, “If I were wrong, why would that be?” helps get you to approach your own beliefs with more skepticism, disciplining your naturally overly optimistic view of what you know and getting you more focused on what you don’t know. Asking yourself why you might be wrong will also increase the accuracy of the things you believe, the opinions you hold, and how you think the future might unfold. That’s because when you ask yourself what information you could discover that would make you change your mind, you can actually go find out some of that stuff. And in asking and answering the question, you are just more likely to go look for it. Even when the information that would change your mind isn’t readily available, it might be available in the future.
Chapter 6
- It doesn’t matter how good the quality of your decision process is if the input into that process is junk. That input is your beliefs, and there is a lot of junk in there. It’s like you have a KICK ME sign on your back when it comes to identifying inaccuracies in what you know and believe.
- Resulting is a good example of an inside view problem. The outcomes that you happen to observe cast a shadow over your ability to see those outcomes in the context of all the things that objectively could have happened. This affects the quality of the lessons you learn. If you experienced a different outcome, you would learn a different lesson. If you experienced a different outcome, you would assess the quality of the decision that preceded the outcome differently.
- Part of why it’s easier to see other people more objectively than you can see yourself is that you are motivated to protect your beliefs when it comes to reasoning about your own situation. Your beliefs form the fabric of your identity.
- You aren’t motivated in the same way when reasoning about other people’s problems because you aren’t endowed to other people’s beliefs the way you are endowed to your own.
- OUTSIDE VIEW What is true of the world, independent of your own perspective. The way that others would view the situation you’re in. What you’ve probably figured out by now is that the remedy for the inside view is to open yourself up as much as possible to other people’s perspectives and what’s true of the world in general, independent of your own experiences, because that’s where the corrective information lives. That’s the outside view.
- The value of getting other people’s perspectives is not just that they know facts that you don’t know that might be helpful to you. It is not just that they might be able to correct inaccuracies in the facts you think you know. It is that even if they had the exact same facts as you, they might view those facts differently. They might come to a very different conclusion given the exact same information.
- More than 90% of professors rate themselves as better-than-average teachers. • About 90% of Americans rate their driving ability as better than average. • Only 1% of students think their social skills are below average. Obviously, it’s impossible for over 90% of the population to be better than average at something. Yet even though we know that half the population must be, by definition, below average (the outside view), we seem to rarely think that we could be part of that half (the inside view). This phenomenon is called the better-than-average effect.
- Accuracy lives in the intersection of the outside view and the inside view. The outside view disciplines the distortions that live in the inside view. That’s why it’s important to start with the outside view and anchor there, considering things like what’s true of the world in general or the way someone else would view your situation.
- Being smart doesn’t make you less susceptible to the inside view. If anything, it makes it worse. It straps your beliefs into the driver’s seat more firmly. Research across a variety of settings has shown that being smart makes you better at motivated reasoning, the tendency to reason about information to confirm your prior beliefs and arrive at the conclusion you desire. And just to be clear, in this case “better” is not a good thing.
- If you think about it, this makes a lot of sense. Smart people often think more highly of their beliefs and opinions. They are less likely to think the stuff they know needs correcting. They have more confidence in what their intuition or gut tells them. After all, they’re really smart. Why wouldn’t they have more confidence in those things? When you’re smart, you’re naturally less skeptical about the things you believe to be true. Smart people are also better at constructing convincing arguments that support their views and reinforce the things they believe to be true. Smart people are better at spinning narratives that convince other people that they are right, not in the service of misleading those people but in the service of keeping the fabric of their own identity from tearing. The combination of motivated reasoning, the propensity to mislead yourself, and an overconfidence in intuition makes smart people less likely to seek feedback. When they do seek feedback, their ability to spin a persuasive narrative makes other people less likely to challenge them. That means that the smarter you are, the more vigilant you have to be about getting to the outside view.
- Base rates: An easy way to get the outside view One way to get to the outside view is to make it a habit, as part of your decision process, to ask yourself what is true of the world in general, independent of anyone’s point of view. BASE RATE How likely something is to happen in situations similar to the one you’re considering.
- Educating yourself about what is true of most people in your situation will give you a glimpse into the outside view that will improve your ability to compare options (such as buying in-home equipment, joining a gym, or doing something else).
- JUST AS KNOWLEDGE TRACKING gets you thinking about what you know and don’t know, motivates you to go find out more, and creates a record of your beliefs at the time of your decision that creates accountability and prevents memory creep, Perspective Tracking has many of the same benefits. Incorporating a habit of Perspective Tracking into your decision process helps scooch your beliefs out of the driver’s seat. It helps you view your gut feelings with more skepticism. Perspective tracking forces you to consider the outside view. And to consider the outside view, you must seek it out: both how other people might view the decision and what is true of the world in general.
- The outside view puts the luck more in focus. That’s why Perspective Tracking is so imperative when it comes to success.
Chapter 7
- HERE’S HOW LONG THE average person spends per week on these decisions: • What do you want to eat?—150 minutes a week. • What do you want to watch on Netflix?—50 minutes a week. • What do you want to wear?—90 to 115 minutes a week. This means that if you’re like most people, you’re spending a lot of time in analysis paralysis. The time the average person spends deciding what to eat, watch, and wear adds up to 250 to 275 hours per year. That’s a lot of time spent on decisions that intuitively feel like they are inconsequential.
- It might be counterintuitive, but the decision-making framework offered in this book will actually help you go faster, and here’s why: The key to achieving the right time-accuracy balance is figuring out what the penalty is for making a lower-quality decision than you would have if you had taken more time.
- The smaller the impact of a poor outcome, the faster you can go. The bigger the impact, the more time you should take.
- A recurring theme of this book has been that you should be laser focused on looking for ways to extract information from the world, transforming some of the universe of stuff you don’t know into stuff you do know. The information you gather is not just about learning new facts, or figuring out how things work, or refining your estimates of how things might turn out. It’s also about figuring out your own preferences, your own likes and dislikes.
- One of the best ways to figure out your likes and dislikes is to try stuff. The faster you make decisions, the more stuff you can try. That means more opportunities to experiment and poke at the world. That means more opportunities for you to learn new stuff, including new stuff about yourself.
- REPEATING OPTIONS When the same type of decision comes up over and over again, you get repeated chances to choose options, including options you may have rejected in the past. Decisions that repeat also provide opportunities for choosing things you are less certain about, like a food you’ve never tried or a new TV show, because you don’t get penalized as heavily for taking those gambles. At little cost, you get information in return about your likes and dislikes, and you might find some surprises in there.
- FREEROLL: A situation where there is an asymmetry between the upside and downside because the bigger the asymmetry between the upside and downside, the more you have to gain when potential losses are limited, the bigger the freeroll. What gets missed is that, for freerolls, the potential big impact on your happiness is one-sided in your favor.
- When considering whether a decision has limited downside potential, it’s crucial to think about the cumulative effects of making the same decision repeatedly, rather than focusing on just the one-time, short-run potential harm.
- When a decision is hard, that means it’s easy. The very thing that slows you down—having multiple options that are very close in quality—is actually a signal that you can go fast, because this tells you that whichever option you choose, you can’t possibly be that wrong, since both options have similar upside and downside potential. Instead of thinking about the similarity between options in terms of their overall potential payoffs, both the positives and the negatives, we mostly get focused on anxiety about the downside.
- Paradox of Choice, that this kind of sheep-in-wolf’s-clothing decision is more likely to come up the more options you have to choose from. The greater the number of available options, the greater the likelihood that more than one of those options will look pretty good to you. The more options that look pretty good to you, the more time you spend in analysis paralysis. That’s the paradox: more choice, more anxiety.
- That’s why identifying low-impact decisions, especially ones that repeat, is so important. Those types of low-risk decisions give you the opportunity to experiment. Experimentation gets the world to tell you what works and what doesn’t work and helps you figure out your preferences, your likes and dislikes.
- Being quit-to-itive is not intuitive Because of the way the human mind works, we tend to view decisions as permanent and final, particularly if they are high impact. We don’t think much in advance about the option to quit. But once you look at decisions through the frame of quit-to-itiveness, you’ll find that for many decisions you thought (or simply assumed) you couldn’t unwind, the cost isn’t prohibitively high.
Chapter 8
Think Positive, but Plan Negative: Identifying our difficulties in executing on our goals
- Imagining how you might fail doesn’t make failure materialize. In fact, there’s a lot of value in picturing the obstacles that might slow you down or get you lost, preventing you from reaching your destination. You can think about this value as the difference between using an old-fashioned paper map and using a navigation app such as Waze. The paper map allows you to see your destination and the different routes that could lead you there. But all those routes appear as clear roads. A paper map can’t show you road closures, heavy traffic, accidents, or speed traps. It can’t show you the obstacles that might impede your progress. But Waze can. That’s why people rarely use paper maps anymore. When it comes to navigation, negative thinking gets you to your destination more reliably.
- Gabriele Oettingen, a professor of psychology at New York University, has conducted over two decades of research showing that anticipating the ways things might go wrong on the path to achieving your goals helps you more successfully reach your destination. For example, among people in a program trying to lose at least fifty pounds, those who imagined the ways in which they might fail lost twenty-six pounds more on average than those who engaged in positive visualization alone. She found that mental contrasting provided a similar boost across a variety of domains, including getting better grades, finishing school projects on time, finding a job, recovering from surgery, or even following through with asking a crush out on a date. MENTAL CONTRASTING Imagining what you want to accomplish and confronting the obstacles that might stand in the way of accomplishing it.
- But the mental contrasting research tells us that the temporary discomfort from imagining failure is worth it, because embracing that discomfort makes it more likely that you’ll actually experience success. Mental pain leads to real-world gains.
- PROSPECTIVE HINDSIGHT Imagining yourself at some time in the future, having succeeded or failed at a goal, and looking back at how you arrived at that destination.
- STATUS QUO BIAS Our tendency to believe that the way things are today will remain the same in the future. When you look forward from the present, status quo bias distorts your view. But if you plan from an imagined point in the future and look back toward the present, you can improve your ability to see beyond what’s immediately in front of you—not just the obstacles farther along the route but how conditions might change.
- Looking at a past version of yourself is a little more like seeing a different person, like when you’re listening to your friend complain about all the jerks they’ve dated. You see the situation in a more objective and detached way. That’s why it’s easy to come up with a list of beliefs that a long-ago version of you held that present you realizes aren’t so solid. Prospective hindsight allows you to imagine your future self looking back at your present self. You can think about the goals and decisions of “that person” more clearly from that vantage point than when you’re caught in the gravitational pull of the present moment. “Galilean relativity”
- Premortems and Backcasting: Whether you deserve an autopsy or a parade, you should know why in advance.
- STEPS FOR A PREMORTEM (1) Identify the goal you’re trying to achieve or a specific decision you’re considering. (2) Figure out a reasonable time period for achieving the goal or for the decision to play out. (3) Imagine it’s the day after that period of time and you didn’t achieve the goal, or the decision worked out poorly. Looking back from that imagined point in the future, list up to five reasons why you failed due to your own decisions and actions or those of your team. (4) List up to five reasons why you failed due to actions outside your control. (5) If you’re doing this as a team exercise, have each member do steps (3) and (4) independently, prior to a group discussion of reasons.
- Stuff within your control—your own decisions and actions or, as is often the case in a business setting, the decision and actions of your team • Stuff outside of your control—in addition to luck, the decisions and actions of people you have no influence over. An effective premortem should produce reasons for failure within each category.
- Research suggests that when you combine mental time travel and mental contrasting, you can produce 30% more reasons for why something might fail.
- BACKCASTING Imagining yourself at some time in the future, having succeeded at achieving a goal, and looking back at how you arrived at that destination. In a backcast, you imagine your decision has worked out or you’ve reached your goal and ask, “Why did that happen?” The steps for backcasting are similar to those for a premortem. STEPS FOR A BACKCAST (1) Identify the goal you’re trying to achieve or a specific decision you’re considering. (2) Figure out a reasonable time period for achieving the goal or for the decision to play out. (3) Imagine it’s the day after that period of time and you achieved the goal, or the decision worked out well. Looking back from that imagined point in the future, list up to five reasons why you succeeded due to your own decisions and actions or those of your team. (4) List up to five reasons why you succeeded due to actions outside of your control. (5) If you’re doing this as a team exercise, have each member do steps (3) and (4) independently, prior to a group discussion of reasons.
- Now that you have the output, how can you use what you’ve learned in advance to improve your likelihood of succeeding? The first thing you should always consider after doing these exercises is whether you want to modify your goal or change your decision, given what you’ve just learned.
- Ulysses contracts can involve three types of advance commitments: • Like Ulysses, you can physically prevent yourself from making poor decisions. • You can raise barriers, making it harder to execute on actions that will defeat your goals. When you raise barriers, you’re not physically preventing yourself from acting, like when you tether yourself to a mast. But you are increasing friction to make it more difficult to tamper with your plans. Raised barriers also provide you with a moment to stop and think before acting. • You can lower barriers, reducing the friction to execute on actions that advance you toward success.
- The Dr. Evil Game: Outthinking the evil genius making sure you fail (P.S. The evil genius is you) When you conduct a premortem, you are considering an unintentional future failure. Your goal was to succeed but you failed. But what if you imagined the ways in which you could intentionally make yourself fail? That would be the ultimate exercise in negative thinking, building on what you learned from prospective hindsight. The decision tool for doing this is the Dr. Evil game.
- STEPS TO PLAY THE DR. EVIL GAME (1) Imagine a positive goal. (2) Imagine that Dr. Evil has control of your brain, causing you to make decisions that will guarantee failure. (3) Any given instance of that must have a good enough rationale that it won’t be noticed by you or others examining that decision. (4) Write down those decisions.
- CATEGORY DECISION When you identify a category of poor decisions that will be hard to spot except in the aggregate, you can decide in advance what options you can and cannot choose that fall within that category.
- A common practice of successful professional investors is to make category decisions to avoid investments outside their circle of competence. In facing an opportunity outside of their realm of expertise, particularly one that promises juicy returns, investors run the risk of fooling themselves into thinking that they can make a winning decision. The temptation to wander outside the circle of competence is especially strong if those boundaries are not in place. On the other hand, if they say “I’m a seed investor,” or “I invest only in assets of REITs that are in restructuring or bankruptcy,” they are less likely to consider anything else that comes when you identify the possibility of bad luck, there are things you can do in advance to soften the impact of that bad luck. These things are called hedges. There are three key features of a hedge. 1. A hedge reduces the impact of bad luck when it occurs. 2. A hedge has a cost. 3. You hope you never use it.
Chapter 9
IF YOU WANT TO KNOW WHAT SOMEONE THINKS, STOP INFECTING THEM WITH WHAT YOU THINK
- The problem with offering your opinion first when soliciting someone’s advice is that it significantly increases the likelihood that they will express the same belief back to you.
- One of the best tools for improving your decision-making is to get other people’s perspectives. But you can only do that if you get their actual perspective, instead of your perspective parroted back to you.
- If there’s one thing you’ve learned from this book, it’s that the way you naturally interact with the world makes it much more likely you’re going to see the places where the maps overlap, both noticing the stuff that agrees with you and actively seeking it out as well. You’ve probably also figured out that the exciting stuff happens where those maps diverge. That’s where you find corrective information, and the stuff you don’t know. Exploring that divergence allows you to get closer to what’s objectively true. Where the maps diverge and your opinion and somebody else’s are far apart, three things might be true, and they are all good for improving the quality of your decisions: 1. The objective truth lies somewhere between the two beliefs. When two people are equally well informed and they hold opposite opinions, the truth most likely lies between the two. When that’s the case, it’s obvious why both people benefit from having discovered the divergence. Both people get the opportunity to moderate their beliefs and get closer to the objective truth. 2. You could be wrong, and the other person could be right. If you hold an inaccurate belief, the quality of any decision informed by that belief will suffer. A rational person would welcome the chance to change an inaccurate belief, but we know people like learning they’re wrong about as much as those doctors liked Semmelweis telling them they were killing patients by not washing their hands. 3. You could be right, and the other person could be wrong. When this is the case, you might think that only the person who is wrong benefits by getting the chance to reverse an inaccurate belief, because your belief was right and will remain unchanged. But actually, you benefit from the exchange as well because the act of explaining your belief and conveying it to someone else will improve how well you understand it. The better you understand why you believe the things you do, the higher in quality those beliefs become.
- The only way somebody can know that they’re disagreeing with you is if they know what you think first. Keeping that to yourself when you elicit feedback makes it more likely that what they say is actually what they believe.
- First, it’s hard for people to publicly disagree with team members who have more expertise or are of higher status, whether it’s Einstein or the CEO. But it’s worse than that because of the halo effect, which is the tendency to give the opinions of highly successful people much more weight across the board, even in areas in which they have no expertise. No one wants to step up and contradict Einstein’s feedback, whether it’s about the theory of relativity or whether to sue the landlord.
- HALO EFFECT A cognitive bias in which a positive impression of a person in one area causes you to have a positive view of that person in other, unrelated, areas. Second, for all the value that expertise provides, subject-matter experts aren’t impervious to bias. As Philip Tetlock has shown, when you’re a subject matter expert, you have a tendency to get entrenched in your worldview and that makes it harder to climb out from that trench and see things from a perspective that differs from your own robust model of the world.
- Group members with lower status may have different, valuable perspectives. Sometimes, they see innovative solutions others don’t see because they aren’t as anchored to the status quo. In a macro sense, when you look at the history of the world, each subsequent generation, by offering its different view, becomes responsible for innovative leaps and paradigm shifts. By initially anonymizing feedback, those outside-the-box perspectives get a chance to breathe.
- If people don’t have the relevant information to provide high-quality feedback, all the quarantining in the world won’t help your decisions. Their feedback will only be as good as the input you give them. In other words, junk in, junk out—quarantines or otherwise.
- We’ve identified a key bottleneck in the decision process, that the quality of the feedback that you get is limited by the quality of the information that you input into that process. Our narratives naturally live in the inside view, biased to support our own perspective of the world. It logically follows that one way to address this problem is to get to the outside view, putting yourself in the shoes of the person giving feedback instead of the feedback seeker. You can get to the outside view by asking yourself, “If somebody were seeking my opinion about this type of decision, what are the things that I would need to know to feel like I could give them high-quality feedback?” Make a checklist of those details and then provide them to anyone you are seeking advice from. You can do this for any decision, but it’s especially useful for decisions that repeat because you can think about it before you’re facing any particular instance of that decision. When you’re in the midst of a decision, you’ve probably already formed an opinion about your preferred option. Once that happens, your preference will distort what information you think you would need to know.
- By building this checklist in advance, you won’t be as influenced by the specifics of a decision on which you’ve already formed an opinion, making it easier to be objective and get to the outside view.
- You’re going to make thousands and thousands of decisions across your lifetime, some of which will work out and some of which won’t. The goal of good decision-making can’t be that every single decision will work out well. Because of the intervention of luck and incomplete information, that’s an impossible goal. The decisions you make are like a portfolio of investments. Your goal is to make sure that the portfolio as a whole advances you toward your goals, even though any individual decision in that portfolio might win or lose.
GENERAL REFERENCES AND SUGGESTED FURTHER READING
General References and Suggested Further Reading Ariely, Dan. Predictably Irrational: The Hidden Forces That Shape Our Decisions. Revised and expanded edition. New York: Harper Collins, 2009. Brockman, John, ed. Thinking: The New Science of Decision-Making, Problem-Solving, and Prediction. New York: Harper Perennial, 2013. Cialdini, Robert. Influence: The Psychology of Persuasion. Revised edition. New York: HarperCollins, 2009. Dalio, Ray. Principles: Life and Work. New York: Simon & Schuster, 2017. Duhigg, Charles. The Power of Habit: Why We Do What We Do in Life and Business. New York: Random House, 2012.———. Smarter Better Faster: The Secrets of Being Productive in Life and Business. New York: Random House, 2016. Ellenberg, Jordan. How Not to Be Wrong: The Power of Mathematical Thinking. New York: Penguin, 2014. Epstein, David. Range: Why Generalists Triumph in a Specialized World. New York: Riverhead, 2019. Feynman, Richard. “Cargo Cult Science.” Engineering and Science 37, no. 7 (June 1974): 10–13.———. The Pleasure of Finding Things Out: The Best Short Works of Richard P. Feynman. New York: Perseus Publishing, 1999. Firestein, Stuart. Ignorance: How It Drives Science. New York: Oxford University Press, 2012. Gilbert, Daniel. Stumbling on Happiness. New York: Alfred A. Knopf, 2006. Haidt, Jonathan. The Righteous Mind: Why Good People Are Divided by Politics and Religion. New York: Pantheon Books,
2012. Holmes, Jamie. Nonsense: The Power of Not Knowing. New York: Crown, 2015. Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011. Kahneman, Daniel, and Amos Tversky. “On the Psychology of Prediction.” Psychological Review 80, no. 4 (July 1973): 237–51. Levitt, Steven, and Stephen Dubner. Freakonomics: A Rogue Economist Explores the Hidden Side of Everything. New York: HarperCollins, 2005. Loewenstein, George, Daniel Read, and Roy Baumeister, eds. Time and Decision: Economic and Psychological Perspectives on Intertemporal Choice. New York: Russell Sage Foundation, 2003. Marcus, Gary. Kluge: The Haphazard Evolution of the Human Mind. New York: Houghton Mifflin, 2008. Marcus, Gary, and Ernest Davis. Rebooting AI: Building Artificial Intelligence We Can Trust. New York: Pantheon, 2019. Mauboussin, Michael. The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing. Boston: Harvard Business Review Press, 2012.———. Think Twice: Harnessing the Power of Counterintuition. Boston: Harvard Business School Publishing, 2009. Mauboussin, Michael, Dan Callahan, and Darius Majd. “The Base Rate Book: Integrating the Past to Better Anticipate the Future,” Credit Suisse Global Financial Strategies, September 26, 2016. Merton, Robert K., “The Normative Structure of Science,” 1942. In The Sociology of Science: Theoretical and Empirical Investigations, edited by Norman Storer. Chicago and London: University of Chicago Press, 1973. Mill, John Stuart. On Liberty. London: John W. Parker and Son, 1859. Moore, Don. Perfectly Confident: How to Calibrate Your Decisions Wisely. New York: Harper Business, 2020. Page, Scott. The Model Thinker: What You Need to Know to Make Data Work for You. New York: Hachette, 2018. Parrish, Shane. The Great Mental Models: General Thinking Concepts. Ottawa, Canada: Latticework, 2020. Pink, Daniel. When: The Scientific Secrets of Perfect Timing. New York: Riverhead, 2018. Pinker, Steven. Enlightenment How: The Case for Reason, Science, Humanism, and Progress. New York: Viking, 2018. Rescher, Nicholas. Luck: The Brilliant Randomness of Everyday Life. New York: Farar Straus & Giroux, 1995. Shermer, Michael. The Believing Brain: From Ghosts and Gods to Politics and Conspiracies: How We Construct Beliefs and Reinforce Them as Truths. New York: Times Books, 2011. Silver, Nate. The Signal and the Noise: Why So Many Predictions Fail—But Some Don’t. New York: Penguin, 2012. Suroweicki, James. The Wisdom of Crowds: Why the Many Are Smarter Than the Few and How Collective Wisdom Shapes Business, Economies, Societies and Nations. New York: Random House, 2004. Taleb, Nassim. Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets. New York: Random House, 2004. Tetlock, Philip. Expert Political Judgment: How Good Is It? How Much Can We Know? Princeton, NJ: Princeton University Press, 2005. Tetlock, Philip, and Dan Gardner. Superforecasting: The Art and Science of Prediction. New York: Crown, 2015. Thaler, Richard. Misbehaving: The Making of Behavioral Economics. New York: W. W. Norton & Co., 2015. Thaler, Richard, and Cass Sunstein. Nudge: Improving Decisions About Health, Wealth, and Happiness. New York:Penguin, 2008. Tversky, Amos, and Daniel Kahneman. “Judgment Under Uncertainty: Heuristics and Biases.” ONR Technical Report (August 1973). Von Neumann, John, and Oskar Morgenstern. Theory of Games and Economic Behavior. Princeton, NJ: Princeton University Press, 2004. Weinberg, Gabriel, and Lauren McCann. Super Thinking: The Big Book of Mental Models. New York: Penguin/Portfolio,
SELECTED REFERENCES
Selected References Arbesman, Samuel. The Half-Life of Facts: Why Everything We Know Has an Expiration Date. New York: Current, 2012. Arkes, Hal, and Catherine Blumer. “The Psychology of Sunk Cost.” Organizational Behavior and Human Decision Processes 35, no. 1 (1985): 124–40. Ariely, Dan, and Jeff Kreisler. Dollars and Sense: How We Misthink Money and How to Spend Smarter. New York: Harper, 2017. Ariely, Dan, Klaus Wertenbroch. “Procrastination, Deadlines, and Performance: Self-Control by Precommitment.” Psychological Science 13, no. 3 (2002): 219–24. Arvai, Joseph, and Ann Froschauer. “Good Decisions, Bad Decisions: The Interaction of Process and Outcome in Evaluations of Decision Quality.” Journal of Risk Research 13, no. 7 (October 2010): 845–59. Asch, Solomon. “Opinions and Social Pressure.” Scientific American 193, no. 5 (1955): 31–35. Bar-Eli, Michael, Azar Ofer, Ilana Ritov, Yael Keidar-Levin, and Galin Schein. “Action Bias Among Elite Soccer Goalkeepers: The Case 0f Penalty Kicks.” Journal of Economic Psychology 28, no. 5 (October 2007): 606–21. Baron, Jonathan, and John Hershey. “Outcome Bias in Decision Evaluation.” Journal of Personality and Social Psychology 54, no. 4 (1988): 569–79. Browne, Basil. “Going on Tilt: Frequent Poker Players and Control.” Journal of Gambling Behavior 5, no 1 (March 1989): 3–21. Burch, E. Earl, and William Henry. “Opportunity Costs: An Experimental Approach.” Accounting Review 45, no. 2 (1970): 315–21. Cavojova, Vladimira, Jakub Srol, and Magalena Adamus. “My Point Is Valid, Yours Is Not: Myside Bias in Reasoning About Abortion.” Journal of Cognitive Psychology 30, no. 7 (2018): 656–69. Chapman, Gretchen, and Eric Johnson. “Anchoring, Activation, and the Construction of Values.” Organizational Behavior and Human Decision Processes 79, no. 2 (August 1999): 115–53. Clear, James. Atomic Habits: An Easy & Proven Way to Build Good Habits & Break Bad Ones. New York: Avery, 2018. Cochran, Winona, and Abraham Tesser. “The ‘What the Hell’ Effect: Some Effects of Goal Proximity and Goal Framing on Performance.” In Striving and Feeling: Interactions Among Goals, Affect, and Self-Regulation, edited by L. Martin and Abraham Tesser. New York: Lawrence Erlbaum Associates, 1996. Coyle, Daniel. The Culture Code: The Secrets of Highly Successful Groups. New York: Bantam, 2018. Cross, K. Patricia. “Not Can, But Will College Teaching Be Improved?” New Directions for Higher Education 17 (1977): 1–15. De Wit, Frank, Lindred Greer, and Karen Jehn. “The Paradox of Intragroup Conflict: A Meta-Analysis.” Journal of Applied Psychology 92, no. 2 (2012): 360–90. Dekking, F. M., C. Kraaikamp, H. P. Lopuhaä, and L. E. Meester. A Modern Introduction to Probability and Statistics: Understanding Why and How. London: Springer Science & Business Media, 2005. Dion, Karen, Ellen Berscheid, and Elaine Walster. “What Is Beautiful Is Good.” Journal of Personality and Social Psychology 24, no. 3 (1972: 285–90. Duarte, Jose, Jarret Crawford, Charlotta Stern, Jonathan Haidt, Lee Jussim, and Philip Tetlock. “Political Diversity Will Improve Social Psychological Science. Behavioral and Brain Sciences 38 (January 2015): 1–58. Dunning, David. “The Dunning–Kruger Effect: On Being Ignorant of One’s Own Ignorance.” In Advances in Experimental Social Psychology, volume 44. San Diego: Academic Press, 2011. Edwards, Kari, and Edward Smith. “A Disconfirmation Bias in the Evaluation of Arguments.” Journal of Personality and Social Psychology 71, no. 1 (1996): 5–24. Eskreis-Winkler, Lauren, Katherine Milkman, Dena Gromet, and Angelea Duckworth. “A Large-Scale Field Experiment Shows Giving Advice Improves Academic Outcomes for the Advisor.” PNAS 116, no. 30 (July 23, 2019):14808–810. Fischhoff, Baruch. “Hindsight Is Not Equal to Foresight: The Effect of Outcome Knowledge on Judgment Under Uncertainty.” Journal of Experimental Psychology: Human Perception and Performance 1, no. 3 (August 1975):288–99.Festinger, Leon. A Theory of Cognitive Dissonance. Stanford, CA: Stanford University Press, 1957. Franz, Timothy, and James Larson. “The Impact of Experts on Information Sharing During Group Discussion.” Small Group Research 33, no. 4 (August 2002):. 383–411. Frederick, Shane, George Loewenstein, and Ted O’Donoghue. “Time Discounting and Time Preference: A Critical Review.” Journal of Economic Literature 40, no. 2 (June 2002): 351–401. Friedman, Jeffrey. War and Chance: Assessing Uncertainty in International Politics. New York: Oxford University Press, 2019. Friedman, Jeffrey, and Richard Zeckhauser. “Handling and Mishandling Estimative Probability: Likelihood, Confidence, and the Search for Bin Laden.” Intelligence and National Security 30 (2015): 77–99. Gigerenzer, Gerd, Ulrich Hoffrage, and Heinz Kleinbölting. “Probabilistic Mental Models: A Brunswikian Theory of Confidence. Psychological Review 98, no. 4 (1991): 506–28. Gigone, Daniel, and Reid Hastie. “The Common Knowledge Effect: Information Sharing and Group Judgment.” Journal of Personality and Social Psychology 65, no. 5 (1993): 959–74. Gilbert, Daniel. “How Mental Systems Believe.” American Psychologist 46, no. 2 (February 1991): 107–19. Gilbert, Daniel, Roman Tafarodi, and Patrick Malone. “You Can’t Not Believe Everything You Read.” Journal of Personality and Social Psychology 65, no. 2 (August 1993): 221–33. Gino, Francesca, Don Moore, and Max Bazerman. “No Harm, No Foul: The Outcome Bias in Ethical Judgments.” Harvard Business School NOM Working Paper 08-080, 2009. Gino, Francesca, and Gary Pisano. “Why Leaders Don’t Learn from Success.” Harvard Business Review 89, no. 4 (April 2011): 68–74. Godker, Katrin, Peiran Jiao, and Paul Smeets. “Investor Memory.” July 2019 draft. www.uibk.ac.at/credence-goods/events/sfb-seminar/documents/sfb_seminar_19_smeets_paper.pdf. Gollwitzer, Peter, and Paschal Sheeran. “Implementation Intentions and Goal Achievement: A Meta-Analysis of Effects and Processes.” Advances in Experimental Social Psychology 38 (2006): 69–119. Guwande, Atul. The Checklist Manifesto: How to Get Things Right. New York: Metropolitan Books, 2009. Haidt, Jonathan, and Richard Reeves, eds. All Minus One: John Stuart Mill’s Ideas on Free Speech Illustrated. New York: Heterodox Academy, 2018. Hammond, John, Ralph Keeney, and Howard Raiffa. “The Hidden Traps in Decision Making.” Harvard Business Review 76, no. 5 (September–October 1998): 47–58. Harford, Tim. “Why Living Experimentally Beats Taking Big Bets.” Financial Times, February 14, 2019. Hastorf, Albert, and Hadley Cantril. “They Saw a Game: A Case Study.” Journal of Abnormal and Social Psychology 49, no. 1 (January 1954): 129–34. Heath, Chip, and Dan Heath. Decisive: How to Make Better Choices in Life and Work. New York: Crown, 2013. Heck, Patrick, Daniel Simons, and Christopher Chabris. “65% of Americans Believe They Are Above Average in Intelligence: Results of Two Nationally Representative Surveys.” PLoS ONE 13, no. 7 (2018): e0200103. Horowitz, Kate. “Why Making Decisions Stresses Some People Out.” MentalFloss.com, February 27, 2018. Hughes, Jeffrey, and Abigail Scholer. “When Wanting the Best Goes Right or Wrong: Distinguishing Between Adaptive and Maladaptive Maximization.” Personality and Social Psychology Bulletin 43, no. 4 (2017): 570–83. Jarrett, Christian. “‘My-Side Bias’ Makes It Difficult for Us to See the Logic in Arguments We Disagree With.” BPS Research Digest (October 9, 2018). Johnson, Hollyn, and Colleen Seifert. “Sources of the Continued Influence Effect: When Misinformation in Memory Affects Later Inferences.” Journal of Experimental Psychology: Learning, Memory, and Cognition 20, no. 6 (November 1994):1420–36. Johnson-Laird, Philip. “Mental Models and Probabilistic Thinking.” Cognition 50, no. 1 (June 1994):189–209. Kahan, Daniel, David Hoffman, Donald Braman, Danieli Evans, and Jeffrey Rachlinski. “They Saw a Protest: Cognitive Illiberalism and the Speech-Conduct Distinction.” Stanford Law Review 64 (2012): 851–906. Kahan, Daniel, and Ellen Peters. “Rumors of the ‘Nonreplication’ of the ‘Motivated Numeracy Effect’ Are Greatly Exaggerated.” Cultural Cognition Project, Working Paper No. 324, 2017. Kahan, Daniel, Ellen Peters, Erica Dawson, Paul Slovic. “Motivated Numeracy and Enlightened Self-Government.” Behavioural Public Policy 1, no. 1 (May 2017): 54–86. Kahneman, Daniel. “Maps of Bounded Rationality: A Perspective of Intuitive Judgment and Choice.” American Economic Review 93, no. 5 (December 2003): 1444–75. Kahneman, Daniel, and Gary Klein. “Conditions for Intuitive Expertise: A Failure to Disagree.” American Psychologist 64, no. 6 (September 2009): 515–26. Kahneman, Daniel, Jack Knetsch, and Richard Thaler. “The Endowment Effect, Loss Aversion, and Status Quo Bias.” Journal of Economic Perspectives 5, no. 1 (Winter 1991): 193–206. Kahneman, Daniel, Paul Slovic, and Amos Tversky, eds. Judgment Under Uncertainty: Heuristics and Biases. New York: Cambridge University Press, 1982. Kahneman, Daniel, and Amos Tversky. “Choices, Values, and Frames.” American Psychologist 39, no. 4 (April 1984): 341–50.———. “Intuitive Prediction: Biases and Corrective Procedures.” Defense Advanced Research Project Agency, Technical Report PTR-1042-77-6, June 1977.———, “Prospect Theory: An Analysis of Decision Under Risk.” Econometrica: Journal of the Econometric Society 47, no. 2 (March 1979), 263–91. Khazan, Olga. “The Power of ‘Good Enough.’” TheAtlantic.com, March 10, 2015. Klein, Gary. “Performing a Project Premortem.” Harvard Business Review 85, no. 9 (September 2007), 18–19. Klein, Gary, Paul Sonkin, and Paul Johnson. “Rendering a Powerful Tool Flaccid: The Misuse of Premortems on Wall Street.” February 2019 draft. capitalallocatorspodcast.com/wp-content/uploads/Klein-Sonkin-and-Johnson-2019-The-Misuse-of-Premortems-on-Wall-Street.pdf. Laakasuo, Michael, Jussi Palomäki, and Mikko Salmela. “Emotional and Social Factors influence Poker Decision Making Accuracy. Journal of Gambling Studies 31, no. 3 (2015): 933–47. Langer, Ellen. “The Illusion of Control.” Journal of Personality and Social Psychology 32, no. 2 (1975): 311–28. Larson, James, Pennie Foster-Fishman, and Christopher Keys. “Discussion of Shared and Unshared Information in Decision-Making Groups,” Journal of Personality and Social Psychology 67, no. 3 (1994): 446–61. Lerner, Jennifer, and Philip Tetlock. “Accounting for the Effects of Accountability.” Psychological Bulletin 125, no. 2 (March 1999): 255–75.———. “Bridging Individual, Interpersonal, and Institutional Approaches to Judgment and Decision Making: The Impact of Accountability on Cognitive Bias.” In Emerging Perspectives on Judgment and Decision Research, edited by S. Schneider and J. Shanteau. Cambridge, UK: Cambridge University Press, 2003. Levitt, Steven, and Stephen Dubner. Think Like a Freak. New York: HarperCollins, 2014. Levy, Dan, Joshua Yardley, and Richard Zeckhauser. “Getting an Honest Answer: Clickers in the Classroom.” Journal of the Scholarship of Teaching and Learning 17, no. 4 (October 2017): 104–25. Lyon, Don, and Paul Slovic. “Dominance of Accuracy Information and Neglect of Base Rates in Probability Estimation.” Acta Psychologica 40, no. 4 (August 1976): 287–98. MacCoun, Robert, and Saul Perlmutter. “Hide Results to Seek the Truth: More Fields Should, Like Particle Physics, Adopt Blind Analysis to Thwart Bias.” Nature 526, no. 7572 (October 8, 2015),: 187–90. MacCoun, Robert, and Saul Perlmutter. “Blind Analysis as a Correction for Confirmatory Bias in Physics and in Psychology.” In Psychological Science Under Scrutiny: Recent Challenges and Proposed Solutions, edited by Scott Lilienfeld and Irwin Waldman. Oxford, UK: Wiley Blackwell, 2017. Mauboussin, Michael, Dan Callahan and Darius Majd. “The Base Rate Book: Integrating the Past to Better Anticipate the Future,” Credit Suisse Global Financial Strategies, September 26, 2016. Mauboussin, Andrew, and Michael Mauboussin. “If You Say Something is ‘Likely,’ How Likely Do People Think It Is?” HBR.org, July 3, 2018. Mitchell, Deborah, J. Edward Russo, and Nancy Pennington. “Back to the Future: Temporal Perspective in the Explanation of Events.” Journal of Behavioral Decision Making 2, no. 1 (January 1989): 25–38. Mitchell, Terence, and Laura Kalb. “Effects of Outcome Knowledge and Outcome Valence on Supervisors’ Evaluations.” Journal of Applied Psychology 66, no. 5 (1981),: 604–12. Moore, Don, and Derek Schatz. “The Three Faces of Overconfidence.” Social & Personality Psychology Focus 11, no 8 (August 2017): e12331. Morse, Mitch. “Thinking in Bets: Book Review and Thoughts on the Interaction of Uncertainty and Politics.” Medium.com. December 9, 2018. Murdock, Bennett. “The Serial Position Effect of Free Recall.” Journal of Experimental Psychology 64, no. 5 (1962): 482–88. Nickerson, Raymond. “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises.” Review of General Psychology 2, no. 2 (1998): 175–220. O’Brien, Michael, R. Alexander Bentley, and William Brock. The Importance of Small Decisions. Cambridge, MA: MIT Press, 2019. Oettingen, Gabriele. Rethinking Positive Thinking: Inside the New Science of Motivation. New York: Current, 2014. Oettingen, Gabriele, and Peter Gollwitzer. “Strategies of Setting and Implementing Goals.” In Social Psychological Foundations of Clinical Psychology, edited by J. Maddox and Tangney. New York: Guilford Press, 2010. Pachur, Thorsten, Ralph Hertwig, and Florian Steinmann. “How Do People Judge Risks: Availability Heuristic, Affect Heuristic, or Both?” Journal of Experimental Psychology: Applied 18, no. 3 (2012): 314–330. Phillips, Katherine, Katie Liljenquist, and Margaret Neale. “Is the Pain Worth the Gain? The Advantages and Liabilities of Agreeing with Socially Distinct Newcomers.” Personality and Social Psychology Bulletin 35, no. 3 (2009): 336–50. Price, Vincent, Joseph Cappella, and Lilach Nir. “Does Disagreement Contribute to More Deliberative Opinion?” Political Communication 19, no. 1 (Januar 2002): 95–112. Rapp, David. “The Consequences of Reading Inaccurate Information.” Current Directions in Psychological Science 25, no. 4 (2016): 281–85. Richards, Carl. The Behavior Gap: Simple Ways to Stop Doing Dumb Things with Money. New York: Penguin/Portfolio, 2012. Robinson, John. “Unlearning and Backcasting: Rethinking Some of the Questions We Ask About the Future.” Technological Forecasting and Social Change 33, no. 4 (July 1998),: 325–38. Roese, Neal, and Kathleen Vohs. “Hindsight Bias.” Perspectives on Psychological Science 7, no. 5 (2012): 411–26. Ross, Michael, and Fiore Sicoly. “Egocentric Biases in Availability and Attribution.” Journal of Personality and Social Psychology 37, no. 3 (March 1979): 322–36. Russo, J. Edward, and Paul Schoemaker. Winning Decisions: Getting it Right the First Time. New York: Doubleday, 2002. Samuelson, William, and Richard Zeckhauser. “Status Quo Bias in Decision Making.” Journal of Risk and Uncertainty 1 (1988): 7–59. Schkade, David, and Daniel Kahneman. “Does Living in California Make People Happy? A Focusing Illusion in Judgments of Life Satisfaction.” Psychological Science 9, no. 5 (September 1998): 340–46. Schoemaker, Paul, and Philip Tetlock. “Superforecasting: How to Upgrade Your Company’s Judgment.” Harvard Business Review 94 (May 2016): 72–78. Schwardmann, Peter, and Joel van der Weele, “Deception and Self-Deception.” Nature Human Behaviour 3, no. 10 (2019), 1055–61. Schwartz, Barry. The Paradox of Choice: Why More Is Less. New York: HarperCollins, 2003. Schwartz, Barry, Andrew Ward, John Monterosso, Sonya Lyubomirsky, Katherine White, and Darrin Lehman “Maximizing Versus Satisficing: Happiness Is a Matter of Choice.” Journal of Personality and Social Psychology 83, no. 5 (2002): 1178–97. Schwartz, Janet, Daniel Mochon, Lauren Wyper, Josiase Maroba, ,Deepak Patel, and Dan Ariely. “Healthier by Precommitment.” Psychological Science 25, no. 2 (2014): 538–46. Sheikh, Hasan, and Cass Sunstein. “To Persuade As an Expert, Order Matters: ‘Information First, then Opinion’ for Effective Communication.” October 24, 2019 draft. ssrn.com/abstract=3474998. Simonson, Itamar. “The Influence of Anticipating Regret and Responsibility on Purchase Decisions.” Journal of Consumer Research 19, no. 1 (June 1992):105–18. Slovic, Paul, Melissa Finucane, Ellen Peters, Donald MacGregor. “The Affect Heuristic.” European Journal of Operational Research 177, no. 3 (2007), 1333–52. Smets, Koen. “More Indifference: Why Strong Preferences and Opinions Are Not (Always) for Us.” Medium.com. May 3, 2019. Stanovich, Keith, and Richard West. “On the Failure of Cognitive Ability to Predict My Side and One-Sided Thinking Biases.” Thinking & Reasoning 14, no. 2 (2008), 129–67. Stark, Emily, and Daniel Sachau. “Lake Wobegon’s Guns: Overestimating Our Gun-Related Competences.” Journal of Social and Political Psychology 4, no. 1 (2016): 8–23. Stasser, Garold, and William Titus. “Pooling of Unshared Information in Group Decision Making: Biased Information Sampling During Discussion,” Journal of Personality and Social Psychology 48, no. 6 (1985): 1467–78. Staw, Barry. “The Escalation of Commitment to a Course of Action.” Academy of Management Review 6, no. 4 (1981): 577–87. Stone, Peter. The Luck of the Draw: The Role of Lotteries in Decision Making. New York: Oxford University Press, 2011. Sturm, Mike. “Satisficing: A Way Out of the Miserable Mindset of Maximizing.” Medium.com. March 28, 2018. Sunstein, Cass. “Historical Explanations Always Involve Counterfactual History.” Journal of Philosophy of History (November 2016), 433–40. Sunstein, Cass, and Reid Hastie. Wiser: Getting Beyond Groupthink to Make Groups Smarter. Boston: Harvard Business Press, 2014. Svenson, Ola. “Are We All Less Risky and More Skillful Than Our Fellow Drivers?” Acta Psychologica 47 (1981): 143–48. Sweeney, Joseph. “Beyond Pros and Cons—Start Teaching the Weight and Rate Method.” Medium.com, October 22, 2018. Thaler, Richard. “Mental Accounting and Consumer Choice.” Marketing Science 4, no. 3 (1985): 199–214.———. “Mental Accounting Matters.” Journal of Behavioral Decision Making 12, no. 3 (1999): 183–206. Thorpe, Clare. “A Guide to Overcoming FOBO, the Fear of Better Options.” Medium.com, November 19, 2018.Trope, Yaacov, and Ayelet Fishbach. “Counteractive Self-Control in Overcoming Temptation.” Journal of Personality and Social Psychology 79, no. 4 (2000): 493–506. Trouche, Emmanuel, Petter Johansson, Lars , Hall, and Hugo Mercier. “The Selective Laziness of Reasoning.” Cognitive Science (2015): 1–15. Tversky, Amos, and Daniel Kahneman. “Advances in Prospect Theory: Cumulative Representation of Uncertainty.” Journal of Risk and Uncertainty 5, no. 4 (1992):297–323.———. “Availability: A Heuristic for Judging Frequency and Probability.” Cognitive Psychology 5, no. 2 (1973): 207–32.———. ”Evidential Impact of Base Rates.” ONR Technical Report (May 1981).———. “Extensional Versus Intuitive Reasoning: The Conjunction Fallacy in Probability Judgment.” Psychological Review 90, no. 4 (October 1983):293–315.———. “The Framing of Decisions and the Psychology of Choice.” Science 211, January 30, 1981, 453–58.———. “Loss Aversion in Riskless Choice: A Reference-Dependent Model.” Quarterly Journal of Economics 106, no. 4 ( November 199), 1039–61.———. “Rational Choice and the Framing of Decisions.” Journal of Business 59 (1986): 251–278. Ullmann-Margalit, Edna, and Sidney Morganbesser. “Picking and Choosing.” Social Research 44, no. 4 (Winter 1977):757–85. Weller, Chris. “A Neuroscientist Explains Why He Always Picks the 2nd Menu Item on a List of Specials.” Business Insider, July 28, 2017. West, Richard, Russell Meserve, and Keith Stanovich. “Cognitive Sophistication Does Not Attenuate the Bias Blind Spot.” Journal of Personality and Social Psychology 103, no. 3 (September 2002): 506–19. Wheeler, Michael. “The Luck Factor in Great Decisions.” HBR.org, November 18, 2013.———. “The Need for Prospective Hindsight.” Negotiation Journal 3, no. 7 (January 1987): 7–10. Zeckhauser, Richard. “Investing in the Unknown and Unknowable.” Capitalism and Society 1, no. 2 (2006), 1–-39. Zweig, Jason, and Phil Tetlock. “The Perilous Task of Forecasting.” Wall Street Journal, June 17, 2016.