Thomas Gilovich looks at why we form questionable or inaccurate beliefs and why we hold onto them. Through research and examples, he identifies cognitive biases we all share that distort our beliefs and decisions.
The Notes
- “It ain’t so much the things we don’t know that get us into trouble. It’s the things we know that just ain’t so.” — Artemus Ward
- We can hold questionable beliefs despite seeing evidence to the contrary. Not because we are stupid or gullible but because the judgment tools available for processing information correctly were misapplied in a way that led to a biased outcome.
- “We hold many dubious beliefs, in other words, not because they satisfy some important psychological need, but because they seem to be the most sensible conclusions consistent with the available evidence. People hold such beliefs because they seem, in the words of Robert Merton, to be the ‘irresistible products of their own experience.’ They are the products, not of irrationality, but of flawed rationality.”
- “The world does not play fair. Instead of providing us with clear information that would enable us to ‘know’ better, it presents us with messy data that are random, incomplete, unrepresentative, ambiguous, inconsistent, unpalatable, or secondhand… It is often our flawed attempts to cope with precisely these difficulties that lay bare our inferential shortcomings and produce the facts we know that just ain’t so.”
- “When people learn no tools of judgment and merely follow their hopes, the seeds of political manipulation are sown.” — Stephen Jay Gould
- “The human understanding supposes a greater degree of order and equality in things than it really finds; and although many things in nature be sui generis and most irregular, will yet invest parallels and conjugates and relatives where no such thing is.” — Francis Bacon
- “We are predisposed to see order, pattern, and meaning in the world, and we find randomness, chaos, and meaninglessness unsatisfying. Human nature abhors a lack of predictability and the absence of meaning. As a consequence, we tend to “see” order where there is none, and we spot meaningful patterns where only the vagaries of chance are operating.”
- Seeing order and patterns allows for connecting dots and discovery through rigorous tests but it falls flat when we treat “order” in chaos as facts.
- Hot Hand Fallacy = the belief that success breeds success or failure breeds failure.
- For example, a basketball player may believe they are more likely to make their next shot after making their previous shot i.e. that they shoot in streaks. Numerous factors determine the chance of a player making a shot but the previous shot is not one of them.
- Why do people believe in the hot hand?
- Sequences of hits or misses stand out and are more memorable than alternating hits and misses, even though the sequences of hits or misses are no longer than can happen by random chance. It’s easier to believe that confidence was behind it than randomness.
- People have a poor understanding of what a random sequence looks like. The clustering illusion is the belief that random events like coin flips should alternate heads and tails more than they do. The clustering illusion is due to the representativeness bias. Because we expect the proportion of heads to tails to be close to 50/50 after a long sequence of tosses, we also expect it throughout short sequences too.
- Representative Bias = “the reflexive tendency to assess the similarity of outcomes, instances, and categories on relatively salient and even superficial features, and then to use these assessments of similarity as a basis of judgment. People assume that ‘like goes with like.'”
- Hindsight lets us see patterns where randomness exists. Without thorough testing, anyone can treat a hypothetical anomaly as fact and believe things are not true. We’re also great at inventing a reason why a false anomaly exists.
- “Once a person has (mis)identified a random pattern as a “real” phenomenon, it will not exist as a puzzling, isolated fact about the world. Rather, it is quickly explained and readily integrated into the person’s pre-existing theories and beliefs. These theories, furthermore, then serve to bias the person’s evaluation of new information in such a way that the initial belief becomes solidly entrenched.”
- Statistical Regression
- “Whenever any two variables are imperfectly correlated, extreme values of one of the variables are matched, on the average, by less extreme values of the other.”
- We tend to be overly aggressive (not conservative enough or regressive) when making predictions. We expect similarly extreme performance to continue rather than a regression in performance.
- Regression Fallacy = the tendency to fail to recognize regression and instead explain it away through complex theories — a need to add meaning to a random event.
- “By developing elaborate explanations for phenomena that are the predictable result of statistical regression, people form spurious beliefs about phenomena and causal relations in everyday life.”
- The Illusion of Validity = a belief that something exists based on inadequate evidence yet is considered a logical conclusion on objective evidence.
- We over-rely on information that confirms a relationship and overlook disconfirming information.
- Confirming information is easier to deal with than non-confirming information especially when the non-confirming information is framed negatively. I.E. “How much easier it is to comprehend the statement ‘All Greeks are mortals’ than ‘All non-mortals are non-Greeks.'”
- Asymmetric Variables = variables where both levels of the variable offer the presence of some attribute or set of attributes (male or female).
- Symmetric Variables = variables where one level of the variable is the absence of the other (it rained or did not rain).
- We are more easily swayed by asymmetric variables (the non-negative) that confirm beliefs.
- “The influence of confirmatory information is particularly strong when both variables are asymmetric because in such cases three of the four cells contain information about the nonoccurrence of one of the variables, and, once again, such negative or null instances have been shown to be particularly difficult to process.”
- “It is the peculiar and perpetual error of the human understanding to be more moved and excited by affirmatives than negatives.” — Francis Bacon
- We tend to seek information that confirms our beliefs rather than information that disconfirms it. It’s not always because of a desire to “prove” our belief true. It can be a byproduct of asking one-sided confirmatory questions.
- “When testing a hypothesis of similarity, people look for evidence of similarity rather than dissimilarity, and when testing a hypothesis of dissimilarity, they do the opposite. The relationship one perceives between two entities, then, can vary with the precise form of the question that is asked.”
- “A fundamental difficulty with effective policy evaluation is that we rarely get to observe what would have happened if the policy had not been put into effect. Policies are not implemented as controlled experiments, but as concerted actions. Not knowing what would have happened under a different policy makes it enormously difficult to distinguish positive or negative outcomes from good or bad strategies. If the base rate of success is high, even a dubious strategy will seem wise; if the base rate is low, even the wisest strategy can seem foolish.”
- Self-Fulfilling Prophecies
- When expectations lead people to act in a certain way that leads to expectations being realized. We often accept the outcome at face value rather than consider what might have happened had we acted differently.
- Ex: a rumor of a bank’s insolvency leads to a bank run and insolvency.
- Often contain a kernel of truth and can lead to extreme outcomes.
- “Self-fulfilling prophecies generally turn little effects into big effects, rather than create effects from scratch.”
- “Life is a series of trade-offs. For every benefit gained, there is usually some cost.”
- “When making judgments and decisions, we employ a variety of informal rules and strategies that simplify fundamentally difficult problems and allow us to solve them without excessive effort and stress. These strategies are generally effective, but the benefit of simplification is paid for at the cost of occasional systematic error. There is, in other words, an ease/accuracy trade-off in human judgment.”
- Ambiguous Information = often perceived in a way that fits our preconceptions. Ex: black clothing is perceived as aggressive or bad i.e. the bad guys always wear black hats in films.
- Unambiguous Information = often scrutinized inconsistent information more than consistent information. We only look for additional information when outcomes fail to line up with expectations and/or add new meaning to new information.
- Study on Losing:
- The research studied gamblers, asking them to track their thoughts after winning/losing bets.
- Gamblers spent more time thinking about losses than wins.
- Gamblers viewed wins as something that was expected.
- Gamblers viewed losses as flukes or one-off unlucky breaks. Losses were often seen as “near wins.”
- The “near wins” was reason enough to continue gambling, with a slight tweak to the strategy.
- How Scientific Procedures Help Weed out Bias:
- Use simple statistical tools to avoid misperceiving random sequences.
- Use control groups and random sampling to avoid forming conclusions from incomplete data.
- Use blind observers to avoid influence.
- Require the meaning of outcomes to be specifically defined in advance (if possible) and objectively determined. They precisely define what success and failure is. It helps avoid interpreting outcomes by adding meaning (our own narrative) that fits expectations.
- “Sir Peter Medawar has noted, science works ‘…in a rapid reciprocation of guesswork and checkwork, proposal and disposal, conjecture and refutation.'”
- “When asked on a talk show to explain the secret of his success, two-time Nobel Laureate Linus Pauling once replied that ‘…you need to have a lot of ideas, and then you have to throw away the bad ones.'”
- “We humans seem to be extremely good at generating ideas, theories, and explanations that have the ring of plausibility. We may be relatively deficient, however, in evaluating and testing our ideas once they are formed. One of the biggest impediments to doing so is our failure to realize that when we do not precisely specify the kind of evidence that will count as support for our position, we can end up ‘detecting’ too much evidence for our preconceptions.”
- Barnum Effect = We tend to accept generally worded descriptions as being about ourselves if we believe it was produced specifically for us like a horoscope.
- Because in any long, general word description some things will match that person’s characteristics and the description will be so general that some points covered will be true.
- It’s produced to apply to anyone.
- “Whenever a new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once; for I had found by experience that such facts and thoughts were far more apt to escape from the memory than favorable ones.” — Charles Darwin
- Two-Sided Events:
- Events that stand out and register as a belief regardless of how they turn out.
- Ex: a bet on a football match has two outcomes — win or loss — with emotional significance for the bettor.
- Both outcomes are likely to be remembered.
- In some cases, the unfavorable outcome is more memorable because of “what could have been.”
- One-Sided Events:
- Events that stand out and register as a belief when they only turn out one way.
- Ex: the belief that strange things happen when there’s a full moon. If you expect strange things to happen during full moons, you’re likely watching out for strange things, thus remembering them. As opposed to not paying attention to the strange things that happen every other day because there’s no full moon.
- One-sided events that support expectations tend to stand out. Those events tend to be better remembered.
- Events that confirm our beliefs tend to stand out more than those that offer non-confirmation.
- Asymmetries: One-sided negative events can stand out in a way that appears to be “bad streaks.” Like noticing how all the buses/trains/cabs are going in the opposite direction.
- Pattern Asymmetries: numerical, spatial, or temporal patterns produced by various outcomes. Winning streaks are more memorable despite being mere coincidence.
- Base Rate Departure: Outliers are more memorable than frequent “normal” occurrences.
- Negative Eventful Actions = actions (customs) that are so common that we only notice when someone fails to follow them. In this case, the disconfirmation of expectations stands out.
- “These asymmetries tend to accentuate information that is consistent with a person’s expectations and pre-existing beliefs. As a result, people tend to see in a body of evidence what they expect to see. What people expect to see, furthermore, is often what they want to see, and so the biasing effect of their preconceptions is often exacerbated by the biasing effect of their preferences and motives.”
- Self-Serving Beliefs
- Lake Wobegon Effect = The average person tends to believe they are above average in every way. They believe they are more socially desirable, more intelligent, more objective, less prejudiced, better leaders, better at their jobs, and more skilled drivers than average.
- Endowment Effect = We put a higher value on things we own, than if we didn’t own it, which creates inertia when trying to complete economic transactions.
- We tie successes to our own doing and failures to external circumstances.
- Our Motives Influence Beliefs:
- When seeking information, we naturally ask: What supports this belief? The natural framing of questions is slanted toward confirming evidence because it’s more comforting than disconfirming evidence.
- We naturally gravitate toward people whose opinions line up with our beliefs. We’re able to hear what we want to hear.
- When searching for information we tend to stop after finding confirming evidence. We often dig deeper if disconfirming evidence is found first, in the hopes of finding confirmation.
- We use different criteria to our own advantage when evaluating given traits.
- “Everybody ranks himself high in qualities he values: careful drivers give weight to care, skillful drivers give weight to skill, and those who think that, whatever else they are not, at least they are polite, give weight to courtesy, and come out high on their own scale. This is the way that every child has the best dog on the block.” — Thomas Shelling
- We believe we are above average on ambiguous traits until we are asked for specific definitions of those traits.
- “Beliefs are like possessions.” — Robert Abelson
- Narrative Distorts Information
- Most of what we know comes from second (third, fourth, etc.) hand sources. It’s not from personal experience.
- A good story can bring misleading information.
- Most messages are not passed verbatim. Information leaks. Messages passed along, get edited — simplified and cleaned, important details are sharpened, and less essential details are deemphasized or removed. That distorts what we learn secondhand.
- We tend to develop extreme impressions of people we have heard about but never met. We define those people by their actions or behavior in a story, even if it was a one-off event.
- Most secondhand stories are not secondhand but third-, fourth-, etc. hand. We generally don’t know how remote the story is. We’re blind to how distorted a message may be.
- Informative stories often stretch the facts to make a point.
- Entertaining stories often substitute accuracy for entertainment. Literary license.
- The media plays up amazing/surprising/negative stories over balanced accounts to draw more eyeballs.
- We distort messages for our own self-interest, unbeknownst to the audience.
- Untrue, but plausible, stories are more easily retold because they could be true.
- Of course, personal experiences can offer anecdotal evidence that is equally untrue.
- Base Rates
- “Those who study human judgment and decision-making urge us to give less weight to our own impressions and to assign more weight to the “base rate,” or general background statistics… Because personal experience is not an infallible guide to the truth, we must augment it (augment it more than we apparently do) with relevant background statistics.”
- “Distinguishing what we know well from what we only think is true is itself an important advance.”
- Basic Tests
- Consider the Source = how credible is the source of information. It is quoted or implied.
- Trust Facts, Distrust Projection = Give more weight to facts than projections.
- Watch Out for Sharpening/Leveling = Do claims or estimates include a range or confidence interval or is it watered down with “as many as” type statements?
- Be Wary of Testimonials = One person’s experience can deviate from the average person’s experience. Testimonials are anecdotal.
- Imagined Social Support
- “My opinion, my conviction, gains infinitely in strength and success, the moment a second mind has adopted it.” — Novalis
- Our beliefs are influenced by what we think other people believe. Yet, we are poor guesstimators of what other people think which leads to mistaken beliefs.
- We also project our beliefs onto others.
- False Consensus Effect = the tendency for our own beliefs to bias our views on how widely those beliefs are shared by others.
- The effect is relative. It exaggerates our estimates of how common a belief is relative to the estimates of the average person.
- May be driven by the need to boost self-esteem by knowing that our beliefs are mainstream.
- It’s more present with beliefs we’re emotionally invested in.
- Since we tend to associate with people who think as we do, it’s easier to think of people we know who share the same belief, which makes it easier to overestimate how many others believe the same thing.
- When external events influence us, we believe those events have a similar influence on others.
- The perceived higher social support allows us to hold onto our beliefs even tighter.
- GroupThink
- We are generally reluctant to openly question other people’s beliefs. It’s a way to avoid uncomfortable disagreements i.e. nobody wants to be the bearer of bad news.
- “Members of highly cohesive advisory groups who are under considerable pressure to devise effective courses of action can become overly concerned with maintaining apparent consensus within the group and will sometimes censor their personal reservations to accomplish it. Disastrous policies sometimes result.”
- The absence of argument can lead to a false sense of social support.
- “Just as our actions can convince others about what we believe, they can also convince us.”
- Post Hoc Fallacy
- Post hoc ergo propter hoc.
- The erroneous conclusion that since event Y followed X, X must have caused Y to happen.
- It is the failure to consider other reasons for an outcome.
- Ex: in medicine, if a cure follows an “alternative treatment,” it’s almost impossible to dissuade someone that something else restored their health.
- “Things that ought to be true often are. But many times our sense of what ought to be true obscures our vision of what is actually the case, particularly when the underlying theories that generate this sense of plausibility are rather superficial.”
- “Causes often do resemble their effects, of course, but there are more than enough exceptions to warrant a little caution and a little healthy skepticism.”
- “Very few of the most extreme predictions of any emerging field turn out to be true. The ‘smart money’ generally lies on the more modest claims.”
- Self-Handicapping
- An attempt to manage how others perceive us by drawing attention to attributes, things, or skills that might lead to poor performance or failure. The hope is that the other person discounts the failure or sees success as more impressive.
- Real Self-Handicapping = putting a real obstacle in the path to success, making success less likely.
- Feigned Self-Handicapping = simply claiming difficult obstacles exist (excuses), to excuse poor performance.
- The opposite of self-handicapping is name-dropping, boasting, showing off, etc. in an attempt to make us appear more skilled and/or boost social status.
- Feigned self-handicapping is generally ineffective. Name-dropping, boasting, etc. generally backfire. Yet people continue to use such strategies because of biased feedback, a biased view of success rate, and not knowing the effectiveness of alternative strategies.
- “This tendency to focus too heavily on the occasional success is helped along by an asymmetry in the way we evaluate success and failure. A single success generally does more to confirm a strategy’s effectiveness than a single failure does to disconfirm it. Indeed, successes tend to be taken as prima facie evidence that the strategy is effective… Successes, in other words, are generally seen as confirmations of one’s underlying strategy, whereas failures tend to be thought of only as failures of outcome, not as failures of strategy.”
- Replicability in Science
- “Faulty research designs can obscure phenomena that are really there, or lead us to believe in phenomena that are not. Finally…the existence of a large number of studies does not by itself compensate for their lack of quality. Adding together a set of similarly-flawed investigations does not produce an accurate assessment of reality. As statisticians like to say, sample size does not overcome sample bias.”
- “No single experiment, or no set of experiments carried out in one laboratory, can ever stand as definitive evidence. Science requires that a phenomenon be reliably produced in different laboratories for it to be accepted as genuine. Whoever claims to have discovered a phenomenon must describe in sufficient detail how it was produced so that other investigators, following similar steps, can reproduce it themselves. This requirement of replicability applies to all fields of science.”
- Coincidences
- “The improbable is extremely probable.” — Aristotle
- What appears to be an extraordinary coincidence can actually be common. Failing to know how common — misunderstanding probabilities — leads us to believe some other phenomenon is at work.
- The odds are 50% that two people in a group of 23 people were born on the same day. The odds rise to 85% in a group of 35 people. “Many people approach the problem with a fairly accurate sense of the long odds against a particular pair of people having the same birthdate (approximately 1/365), but they fail to appreciate how many different pairs of people there are (253) in a group of 23.”
- “Time converts the improbable to the inevitable — give me a million years and I’ll flip a hundred heads in a row more than once.” — Stephen Jay Gould
- Unlike coin flipping, where we see the same distribution repeated over and over, real life has multiple distributions repeating at once. Having different distributions makes the repetitive sampling process less obvious. The odds of a specific coincidence might be low but the odds of any coincidental event is much higher. We shouldn’t be surprised by “unlikely” surprises.
- “Given the vastness of our experience (how many thoughts we have, how many people we come in contact with, etc.) numerous coincidental events are bound to happen in a lifetime.”
- ” Because ‘big’ events are thought to require ‘big’ causes, purely random coincidence is considered by many to be an unacceptable explanation of such a compelling and evocative occurrence.”
- Prophecies and Premonitions
- “Some prophecies are so vague that they can be ‘fulfilled’ by almost any outcome.”
- The successes stand out and the failures are forgotten.
- “All superstition is much the same whether it be that of astrology, dreams, omens, retributive judgment, or the like,…the deluded believers observe events which are fulfilled, but neglect or pass over their failure, though it be much more common.” — Francis Bacon
- “Logicians and philosophers are in virtual unanimous agreement that the burden of proof on any question lies not with the skeptic, but with the person making the positive assertion.”
- “The underlying causes of faulty reasoning and erroneous beliefs will never be eliminated. People will always prefer black-and-white over shades of grey, and so there will always be the temptation to hold overly-simplified beliefs and to hold them with excessive confidence. People will always be tempted by the idea that everything that happens to them is controllable. Likewise, the tendency to impute structure and coherence to purely random patterns is wired deep into our cognitive machinery, and it is unlikely to ever be completely eliminated. The tendency to be more impressed by what has happened than by what has failed to happen, and the temptation to draw conclusions from what has occurred under present circumstances without comparing it to what would have occurred under alternative circumstances, seem to be similarly ingrained.”
- To Avoid Erroneous Beliefs:
- Avoid drawing conclusions from incomplete and unrepresentative evidence. Be aware of how often life presents us with biased information samples. When drawing conclusions, ask: Is there any information you’re overlooking?
- Avoid the desire to explain outcomes in terms of our pre-existing beliefs. Use a consider the opposite strategy. If the opposite outcome occurred, how would I explain it? What alternate theory could account for it?
- Avoid taking secondhand information at face value. Be aware of how information distorts the further down the communication chain it goes. Ask what is the source, what is that source’s source, and so on, to understand the amount of distortion in the message.
- Avoid seeing things in black and white. Think in terms of probabilities.
- “The more science one learns, the more one becomes aware of what is not known, and the provisional nature of much of what is… This general intellectual outlook, this awareness of how hard it can be to really know something with certainty, while humbling, is an important side benefit of participating in the scientific enterprise.”