Fallacies, Illusions, and Biases (Part 2)

I’m working my way through Rolf Dobelli’s The Art of Thinking Clearly by reading a few sections each morning. Below are my notes on sections 12-23. Read 1-11 here.

  1. “It’ll-get-worse-before-it-gets-better” fallacy: A variant of confirmation bias. If the problem gets worse, the prediction is confirmed. If the situation improves unexpectedly, the customer is happy and the expert attributes it to his prowess. Look for verifiable cause-and-effect evidence instead.
  2. Story bias: We tend to interpret things with meaning, especially things that seem connected. Stories are more interesting than details. Our lives are mostly series of unconnected, unplanned events and experiences. Looking at these ex post facto and making up an overarching narrative is disingenuous. The problem with stories is that they give us a false sense of understanding, which leads us to take bigger risks and urges us to take a stroll on thin ice. Whenever you hear a story, ask: Who is the sender, what are his intentions, and what does this story leave out or gloss over?
  3. Hindsight bias: Possibly a variant on story bias. In retrospect, everything seems clear and inevitable. It makes us think we are better predictors than we actually are, causing us to be arrogant about our knowledge and take too much risk. To combat this, read diaries, listen to oral histories, and read news stories from the time you are looking at. Check out predictions from the time. And keep your own journal with your own predictions about your life, career, and current events. Compare them later to what happened to see how poor of a predictor we all are.
  4. Overconfidence effect: We systematically overestimate and our ability to predict on a massive scale. The difference between what we know and what we think we know is huge. Be aware that you tend to overestimate your knowledge. Be skeptical of predictions, especially from so-called experts. With all plans, favor the pessimistic scenario.
  5. Chauffeur Knowledge: There are two types of knowledge: Real knowledge (deep, nuanced understanding) and Chauffeur knowledge (enough knowledge to put on a show, but understanding to answer questions or make connections). Distinguishing between the two is difficult if you don’t understand the topics yourself. One method is the circle of competence. True experts understand the limits of their competence: The perimeter of what they do and do not know. They are more likely to say “I don’t know.” The chauffeurs are unlikely to do this.
  6. Illusion of Control: Similar to placebo effect. The tendency to believe that we can influence something over which we have absolutely no sway. Sports, gambling, etc. Also: Elevators, cross walks, fake temperature dials. This illusion led prisoners (like Frankel, Solzhenitsyn, etc) to not give up hope in concentration camps. Federal reserve’s federal funds rate is probably a fake dial, too. The world is mostly an uncontrollable system at the level we currently understand it. The things we can influence are very few.
  7. Incentive Super-Response Tendency: People respond to incentives by doing whatever is in their best interest. Extreme examples: Hanoi rats being bred, Dead Sea scrolls being torn apart. Good incentive systems take into account both intent and reward. Poor incentive systems often overlook and even corrupt the underlying aim. “Never ask a barber if you need a haircut.” Try to ascertain what actions are incentivized in any situation.
  8. Regression to Mean: A cousin of the “It’ll-get-worse-before-it-gets-better” and the Illusion of Control fallacies. Extreme performances are often interspersed with less extreme ones. There are natural variations in performance. Students are rarely always high or low performers. They cluster around the mean. Thinking we can influence these high and low performers is an illusion of control.
  9. Outcome Bias: We tend to evaluate decisions based on the result rather than the decision process. This is a variant on the Hindsight Bias. Only in retrospect do signals seem clear. When samples are too small, the results are meaningless. A bad result does not necessary indicate a bad decision and vice versa. Focus on the reasons behind actions: Were they rational and understandable?
  10. Paradox of Choice: A large selection leads to inner paralysis and also poorer decisions. Think about what you want before inspecting existing offers. Write down the criteria and stick to them rigidly. There are never perfect decisions. Learn to love a good choice.
  11. Liking Bias: The more we like someone, the more we are inclined to but from or help that person. We see people as pleasant if (a) they are outwardly attractive, (b) they are similar to you, or (3) they like you. This is why the salesperson copies body language and why multi-level marketing schemes work. Advertising employs likable figures in ads. If you are a salesperson, make people like you. If you are a consumer, judge the product independent of the seller and pretend you don’t like the seller.
  12. Endowment effect: We consider things to be more valuable the moment we own them. If we are selling something, we charge more than we ourselves would spend on it. We are better at holding on to things than getting rid of them. This effect works on auction participants, too, which drives up bidding. And late-stage interview rejections. Don’t cling to things, rather view them as the universe temporarily bestowing them to you.

%d bloggers like this: