How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life by Thomas Gilovich
The Essence
Why do our friends always call us while we’re in the shower? Is that basketball player really ‘hot’ when he makes a bunch of baskets in a row? Can people who claim to have ESP backed by empirical study really read our minds? Social psychologist Thomas Gilovich straightens out the ‘why’ behind such our methods for explaining phenomena like these and many of the other the lines of reasoning and deduction we use when attempting to make sense of the world.
Thing like our tendency to be insufficiently regressive when making a prediction on what’s next. Or our failure to ever even recognize such a regression for what it is, causing us to create ad-hoc justification for things that may just be an influx in factors beyond our current methods of making sense of the phenomenon.
Going further than to just identify such errors, Gilovich provides us with some useful heuristics to avoid making such reasoning errors in the future. Such as when considering beliefs that we hold on the basis of only positive rapport—“I've seen it before”, “I know someone who did”—Gilovich urges to question whether the necessity of a positive statement is sufficient for it to be believed as truth; go seek out the other side.
How We Know What Isn’t So is a primer on faulty reasoning and its underlying cognitive correlates. It may just be that what we know as truth could be misinterpreted or perhaps even misperceived. Gilovich takes our hand and leads us back on a path towards rationality.
How We Know What Isn't So Journal Entry Notes:
This is my book summary of How We Know What Isn't So. My notes are a reflection of the journal write up above. Written informally, the notes contain a mesh of quotes and my own thoughts on the book. The Journal write up also includes important messages and crucial passages from the book.
Cognitive Determinants of Questionable Beliefs
• “We are predisposed to see order, pattern, and meaning in the world, and we find randomness, chaos, and meaninglessness unsatisfying”. Perhaps it is even that the discomfort that arises from the uncertain draws us to such conclusions.
• The Regression fallacy: Failing to recognize statistical regression. So when we get a good result from giving a reward and fail to replicate the results, we assume punishment will be more effective. When in reality this just explains away the regression. It serves to punish the administration of reward and to reward the administration of punishment. Even though research tells us that rewarding desirable response is generally more effective in shaping behavior than punishing undesirable responses.
• Francis Bacon: “It is a peculiar and perpetual error of the human understanding to be more moved and excited by affirmatives than negatives.”
• As dysfunctional as they may be on occasion, our theories, preconceptions, and bias are what make us smart (relative to other creatures). Humans are association machines, constantly applying and altering model after model to modulate towards a course of action that is least calorically depleting.
Motivational and Social Determinants of Questionable Beliefs
• People, within a limit, believe what they want to believe.
• Be aware of optional stopping: Peoples preferences influence not only the kind of information they consider but also the amount they examine.
• Beliefs as possessions: It's an idea that was briefly mentioned by Gilovich that I am quite fond of. As a minimalist I accept that things that are a part of my material possession must maximize on value—beliefs are similar. As I bring in new ideas into my life, I mustn't saturate myself with too many beliefs that it undermines my dedication to ideals I deem as my current subjective truth.
• Second-hand information can be misleading. People sharpen and level information when communicating. Is this what must be done to drive home the point, goal, or information presented? It's these micro alterations that should lead us to be cautious of proof as evidence from anecdote.
• A common type of accepted story that is spread is because of its plausibility, but it is also, more importantly, entertaining and not particularly serious. We ought to keep this in mind when considering the merits of stories told. Could a story really be “Too good to be true”?
• Often there is little reason to believe that a person’s experience is any more informative than one’s own in estimating overall prevalence. We should not allow the depth of our feeling toward any one person to influence our assessment of how many such people there are.
• It seems that the process of interpretation is so reflexive and immediate that we often overlook it. This combined with the widespread assumption that there is but one objective reality is what may lead people to overlook the possibility that others may be responding to a very different situation. It’s challenging to home in on someone else’s subjective experience, even more, when such a being isn’t remotely similar to us and across the globe.
• “Self-Handicapping”: Our attempts to manage how others perceive us by controlling the attributions they make for our performance.
• Concerning self-handicapping… It speaks to how far perseverance and hard work have fallen in value in the current culture that such strategies of self-presentation are so commonly employed.
• An awareness of how and when to question and a recognition of what it takes to truly know something are among the most important elements of what constitutes an educated person.
• Retrospective Prophecies: Capitalize on multiple endpoints, very vague, can almost be “fullfed” by any outcome. Think Nostradamus. When you make broad predictions, the kind that is bound to occur at one point if history goes on long enough and then are subsequently are given credit—or even worse, you attempt to claim credit. Think Bill O'Reilly.
• Confirmatory events are in fact much more memorable than non-confirmatory events.
• Much of scientific enterprise can be construed as the use of formal procedures for determining when to throw out bad ideas, a set of procedures that we might be well advised to adopt in our everyday lives. I’ve always thought of truth-seeking to be much like a science. When a new truth makes itself apparent in my reality, I make quick work to discard of anything that is now only an impendent to embracing that truth. And if that is too costly, I accept my self-deception for what it is and work toward an increasingly more truthful version of reality.
• People tend to think that sufficient quantity can compensate for a lack of quality. There are domains in life in which it does, but empirical research is not one of them.
• Greater familiarity with scientific enterprise should help to promote the habits of the mind necessary to think clearly about evidence and steer clear of dubious beliefs.
• One-Sided Events: events that stand out and are mentally represented as events only when they turn out one way. Example: The phone always rings when I am showering. This is clearly a one-sided event because why would I ever notice someone opting to not call me in the shower every time I take a shower. The saliency is context dependent, relying on a specific outcome to validate its self.
• Two-Sided Events: events that stand out and register an event regardless of how they turn out. Example: “A person bets on a sports race, both outcomes have emotional significance.
• List of Asymmetries that tend to accentuate information that is consistent with a person’s expectations and pre-existing beliefs: Hedonic, Patterns, Definitional, and Base Rate departures.
• The absence of explicit disagreement should not automatically be taken as evidence of an agreement.
• Because personal experience is not an infallible guide to the truth—well at least under our ordinary state of consciousness—we must augment it with relevant background statistics.
• To truly appreciate the complexities of the world and the intricacies of human experience, it is essential to understand how we can be misled by the apparent evidence of everyday experience. This, in turn, requires that we think clearly about our experience, question our assumptions, and challenge what we think we know.
Reading Recommendations
If you liked what you saw. Here are 3 titles that I recommend based on what discussed in How We Know What Isn't So.
- The Wisest One in the Room: How You Can Benefit from Social Psychology's Most Powerful Insights by Thomas Gilovich
- Predictably Irrational: The Hidden Forces That Shape Our Decisions by Dan Ariely
- How the Mind Works by Steven Pinker
Find the book on Amazon: Print
Check Out More 52 in 52 Challenge Summaries
Note: This page contains affiliate links. This means that if you decide to buy a product through them, I will receive a small commission. This has no additional cost to you. If you would like to support Forces of Habit, please use these links. If you do use them, thank you for the support.