You are on page 1of 2

Heres my list of conceptual biases, named and sometimes explained, in Chapter 16, Cognitive biases & Availability Cascades.

When I tried two (Texas Shapshooter & Availability Cascade) at lunch today, a group of very smart people who argue about religion, politics, MACs & PCs, Androids & iPhones, anti-trust enforcement stopped side conversations long enough to listen. Thanks, Seth Mnookin and Enlightenment advocates for empiricism! 1. A brain that cant feel cant decide. This suggests that limbic activity in integral to our decision-making, doesnt it? 2. Pattern recognition produces more false positives than false negatives. Ancestors who were complacent with rustles (of snakes) or flickers (of predators) ceased to contribute to the evolutionary cascade which led to us. 3. Clustering illusion results because we are driven to connect the dots even when the dots are random. Lorraine found a high number of breast cancer cases in Long Island. Believing the dots werent random, when they were, gave a sense of more control over fate/destiny/therapeutic outcomes than we, in fact, have. 4. Expectation bias and selection bias (when SafeMinds members set out to write the academic paper to legitimize the hypothesis they believed was true) will produce errors of manipulation of data or misinterpretation of data. 5. Anchoring effect describes the preference to give past experiences too much weight when making decisions about the future. 6. When we decide how much more energy and attention to invest based on our past investments while discounting evidence, this is irrational escalation, and my own wish to hold onto losing stocks so they can return my investment is an excellent example. 7. The Texas Sharpshooters Fallacy allows us to craft a hypothesis to support our data, making it untestable. The metaphor is shooting holes in the side of a barn, then painting the targets bulls-eyes where the hole is. 8. At the moment when I must be most self-disciplined to see how I was wrong, I want so much to be right that confirmation bias kicks in and I will put all the stress on data which shows I might be right.

9. Mnookin then takes up the group dynamics which produce even more cognitive biases. Once in a group, we adjust through the process of learning together to align our views. As a group, we become simultaneously more extreme and more similar with one anothers views. 10. Last and not least, the research of Kuran and Sunstein demonstrated the power of an availability cascade. KS defined it as a self-reinforcing process of collective belief formation by which an expressed perception trigger a chain reaction that gives the perception increasing plausibility through its rising availability in public discourse. The all but guaranteed effect of spending all my time with like-minded people is more polarization. Red Sox fans do not usually invite Yankee fans to share a relaxed evening at the game, do we? Mnookin mentions the Internet before closing this chapter: on the Net, its easy to fall down a wormhold of self-referential and mutually reinforcing links. Everyone I find agrees with me. Since I only look at the links that cluster with one another, this becomes an information silo and no contrary views or troubling data are allowed. Id like to make sure that the List of Ten gets included in the next edition of Thinking for Dummies. Thank you to the Book Club and our courageous contributors for a challenging and mind-expanding read.

You might also like