In 1969, the psychologist Robert Zajonc published an article about a curious study. He’d posted a silly-sounding word—either kardirga, saricik, biwonjni, nansoma, or iktitaf—on the front page of some student newspapers in Michigan every day for several weeks. Then he sent questionnaires to the papers’ readers, asking them to guess whether each word referred to “something ‘good’ ” or “something ‘bad.’ ” Their answers were consistent, if a little strange: Nonsense words that showed up in print many times were judged to be more positive than those that appeared just once or twice. The fact of their repetition, said Zajonc, gave the words an aura of warmth and trustworthiness. He called this the mere exposure effect.
Maybe you’ve heard about this study before. Maybe you know a bit about Zajonc and his work. That’s good. If you’ve already seen the phrase mere exposure effect in print, then you’ll be more likely to believe that it’s true. That’s the whole point.
Psychologists have devised other ways to make a message more persuasive. “You should first maximize legibility,” says Daniel Kahneman, who describes the Zajonc experiment in Thinking, Fast and Slow, a compendium of his thought and work. Faced with two false statements, side-by-side, he explains, readers are more likely to believe the one that’s typed out in boldface. More advice: “Do not use complex language where simpler language will do,” and “in addition to making your message simple, try to make it memorable.” These factors combine to produce a feeling of “cognitive ease” that lulls our vigilant, more rational selves into a stupor. It’s an old story, and one that’s been told many times before. It even has a name: Psychologists call it the illusion of truth.
See how it works? A simple or repeated phrase, printed in bold or italics, makes us feel good; it just seems right. For Kahneman, that’s exactly what makes it so dangerous. He’s been working on this problem since 1969, when he met his late collaborator, Amos Tversky, at the Hebrew University in Jerusalem. Their famous project, for which Kahneman won a Nobel Prize in 2002, was to illuminate and categorize the pitfalls of intuition, and show that the “rational actor” of economic theory was a fiction. We’re all subject to a set of reliable biases and illusions, they argued; our decisions are consistently inconsistent. For their first major paper, published in Science in 1974 and reprinted in the appendix of Thinking, Fast and Slow, Kahneman and Tversky sorted through the foibles of human judgment and laid out a menu of our most common mistakes. Here was a primer on how perceptions go wrong and a guide for their diagnosis.
The Science paper ticked off some 20 effects and biases, many reduced to simple phrases and set off in italics to make them easier to follow. Thinking, Fast and Slow updates this list with another four decades of work in the field, amounting to a Diagnostic and Statistical Manual of Mental Disorders for the irrational mind. In the course of 418 pages, Kahneman designates no fewer than three biases (confirmation, hindsight, outcome), 12 effects (halo, framing, Florida, Lady Macbeth, etc.), four fallacies (sunk-cost, narrative, planning, conjunction), six illusions (focusing, control, Moses, validity, skill, truth), two neglects (denominator, duration) and three heuristics (mood, affect, availability). A new characterization of how we misjudge the world—and a new catchphrase that we might use to describe it—appears in almost every chapter of the book. That’s Kahneman’s goal: He’s trying to give us “a richer language” for talking about decisions, he says, and “a precise vocabulary” for their analysis.
It’s a promising thought, but to place this book in the rubric of self-help would be to mistake Kahneman—who lived for several years in Nazi-occupied France—for a benighted optimist. Again and again he reminds us that having the means to describe your own bias won’t do much to help you overcome it. If we want to enforce rational behavior in society, he argues, then we all need to cooperate. Since it’s easier to recognize someone else’s errors than our own, we should all be harassing our friends about their poor judgments and making fun of their mistakes. Kahneman thinks we’d be better off in a society of inveterate nags who spout off at the water-cooler like overzealous subscribers to Psychology Today. Each chapter of the book closes with a series of quotes—many suggested by the author’s daughter—that are supposed to help kick off these enriching conversations: You might snipe to a colleague, for example, that “All she is going by is the halo effect”; or maybe you’d interrupt a meeting to cry out, “Nice example of the affect heuristic,” or “Let’s not follow the law of small numbers.”
This imaginary world of psycho-gossip and thought correction sounds like a very annoying place. And while Kahneman’s book offers some clear and engaging examples of how our minds work—or don’t work—it’s never clear whether the propagation of his catchphrases would really improve our lives. Even if organizations and governments can benefit from a rich language of cognitive bias, what would it mean for individuals? Do new ways of talking lead us to make better judgments from one day to the next? (One might as well ask whether the adoption of Freudian terms in the 20th century helped us to manage our ids.)
Whatever its merits, Kahneman’s program—to gift us with a “precise vocabulary” of illusions—plays out according to his own rules and logic. He packages his findings about decision-making into tiny marketing campaigns full of branded notions that worm their way into our heads like viral media. Repeating a slogan makes it seem safer and saner; it elevates his ideas to the level of truthiness. Remember the mere exposure effect? Kahneman wants us to develop a gut feeling about the inadequacy of gut feelings.
It’s a trick that’s become de rigueur in a certain type of science writing, and a fundament in the burgeoning field of “ideas” journalism. Let’s call it the effect effect: Reduce whatever you’re talking about to a single, italicized phrase, so much the better for tapping into a network of TED talks and Radiolab broadcasts, and then repeat, repeat, repeat. I’ve done it in my own writing, and it’s the driving force in a long run of pop-psych best-sellers (most of which are cited in Kahneman’s book). Nassim Taleb’s The Black Swan (2007) brims over with these coinages: He’s got the strengthening effect, the tournament effect, the halo effect, the silent evidence effect, the hedgehog effect, the winner-take-all effect, the butterfly effect, the spandrel effect, the Matthew effect, the nerd effect, and, of course, the black swan effect. James Surowiecki’s The Wisdom of Crowds (2004) describes the Matthew effect, the reputation effect, the cooperator bias, the confirmation bias, and the long-shot bias. Christopher Chabris and Daniel Simons’ The Invisible Gorilla (2010) gives us the blur effect, the Mozart effect, the expectancy effect, the halo effect, and the Hawthorne effect. More effects crop up in Richard Thaler and Cass Sunstein’s Nudge (2008), in Jonah Lehrer’s How We Decide (2009), in Malcolm Gladwell’s books, in pieces about Malcolm Gladwell’s books, in Slate, in Slate, and in Slate. The same catchphrases even recur from one best-seller to the next, emerging in different contexts, slightly altered or not at all, like a thinking man’s LOLcats. If these ideas are good and useful—as many of Kahneman’s seem to be—then everybody wins. But how would you know for sure?
The ubiquity of the effect effect raises a couple of questions of its own. Is there a point at which we’ll have reached a state of overdiagnosis, where these self-help catchphrases have become so plentiful and diverse that we can no longer remember what they mean? Psychiatrists are just now negotiating with the same concern: As their standard DSM guide swells to include marginal disorders like Internet addiction, excessive sex, and prolonged bitterness, critics worry that even the best-established forms of disease might get hopelessly diluted. Could the same happen in pop-psychology? Eventually we’ll be so inundated with “effects” that the word effect will lose its effect. Maybe that’s already happened.
Another question arises from the fact that so many books repeat the same basic message, and invoke such similar “effects,” to explain how our intuitions can help us and hurt us. All these books are best-sellers; are the same people reading each one? (What about the best-selling New Atheist manifestos—how many readers end up buying The End of Faith, and The God Delusion, and also God Is Not Great?) Kahneman himself offers some insight into why certain types of books succeed in spite of their redundancies or because of them. There’s little in Thinking, Fast and Slow that hasn’t been said before, in books and journals and lots of magazine articles. It doesn’t matter, though. One of the new book’s lessons is that familiarity is easy. It feels good. We have a tendency to like what we’ve seen before.
I might be inclined to tell you that I enjoyed this book, that its shopworn examples are well-chosen and nicely told, but there may be no point. Why should anyone believe me? I may be another victim of the effect effect, and the same goes for you.