もっと詳しく

Why is it so hard to be rational? “In everyday life, the biggest obstacle to metacognition is what psychologists call the “illusion of fluency.” As we perform increasingly familiar tasks, we monitor our performance less rigorously; this happens when we drive, or fold laundry, and also when we think thoughts we’ve thought many times before. Studying for a test by reviewing your notes, Fleming writes, is a bad idea, because it’s the mental equivalent of driving a familiar route. “Experiments have repeatedly shown that testing ourselves—forcing ourselves to practice exam questions, or writing out what we know—is more effective,” he writes. The trick is to break the illusion of fluency, and to encourage an “awareness of ignorance.””

“Fleming notes that metacognition is a skill. Some people are better at it than others. Galef believes that, by “calibrating” our metacognitive minds, we can improve our performance and so become more rational. In a section of her book called “Calibration Practice,” she offers readers a collection of true-or-false statements (“Mammals and dinosaurs coexisted”; “Scurvy is caused by a deficit of Vitamin C”); your job is to weigh in on the veracity of each statement while also indicating whether you are fifty-five, sixty-five, seventy-five, eighty-five, or ninety-five per cent confident in your determination. A perfectly calibrated individual, Galef suggests, will be right seventy-five per cent of the time about the answers in which she is seventy-five per cent confident. With practice, I got fairly close to “perfect calibration”: I still answered some questions wrong, but I was right about how wrong I would be.

There are many calibration methods. In the “equivalent bet” technique, which Galef attributes to the decision-making expert Douglas Hubbard, you imagine that you’ve been offered two ways of winning ten thousand dollars: you can either bet on the truth of some statement (for instance, that self-driving cars will be on the road within a year) or reach blindly into a box full of balls in the hope of retrieving a marked ball. Suppose the box contains four balls. Would you prefer to answer the question, or reach into the box? (I’d prefer the odds of the box.) Now suppose the box contains twenty-four balls—would your preference change? By imagining boxes with different numbers of balls, you can get a sense of how much you really believe in your assertions. For Galef, the box that’s “equivalent” to her belief in the imminence of self-driving cars contains nine balls, suggesting that she has eleven-per-cent confidence in that prediction. Such techniques may reveal that our knowledge is more fine-grained than we realize; we just need to look at it more closely. Of course, we could be making out detail that isn’t there.” “