An experimenter walks into a room and gives you these three numbers.
2, 4, 6.
She tells you that the numbers follow a simple rule, and your job is to discover the rule by proposing different strings of three numbers. The experimenter will then tell you whether the strings you propose conform to the rule. You get as many tries as you want, and there’s no time limit.
Give it a shot: What do you think the rule is?
For most participants, the experiment went in one of two ways.
Participant A said “4, 6, 8.” The experimenter replied, “Follows the rule.” The participant then said, “6, 8, 10.” “Also follows the rule,” said the experimenter. After several more strings of numbers were met with nods of approval, Participant A declared that the rule is “increasing intervals of two.”
Participant B opened with “3, 6, 9.” The experimenter replied, “Follows the rule.” The participant then said, “4, 8, 12.” “Also follows the rule,” said the experimenter. After Participant B produced several more strings of numbers that conformed to the rule, he declared that the rule is “multiples of the first number.”
Much to their astonishment, both participants had it wrong.
The rule, it turns out, was “numbers in increasing order.” The strings of numbers that both Participant A and B provided conformed to the rule, but the rule was different than the one they had in mind.
If you didn’t get the rule right, you’re in good company: Only about one in five people in the study could identify the rule on their first attempt. (Here’s a video of an informal version of the experiment).
What’s the secret to solving the puzzle?
What set apart the successful participants from the unsuccessful ones?
The unsuccessful participants believed they found the rule early on and proposed strings of numbers that confirmed their belief. If they thought the rule was “increasing intervals of two,” they generated strings like 8-10-12 or 20-22-24. As the experimenter validated each new string, the participants grew increasingly more confident in their initial brilliant hunch and assumed that they were on the right track. They were too busy trying to find numbers that conformed to what they thought was the right rule, rather than discovering the rule itself.
The successful participants took the exact opposite tack. Instead of trying to prove themselves right by generating strings that confirmed their hypothesis, they tried to falsify it. For example, if they thought the rule was “increasing intervals of two,” they would say “3, 2, 1.” That string doesn’t follow the rule. They might then say, “2, 4, 10.” That string follows the experimenter’s rule, but doesn’t follow what most participants assumed was the right rule.
The numbers game, as you may have guessed, is a microcosm for life. Our instinct in our personal and professional lives is to prove ourselves right. Every “yes” makes us feel good. Every “yes” makes us stick to what we think we know. Every “yes” gets us a gold star and a hit of dopamine.
But every “no” brings us one step closer to the truth.
The point of proving ourselves wrong isn’t to feel good. It’s to make sure that your business doesn’t fall apart or that your health doesn’t break down. Each time we validate what we think we know, we narrow our vision and ignore alternative possibilities—in the same way that each nod of approval from the experimenter led participants to fixate on the wrong hypothesis.
The numbers study is from a real experiment conducted by the cognitive psychologist Peter Cathcart Wason, who coined the term confirmation bias. The bias refers to the human tendency to undervalue evidence that contradicts our beliefs and overvalue evidence that confirms them.
Ideological lock-in happens without our awareness. As a result, we must move beyond mere platitudes like I’m open to proving myself wrong. We must deliberately expose ourselves to this discomfort and actively work to prove ourselves wrong. When our focus shifts from proving ourselves right to proving ourselves wrong, we seek different inputs, we combat deeply-entrenched biases, and we open ourselves up to competing facts and arguments.
“I don’t like that man,” Abraham Lincoln once said, “I must get to know him better.” The same approach should apply to opposing ideas.
Years after he published his numbers study, Wason was stopped on the street by Imre Lakatos, a philosopher of science at the London School of Economics. “We’ve read everything you have written,” Lakatos told Wason, “and we disagree with all of it.”
He added, “Do come and give us a seminar.”