When a politician changes her mind, we call her a flip-flopper. In politics, consistency trumps accuracy. I can’t remember the last time a politician said, “You know what, that’s a great argument. I stand corrected.” When politicians do change their minds in public, they are castigated by the opposition and dragged through the mud for being inconsistent, indecisive, and generally unfit to be the type of hard, ideological person eligible for elected office.
But a scientist who changes her mind is simply doing her job. All good scientists are flip-floppers. The goal of any self-respecting scientist is to prove herself wrong and to come up with hypotheses that can be falsified.
Of course, not all scientists live up to this ideal: Some continue to preach the status quo simply because they identify with it. But generally speaking, scientists are far more open to changing their minds than most other segments of the population.
We’re wired to remain consistent with our pre-existing beliefs, particularly when we’ve expressed them in public. Once we’ve staked a claim, once we’ve invested sunk costs in the form of time and energy defending that claim, our pride and ego tell us to stand our ground.
As a result of the well-documented confirmation bias, we tend to undervalue evidence that contradicts our beliefs and overvalue evidence that confirms them. We filter out inconvenient truths and arguments on the opposing side. As a result, our opinions solidify, and it becomes increasingly harder to disrupt established patterns of thinking.
Sometimes, we take this a step further. In addition to shutting off our own minds to change, we disable others from challenging our beliefs by creating what scientists call unfalsifiable hypotheses. We assume (incorrectly) that our arguments are true unless proven false and then make it impossible (or prohibitively difficult) for others to falsify them.
In science, if there’s no way to test a hypothesis and prove it’s wrong, it’s essentially worthless. For example, if I tell you that I’ve got an invisible, floating dragon living in my garage, I’ve told you something that can’t be falsified. There’s nothing you can do to disprove the dragon’s existence and convince me that I’m wrong.
When our minds remain closed, when we keep opposing arguments at bay through unfalsifiable arguments, and when we associate flip-flopping with being weak or two-faced, misinformation and pseudoscience thrive. Our tribal echo chambers get louder and louder with the sounds of the same misleading voices.
The solution is simple: Attempt to prove yourself wrong on a daily basis. When our focus shifts from proving ourselves right to proving ourselves wrong, we seek different inputs, we combat deeply-entrenched biases, and we open ourselves up to competing facts and arguments.
Begin by figuring out why you believe what you believe. What are your assumptions? What are the supporting facts? How do you know those facts to be accurate? Remember: Google is not a synonym for research. Just because it’s on the Internet doesn’t mean it’s true. Be skeptical of “one study found” claims, seek multiple references, and filter out low-quality information (yes, by low quality, I mean that “8 surprising superfoods you should eat every day for a long life” article that tops your Google search results).
Also ask: Do you really want this particular belief to be true? If so, be careful. Be very careful. The confirmation bias will be particularly strong for deeply-entrenched beliefs.
Once you’ve identified the foundations of your argument, force yourself to poke holes in it. How can I prove myself wrong? What are the arguments on the opposing side? What facts point to a contrary conclusion? Avoid treating this exercise as a variant of the “what’s your biggest weakness” interview question, where you come up with a strength that’s not so subtly couched as a weakness (I work too hard).
Avoid generating straw man arguments. These are weak, and therefore easier to debunk, representations of the opposing position. Instead, engage in steelmanning, the opposite of strawmanning. Steelmanning requires you to find and articulate the strongest form of the opposition’s argument. Charlie Munger, vice chairman of Berkshire Hathaway, is a major proponent of this idea: “You’re not entitled to take a view,” he cautions, “unless and until you can argue better against that view than the smartest guy who holds that opposite view.”
If you’re finding a hard time coming up with opposing arguments, befriend people who disagree with you. Expose yourself to environments where your opinions can be challenged. Treat “prove it,” not as an affront, but as an invitation for engagement.
My latest book, The Democratic Coup d’État, emerged out of a desire to prove myself wrong. I was born in Istanbul, Turkey, at a time when the nation was under military rule. In 1980, the year before my birth, the Turkish military seized power from a civilian government in a brutal, repressive coup. Having personally witnessed these events, I was, for the majority of my life, quick to condemn all military coups. But in 2011, I decided to channel my inner scientist, challenge this deeply-held belief, and come up with a falsifiable hypothesis: “All military coups are bad for democracy.” If I could find counterexamples–where the military promoted, rather than hampered, democratic development–my hypothesis would be proven wrong. During six years of research, I found a boatload of examples that disputed my hypothesis and showed that an event as undemocratic as a military coup can, in some cases, lead to democracy.
As a law professor, I’ve read thousands of judicial opinions, and one of my all-time favorites is Justice Harlan’s dissent in the 1896 case of Plessy v. Ferguson. In that case, a majority of the U.S. Supreme Court, against Harlan’s sole dissent, upheld the constitutionality of racial segregation (the case was later overturned in Brown v. Board of Education). Harlan’s dissent came as a surprise to many: He was a former slave owner and a staunch opponent of the Reconstruction Amendments to the U.S. Constitution, which prohibited the government from discriminating on the basis of race (among other things). When Harlan’s critics accused him of flip flopping, his answer was simple, yet profound: “I’d rather be right than consistent.”
We need more Harlans and fewer confident ideologues. We need more empathy and fewer punches to the gut. We need minds that are malleable and curious, not calcified. We need more rational thinkers and fewer people hiding invisible floating dragons in their garages.