January 10, 2018

Challenging Authority is More Important Now Than Ever. Here’s Why.

previous post:

FREE AUDIO TRAINING

Learn 3 simple strategies to make giant leaps in your life and work.

FREE download

The 43-year-old President-elect had big shoes to fill. John F. Kennedy’s predecessor, Dwight D. Eisenhower, was a five-star general who served as Supreme Commander of the Allied Forces during World War II. Lacking Eisenhower’s military prowess and foreign policy experience, and eager to prove himself in the heady young days of his presidency, JFK found himself caught in a crisis. In 1961, he buckled under pressure from the Joint Chiefs and authorized the Bay of Pigs invasion to topple the communist government in Cuba.

The invasion was a fiasco. Kennedy had assumed, incorrectly, that “the military and intelligence people have some secret skill not available to ordinary mortals.” The following year, when the Soviet Union deployed missiles in Cuba, Kennedy had grown a backbone and learned not to blindly trust the Joint Chiefs. Pressured again by the military to take action, he rejected the Joint Chiefs’ plan for a full-scale air campaign to destroy the missile sites and opted instead for a far less intrusive blockade of the island. In response, the Soviets agreed to deescalate the conflict and withdraw the missiles. Kennedy’s willingness to question the Joint Chiefs saved the world from mutually assured destruction.

Even for those of us who aren’t making life-and-death decisions, Kennedy’s skepticism of the Joint Chiefs is instructive. In praising his willingness to stand up to the authorities, I’m not advocating anti-intellectualism or an assault on reason. Ignorance is not a virtue. There are truths–like scientific laws–that aren’t subject to straw polls. We shouldn’t trust our health to faith healers or disbelieve global warming because the winters feel cold.

But unlike undeniable scientific truths, there are assertions of belief, which can be right or wrong. For assertions, we should channel our inner JFK more often and apply a Goldilocks approach to our confidence in the authorities: not too much, not too little. When our confidence in the authorities runs too high, we don’t bother to evaluate the evidence or arm ourselves with facts and nuances.

Here’s the problem: Misinformation and pseudoscience thrive in an environment where our own critical thinking has atrophied. When we outsource our civic responsibility to the authorities, when we leave it to them to draw conclusions, when we fail to exercise our critical thinking muscles, they wear down over time. Without an informed public willing to question confident claims–whatever their source–democracy decays.

Authorities can be wrong–and disastrously so. It was a doctor who peddled the myth about the supposed link between autism and the MMR (measles, mumps, and rubella) vaccine in a prestigious medical journal. Although the study turned out to be fraudulent, the myth persisted. Parents refused to vaccinate their children, resulting in a significant uptick in measles cases. Neurologists used to believe that lobotomies would cure disorders, astronomers agreed that the universe was static, and psychiatrists classified homosexuality as a “sociopathic personality disturbance.” The experts cautioned that dietary fat causes heart disease even though the evidence wasn’t there. In each of these cases, the authorities were the ones fooling themselves–and by extension, fooling us.

At least in JFK’s time, most experts earned their titles. Since then, expertise has become a largely self-proclaimed qualification. Experts become experts by calling themselves that. Equipped with little more than Wikipedia knowledge, everyone and their cousin can claim expertise on something and begin spreading nonsense. As if the term “expert” were going out of fashion, various other words have been invented to describe the same qualification. Thought leader. Influencer. Authority blogger.

What they lack in knowledge, they make up for by cranking up the drama and assertiveness. News stations feed the 24-hour news monster with loud-mouthed demagogues who dominate the public discourse by evangelizing knowledge that they don’t have. Experts at think tanks and government institutions churn out authoritative policy papers on countries they’ve never visited. Some of these opinions are undoubtedly valuable, but, with the rapid proliferation of self-proclaimed experts, it’s become increasingly difficult to separate the signal from the noise.

Opinions of the authorities can do lasting damage because we’re wired to trust them. Social science studies show that seemingly simple symbols of authority (such as a title or clothing) are sufficient to inspire concerning levels of obedience. In an infamous study by Stanley Milgram, participants were willing to deliver life-threatening electric shocks to a victim in seeming agony (played by an actor), simply because a lab-coated researcher told them to do so.

Authorities are human and fallible. They’re vulnerable to the same biases and conflicts of interest as the rest of us. Since they are experts in the current view of their field, they can reflexively dismiss unorthodox views and protect their ego at the expense of an objective search for the truth. Experts need maverick thinking to avoid tunnel vision but they’re often hostile to it. For example, the continental drift theory, which posited that continents were one big mass and drifted apart overtime, was initially declared absurd. For decades, Nicolaus Copernicus was reluctant to publish his finding that the Earth is not at the center of the universe, for fear of rejection and ridicule by the experts. Ignaz Semmelweis, a Hungarian physician, was ostracized by the medical community for suggesting that doctors wash their hands before treating patients to avoid spreading disease. He ended up in an asylum. Each of these instances proved Max Planck right: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die.”

Last year, when I set out to write a book about military coups, the experts had already reached a conclusion: Coups are bad for democracy. No buts, no ifs, no exceptions. During my research, I uncovered numerous coups that didn’t fit this pattern. These coups toppled dictatorships and built democracies, rather than destroyed them.

When I made this claim at a conference, a senior scholar awoke from his daydream, turned his head deliberately in my direction, and gave me the same type of stare that the Cardinals must have given Galileo for asserting that the Earth is round. Challenging the consensus opinion was academic heresy.

Even after he earned a Nobel prize, the physicist Richard Feynman didn’t see himself as an authority. Instead, he thought of himself as a “confused ape” and approached everything around him with the same level of curiosity, enabling him to see the nuances that others in his field had dismissed. In a rebuke to self-professed experts with confident claims, “I would rather have questions that can’t be answered,” he said, “than answers which can’t be questioned.”

In the end, the world needs fewer self-assured authorities and more confused apes.

The 43-year-old President-elect had big shoes to fill. John F. Kennedy’s predecessor, Dwight D. Eisenhower, was a five-star general who served as Supreme Commander of the Allied Forces during World War II. Lacking Eisenhower’s military prowess and foreign policy experience, and eager to prove himself in the heady young days of his presidency, JFK found himself caught in a crisis. In 1961, he buckled under pressure from the Joint Chiefs and authorized the Bay of Pigs invasion to topple the communist government in Cuba.

The invasion was a fiasco. Kennedy had assumed, incorrectly, that “the military and intelligence people have some secret skill not available to ordinary mortals.” The following year, when the Soviet Union deployed missiles in Cuba, Kennedy had grown a backbone and learned not to blindly trust the Joint Chiefs. Pressured again by the military to take action, he rejected the Joint Chiefs’ plan for a full-scale air campaign to destroy the missile sites and opted instead for a far less intrusive blockade of the island. In response, the Soviets agreed to deescalate the conflict and withdraw the missiles. Kennedy’s willingness to question the Joint Chiefs saved the world from mutually assured destruction.

Even for those of us who aren’t making life-and-death decisions, Kennedy’s skepticism of the Joint Chiefs is instructive. In praising his willingness to stand up to the authorities, I’m not advocating anti-intellectualism or an assault on reason. Ignorance is not a virtue. There are truths–like scientific laws–that aren’t subject to straw polls. We shouldn’t trust our health to faith healers or disbelieve global warming because the winters feel cold.

But unlike undeniable scientific truths, there are assertions of belief, which can be right or wrong. For assertions, we should channel our inner JFK more often and apply a Goldilocks approach to our confidence in the authorities: not too much, not too little. When our confidence in the authorities runs too high, we don’t bother to evaluate the evidence or arm ourselves with facts and nuances.

Here’s the problem: Misinformation and pseudoscience thrive in an environment where our own critical thinking has atrophied. When we outsource our civic responsibility to the authorities, when we leave it to them to draw conclusions, when we fail to exercise our critical thinking muscles, they wear down over time. Without an informed public willing to question confident claims–whatever their source–democracy decays.

Authorities can be wrong–and disastrously so. It was a doctor who peddled the myth about the supposed link between autism and the MMR (measles, mumps, and rubella) vaccine in a prestigious medical journal. Although the study turned out to be fraudulent, the myth persisted. Parents refused to vaccinate their children, resulting in a significant uptick in measles cases. Neurologists used to believe that lobotomies would cure disorders, astronomers agreed that the universe was static, and psychiatrists classified homosexuality as a “sociopathic personality disturbance.” The experts cautioned that dietary fat causes heart disease even though the evidence wasn’t there. In each of these cases, the authorities were the ones fooling themselves–and by extension, fooling us.

At least in JFK’s time, most experts earned their titles. Since then, expertise has become a largely self-proclaimed qualification. Experts become experts by calling themselves that. Equipped with little more than Wikipedia knowledge, everyone and their cousin can claim expertise on something and begin spreading nonsense. As if the term “expert” were going out of fashion, various other words have been invented to describe the same qualification. Thought leader. Influencer. Authority blogger.

What they lack in knowledge, they make up for by cranking up the drama and assertiveness. News stations feed the 24-hour news monster with loud-mouthed demagogues who dominate the public discourse by evangelizing knowledge that they don’t have. Experts at think tanks and government institutions churn out authoritative policy papers on countries they’ve never visited. Some of these opinions are undoubtedly valuable, but, with the rapid proliferation of self-proclaimed experts, it’s become increasingly difficult to separate the signal from the noise.

Opinions of the authorities can do lasting damage because we’re wired to trust them. Social science studies show that seemingly simple symbols of authority (such as a title or clothing) are sufficient to inspire concerning levels of obedience. In an infamous study by Stanley Milgram, participants were willing to deliver life-threatening electric shocks to a victim in seeming agony (played by an actor), simply because a lab-coated researcher told them to do so.

Authorities are human and fallible. They’re vulnerable to the same biases and conflicts of interest as the rest of us. Since they are experts in the current view of their field, they can reflexively dismiss unorthodox views and protect their ego at the expense of an objective search for the truth. Experts need maverick thinking to avoid tunnel vision but they’re often hostile to it. For example, the continental drift theory, which posited that continents were one big mass and drifted apart overtime, was initially declared absurd. For decades, Nicolaus Copernicus was reluctant to publish his finding that the Earth is not at the center of the universe, for fear of rejection and ridicule by the experts. Ignaz Semmelweis, a Hungarian physician, was ostracized by the medical community for suggesting that doctors wash their hands before treating patients to avoid spreading disease. He ended up in an asylum. Each of these instances proved Max Planck right: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die.”

Last year, when I set out to write a book about military coups, the experts had already reached a conclusion: Coups are bad for democracy. No buts, no ifs, no exceptions. During my research, I uncovered numerous coups that didn’t fit this pattern. These coups toppled dictatorships and built democracies, rather than destroyed them.

When I made this claim at a conference, a senior scholar awoke from his daydream, turned his head deliberately in my direction, and gave me the same type of stare that the Cardinals must have given Galileo for asserting that the Earth is round. Challenging the consensus opinion was academic heresy.

Even after he earned a Nobel prize, the physicist Richard Feynman didn’t see himself as an authority. Instead, he thought of himself as a “confused ape” and approached everything around him with the same level of curiosity, enabling him to see the nuances that others in his field had dismissed. In a rebuke to self-professed experts with confident claims, “I would rather have questions that can’t be answered,” he said, “than answers which can’t be questioned.”

In the end, the world needs fewer self-assured authorities and more confused apes.