December 11, 2018

What do rocket scientists, cab drivers, and football players have in common?

previous post:

FREE AUDIO TRAINING

Learn 3 simple strategies to make giant leaps in your life and work.

FREE download

I’m not a morning person. To me, sunrises feel as energizing as a root canal. To prepare myself for what felt like a recurring battle each morning, I would set my alarm clock thirty minutes fast.

You know the rest of the story.

Kid, meet snooze button. In economics lingo, I would “consume” those thirty minutes instead of “saving” them by repeatedly hitting snooze like a rat in a Skinner box.

There is a phenomenon that explains my love-hate relationship with the snooze button. The same phenomenon shows why head and neck injuries increased in American football after players started wearing hard-shelled helmets to better protect themselves. It explains why installing anti-lock brakes—a now-ancient technology introduced in cars in the 1980s to avoid skidding—didn’t decrease the number of accidents. It also explains why marking crosswalks didn’t make crossing the street any safer—in some cases, it led to more fatalities and injuries.

The psychologist Gerald Wilde calls this phenomenon risk homeostasis.

The phrase is fancy, but the idea is simple: In some cases, measures intended to decrease risk backfire. Humans compensate for the reduced risk in one area by increasing risk in another.

Marked crosswalks don’t increase safety because pedestrians develop a false sense of security and assume drivers are more likely to stop. They become less vigilant about looking both ways for oncoming traffic before crossing.

Before hard helmets were introduced in American football, players would wear leather helmets that provided little protection to the head. As a result, they would use their shoulders as the initial point of contact. After players started to wear hard helmets, they began to lead tackles with their heads, which led to increased injuries and fatalities.

Over a three-year period, a study was conducted in Munich. One portion of a taxicab fleet was equipped with an anti-lock brake system (“ABS”). The remainder of the cabs had traditional, non-ABS brakes. The cars were identical in all other respects. They drove at the same time of day, the same days of the week, and in the same weather conditions. The drivers also knew whether their car was equipped with ABS.

The study found no tangible difference in accident rates between the ABS-equipped cars and the remainder. But one difference was statistically significant: driving behavior. The drivers of the ABS-equipped cars became far more reckless. They tailgated more often. Their turns were sharper. They drove faster. They switched lanes dangerously. They were involved in more near-misses.

Safety measures also backfired in the 1986 Challenger space shuttle tragedy that claimed the lives of seven astronauts. The explosion resulted from a catastrophic flaw in the O-rings—thin rubber bands that seal the joints of the rocket boosters that launch the shuttle and prevent hot gases from leaking out. There were two O-rings on each joint—a primary and a secondary, for good measure—because the function they serve is critical.

On the morning of Challenger’s launch, temperatures in Cape Canaveral were uncharacteristically cold. As a result, two engineers at Morton Thiokol, the company that built the rocket boosters, recommended a delay of the launch, believing that cold temperatures could compromise the O-rings.

But the upper management overruled the engineers’ recommendations. The managers decided that that the O-rings had a sufficient safety margin “to enable them to tolerate three times the worst erosion observed up to that time.” What’s more, there was a failsafe in place: They had faith in the secondary O-ring if anything happened to the primary.

This belief boosted a sense of invincibility and led to catastrophe when both the primary and the secondary O-ring failed during launch. These rocket scientists were like football players tackling with their helmet-protected heads or German cabbies in ABS-equipped cars driving fast and loose.

In each case, implementing the safety measure gave us the satisfaction of “doing something” about a problem. But each left unaddressed the human cause behind the technical cause—a la the dream within a dream in the movie Inception.

In other words, the “safe” felt more safe than it actually was. The corresponding behavior change eliminated any benefit from the safety measure. In some cases, the pendulum swung in the other direction: The activity became less safe than it was before the safety measure was put in place.

This doesn’t mean that we stop fastening our seat belts, buy ancient cars that don’t come with ABS, or take up jaywalking.

Instead, pretend the crosswalk isn’t marked and walk accordingly. Assume the secondary O-ring or the ABS brakes won’t prevent the accident. Keep your head out of the tackle, even if you’re wearing a helmet. Act as if your clock isn’t thirty minutes fast or that you didn’t receive an extension on that project deadline.

The safety net may be there to catch you if you fall, but you’re better off pretending it doesn’t exist.

[Inspirations: Gerald Wilde, Target Risk; Malcolm Gladwell, Blowup; William Starbuck & Frances Milliken, Challenger: Fine-Tuning the Odds Until Something Breaks].