Rare Risk and Overreactions

by Bruce Schneier

June 15, 2007 Crypto-Gram

Everyone had a reaction to the horrific events of the Virginia Tech shootings. Some of those reactions were rational. Others were not.

A high school student was suspended for customizing a first-person shooter game with a map of his school. A contractor was fired from his government job for talking about a gun, and then visited by the police when he created a comic about the incident. A dean at Yale banned realistic stage weapons from the university theaters -- a policy that was reversed within a day. And some teachers terrorized a sixth-grade class by staging a fake gunman attack, without telling them that it was a drill.

These things all happened, even though shootings like this are incredibly rare; even though -- for all the press -- less than one percent of homicides and suicides of children ages 5 to 19 occur in schools. In fact, these overreactions occurred, not despite these facts, but because of them.

The Virginia Tech massacre is precisely the sort of event we humans tend to overreact to. Our brains aren't very good at probability and risk analysis, especially when it comes to rare occurrences. We tend to exaggerate spectacular, strange and rare events, and downplay ordinary, familiar and common ones. There's a lot of research in the psychological community about how the brain responds to risk -- some of it I have already written about -- but the gist is this: Our brains are much better at processing the simple risks we've had to deal with throughout most of our species' existence, and much poorer at evaluating the complex risks society forces us to face today.

Novelty plus dread equals overreaction.

We can see the effects of this all the time. We fear being murdered, kidnapped, raped and assaulted by strangers, when it's far more likely that the perpetrator of such offenses is a relative or a friend. We worry about airplane crashes and rampaging shooters instead of automobile crashes and domestic violence -- both far more common.

In the United States, dogs, snakes, bees and pigs each kill more people per year than sharks. In fact, dogs kill more humans than any animal except for other humans. Sharks are more dangerous than dogs, yes, but we're far more likely to encounter dogs than sharks.

Our greatest recent overreaction to a rare event was our response to the terrorist attacks of 9/11. I remember then-Attorney General John Ashcroft giving a speech in Minnesota -- where I live -- in 2003, and claiming that the fact there were no new terrorist attacks since 9/11 was proof that his policies were working. I thought: "There were no terrorist attacks in the two years preceding 9/11, and you didn't have any policies. What does that prove?"

What it proves is that terrorist attacks are very rare, and maybe our reaction wasn't worth the enormous expense, loss of liberty, attacks on our Constitution and damage to our credibility on the world stage. Still, overreacting was the natural thing for us to do. Yes, it's security theater, but it makes us feel safer.

People tend to base risk analysis more on personal story than on data, despite the old joke that "the plural of anecdote is not data." If a friend gets mugged in a foreign country, that story is more likely to affect how safe you feel traveling to that country than abstract crime statistics.

We give storytellers we have a relationship with more credibility than strangers, and stories that are close to us more weight than stories from foreign lands. In other words, proximity of relationship affects our risk assessment. And who is everyone's major storyteller these days? Television. (Nassim Nicholas Taleb's great book, The Black Swan: The Impact of the Highly Improbable, discusses this.)

Consider the reaction to another event from last month: professional baseball player Josh Hancock got drunk and died in a car crash. As a result, several baseball teams are banning alcohol in their clubhouses after games. Aside from this being a ridiculous reaction to an incredibly rare event (2,430 baseball games per season, 35 people per clubhouse, two clubhouses per game. And how often has this happened?), it makes no sense as a solution. Hancock didn't get drunk in the clubhouse; he got drunk at a bar. But Major League Baseball needs to be seen as doing something, even if that something doesn't make sense -- even if that something actually increases risk by forcing players to drink at bars instead of at the clubhouse, where there's more control over the practice.

I tell people that if it's in the news, don't worry about it. The very definition of "news" is "something that hardly ever happens." It's when something isn't in the news, when it's so common that it's no longer news -- car crashes, domestic violence -- that you should start worrying.

But that's not the way we think. Psychologist Scott Plous said it well in "The Psychology of Judgment and Decision Making": "In very general terms: (1) The more available an event is, the more frequent or probable it will seem; (2) the more vivid a piece of information is, the more easily recalled and convincing it will be; and (3) the more salient something is, the more likely it will be to appear causal."

So, when faced with a very available and highly vivid event like 9/11 or the Virginia Tech shootings, we overreact. And when faced with all the salient related events, we assume causality. We pass the Patriot Act. We think if we give guns out to students, or maybe make it harder for students to get guns, we'll have solved the problem. We don't let our children go to playgrounds unsupervised. We stay out of the ocean because we read about a shark attack somewhere.

It's our brains again. We need to "do something," even if that something doesn't make sense; even if it is ineffective. And we need to do something directly related to the details of the actual event. So instead of implementing effective, but more general, security measures to reduce the risk of terrorism, we ban box cutters on airplanes. And we look back on the Virginia Tech massacre with 20-20 hindsight and recriminate ourselves about the things we should have done.

Lastly, our brains need to find someone or something to blame. (Jon Stewart has an excellent bit on the Virginia Tech scapegoat search, and media coverage in general.) But sometimes there is no scapegoat to be found; sometimes we did everything right, but just got unlucky. We simply can't prevent a lone nutcase from shooting people at random; there's no security measure that would work.

As circular as it sounds, rare events are rare primarily because they don't occur very often, and not because of any preventive security measures. And implementing security measures to make these rare events even rarer is like the joke about the guy who stomps around his house to keep the elephants away.

"Elephants? There are no elephants in this neighborhood," says a neighbor.

"See how well it works!"

If you want to do something that makes security sense, figure out what's common among a bunch of rare events, and concentrate your countermeasures there. Focus on the general risk of terrorism, and not the specific threat of airplane bombings using liquid explosives. Focus on the general risk of troubled young adults, and not the specific threat of a lone gunman wandering around a college campus. Ignore the movie-plot threats, and concentrate on the real risks.