Risk Management : The Five Rules of Risk

Insaf Ali

https://img.particlenews.com/image.php?url=0vhDmx_0YpnYpUw00

Why do you walk outside?

You know it’s risky, right?

Don’t you value your life? Or rather, how much do you value your life?

Could the answer be infinite?

Could your life be worth everything to you?

After all, most people would say that they gain nothing from losing their life and, therefore, the value of their life, to themselves, must be infinite, but that can’t be true. Your life cannot be of infinite value to yourself because, if that were the case, you would never walk outside. Walking outside is risky.

Every time you set foot on the sidewalk, you elevate the chances that imminently, your life will end.

So why do you walk outside?

Risk is the balance of negatives and positives. For you or anyone to consider a risk to be acceptable, the positives must outweigh the negatives.

For example, in a given year, an American has a one in 55,000 chance of dying by being hit by a car while walking down the street.

If you valued your life infinitely, that infinite value would be multiplied by your chance of death, 1 in 55,000 per year, to create the total negative value of walking outside, but of course, infinity times anything is infinity.

So, you can’t possibly value your life infinitely, because you do walk outside.

You somehow decide that that 1 in 55,000 chance of death is worth it for whatever you gain from walking down the street.

This is the process of risk evaluation and, for absolutely every decision you make, you subconsciously balance the negatives and positives.

Let’s ask another question: would you, personally, drive a car?

Almost every person would answer yes.

We even know how many people would answer yes because two-thirds of all Americans have a driver’s license. So, a follow-up question: would you, personally, go mountain biking.

Would you consider that to be too risky — to bring you a little too close to death?

Not all, and probably not most, but many more would answer no.

This is something that has come up time and time again as mountain biking has gained popularity and schools have tried to add it to their roster of sports, but parents and administrators said no.

They said that it was far too dangerous of a sport, that the risk of injury or death was unacceptable. However, in a given year, the odds of death while driving for a given licensed driver in the US is 1 in 600.

Meanwhile, the odds of death while mountain biking for a given mountain biker is about 1 in 30,000.

Is that what would have expected?

Prior to these statistics would you have earnestly thought that this, something you see every day or possibly do every day, is more dangerous than something that you watch people do online in awe?

Most people would say no, so how are people’s risk perceptions so wrong?

What makes driving an acceptable risk to nearly everyone but mountain biking an unacceptable risk to many?

Well, it’s simple: engrained in our brains are flawed processes for evaluating risk. For certain reasons, what we fear will kill us and what will actually kill us are very, very different. These are the rules of our flawed human risk evaluation system.

Rule 1: Voluntary risks are more acceptable than involuntary risks

That is to say, people are far more tolerant of risks that they can choose to engage in than those that they can’t. That’s why most of the population will happily drive a car, with a 1 in 600 annual risk of death, but if a food preservative was introduced that had the same level of risk, there would be a mass outcry.

In fact, studies have shown that people are roughly 1,000 times more tolerant of voluntary risk than involuntary.

The perception of choice is powerful. That means that if measures are introduced that make an involuntary risk seem voluntary, the risk is more tolerated, even if those measures have a little real effect on the average overall risk. This gets tricky with activities that are somewhere in between voluntary and involuntary.

Say, for example, a company is polluting the air in a town. This seems like an involuntary risk — you can’t, after all, chose to not breathe the polluted air in — however, consider how communication can reframe this.

The factory owner might say, “you can always move away,” or, “buy an air filter,” or, “stay inside on bad air days.” If you give people ways that seem to control their personal exposure to an involuntary risk, they might start to believe that it is a voluntary risk.

Not only does this potentially increase the tolerance of those exposed to the risk, but it will also increase the tolerance of the bystanders not exposed. They might think, “those people are choosing to stay there and not use an air filter and go outside.”

The blame is slowly shifted on the individual, rather than on the originator of the risk. The overall danger from something can stay exactly the same, but give people a semblance of control over it and they will tolerate it.

Rule 2: Acceptance is inversely proportional to the Prevalence

The more people who are exposed to a given risk, the less the public will accept it. We can accept the huge risk of space travel, with only a tiny number of people partaking, yet we could not accept if air travel had the same level of risk.

Both are voluntary activities, but there are two parallel effects taking place here — fewer people will engage in higher-risk activities, and fewer people will tolerate high exposure risks.

This effect is also why the risk tolerance for things like vaccines, which are distributed to hundreds of millions or billions of people, is so dramatically lower than the risk tolerance of experimental medicines, which are distributed to a small and select number of individuals.

Rule 3: Disease is a Yardstick

It’s been shown that average human risk acceptance roughly correlates to whether an activity is more or less deadly than one’s chance of death from the disease.

What’s possibly happening here is that humans have accepted the likelihood of their death by the hands of disease, so partaking in anything riskier than the disease is unconscionable because it then makes it more likely to die from that other activity than the disease. It changes one's statistically predetermined end.

For example, the researcher who defined this rule found that the average risk of death per hour for a soldier fighting in the Vietnam war was roughly equal to the average risk of dying per hour from disease as a human.

Therefore, the general public more or less accepted the risk of those fighting there. This disregards the fact that the risk for the individual soldiers fighting in Vietnam of dying from disease was, at their average age, dramatically lower than the overall population average, which is skewed heavily by those in old age, but that didn’t matter because the public’s yardstick is a disease — riskier, it’s not ok, less risky, it’s ok.

Rule 4: Novelty increases perceived risk

We fear what is new and not yet understood. As understanding goes up, the perceived risk goes down, and as understanding goes down, the perceived risk goes up.

For example, in one now-famous 1970s study, college students were asked to rank thirty activities and technologies from most to least risky — simple as that. In the end, on average, they ranked nuclear power as the riskiest of these thirty, above cars, handguns, smoking, and everything else.

Then, experts, whose risk perception was shown in separate research to closely match real risk, ranked the same list of thirty. To them, nuclear power was the 20th most risky activity or technology on the list.

Meanwhile, non-nuclear electric power was 19th on the college student’s list, but 9th on the expert’s list. The college students overestimated the risk of nuclear power and underestimated the risk of electric power.

So what’s the difference between nuclear and non-nuclear electric power

Well, normal electric power had been widespread for a century at the time when this research was done.

Nuclear power, meanwhile, was just gaining prevalence.

This risk perception study was done before the well-known meltdowns at Three Mile Island and Chernobyl, just single-digits of people had died directly from nuclear energy incidents at this time, yet people feared it more than cars, which killed 50,000 people in the US in the year when they were surveyed.

Today, four decades later, the risk perception of nuclear power is greatly reduced because it is no longer new. Younger people now learn how nuclear power works in the school, and older people have a longer period of evidence for the risk level of nuclear power. Understanding leads to acceptance.

Rule 5: Numbers are numbing

The value of saving one human life should be half of the value of saving two human lives, and then the value of saving two human lives should be half of saving four human lives, and so on and so forth.

That’s intuitive, that makes sense, each life is, on average, worth saving as much as the next meaning that a graph, with the number of lives saved on the x-axis and value of lives saved on the y, should look like this — a straight line.

There’s even an argument that going a level deeper, the graph should look like this since the large-scale loss of life has a more widespread negative societal effect.

Losing a million lives would be more than a million times worse than losing one life, for a society. So, the graph should look either like this or this.

However, if you change this graph from what humans logically agree on to what they perceive psychologically, research shows it looks like this. Humans perceive that saving the 1000th life is less worth it than saving the 1st life, even though logically, they are of equal value.

This can partially explain how people can tolerate mass casualty natural disasters or pandemics or genocides with little more concern than they do small terrorist attacks.

While this phenomenon doesn’t make logical sense, it does make some psychological sense because it lines up with how people perceive other quantifiable collections.

For example, humans value a dollar less the more they earn — your first dollar is worth far more than your millionth, to your mind, even though a dollar is still worth a dollar.

In the context of human life, though, this is more destructive. Humans can’t comprehend mass tragedy through statistics, and so they are not an effective means of communication for those purposes.

It is for this reason why people might underestimate the risk that causes mass casualty events.

These rules and others all combine to create a simple truth: the way we, as a collective, perceive risk is wrong.

We overestimate the danger of low-risk activities, underestimate the danger of high-risk activities, we think things that we like are less risky, we conflate unfair danger with higher danger — this perceptual package works so that perceived risk swings wildly above and below actual risk.

We just can’t trust ourselves. We can think, earnestly, that risk has increased, but it hasn’t, or that it’s decreased, but it hasn’t.

Considering that we, the general public, cannot reasonably know how risky something is is exactly why life itself is risky. While science and statistics might know the risk, we don’t, so whether we’re actually engaging in acceptable risk is a crapshoot.

Normally, this isn’t something worth stressing about.

Our flawed risk perception systems will average out, through time, to something similar to real risk. The only time it’s really worth considering is when people are making decisions about the risk that affects others.

When people, somehow, get to decide what risk others face, perception is dangerous, because it can silently and unknowingly eclipse science, statistics, and fact. Your online identity is now more visible than ever, so it’s worth making sure you present yourself right.

Comments / 0

Published by

If you like my content, you can support me at Patreon, link is given below.

New York, NY
190 followers

More from Insaf Ali

Comments / 0