“Life is inherently risky. There is only one big risk you should avoid at all costs, and that is the risk of doing nothing.” Denis Waitley
The communication of relative risk is integral to environmental issues. Risk is frequently exaggerated, or totally underestimated. Consider air travel. Air travel is the safest form of mass transport, yet how many of us are afraid of flying? I can put my hand up straight away. Moreover, I am afraid of flying in big aeroplanes but not little ones! Once I flew in a small firefighting plane with a drunk pilot who flew down a steep mountain valley and buzzed the top of the trees in the forest, and I was not in the slightest afraid. If I were rational about risk, then I surely should have been afraid of the latter…where I was likely at extremely high risk of dying in an air crash! And how many of us are afraid of driving? I suppose it depends on where you live (I am afraid of driving in my home country of South Africa, but in my adopted country of Australia, I am not in the slightest afraid). But if you look at the degree of risk attached to both activities, driving is far more dangerous than flying. For example, using the metric, deaths by billion kilometers, you get the following table, and leaving aside the space shuttle, you can see that air travel is pretty safe:
|Deaths per billion journeys|
|Space Shuttle: 14,925,373|
As is shown above with the cars and aeroplanes, relative risk is frequently miscommunicated. Environmental issues are much the same, and arguably, many campaigns depend on this. Years ago, I had a brief contract working for a very large multinational environmental organization (they do a lot of really good stuff, so I am not going to name them). I was hired to run a campaign to highlight the shipping of nuclear waste (which is not necessarily a good thing, but is not the point of this article). I had to organize demonstrations, send out media releases, etc etc, all based around the fact that the ship might sink and the nuclear waste end up at the bottom of the sea.
I was provided with “fact” sheets, and told to emphasize the huge risk of this ship passing by, particularly that the area was prone to freak, giant waves. Because I am a researcher, I did a little digging into some of these statements, particularly the threat to the ship by the freak waves. I contacted the world expert into these waves and asked him if this were indeed the case. He answered, most certainly…but why on earth was I wasting my time protesting against shipping nuclear waste, which was one of the most controlled activities on the planet, when there were far more severe environmental risks that I would be better off focusing on?
My curiosity aroused, I asked, what? He said that many companies (and probably countries) had extremely unseaworthy ships which sailed under flags of convenience (registering a ship in another country to avoid regulations or operating costs, such as maintenance requirements). These ships often carried highly toxic products, such as crude oil, other fossil fuels or dangerous chemicals, and because were not well maintained, tended to hug the shoreline…where they were most vulnerable to freak waves, not to mention hazards such as rocks and reefs. He told me that should one of those ships be sunk, it might cause vast environmental damage to sensitive shorelines, and that the risk of this happening was an order of magnitude higher than the nuclear waste ship sinking (the cargo of which was encased in a block of glass and metal) and controlled by international regulations.
But nuclear sounds so nasty and dangerous, why would anyone care about some old rust bucket? And of course, let me be very clear, there are indeed risks with nuclear waste, some of which are extremely serious. But the point is that the perception of the relative risk was completely incorrect.So, why do people not think rationally about risk? There are ten factors related to risk evaluation (under and over-estimation of risk), and I will detail these below:
- Dread (and catastrophic potential). We are pretty useless at predicting the future, despite what late night TV psychics might claim. When we attempt to predict what might happen, we more often imagine terrible scenarios (i.e. that a tornado will destroy our home instead of just damaging a couple of cars). We are not rational about such risks, and the media know this. Many more people will buy a newspaper or click on a link if it has a sensationalist headline. This has recently been taken to extremes by some media outlets such as Upworthy and News Limited.
- Control and choice. The majority of us have a deep need for control over our lives, and often assume that we have more control than in reality. This is related to choice; having a choice between two activities of equal risk, we can minimize the risk, because we feel that choice gives us more control. This is one reason why we underestimate risks.
- Natural or man-made risks. We often think of natural disasters as less risky than man-made disasters. This is related to the point about control; we think we have more control over man-made disasters. On the other hand, sometimes the feeling of reduced control over natural disasters can make them seem all the more terrifying. This is related to intentional vs accidental events. For example, we may be terrified of the thought of bio-terrorism (using viruses to kill people) or Ebola but blithe about the risk of influenza. Yet, according to the World Health Organization, influenza kills between 250,000 and 500,000 people annually, and bio-terrorism kills approximately none.
- Children. Anything that is mooted as a risk to children is deemed greater than a risk to adults. This attitude gives rise to some countries being termed “nanny states, where any risk to children is legislated against. The anti-vaccination brigade use this argument very effectively, arguing incorrectly that vaccination is a greater risk to children than not being vaccinated. Although this is completely fallacious, the immediate risk of a vaccine injury seems more dangerous than the future risk of a preventable disease.
- Novelty. If we have never encountered a risk before, we might spend more time thinking about it, and thus assume it is more risky than it really is. We drive in cars almost every day, but underestimate their risk in comparison to aeroplanes, in which we might fly only once a year. That is why anomalous things are given so much emphasis in the media. We like to read about the unusual, but then we assume that the unusual is more common than it actually is. Some call this the “red car syndrome”; we rarely notice red cars on the road, until we buy one, then every second car seems to be red. This is also related to the human tendency to find patterns, even when they don’t exist.
- Publicity and media. This is paramount. If something receives a lot of media attention, we might assume the risk is a lot more significant. For example, in 2011, the State in which I live, Queensland Australia, had a major flood event. Some areas were badly flooded but very few were killed. However, the media inflated these floods to such an extent, that people overseas panicked, thinking their relatives were in extreme danger, when in reality, there was a lot of property damage, but very little risk unless someone did something really stupid, like driving on flooded roads.
- Propinquity. This means that a risk to me is seen as greater than a risk to someone else. This is related to the neighbourhood or personified effect, which I talk about later. If something is more immediate, and directly influences me, then I think it is a greater risk. A serious accident in my local suburb is of far greater importance than a thousand dying in one month in automobile accidents in another country. We may overestimate the risk, say to our own children, without knowledge of the facts (the driver might have been under the influence of drugs, speeding or had a medical condition, all or any of which may have made him or her much more likely to crash).
- Immediacy: We overestimate threats that are immediate rather than those that may occur sometime in the future. This is the most important reason why the risk of climate change is underestimated (and deliberately communicated as such). Those with vested interests overestimate the risks of taking action now (generally economic) and underestimate the risks of doing nothing (vast social, economic and environmental devastation). This is also related to the timing of something, and is sometimes called the frog in hot water syndrome. If something happens slowly and imperceptibly (i.e. a change in climate averages) it is likely to be taken less seriously than something happening in the short term (i.e. a flash flood).
- Risk-benefit tradeoff. If taking a risk can also lead to opportunities (i.e. gambling) then often performing a risky action can be viewed as less risky than it actually is.
- Trust. If we trust the other people involved, we are more likely to minimize the risk than if we don’t trust them. If I go climbing with my cousin who is a climbing instructor, I might feel more secure than say, going climbing with someone who I do not know, and “looks a bit dodgy”.