Safety Moment #60: Risk Perception
The material in this article to do with risk perception is taken from the book Process Risk and Reliability Management.
Risk perception is fundamentally a subjective matter; no matter how hard analysts strive to make the topic objective the fact remains that, as Oscar Wilde (1854-1900) once said, A truth ceases to be a truth as soon as two people perceive it.
This observation regarding different truths applies to hazards analysis and risk management work. Each person participating in a hazards analysis has his or her own opinions, memories, attitudes and overall world view. Most people are - in the strict sense of the word - prejudiced; that is, they pre-judge situations rather than trying to analyze the facts rationally and logically. People jump to preconceived conclusions, and those conclusions will often differ from those of other people who are looking at the same information with their own world view. With regard to risk management, even highly-trained, seasoned experts - who generally regard themselves as being governed only by the facts - will reach different conclusions when presented with the same data. Indeed, Hence, there is no such thing as "real risk" or "objective risk". If risk cannot be measured objectively then objective risk does not exist at all.
In his book Bad Science the author Ben Goldacre (Goldacre 2008) has a chapter entitled Why Clever People Believe Stupid Things. Many of his insights can be applied to the risk assessment of process facilities. He arrives at the following conclusions:
- We see patterns where there is only random noise.
- We see causal relationships where there are none.
- We overvalue confirmatory information for any given hypothesis.
- We seek out confirmatory information for any given hypothesis.
- Our assessment of the quality of new evidence is biased by our previous beliefs.
The subjective component of risk becomes even more pronounced when the perceptions of non-specialists, particularly members of the public, are considered. Hence successful risk management involves understanding the opinions, emotions, hopes and fears of many people, including managers, workers and members of the public.
Some of the factors that affect risk perception are discussed below.
Degree of Control
Voluntary risks are accepted more readily than those that are imposed. For example, someone who believes that the presence of a chemical facility in his community poses an unacceptable risk to himself and his family may willingly go rock-climbing on weekends because he feels that he has some control over the risk associated with the latter activity, whereas he has no control at all over the chemical facility, or of the mysterious odors it produces. Hence rock climbers will quickly point to evidence that their sport is safer than say driving to the mountains. But their response misses the point - they are going to climb rocks anyway; they then assemble the evidence to justify what they are doing (see point #4 above).
Similarly, most people feel safer when driving a car rather than riding as a passenger, even though half of them must be wrong. The feeling of being in control is one of the reasons that people accept highway fatalities more readily than the same number of fatalities in airplane crashes.
The desire for control also means that most people generally resist risks that they feel they are being forced to accept; they will magnify the perceived risk associated with tasks that are forced upon them.
Familiarity with the Hazard
Most people understand and accept the possibility of the risks associated with day-to-day living, but they do not understand the risk associated with industrial processes, thus making those risks less acceptable. A cabinet full of household cleaning agents, for example, may actually pose more danger to an individual than the emissions from the factory that makes those chemicals. But the perceived risk is less.
Hazards that are both unfamiliar and mysterious are particularly unacceptable, as can be seen by the deep distrust that the public feels with regard to nuclear power facilities.
Direct Benefit
People are more willing to accept risk if they are direct recipients of the benefits associated with that risk. The reality is that most industrial facilities provide little special benefit to the immediate community apart from offering some job opportunities and an increased local tax base. On the other hand, it is the community that has to take all of the associated risk associated with those facilities, thus creating the response of NIMBY (Not in My Backyard).
Personal Impact
The effect of the consequence term will depend to some degree on the persons who are impacted by it. For example, if an office worker suffers a sprained ankle he or she may be able to continue work during the recovery period; an outside operator, however, may not be able to work at his normal job during that time. Or, to take another example, the consequence of a broken finger will be more significant to a concert pianist than to a process engineer.
Natural vs. Man-Made Risks
Natural risks are generally considered to be more acceptable than man-made risks. For example, communities located in areas of high seismic activity understand and accept the risks associated with earthquakes. Similarly people living in hurricane-prone areas regard major storms as being a normal part of life. However, these same people are less likely to understand or accept the risks associated with industrial facilities.
Recency of Events
People tend to attribute a higher level of risk to events that have actually occurred in the recent past. For example, the concerns to do with nuclear power facilities in the 1980s and 90s were very high because the memories of Chernobyl and Three Mile Island were so recent. This concern is easing given that these two events occurred decades ago, and few people have a direct memory of them.
Perception of the Consequence Term
RiskHazard = Consequencen * Likelihood........................ (1)
The Risk Equation (1) is linear; it gives equal value to changes in the consequence and frequency terms, implying a linear trade-off between the two. For example, according to Equation (1), a hazard resulting in one fatality every hundred years has the same risk value as a hazard resulting in ten fatalities every thousand years. In both cases the fatality rate is one in a hundred years, or 0.01 fatalities yr-1. But the two risks are not perceived to be the same. In general, people feel that high-consequence events that occur only rarely are less acceptable than more frequent, low consequence accidents. Hence, the second of the two alternatives shown above is perceived as being worse than the first.
The same way of looking at risk can be seen in everyday life. In a typical large American city around 500 people die each year in road accidents. Although many efforts are made to reduce this fatality rate the fact remains that this loss of life is perceived as a necessary component of modern life, hence there is little outrage on the part of the public. Yet, were an airplane carrying 500 people to crash at that same city's airport every year, there would be an outcry. Yet the fatality rate is the same in each case, i.e., 500 deaths per city per year. The difference between the two risks is a perception rooted in feelings and values.
To accommodate the difference in perception regarding risk Equation (1) can be modified so as to take the form of Equation (2).
RiskHazard = Consequencen * Likelihood........................ (2)
where n > 1
Equation (2) shows that the contribution of the consequence term has been raised by the exponent n, where n > 1. In other words, high consequence/low frequency accidents are assigned a higher perceived risk value than low consequence/high frequency accidents.
Since the variable 'n' represents subjective feelings it is impossible to assign it an objective value. However, if a value of say 1.5 is given to 'n' then Equation (2) for the two scenarios just discussed - the airplane crash and the highway fatalities - becomes Equations (3) and (4) respectively.
Riskairplane = 500 1.5 * 1....................................................... (3)
= 11180
Riskauto = 1 1.5 * 500....................................................... (4)
= 500
The 500 auto fatalities are perceived as being equivalent to over 11,000 airplane fatalities, i.e., the apparent risk to do with the airplane crash is 17.3 times greater than for the multiple automobile fatalities.
In the case of hazards that have very high consequences, such as the meltdown of the core of a nuclear power facility, perceived risk rises very fast as a result of the exponential term in Equation (2), thus explaining public fear to do with such facilities. Over the years, managers and engineers in such facilities have reduced the objective risk associated with nuclear power plants to an extremely low value, largely through the extensive use of sophisticated instrumentation systems. However, since the worst-case scenario - core meltdown - remains the same the public remains nervous and antagonistic. In such cases management would be better advised to address the consequence term rather than the likelihood term. With regard to nuclear power, the route to public acceptance is to make the absolute worst-case scenario one of low consequence.
The subjective and emotional nature of risk is summarized by Brander (1995) with reference to the changes in safety standards that were introduced following the Titanic tragedy.
They [scientists and engineers] tend to argue with facts, formulas, simulations, and other kinds of sweet reason. These don't work well. What does work well are shameless appeals to emotion - like political cartoons. Like baby seals covered in oil. And always, always, casualty lists. Best of all are individual stories of casualties, to make the deaths real. We only learn from blood.
Comprehension Time
When people are informed that a significant new risk has entered their lives it can take time for them to digest that information. For example, when a patient is informed by a doctor that he or she has a serious medical condition, the doctor should not immediately launch into a discussion of possible treatments. He should allow the patient time to absorb the news before moving on to the next step. So it is with industrial risk. If people - particularly members of the public - are informed of a new risk associated with a process facility, then those people need to time to grasp and come to terms with what has been said. There is a difference between having an intellectual grasp of risk and of subjectively understanding how things have changed.
Randomness
Human beings tend to create order out of a random series of events. People have to do this in order to make sense of the world in which they live. The catch is that there is a tendency to create order, even when events are statistically independent of one another.
For example, psychologists gave test subjects a set of headphones and then played a series of random beeps. The subjects were told to imagine that each beep corresponded to an automobile going by. They were then asked if the beeps were coming in batches, such as would occur when cars were leaving a red traffic light, or whether the cars were spaced randomly, such as would happen on a freeway. The subjects generally said that the beeps were in groups, even though they were in fact occurring at random.
Therefore it is important for those working in process risk management not to create patterns and order out of randomly occurring events. For example, if two or three near miss incidents can be attributed to a failure of the Management of Change (MOC) system this does not necessarily mean that the MOC system is any more deficient than the other elements of the process safety program.
Regression to the Mean
Related to the above discussion concerning the tendency to create non-existent order out of random events, people will also create causal relationships where there are none, particularly when a system was regressing to the mean anyway.
For example, a facility may have suffered from a series of serious incidents. In response to this situation management implements a much more rigorous process safety management program than they had before. The number of incidents then drops. It is natural to explain the improvement as a consequence of the new PSM program. Yet, if the serious events were occurring randomly then it is likely that their frequency would have gone down anyway because systems generally have a tendency to revert to the mean.
Bias toward Positive Evidence / Prior Beliefs
People tend to seek out and information that confirms their opinions and they tend to overvalue information that confirms those opinions. It is particularly important to recognize this trait when conducting incident investigations. It is vital that the persons leading an investigation listen to what is being said without interjecting with their own opinions or prejudices.
We also tend to expose ourselves to situations and people that confirm our existing beliefs. For example, most people will watch TV channels that reinforce their political opinions. This can lead to shock when it turns out in an election that those beliefs did not constitute a majority opinion.
Availability
People tend to notice items which are outstanding or different in some way. For example, someone entering their own house will not see all of the items of furniture, but she will notice that the television has been stolen or that a saucepan has boiled dry. Similarly anecdotes and emotional descriptions have a disproportionate impact on people's perceptions (as illustrated in the discussion to do with the Titanic tragedy provided earlier in this chapter).
Goldacre notes that, as information about the dangers of cigarette smoking became more available, it was the oncologists and chest surgeons who were the first to quit smoking because they were the one who saw the damage caused to human lungs by cigarettes.
You are welcome to use this Safety Moment in your workplace. But please read Use of Safety Moments.
Copyright © Ian Sutton. 2020. All Rights Reserved.