Risk https://iansutton.com/ en Safety Moment #93: The Fundamentals of Process Safety Risk https://iansutton.com/safety-moments/safety-moment-93-fundamentals-process-safety-risk <span class="field field-name-title field-formatter-string field-type-string field-label-hidden">Safety Moment #93: The Fundamentals of Process Safety Risk</span> <span class="field field-name-uid field-formatter-author field-type-entity-reference field-label-hidden"><span lang="" about="/user/445" typeof="schema:Person" property="schema:name" datatype="">Ian Sutton</span></span> <span class="field field-name-created field-formatter-timestamp field-type-created field-label-hidden">Sun, 06/21/2020 - 8:06am</span> <div class="clearfix text-formatted field field-node--body field-formatter-text-default field-name-body field-type-text-with-summary field-label-hidden has-single"><div class="field__items"><div class="field__item"><a href="https://netzero2050.substack.com/p/the-fundamentals-of-risk" title="Fundamentals of process safety risk"><img alt="Risk process safety management" data-entity-type="file" data-entity-uuid="069ad18c-bc13-4ba3-872e-983fd03cc430" height="240" src="/sites/default/files/inline-images/shutterstock_225399994-500.jpg" width="245" class="align-center" /></a> <p>This article has moved to the post <a href="https://netzero2050.substack.com/p/the-fundamentals-of-risk" title="Fundamentals of process safety risk">The Fundamentals of Process Safety Risk</a>.</p> <div style="background-color: PaleTurquoise;color:black;padding: 10px;border:1px solid black;font-size:14px;"> <p class="text-align-center">Copyright © Ian Sutton. 2023. All Rights Reserved.</p> </div> </div></div> </div> <div class="field field-node-field-topics field-entity-reference-type-taxonomy-term field-formatter-entity-reference-label field-name-field-topics field-type-entity-reference field-label-hidden"><div class="field__items"><div class="field__item field__item--risk"> <span class="field__item-wrapper"><a href="/topics/risk" hreflang="en">Risk</a></span> </div></div> </div> Sun, 21 Jun 2020 12:06:42 +0000 Ian Sutton 350 at https://iansutton.com Safety Moment #102: ALARP and Acceptable Risk https://iansutton.com/safety-moments/safety-moment-102-alarp-acceptable-risk <span class="field field-name-title field-formatter-string field-type-string field-label-hidden">Safety Moment #102: ALARP and Acceptable Risk</span> <span class="field field-name-uid field-formatter-author field-type-entity-reference field-label-hidden"><span lang="" about="/user/445" typeof="schema:Person" property="schema:name" datatype="">Ian Sutton</span></span> <span class="field field-name-created field-formatter-timestamp field-type-created field-label-hidden">Thu, 06/11/2020 - 2:34pm</span> <div class="clearfix text-formatted field field-node--body field-formatter-text-default field-name-body field-type-text-with-summary field-label-hidden has-single"><div class="field__items"><div class="field__item"><img alt="ALARP acceptable risk" data-entity-type="file" data-entity-uuid="8a478e67-a756-4583-a4e4-dccb48170900" src="/sites/default/files/inline-images/shutterstock_139825753-500.jpg" class="align-center" /> <p>Fundamentally risk is subjective; it is not possible to define what level of risk is acceptable dispassionately and objectively. Any two risk scenarios are inherently different from one another due to people’s inherent understanding and acceptance of different types of risk, their emotions, memories, hopes and fears.</p> <p>Factors that affect risk perception include the following:</p> <ul> <li>Degree of control.</li> <li>Familiarity with the hazard.</li> <li>Direct benefit.</li> <li>Personal impact,</li> <li>Natural vs. man-made risks,</li> <li>Recency of events,</li> <li>Effects of the consequence term, and</li> <li>Comprehension time.</li> </ul> <p>In spite of these difficulties, it is still necessary to have a clear understanding as to what levels of risk are acceptable. After all, if a facility operates for long enough, it is <em>certain -</em> statistically speaking - that there will be an accident. Therefore, process safety professionals need guidance as to what level of "acceptable safety" is required. This is tricky. Regulatory agencies in particular will never place a numerical value on human life and suffering because any number that they propose would inevitably generate controversy. Yet working targets have to be provided, otherwise the facility personnel do not know what they are shooting for. Nor can a regulatory body, a professional society or the author of a book such as this can provide an objective value for risk.</p> <p>Individuals and organizations are constantly gauging the level of risk that they face in their personal and work lives, and then acting on their assessment of that risk. For example, at a personal level, an individual has to make a judgment as to whether it is safe or not to cross a busy road. And there is the further complication of the subjectivity of risk. Someone who is strongly opposed to having a chemical plant near their home may happily choose to go bungee jumping at weekends.</p> <p>In an industrial context managers make risk-based decisions regarding issues such as whether to shut down an equipment item for maintenance or to keep it running for another week. Other risk-based decisions made by managers are whether or not an operator needs additional training, whether to install an additional safety shower in a hazardous area, and whether a full Hazard and Operability Analysis (HAZOP) is needed to review a proposed change. Engineering standards, and other professional documents, can provide guidance. But, at the end of the day, the manager has a risk-based decision to make. That decision implies that some estimate of "acceptable risk" has been made.</p> <p>One company provided the criteria shown in the Table for its design personnel.</p> <table> <tbody> <tr> <td><strong> </strong></td> <td><strong>Fatalities per year (employees and contractors)</strong></td> </tr> <tr> <td>Intolerable risk</td> <td>&gt;5 x 10<sup>-4</sup></td> </tr> <tr> <td>High risk</td> <td>&lt;5 x 10<sup>-4</sup> and &gt;1 x 10<sup>-6</sup></td> </tr> <tr> <td>Broadly tolerable risk</td> <td>&lt;1 x 10<sup>-6</sup></td> </tr> </tbody> </table> <p>Their instructions were that risk must never be in the 'intolerable' range. High risk scenarios are "tolerable", but every effort must be made to reduce the risk level, <em>i.e.,</em>to the "broadly tolerable" level.</p> <p>Please move to <a href="https://netzero2050.substack.com/p/alarp-and-acceptable-risk" title="ALARP and acceptable risk">ALARP and Acceptable Risk</a> to learn more.</p> <div style="background-color: PaleTurquoise;color:black;padding: 10px;border:1px solid black;font-size:14px;"> <p class="text-align-center">Copyright © Ian Sutton. 2023. All Rights Reserved.</p> </div> </div></div> </div> <div class="field field-node-field-topics field-entity-reference-type-taxonomy-term field-formatter-entity-reference-label field-name-field-topics field-type-entity-reference field-label-hidden"><div class="field__items"><div class="field__item field__item--risk"> <span class="field__item-wrapper"><a href="/topics/risk" hreflang="en">Risk</a></span> </div></div> </div> Thu, 11 Jun 2020 18:34:00 +0000 Ian Sutton 336 at https://iansutton.com Safety Moment #62: From Complicated to Complex https://iansutton.com/safety-moments/safety-moment-62-complicated-complex <span class="field field-name-title field-formatter-string field-type-string field-label-hidden">Safety Moment #62: From Complicated to Complex</span> <span class="field field-name-uid field-formatter-author field-type-entity-reference field-label-hidden"><span lang="" about="/user/445" typeof="schema:Person" property="schema:name" datatype="">Ian Sutton</span></span> <span class="field field-name-created field-formatter-timestamp field-type-created field-label-hidden">Mon, 08/27/2018 - 1:51am</span> <div class="clearfix text-formatted field field-node--body field-formatter-text-default field-name-body field-type-text-with-summary field-label-hidden has-single"><div class="field__items"><div class="field__item"><p>In any performance-based program such as process safety, the work is never finished — there is always room for improvement.</p> <p>In practice, most of the developments in techniques for improving safety analysis are improvements of existing programs or techniques. For example, the hazards analysis technique LOPA (Layers of Protection Analysis) has seen widespread application in recent years. Yet it is basically a development of the well-established Fault Tree and Event Tree techniques.</p> <p>In order to understand how to develop more innovative ways of improving safety it is useful to distinguish between the words <strong><em>complicated</em></strong> and <strong><em>complex</em></strong>.  Most safety work has been to do with complicated systems — the opportunities for major improvements lie with understanding complex systems.</p> <figure role="group" class="caption caption-img align-center"> <img alt="A complicated, not complex, system" data-entity-type="file" data-entity-uuid="6c3dea07-3465-4d00-b015-9de71d3622bd" src="/sites/default/files/inline-images/Refinery-18-500.jpg" /> <figcaption>A complicated, but not complex, system. Credit: Shutterstock</figcaption> </figure> <h2>Complicated</h2> <p>Process facilities consist of thousands of items that are connected to one another and that interact with one another. Yet, in spite of their size they are fundamentally understandable and predictable. They are complicated, not complex.</p> <p>Most process safety work aims to understand and control this complication. The aim is to develop solutions that are both successful and repeatable. For example,</p> <ul> <li>Once a method for writing operating procedures has been developed, then that method can be used throughout the organization for writing procedures for all types of facility and activity.</li> <li>Once a hazards analysis team has identified how a pressure vessel may rupture they can apply that insight into the operation of all other pressure vessels.</li> <li>Once an effective technique for analyzing incidents has been developed, that technique can be used for all future incident investigations.</li> </ul> <p>Key words here are ‘understandable’ and ‘repeatable’. </p> <p>The following are features of complicated systems.</p> <ul> <li>It is and can be understood by breaking it down into smaller parts, by determining how those parts work and how they interact with one another.</li> <li>A complicated situation can be quantified and understood through the use of metrics.</li> <li>A command and control management structure is effective at managing complicated systems.</li> </ul> <p>By and large process safety professionals aim to reduce the risk associated with complicated<em> </em>system. And, on the whole, their efforts have been successful. Process facilities are more complicated than they were a generation ago — but the complication is understood and it is successfully managed.</p> <h2>Complex</h2> <p>A complex system is based on relationships, interconnection and evolution. It is fundamentally unpredictable. (Any system which involves human behavior — particularly the behavior of people in groups — will be complex.)</p> <p>Complex systems do not have to be complicated — although most are.</p> <p>Key aspects of a complex situation include the following.</p> <ul> <li>It comprises relationships that cannot be understood just by breaking a system into its component parts.</li> <li>The situation is fluid — surprises happen.</li> <li>Command and control structures will be limited in their effectiveness.</li> <li>It cannot be easily quantified — there are no effective metrics.</li> <li>It will usually involve the unpredictable behavior of human beings, both as individuals and in groups.</li> </ul> <p>Climate change is an excellent example of a complex system. Climate change models are increasingly accurate at forecasting what the climate will look like in coming years. But factors such as the following cannot be effectively modeled by a computer program.</p> <ul> <li>The response by people, both as individuals and as part of larger groups such as nation states.</li> <li>The impact of resource depletion. For example, if oil supplies start to dwindle, will the amount of CO<sub>2</sub> being pumped into the atmosphere go down? Or will reduced oil supplies lead to increased coal consumption, thus increasing the amount of CO<sub>2</sub> generated?</li> <li>The impact of increased methane emissions from the tundra as permafrost melts.</li> <li>The success or failure of efforts to reduce population growth.</li> </ul> <p><span style="font-size: 15px;">The upshot of this way of thinking is that we learn that we cannot “solve” the climate change “problem”. Indeed, we don’t even know what the “problem” is. Instead, complexity creates predicaments or dilemmas. When faced with a predicament we can respond and adapt, but we cannot make it go away. We cannot return to the status quo pre ante. Nor can we predict what the future holds. There are surprises in store.</span></p> <h2>Process Safety Management</h2> <p>So where does this discussion take the discipline of process safety management?</p> <p>If we are to manage complex situations effectively issues such as the following should be considered.</p> <ul> <li><em>Notice new and unexpected emergent directions</em><br /> Not all events are predictable; adapt appropriately to unexpected situations.</li> <li><em>Learn from new experiences</em><br /> Learning, in this context, is quite different from training or from education. It is based on an understanding that unexpected events will happen and the need to figure out why.</li> <li><em>Factor in the vagaries of human behavior</em><br /> Regular readers of these Safety Moments know that, of all the elements of a process safety management system, the one that I regard as being the most important is Employee Participation. The catch is that people are inherently unpredictable. For example, an Asset Integrity program may be able to predict with a high level of confidence when an equipment item may fail. But no process safety program can predict if and when the workforce will initiate industrial action. Nor can the program anticipate that the maintenance technician will repair it incorrectly because his mind distracted due to a domestic conflict.</li> </ul> <div style="background-color: PaleTurquoise;color:black;padding: 10px;border:1px solid black;font-size:14px;"> <p>You are welcome to use this Safety Moment in your workplace. But please read <a href="https://iansutton.com/use-safety-moments" title="Use of Safety Moments from Sutton Technical Books">Use of Safety Moments</a>.</p> <p class="text-align-center">Copyright © Ian Sutton. 2021. All Rights Reserved.</p> </div> </div></div> </div> <div class="field field-node-field-topics field-entity-reference-type-taxonomy-term field-formatter-entity-reference-label field-name-field-topics field-type-entity-reference field-label-hidden"><div class="field__items"><div class="field__item field__item--process-safety-management"> <span class="field__item-wrapper"><a href="/topics/process-safety-management" hreflang="en">Process Safety Management</a></span> </div><div class="field__item field__item--net-zero"> <span class="field__item-wrapper"><a href="/topics/net-zero" hreflang="en">Net Zero</a></span> </div><div class="field__item field__item--risk"> <span class="field__item-wrapper"><a href="/topics/risk" hreflang="en">Risk</a></span> </div></div> </div> Mon, 27 Aug 2018 05:51:04 +0000 Ian Sutton 289 at https://iansutton.com Safety Moment #60: Risk Perception https://iansutton.com/safety-moments/safety-moment-60-risk-perception <span class="field field-name-title field-formatter-string field-type-string field-label-hidden">Safety Moment #60: Risk Perception</span> <span class="field field-name-uid field-formatter-author field-type-entity-reference field-label-hidden"><span lang="" about="/user/445" typeof="schema:Person" property="schema:name" datatype="">Ian Sutton</span></span> <span class="field field-name-created field-formatter-timestamp field-type-created field-label-hidden">Sat, 08/04/2018 - 4:40pm</span> <div class="clearfix text-formatted field field-node--body field-formatter-text-default field-name-body field-type-text-with-summary field-label-hidden has-single"><div class="field__items"><div class="field__item"><p><strong>The material in this article to do with risk perception is taken from the book <a href="https://iansutton.com/books/process-risk-reliability-management">Process Risk and Reliability Management</a>.</strong></p> <hr /> <p>Risk perception is fundamentally a subjective matter; no matter how hard analysts strive to make the topic objective the fact remains that, as Oscar Wilde  (1854-1900) once said, <em>A truth ceases to be a truth as soon as two people perceive it. </em></p> <p>This observation regarding different truths applies to hazards analysis and risk management work. Each person participating in a hazards analysis has his or her own opinions, memories, attitudes and overall world view. Most people are - in the strict sense of the word - prejudiced; that is, they pre-judge situations rather than trying to analyze the facts rationally and logically. People jump to preconceived conclusions, and those conclusions will often differ from those of other people who are looking at the same information with their own world view. With regard to risk management, even highly-trained, seasoned experts - who generally regard themselves as being governed only by the facts - will reach different conclusions when presented with the same data. Indeed, Hence, there is no such thing as "real risk" or "objective risk". If risk cannot be measured objectively then objective risk does not exist at all.</p> <p>In his book <em>Bad Science </em>the author Ben Goldacre (<a href="https://iansutton.com/citations#Goldacre_2008">Goldacre 2008</a>) has a chapter entitled <em>Why Clever People Believe Stupid Things</em>. Many of his insights can be applied to the risk assessment of process facilities. He arrives at the following conclusions:</p> <ol> <li>We see patterns where there is only random noise.</li> <li>We see causal relationships where there are none.</li> <li>We overvalue confirmatory information for any given hypothesis.</li> <li>We seek out confirmatory information for any given hypothesis.</li> <li>Our assessment of the quality of new evidence is biased by our previous beliefs.</li> </ol> <p>The subjective component of risk becomes even more pronounced when the perceptions of non-specialists, particularly members of the public, are considered. Hence successful risk management involves understanding the opinions, emotions, hopes and fears of many people, including managers, workers and members of the public.</p> <p>Some of the factors that affect risk perception are discussed below.</p> <h2>Degree of Control</h2> <p>Voluntary risks are accepted more readily than those that are imposed. For example, someone who believes that the presence of a chemical facility in his community poses an unacceptable risk to himself and his family may willingly go rock-climbing on weekends because he feels that he has some control over the risk associated with the latter activity, whereas he has no control at all over the chemical facility, or of the mysterious odors it produces. Hence rock climbers will quickly point to evidence that their sport is safer than say driving to the mountains. But their response misses the point - they are going to climb rocks anyway; they then assemble the evidence to justify what they are doing (see point #4 above).</p> <p>Similarly, most people feel safer when driving a car rather than riding as a passenger, even though half of them must be wrong. The feeling of being in control is one of the reasons that people accept highway fatalities more readily than the same number of fatalities in airplane crashes.</p> <p>The desire for control also means that most people generally resist risks that they feel they are being forced to accept; they will magnify the perceived risk associated with tasks that are forced upon them.</p> <h2>Familiarity with the Hazard</h2> <p>Most people understand and accept the possibility of the risks associated with day-to-day living, but they do not understand the risk associated with industrial processes, thus making those risks less acceptable. A cabinet full of household cleaning agents, for example, may actually pose more danger to an individual than the emissions from the factory that makes those chemicals. But the perceived risk is less.</p> <p>Hazards that are both unfamiliar and mysterious are particularly unacceptable, as can be seen by the deep distrust that the public feels with regard to nuclear power facilities.</p> <h2>Direct Benefit</h2> <p>People are more willing to accept risk if they are direct recipients of the benefits associated with that risk. The reality is that most industrial facilities provide little special benefit to the immediate community apart from offering some job opportunities and an increased local tax base. On the other hand, it is the community that has to take all of the associated risk associated with those facilities, thus creating the response of NIMBY (Not in My Backyard).</p> <h2>Personal Impact</h2> <p>The effect of the consequence term will depend to some degree on the persons who are impacted by it. For example, if an office worker suffers a sprained ankle he or she may be able to continue work during the recovery period; an outside operator, however, may not be able to work at his normal job during that time. Or, to take another example, the consequence of a broken finger will be more significant to a concert pianist than to a process engineer.</p> <h2>Natural vs. Man-Made Risks</h2> <p>Natural risks are generally considered to be more acceptable than man-made risks. For example, communities located in areas of high seismic activity understand and accept the risks associated with earthquakes. Similarly people living in hurricane-prone areas regard major storms as being a normal part of life. However, these same people are less likely to understand or accept the risks associated with industrial facilities.</p> <h2>Recency of Events</h2> <p>People tend to attribute a higher level of risk to events that have actually occurred in the recent past. For example, the concerns to do with nuclear power facilities in the 1980s and 90s were very high because the memories of Chernobyl and Three Mile Island were so recent. This concern is easing given that these two events occurred decades ago, and few people have a direct memory of them.</p> <h2>Perception of the Consequence Term</h2> <p>   Risk<sub>Hazard</sub>  =  Consequence<sup>n</sup>  *  Likelihood........................ (1)</p> <p>The Risk Equation (1) is linear; it gives equal value to changes in the consequence and frequency terms, implying a linear trade-off between the two. For example, according to Equation (1), a hazard resulting in one fatality every hundred years has the same risk value as a hazard resulting in ten fatalities every thousand years. In both cases the fatality rate is one in a hundred years, or 0.01 fatalities yr<sup>-1</sup>. But the two risks are not <em>perceived</em> to be the same. In general, people feel that high-consequence events that occur only rarely are less acceptable than more frequent, low consequence accidents. Hence, the second of the two alternatives shown above is perceived as being worse than the first.</p> <p>The same way of looking at risk can be seen in everyday life. In a typical large American city around 500 people die each year in road accidents. Although many efforts are made to reduce this fatality rate the fact remains that this loss of life is perceived as a necessary component of modern life, hence there is little outrage on the part of the public. Yet, were an airplane carrying 500 people to crash at that same city's airport every year, there would be an outcry. Yet the fatality rate is the same in each case, <em>i.e.,</em> 500 deaths per city per year. The difference between the two risks is a perception rooted in feelings and values.</p> <p>To accommodate the difference in perception regarding risk Equation (1) can be modified so as to take the form of Equation (2).</p> <p>   Risk<sub>Hazard</sub>  =  Consequence<sup>n</sup>  *  Likelihood........................ (2)</p> <p>where  n &gt; 1</p> <p>Equation (2) shows that the contribution of the consequence term has been raised by the exponent n, where n &gt; 1. In other words, high consequence/low frequency accidents are assigned a higher <em>perceived </em>risk value than low consequence/high frequency accidents.</p> <p>Since the variable 'n' represents subjective feelings it is impossible to assign it an objective value. However, if a value of say 1.5 is given to 'n' then Equation (2) for the two scenarios just discussed - the airplane crash and the highway fatalities - becomes Equations (3) and (4) respectively.</p> <p>   Risk<sub>airplane</sub>  =  500 <sup>1.5</sup>  *  1....................................................... (3)</p> <p>   =  11180</p> <p>   Risk<sub>auto</sub>     =    1 <sup>1.5</sup>  *  500....................................................... (4)</p> <p>   =    500</p> <p>The 500 auto fatalities are perceived as being equivalent to over 11,000 airplane fatalities, <em>i.e., </em>the apparent risk to do with the airplane crash is 17.3 times greater than for the multiple automobile fatalities.</p> <p>In the case of hazards that have very high consequences, such as the meltdown of the core of a nuclear power facility, perceived risk rises very fast as a result of the exponential term in Equation (2), thus explaining public fear to do with such facilities. Over the years, managers and engineers in such facilities have reduced the <em>objective</em> risk associated with nuclear power plants to an extremely low value, largely through the extensive use of sophisticated instrumentation systems. However, since the worst-case scenario - core meltdown - remains the same the public remains nervous and antagonistic. In such cases management would be better advised to address the consequence term rather than the likelihood term. With regard to nuclear power, the route to public acceptance is to make the absolute worst-case scenario one of low consequence.</p> <p>The subjective and emotional nature of risk is summarized by <a href="https://iansutton.com/citations#Brander_1995">Brander (1995)</a> with reference to the changes in safety standards that were introduced following the Titanic tragedy.</p> <blockquote> <p>They [scientists and engineers] tend to argue with facts, formulas, simulations, and other kinds of sweet reason. These don't work well. What does work well are shameless appeals to emotion - like political cartoons. Like baby seals covered in oil. And always, always, casualty lists. Best of all are individual stories of casualties, to make the deaths real. We only learn from blood.</p> </blockquote> <h2>Comprehension Time</h2> <p>When people are informed that a significant new risk has entered their lives it can take time for them to digest that information. For example, when a patient is informed by a doctor that he or she has a serious medical condition, the doctor should not immediately launch into a discussion of possible treatments. He should allow the patient time to absorb the news before moving on to the next step. So it is with industrial risk. If people - particularly members of the public - are informed of a new risk associated with a process facility, then those people need to time to grasp and come to terms with what has been said. There is a difference between having an intellectual grasp of risk and of subjectively understanding how things have changed.</p> <h2>Randomness</h2> <p>Human beings tend to create order out of a random series of events. People have to do this in order to make sense of the world in which they live. The catch is that there is a tendency to create order, even when events are statistically independent of one another.</p> <p>For example, psychologists gave test subjects a set of headphones and then played a series of random beeps. The subjects were told to imagine that each beep corresponded to an automobile going by. They were then asked if the beeps were coming in batches, such as would occur when cars were leaving a red traffic light, or whether the cars were spaced randomly, such as would happen on a freeway. The subjects generally said that the beeps were in groups, even though they were in fact occurring at random.</p> <p>Therefore it is important for those working in process risk management not to create patterns and order out of randomly occurring events. For example, if two or three near miss incidents can be attributed to a failure of the Management of Change (MOC) system this does not necessarily mean that the MOC system is any more deficient than the other elements of the process safety program.</p> <h2>Regression to the Mean</h2> <p>Related to the above discussion concerning the tendency to create non-existent order out of random events, people will also create causal relationships where there are none, particularly when a system was regressing to the mean anyway.</p> <p>For example, a facility may have suffered from a series of serious incidents. In response to this situation management implements a much more rigorous process safety management program than they had before. The number of incidents then drops. It is natural to explain the improvement as a consequence of the new PSM program. Yet, if the serious events were occurring randomly then it is likely that their frequency would have gone down anyway because systems generally have a tendency to revert to the mean.</p> <h2>Bias toward Positive Evidence / Prior Beliefs</h2> <p>People tend to seek out and information that confirms their opinions and they tend to overvalue information that confirms those opinions. It is particularly important to recognize this trait when conducting incident investigations. It is vital that the persons leading an investigation listen to what is being said without interjecting with their own opinions or prejudices.</p> <p>We also tend to expose ourselves to situations and people that confirm our existing beliefs. For example, most people will watch TV channels that reinforce their political opinions. This can lead to shock when it turns out in an election that those beliefs did not constitute a majority opinion.</p> <h2>Availability</h2> <p>People tend to notice items which are outstanding or different in some way. For example, someone entering their own house will not see all of the items of furniture, but she will notice that the television has been stolen or that a saucepan has boiled dry. Similarly anecdotes and emotional descriptions have a disproportionate impact on people's perceptions (as illustrated in the discussion to do with the <em>Titanic </em>tragedy provided earlier in this chapter).</p> <p>Goldacre notes that, as information about the dangers of cigarette smoking became more available, it was the oncologists and chest surgeons who were the first to quit smoking because they were the one who saw the damage caused to human lungs by cigarettes.</p> <div style="background-color: PaleTurquoise;color:black;padding: 10px;border:1px solid black;font-size:14px;"> <p>You are welcome to use this Safety Moment in your workplace. But please read <a href="https://iansutton.com/use-safety-moments" title="Use of Safety Moments from Sutton Technical Books">Use of Safety Moments</a>.</p> <p class="text-align-center">Copyright © Ian Sutton. 2020. All Rights Reserved.</p> </div> </div></div> </div> <div class="field field-node-field-topics field-entity-reference-type-taxonomy-term field-formatter-entity-reference-label field-name-field-topics field-type-entity-reference field-label-hidden"><div class="field__items"><div class="field__item field__item--risk"> <span class="field__item-wrapper"><a href="/topics/risk" hreflang="en">Risk</a></span> </div><div class="field__item field__item--communication"> <span class="field__item-wrapper"><a href="/topics/communication" hreflang="en">Communication</a></span> </div></div> </div> Sat, 04 Aug 2018 20:40:38 +0000 Ian Sutton 281 at https://iansutton.com Safety Moment #56: Sinking Standards / Accountants Rule the Waves https://iansutton.com/safety-moment-56-sinking-standards <span class="field field-name-title field-formatter-string field-type-string field-label-hidden">Safety Moment #56: Sinking Standards / Accountants Rule the Waves</span> <span class="field field-name-uid field-formatter-author field-type-entity-reference field-label-hidden"><span lang="" about="/user/445" typeof="schema:Person" property="schema:name" datatype="">Ian Sutton</span></span> <span class="field field-name-created field-formatter-timestamp field-type-created field-label-hidden">Fri, 07/27/2018 - 11:57am</span> <div class="clearfix text-formatted field field-node--body field-formatter-text-default field-name-body field-type-text-with-summary field-label-hidden has-single"><div class="field__items"><div class="field__item"><p>This Safety Moment has been updated. It is now <a href="https://netzero2050.substack.com/p/the-process-safety-professional-part-c7a" title="Titanic PSM">The Process Safety Professional. Part 9: That Sinking Feeling</a>.</p> </div></div> </div> <div class="field field-node-field-topics field-entity-reference-type-taxonomy-term field-formatter-entity-reference-label field-name-field-topics field-type-entity-reference field-label-hidden"><div class="field__items"><div class="field__item field__item--regulations-codes-and-engineering-standards"> <span class="field__item-wrapper"><a href="/topics/regulations-codes-engineering-standards" hreflang="en">Regulations, Codes and Engineering Standards</a></span> </div><div class="field__item field__item--sems-safety-and-environmental-management-system"> <span class="field__item-wrapper"><a href="/topics/sems-safety-environmental-management-system" hreflang="en">SEMS (Safety and Environmental Management System)</a></span> </div><div class="field__item field__item--risk"> <span class="field__item-wrapper"><a href="/topics/risk" hreflang="en">Risk</a></span> </div><div class="field__item field__item--offshore-oil-and-gas"> <span class="field__item-wrapper"><a href="/topics/offshore-oil-gas" hreflang="en">Offshore Oil and Gas</a></span> </div><div class="field__item field__item--marine"> <span class="field__item-wrapper"><a href="/topics/marine" hreflang="en">Marine</a></span> </div></div> </div> Fri, 27 Jul 2018 15:57:30 +0000 Ian Sutton 278 at https://iansutton.com The Chemical Safety Board https://iansutton.com/chemical-safety-board <span property="schema:name" class="field field-name-title field-formatter-string field-type-string field-label-hidden">The Chemical Safety Board</span> <span rel="schema:author" class="field field-name-uid field-formatter-author field-type-entity-reference field-label-hidden"><span lang="" about="/user/445" typeof="schema:Person" property="schema:name" datatype="">Ian Sutton</span></span> <span property="schema:dateCreated" content="2018-05-08T12:04:59+00:00" class="field field-name-created field-formatter-timestamp field-type-created field-label-hidden">Tue, 05/08/2018 - 8:04am</span> Tue, 08 May 2018 12:04:59 +0000 Ian Sutton 233 at https://iansutton.com https://iansutton.com/chemical-safety-board#comments Lowest Level of Risk (BSEE) https://iansutton.com/offshore-oil-gas-sems-safety-environmental-management-system-risk/lowest-level-risk-bsee <span property="schema:name" class="field field-name-title field-formatter-string field-type-string field-label-hidden">Lowest Level of Risk (BSEE)</span> <span rel="schema:author" class="field field-name-uid field-formatter-author field-type-entity-reference field-label-hidden"><span lang="" about="/user/445" typeof="schema:Person" property="schema:name" datatype="">Ian Sutton</span></span> <span property="schema:dateCreated" content="2017-09-25T04:25:55+00:00" class="field field-name-created field-formatter-timestamp field-type-created field-label-hidden">Mon, 09/25/2017 - 12:25am</span> Mon, 25 Sep 2017 04:25:55 +0000 Ian Sutton 124 at https://iansutton.com Event Tree Analysis https://iansutton.com/risk/event-tree-analysis <span property="schema:name" class="field field-name-title field-formatter-string field-type-string field-label-hidden">Event Tree Analysis</span> <span rel="schema:author" class="field field-name-uid field-formatter-author field-type-entity-reference field-label-hidden"><span lang="" about="/user/445" typeof="schema:Person" property="schema:name" datatype="">Ian Sutton</span></span> <span property="schema:dateCreated" content="2017-09-25T04:22:13+00:00" class="field field-name-created field-formatter-timestamp field-type-created field-label-hidden">Mon, 09/25/2017 - 12:22am</span> Mon, 25 Sep 2017 04:22:13 +0000 Ian Sutton 122 at https://iansutton.com https://iansutton.com/risk/event-tree-analysis#comments Fault Tree Analysis https://iansutton.com/process-safety-management-hazards-analysis-risk/fault-tree-analysis <span property="schema:name" class="field field-name-title field-formatter-string field-type-string field-label-hidden">Fault Tree Analysis</span> <span rel="schema:author" class="field field-name-uid field-formatter-author field-type-entity-reference field-label-hidden"><span lang="" about="/user/445" typeof="schema:Person" property="schema:name" datatype="">Ian Sutton</span></span> <span property="schema:dateCreated" content="2017-09-23T15:57:35+00:00" class="field field-name-created field-formatter-timestamp field-type-created field-label-hidden">Sat, 09/23/2017 - 11:57am</span> Sat, 23 Sep 2017 15:57:35 +0000 Ian Sutton 87 at https://iansutton.com