the development of risk analysis: low probability,...

Click here to load reader

Upload: others

Post on 21-Aug-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

The Development of Risk Analysis: Low Probability, High Consequence Events Richard Wilson Jefferson Physical Laboratory 257A

The Development of Risk Analysis:

A Personal Perspective

ABSTRACT

This article reflects on my experiences observing and participating in the development of risk analysis for environmental and health hazards since the 1970s with emphasis on its critical role in informing decisions with potentially high consequences, even for very low probability events once ignored or simply viewed as "acts of God." I discuss how modern society wants to protect itself from hazards with limited or no immediate historical precedent such that prediction and protective actions must depend on models that offer varying degrees of reliability. I believe that we must invest in understanding risks and risk models to ensure health in the future and protect ourselves from large challenges, including climate change, whether anthropogenic or otherwise, terrorism, and perhaps even cosmic change.

Keywords: model, accident, low-dose, regulation, cancer, terrorism

1. INTRODUCTION

As society has developed we have tried to understand major events in the world. Those that happen regularly, the rise and fall of the sun, eclipses and the equinoxes, have been studied and used for prediction by many societies. Weather and climate are less predictable but are now being analyzed with some success. But risk analysis has tended to concentrate on the effect of man's technological actions. I will therefore start with a discussion of these.

Two hundred years ago the method of understanding the risks and dangers of society was to try out the technology and modify the technology if any problems arose. An obvious example was the development of railroad travel. About the year 1800, tramways were installed to guide wagons of freight, often in the mines, using flanged rails. These proved unstable that, by 1830, the flanges were installed on the wheels instead. But only recently have there been systematic studies of the “wheel-rail dynamics”. There were other problems with the railway. In 1833, the first passenger railway in the world, between Liverpool and Manchester, opened for business. On its opening day an engine ran down a man who failed to get out of the way. Moreover the man was a Member of Parliament! (Parenthetically it might be regarded a good thing if the member was from the opposite party.) Further developments in railroads and elsewhere led to accidents with tens and even hundreds of persons killed in a single accident. But still only the historical approach was used.

Mining and particularly coal mining was always regarded as dangerous but people engaged in it anyway. Two distinct types of danger became apparent two millennia ago. On the one hand miners breathed in noxious fumes of various sorts. These included mercury vapor in the open pit mines in Spain, and dust particles in underground tin mines in Cornwall. Secondly, there was the danger of collapse of the mines and entrapment of the miners. Even living in a polluted town was known to be dangerous. In the 19th century, a prayer in various northern English churches became: “from Hell, Hull and Halifax, good Lord deliver us.” As late as 1945, Halifax, UK, was often covered with a penetrating fog, and my paternal grandmother used to scrub her front door step twice a day. This air pollution seemed to many people to be an inevitable sign of prosperity. As a Yorkshire mill owner once said: “Where there is muck there is brass (money).”

Although control of these risks began many years ago, it is useful to consider the end of World War II as a defining point in man's desire and ability to control these hazards. The availability of cheap oil from the Middle East and gas from Algeria, the North Sea, Siberia, and in the US with hydraulic fracturing (fracking) of the gas containing rock, ushered in a period of prosperity and a fuel less polluting than coal to go with it.

2.1. Major Disasters—Historical Approach

Until World War II very few people concerned themselves with major disasters that could kill more than a hundred people at once. Yet they happened. Many large disasters throughout the world have been recorded and it is likely that many more lie unrecorded. The outbreak of plague in the 14th century killed 1/3 of the people in Europe, including Great Britain, and perhaps as large a fraction elsewhere. No one knows why the pandemic started and no one knows why it stopped. Yet the text books in English schools before World War II barely mentioned it. Floods, earthquakes and hurricanes occurred through weather and climate extremes. They were called: “Acts of God”. It is only since WWII that society has realized that while we cannot control “Acts of God” we can be prepared for them and can control man's reaction to them. This reaction before World War II already included tornado warnings, and construction of flood control levees. Yet only in the last couple of decades has society included definite disincentives such as unavailability of insurance, to put oneself out of the path of these natural disasters.

The extent to which mankind has deliberately built his civilization in dangerous ways is not always appreciated. The Nobel Laureate Eugene Wigner commented to me, in 1973, that “whenever there is a lot of energy in one place and a lot of people in the same place, there is a potential for disaster.” A look at the geography of the US shows that major cities have grown up around almost every estuary, because rivers seemed the easiest transportation routes. Yet this fact makes the cities liable to flood. A controllable flood can be a good thing, because it leads to fertilization of the land, but the flip side is the uncontrollable flood leading to disaster. In 1950, water in the Vaiont dam in northern Italy went over the top of the spillway as a landslide slid into the dam. Three thousand people downstream were killed. Man has also brought his industries to these same estuaries. The cyclones in Bangladesh in 2007 and Myanmar in 2008 caused enormous casualties in these coastal, estuary areas. In several estuaries, there are large 17 million gallon (about 64.3 cubic meters) tanks of such potentially dangerous substances, as liquefied natural gas, each tank of which, on combustion, would release as much energy as five Hiroshima bombs, or of 17 million gallons of toxic chlorine which could, on release, kill many people.

One of the earliest recognitions that Wigner's potential hazards could be minimized was the proclamation in 1848 by the citizens of London that no petroleum products could be brought up the Thames River closer than 30 miles east of London Bridge. () Over the next century, this led to the construction of a complex in Canvey Island. () This included one hundred and fifty tanks each with 17 million gallon of flammable and toxic liquids.

In the 19th and early 20th centuries, engineers developed practical solutions to disasters. This was, in part stimulated by the insurance industry which, for example, developed standards for locating the large tanks as early as 1907. These tended to be based on the old historical approach:  “make sure that what has been known to happen before will not happen again”, particularly if it happened frequently. In the process they began to use a formal procedure called “Fault Tree” analysis to analyze a system and discover what contributes to failures. Following the Canvey Report, which was formally requested by and presented to the British Parliament, the deputy director of the Health and Safety Executive, Dunster (), gave an appeal for widespread application and discussion in 1980.

“Our use of risk assessments and our attempts to make quantitative comparisons between risks of alternative decisions such as energy sources together make it clear that we lack a great deal of necessary information. We know little about the real magnitude of many existing risks and still less about society’s attitude towards these ill-defined risks. I suggest that we need to attempt more risk assessment and that we need to publish more of the results. These results will only be one factor in the process of making decisions and indeed the existence of such studies may often make decisions more difficult to reach but eventually we should gain confidence that our decisions are being taken in a consistent and possibly even in a logical way.” (2)

2.2. Models, Uncertainty and Appropriate Caution

It is important to realize that even at this elementary stage any prediction demands a model: even if it is only “next year will be like last year” or “next year will be like last year with a few understood improvements”. Once a model is introduced, uncertainty enters. This uncertainty is not merely due to statistical sampling errors. It has been said that: ”all models are wrong but some models are useful.” The more complex situation and the smaller risks one is estimating, the more uncertain is the estimate. Critics might argue that risk analysts are not talking about science – but that is a matter of definition.

2.3. Formal Study of Large Accidents in New Technologies

But it was not until the development of nuclear electric power after World War II that the old paradigm: “try it and if it gives trouble, fix it” was deemed inadequate for a new technology. Society now demands evidence, in advance, that the technology is safe. A number of reasons have been suggested for the fact that this fundamental change first occurred for nuclear power and probably all are in some part responsible.

(1)- The new technology was in the hands of fundamental scientists from the start.

(2)- The new technology used new physical principles.

(3)- The new technology arose simultaneously with a new deadly form of war.

(4)- The new technology posed unprecedented hazards.

The change was particularly apparent in the United States. The US Atomic Energy Commission (AEC) was set up in 1946 as a civilian agency to encourage and control all uses of nuclear fission. Military use was subordinate to civilian control. The first nuclear reactors were for military purposes but as early as 1946 William Webster, later head of the New England Electric System was asked to report on the potential for nuclear electric power which he did orally. The first nuclear generated electricity was about 15 years later. Outstanding scientists were either on the Atomic Energy Commission or consultants to it. The names of Glenn Seaborg, John Von Neumann, Robert Bacher, Edward Teller, Eugene Wigner, and Richard Feynman come to mind as influencing the safety procedures. The commission established an Advisory Committee on Reactor Safeguards (ACRS) to advise on safety. Their advice is always respected. Right from the beginning the ACRS set up a procedure called by a name borrowed from the military, “Defense in Depth.” () One must imagine the worst thing that can reasonably go wrong in the reactor, the “Maximum Credible Accident” and then devise an engineered safeguard to prevent it happening. Large reactors, particularly the first in a series, were to be in unpopulated areas, following Wigner's principle. 

There is little doubt that the “Defense in Depth” approach already led to a dramatic improvement in safety approaches, but by 1970 when many large nuclear reactors were under construction criticism arose. This was also a period of public questioning of government in the Vietnamese war. The next step came from Professor Manson Benedict of the Massachusetts Institute of Technology. It was to consider what happens if something happens beyond the Maximum Credible Accident and to consider how often that might come about? Professor Norman Rasmussen was selected for this task, and within two years he and his study group had a landmark study - “The Reactor Safety Study.” () In this study he was the first to use an “Event Tree”. Starting from an initiating event, he followed the course of a potential accident taking account of the engineered safety devices mandated by the “Defense in Depth” approach, but his team calculated the availability thereof using a “fault tree”. In principle the initiating events can lead to hundreds of thousands of possible results, but for a nuclear reactor only those that can lead to melting of the reactor core are important.

Although the technique changed the whole understanding of risks of large facilities people were slow to grasp the significance. Too much attention was paid to the final number describing the effect on public health of an accident and too little to the calculation procedure itself. Nuclear power operators, looking at the numbers and comparison with other risks () merely said that “this proves that reactors are safe.” This was an inadequate response. At it’s best; the event tree analysis procedure helps them to understand the reactors, and to make safety and operational improvements. But in this a large fraction of the leaders in the industry failed, although a few understood the power of these methods early-on. Anti-nuclear groups criticized the numbers, in many instances correctly, but failed to acknowledge that even if the more pessimistic numbers were correct, nuclear power was one of the safer energy technologies. This was in spite of a positive report by a review committee. Lewis et al. () correctly criticized detail but praised the procedure. The US Nuclear Regulatory Commission failed to adopt the Event Tree analysis procedure in its licensing of new reactors. But this began to change after the Three Mile Island accident in 1979.

Rasmussen's study group had studied a GE Boiling Water Reactor and a Westinghouse Pressurized Water Reactor but not a Babcock and Wilcox Pressurized Water Reactor. In an ideal world, Babcock and Wilcox would have followed up by a study of their reactor design using Rasmussen's method. So also would the owner and operator of each Babcock and Wilcox reactor and the Nuclear Regulatory Commission would have requested it. If any one of them had done so, they could not have failed to realize that due to a peculiarity of design the probability of a core melt in a Babcock and Wilcox reactor was about an order of magnitude more probable than in a Westinghouse one. After Three Mile Island more scientists had became convinced of the importance and power of the event tree method. The practitioners developed a jargon, and called the procedure Probabilistic Risk Analysis (PRA) and a society Probabilistic Safety Analysis and Management (PSAM) was created to discuss these procedures. ()

But engineering only covered a part of the whole issue of understanding risks. If the core melts, and the containment vessel fails to hold, then the effect of the release on people involves a very different type of calculation which I now describe. Although this had been discussed by the Atomic Energy Commission and the UN Subcommittee for the Effects of Atomic Radiation (UNSCEAR) with well over 100,000 papers and reports, it was not an engineering calculation.

The important feature of the engineering analysis is that if the design and operation are done correctly, the various steps in an event can be statistically distinct events and the probability of ultimate disaster is the product of a number of (hopefully) small probabilities. It follows that both engineering design and analysis must focus on ensuring the statistical independence. Analysts must also focus on those actions or hazards that destroy the statistical independence such as fire, flood, earthquake, sabotage or terrorism and combinations of these. Fukushima made it abundantly clear that earthquakes and floods are not statistically independent. However it is hard to envisage the coupling between an engineering disaster and the calculated (or measured) effect of a released pollutant on health.

2.4. Risks of Toxic Chemicals and Substances

The effect of pollutants on health is not normally considered by the same people as the engineers who discussed failures of an engineered system. But a systematic approach began to appear about 1970, starting with the effects of radiation and radioactive materials. The major event was the establishment of the US Environmental Protection Agency (EPA) followed soon thereafter by counterparts in other countries. For much of the preceding century the USA had been concerned about the safety of food and drugs and the Food and Drug Administration had been established to regulate these. There were two concerns: Acute Toxicity where a person might be poisoned in short order by an unwanted high dose, and chronic effects such as cancer that arise after prolonged exposure to a lower dose. The first concern was 100 years old. Toxic levels had been established for many chemicals, some by bitter experience with people, and some by animal data, mostly on rats and mice. The Environmental Protection Agency was encouraged and authorized to modify this procedure for regulation of environmental pollution.

Obviously the most important information we can get is direct information on the effect a pollutant has upon people. In almost all cases this information comes from the unfortunate situations where a group of people have been heavily exposed with disastrous adverse results. Over the years a lot of information has been gained about acute doses. But the chronic effect of lower repeated doses has only been a serious concern for the last 40 or 50 years. As noted earlier, the British Health and Safety Executive were very clear thinking in discussing these matters (2). Also a distinguished physician, Sir Edward Pochin, working at the time for the British Radiological Protection Board, located in the UK Department of Agriculture, wrote a very clear paper including both chemical pollutants and radiation (). In the following ten years until his death, he developed further his concept of the “index of harm”. Unfortunately, the Lowestoft, England, office and laboratory Radiological Protection Board has since been emasculated.

Just as society has moved beyond the old engineering approach: “Try it. If it causes trouble, fix it.” So has society moved beyond the approach: “Eat it, and if it makes you sick eat something else.” Or the slightly more sophisticated argument: “Feed it to grandma and if she is okay then feed it to the children.” Of the over 100,000 chemicals known and the 2,000 or so to which we are regularly exposed, only about 30 have been known, from direct data, to cause cancer or other chronic effects in humans. It is necessary to use all other available information. The most important have been the uses of laboratory animals, rats and mice. These have been used for acute toxicity for over a century. If rats or mice die of exposure to a chemical, so goes the argument so will people. Quantitatively, this has only been studied recently. Looking at hundreds of chemicals, Rhomberg and Wolff () showed that the acute toxic level in rats is approximately proportional to that in people given “comparable” doses.  Biological and toxicological theory may well eventually give us a precise understanding of what “comparable” is, but I here note that experimental data suggest that comparable means a dose which is an equal fraction of the body weight.

Delayed effects, such as cancer, are harder to study. Yet animals can give us some information about cancer probabilities. Stimulated by a suggestion of Sir Richard Doll, Crouch and I () showed that carcinogenic effects in mice correlated with those in rats, and to the limited data available, in people. Dr. Tomatis () of IARC first reported and we confirmed, () that the correlation is better if the site of the cancer is not specified. Although toxicologists and others discuss the possible biological reasons for this correlation, it has proven to be of considerable utility.

But there is one known exception, and a major one - arsenic. This exception was swept under the rug by most people with disastrous effects for SE Asia and other developing countries. () In an early paper () Crouch and I noted that arsenic appeared to be an exception. We did note that it appeared to be carcinogenic in humans when inhaled, but failed to emphasize that there were a few indications (such as angiosarcoma) that ingestion also showed an effect. Nor did we note that the animal studies were by ingesting trivalent or hexavalent arsenic. Once Professor Jack Ng and colleagues in Brisbane, Australia showed that laboratory animals who ingested organic arsenic (arsenic as part of an organic compound) got cancer is there a partial understanding. () Toxicologists are still groping for a better understanding of how this could have been predicted and consequently avoided. But few ask the crucial question for the future. What other catastrophe of comparable magnitude is lurking out of sight? This then remains a primary challenge to risk assessment of chronic effects and carcinogenic risk assessment in particular.

2.5. The Low Dose Arguments

The idea that effects of low doses of pollutants might give adverse effects roughly proportional to the dose, or for chronic effects to the dose integrated over time, arose in the 1920s. The physicist Jeffrey Crowther () related cancer incidence to the ionization of a cell by ionizing radiation. While only a small number of such ionized cells would actually produce a cancer, Crowther had suggested a possible probabilistic nature for cancer incidence. This idea was followed in 1928 by the non-governmental International Commission on Radiological Protection (ICRP) () who suggested that no one be exposed to any amount of ionizing radiation without expectation of benefit. () Low dose linearity follows from a general argument enunciated by Guess et al. () and Crump et al. (), which follows from Taylor's theorem. () If a medical outcome caused by a pollutant is indistinguishable from an outcome that can occur naturally any small increase in dose is likely to lead to a corresponding small increase in effect. () While this is usually accepted for radiation exposures, it is not universally accepted for toxic chemicals. Scientists affiliated with the chemical industry argued, particularly during the 1970s and 1980s, this does not apply to chemicals because, unlike ionizing radiation which passes through tissue and exposes most internal organs there is no linear relationship between exposure to a chemical and the dose to an organ. But there exist compounds called DNA adducts. These are a combination of DNA and part of the pollutant chemical and are probably a way the body rejects the chemical. But the concentration of adducts was shown in one chemical to be related to applied dose over a factor of 100,000. () Some scientists have argued that low dose linearity only applies to genotoxic chemicals – yet they are silent when it is applied to asbestos which is not genotoxic. That low dose linearity has wider application is still not generally accepted. A part of the problem is a failure to realize that a chemical may be carcinogenic for one end point (medical outcome) and anti-carcinogenic for another. () Indeed low doses of a pollutant, such as Fowler's solution of potassium arsenite, have been prescribed for a couple of centuries as cures for a variety of ailments. A maxim attributed to Paracelsus was often quoted. () Loosely translated from the German: “The dose makes the poison.” Most toxicologists believed, and some still believe, that there is a threshold dose below which adverse effects will not occur. If so, the threshold could be determined by epidemiology. But epidemiology has its limits. In general, an adverse effect must be doubled in a population before everyone believes that the material is toxic. () This severely limits the precautions that society can take. This is particularly important for the chronic effects such as cancer, which are usually delayed compared to the exposure and harder to analyze. Many persons have suggested that in these cases society should be governed by an ill-defined “Precautionary Principle”. I would here define it by the demand “Do not do it if you do not understand the consequences.” As so stated the controlling word is “you”. All too often starry eyed idealists are not equipped to understand the consequences but will not accept that society should move ahead if there is a technical consensus that the consequences are understood.

2.6. Inconsistency of Regulatory Agencies

Almost from the beginning regulatory agencies have been inconsistent in using risk and redefining words to suit political ends. In 1979 the EPA regulated some industrial solvents, trichloroethylene and perchloroethylene, at a level that poses a one in a million lifetime risk using the most sensitive animal study. They have been widely criticized as being too stringent but alas the criticism has not been based on the risk level (which I argue cannot be met for all materials consistently), but on the use of low dose linearity. In this I maintain that industry and toxicologists are scientifically wrong. According to the arguments above low dose linearity should apply whenever the medical end point is indistinguishable from one occurring naturally which applies for most carcinogens and pulmonary effects of air pollution. If the 1979 argument were applied to arsenic EPA would have to regulate 5,000 times more rigorously – an impossible task. Worse still, EPA has never admitted that their early risk assessments were too stringent (and should be revised) although more recent ones are less draconian. But the failure of the EPA to formally modify these early regulations in response to better information and understanding sets a very bad precedent for the future. Although EPA sends out, as required by law, proposals for public comment, the public comments are not acknowledged and seem to go into a black hole. While the central office of EPA is subject to such criticisms the field offices are worse. To some extent, this gap is filled by independent organizations such as the Health Effects Institute (HEI). Fortunately, the British Health and Safety Executive seems to understand the issues far better as evidenced by the public statements of Dr. H. Dunster (2) noted earlier. But in a study by the UK Royal Society () a major disagreement arose. The engineers and scientists, particularly physical scientists, appeared to understand each other and what the theoretical calculation of risk means. But psychologists and other social scientists had a different understanding of risk. There was no convergence in this report. This distinction was echoed in the USA. () It remains of vital importance to understand and describe this distinction which goes beyond mere definition.

2.7. Cancer from Fibers

While the animal-man comparison has proven extraordinarily useful for acute toxic effects, and society owes a huge debt to the rodents and it has been important but somewhat less useful for carcinogenic effects where chemical activity has been the primary problem, other problems loom in the future. Asbestos is a well-known carcinogen, or more precisely group of carcinogens, has added another dimension to the problem of comparison that is even now being incompletely understood.  Only the fibrous nature of asbestos seems to make it so carcinogenic, and the dimensions of the fibers are all important. How that geometry varies between animal species must depend upon some geometric aspect and it is far from clear that rodent lungs and human lungs are sufficiently similar for anything but a very rough answer. Society has depended almost exclusively on epidemiology at high doses. In 1986 regulatory agencies treated all chemical forms of asbestos as the same even though there are two distinct structures (serpentine and amphibole) which can be distinguished by an electron microscope. This hypothesis, assumed for regulatory convenience is being questioned for mesothelioma. The issue might be deemed of no importance since the use of asbestos, defined as the fibrous form of the minerals, has been banned or almost eliminated in commerce. Two reasons make me concerned. It bodes ill for the future when a sensible public policy is based upon bad science. Secondly, the advent of nanotechnologies with substances with a variety of new physico-chemical properties emphasizes the need for precautionary risk assessments that go beyond what is presently understood. So far I have not found a nanotechnologist who is even asking sensible questions!

2.8. Examples where Paracelsus’ Words must be modified

A naïve application of Paracelsus’ words is that “more is worse, less is better.” Problems already arose when considering animal-man comparisons. The first animal bioassay () of the very toxic chemical 2,3,7,8-tetrachlorodibenzo-p-dioxin (DIOXIN) found that although the number of liver tumors increases at high doses, the number of more common cancers decreased. Regulatory attention has been paid to the increase but we emphasize that scientific attention should also be paid to the decrease – called anti-carcinogenicity. It was found with other chemicals also. () The fact that the bioassays in the National Toxicology program were not double blind can lead to outright error and prevents a fully reliable assessment with NTP data. But in people there is one well-known chemical, ethyl alcohol, which at low doses reduces the risk of stroke, at moderate doses increases the risk of lip cancer among smokers, and many cancers in animals, but at very high doses has narcotic effects which can be disastrous. () The implication for me is that this could be a general phenomenon.

But a very major challenge for risk analysis comes from claims that for “endocrine disruptors” low doses are more important than higher ones. If these claims are true one cannot take the measured effect at high doses and use a pessimistic (usually linear) extrapolation to low ones. () The arguments are theoretical and so far the calculated risks are not “anchored” by firm epidemiological data on people. This makes it vital for proponents to be sure that the theory is well described to a wide community and is generally accepted.

Another example arises in proposed regulation for perchlorate. At a major hearing on perchlorate it was pointed out that the risk is limited to pregnant women who have not taken precautionary steps. () () Yet some agencies want to regulate to a level of one in a million in a lifetime of having no adverse effect on this limited group. This is even worse than my complaint about the overly stringent regulation of solvents mentioned earlier. Such plans cannot be carried out consistently and their application is arbitrary and capricious and probably illegal on these grounds.

2.9. The Extension to Terrorism and Sabotage

There has never been any formal acceptance of how to use these analyses to study terrorism, but there are some indications. Already in the 1970s it was realized, as noted above, that fire, flood, earthquake or sabotage could couple the steps in the event tree and vastly increase the likelihood of an accident. The overall probability would then no longer be the product of the conditional probabilities at each stage. Indeed, critics argued, incorrectly, that these “common mode failures” were being ignored in the event tree analysis. On the contrary an understanding of the logic of the process enabled these effects to be isolated and considered separately. But initially there were all too many industry executives who did not understand the strengths and weakness and therefore failed to realize this way of using the analysis. For example, in 1976 one such executive proudly told me that his risk analysis for an LNG terminal in Oxnard, California, showed a very low risk - 10-35 per year. But in the middle of the 230 page detailed report the crucial words could be found: “we have not considered the possibility of sabotage.” For the $1,500,000 he paid for the analysis, he deserved better.

One feature emerges from these studies of sabotage (or terrorism as we would now say).  A saboteur must accomplish several things at once corresponding to the steps in the event tree. For example, a terrorist could be a “sleeper” and learn nuclear safety issues; get a job with a power company so that he knows the weak points. Thinking backwards along the event tree he could wait till the wind blows toward a major city, then set off a small bomb opening up the containment vessel, and at the same time with another bomb or otherwise break a reactor coolant pipe and disable the ECCS (Emergency Core Cooling System). It is hard to imagine a single saboteur doing this but a group of armed terrorists might.

I have envisaged setting up an event tree for terrorism with 4 basic steps:

(1) Preventing a person becoming a terrorist.

(2) Preventing terrorist access to weapons.

(3) Preventing terrorist access to dangerous facilities.

(4) Making every facility more robust against terrorist attack.

In all uses of an event tree the importance of reducing the number of initiating events is very clear. If we can prevent someone becoming a terrorist then we need not worry about the other steps. Although there are gradations there are indications from studies of terrorist motivations that less than one or two thousand people exist who have the motivation to mount an attack like that of 9/11 yet there are many hundreds of thousands of potential targets. This suggests that such an event tree the same rule applies as in an ordinary engineering analysis. The earlier in an event sequence you can stop the hazard, the easier it is and the most easily understandable.

Step (2) in the United States is almost impossible to achieve for small arms and even submachine guns. They are easily available. But it is probably at this step that one could most effectively control nuclear terrorism. It can be made difficult to steal a complete bomb, and hard to steal fissile material. This has been a priority of society for half a century. It may be harder to safeguard chemical and biological materials but here also the effort is definitely well worth it.

In the USA step (3) is where most attention has been paid. We try to prevent potential terrorists from entering the country, and any suspicious action can be met with drastic, even draconian measures. Unfortunately this step inevitably involves a fundamental conflict with the deeply held belief in individual liberty. The details of this have been argued back and forth in the USA for the last 10 years. I am far from convinced that the effort here has had much effect, let alone worth the reduction in individual liberty that has been the inevitable consequence. Also at step 3 there is a very dangerous tendency to make many plans and actions secret. I personally believe that secrecy is far too often used to cover up incompetence. There are a few places in society where secrecy is helpful, but not many. I believe that the security advantages of an open society far exceed the dangers that a few lapses in security might create.

Most risk assessors concentrate on step (4).  Not because it is the most important but because it is easiest to analyze. Many analysts of engineering systems have commented that if one has properly understood the Low Probability High Consequence Accidents, one has done most of the required analysis. In a personal discussion of sabotage I had with Professor Rasmussen soon after the Three Mile Island accident he commented: “there is nothing that a saboteur can do that those clowns (the TMI operators) did on their own.” To which I replied: “Yes but might do it more often.” The low probability can become a high probability. It follows that the adjustment of the technology, either in construction or operation, to make these events more difficult will be important. It is not so obvious, that similar attention should be paid to Low Probability High Consequence events in biological or chemical sequences. Most people would argue that it is unlikely that the 1919 influenza pandemic, with its 80 million deaths worldwide, can recur.  But many would argue that the probability could be increased by a terrorist who obtains some undesirable spores. Some experts disagreed, but the waverers were convinced by the Severe Acute Respiratory Syndrome (SARS) epidemic which began with a previously unknown pathogen. The Chinese, prompted by WHO, acted firmly, using the old technique of quarantine. Again, a good defense against such a terrorist attack is for society to be prepared to recognize an epidemic (which can be localized and often not spread by infection) quickly and act within weeks to prevent it becoming a pandemic which spreads across nations and continents. A conclusion I draw is that a proper event tree analysis for Low Probability High Consequence events is a major necessity for all technologies. For public reassurance it should be as open a process as possible. As noted earlier, Probabilistic Risk Analysis has been slow to enter society and it has not yet been accepted by the building industry. That makes me nervous about any new tall building especially at the World Trade Center site. The attraction of this site for a terrorist is evident and public attention to the risks must be correspondingly increased. Alas there is a societal tendency to react belligerently to a challenge posed by the action of another, and to deliberately keep a site that will incite potential terrorists. I hope that the USA can resist this dangerous tendency.

Of course, these 4 steps in my event tree for terrorism are not independent and we must consider even more than for “naturally” occurring sequences, coupling between the steps. I here call attention to an often-stated maxim. If someone with a sensitive technology takes steps to make it secure and makes this publicly known, a potential terrorist is likely to attack somewhere else, or abandon the idea of terrorism completely. Another potential coupling is that draconian efforts in step (3), even if they seem to decrease the probability in step (1), may cause enough anger that there may be a long-term increase in step (1) with huge consequences.

2.10. Risks of Cosmic Catastrophe

Another specialized risk became a subject of concern - the risk of cosmic catastrophe. Again, events that could lead to cosmic catastrophe can be either natural or man-made.  The first such risk analysis was performed in secret in summer 1945. Scientists at Los Alamos, particularly Edward Teller, noted that the temperature in a nuclear explosion would be millions of degrees, higher than any man made temperature heretofore. The question was posed: “could the high temperature ignite the atmosphere and cause the combination of destruction of nitrogen or combine the oxygen and nitrogen, not merely in the local region but in the whole atmosphere surrounding the globe?” Several of the world's brightest scientists, including Fermi, Bethe, and Weisskopf pondered this issue. They decided that it could not happen. The actual words are unknown and the subject of legend: “the probability is less than one in a million” presumably meaning, somewhat loosely, that one in a million such decisions might be wrong. () These scientists clearly felt that “one in a million” was as close to impossible as they wanted to state, not being aware that “one in a million” would foolishly be attempted as a regulatory standard. () Recently two other cosmic risks have been publicly considered. In 1999 the Relativistic Heavy Ion Collider (RHIC) was commissioned at Brookhaven National Laboratory (BNL). This was a true atom smasher, where ions of heavy atoms were induced to collide. The compound nucleus was larger and heavily excited. Many imaginative scientists speculated what might have been the effect. Could the high excitation lead the particles to create a “mini-black hole”? That black hole might eventually absorb the whole earth and solar system. BNL set up an expert committee who studied the matter and decided that it could not happen. RHIC started operation and there is no sign of untoward cosmic events. In 2008 another accelerator was commissioned, the Large Hadron Collider (LHC) in the European Centre for Nuclear Research (CERN) in Geneva. The CERN management also set up a committee of experts to study the issue and, similarly to the BNL committee decided that there was no problem. But two scientists, Louis Sancho and Walter Wagner, disagreed and took the matter to court in Hawaii which dismissed it for lack of jurisdiction. ()

The “lack of jurisdiction” was a legal question, but it is common sense that everyone on earth had an interest in the contemplated cosmic hazard. In all these three situations the scientific community acted ahead of the rest of the body politic in raising the issue and by openly setting up a review committee encouraging imaginative thought. In the first of these wartime prevented the general public from having any input. It is unclear who should make the final decisions in these cases. One of the concerns expressed by the plaintiffs in the Hawaii case is that the CERN reviewing committee, while composed of experts was composed of experts interested in seeing the project go ahead. A concern in the opposite direction is that the plaintiffs are merely publicity seekers who have no substance in their claim. Those who follow the US toxic tort cases are used to such contrasting concerns. Society's failure to come to grips with this may result in a “random redistribution of wealth or random individual tragedies.” But when cosmic risks are concerned there is no such room for complacency.

The above discussions are not meant to provide a historical encyclopedic discussion of risk analysis. Rather, they are a discussion of how I as a physical scientist have come to understand the important issues. For this reason, I have deliberately kept the number of references small. Other analysts have had other experiences, and noted other problems where logical scientific principles can be helpful. I look forward to reading them.

3. REFERENCES

� Alderslade R. The Public Health Act of 1848: The act’s qualities of imagination and determination are still needed today. British Medical Journal, 1988; 317(7158):549–550.

� Health and Safety Executive (HSE). Canvey: An investigation of the potential hazards from operations in the Canvey Island/Thurrock area. 49 High Holburn, London, UK: Her Majesty’s Stationery Office (HMSO), 1978.

� Dunster HJ. The Approach of a Regulatory Authority to the Concept of Risk. International Atomic Energy Agency (IAEA), 1980; Bulletin 22(5/6).

� Wikipedia. Defense in Depth. Available at: http://en.wikipedia.org/wiki/Defence_in_depth, Accessed on October 4, 2011.

� Rasmussen N et al. Reactor Safety Study: An assessment of accident risks in U.S. commercial nuclear power plants. Atomic Energy Commission Report, WASH 1400; Nuclear Regulatory Commission Reports, RM-30-1 and NUREG-75/014, 1975.

� Wilson R. Kilowatt Deaths. Physics Today, 25(73), 1972.

� Lewis HW, Budnitz RJ, Kouts HJC, Loewenstein WB, Rowe WD, von Hippel F, Zachariasen F. Risk Assessment Review Group Report to the U.S. Nuclear Regulatory Commission. U.S. Nuclear Regulatory Commission Report, NUREG/CR-0400, 1978.

� Nuclear Regulatory Commission (NRC). Probabilistic Risk Assessment—Fact Sheet. NRC Office of Public Affairs, October, 2007.

� Pochin, Sir E. Problems involved in developing an Index of Harm. Pergamon Press, Oxford: International Commission on Radiological Protection, ICRP Publication No. 27, 1977.

� Rhomberg LR, Wolff SK. Empirical Scaling of Single Oral Lethal Dose across mammalian Species based on a Large Data Base. Risk Analysis, 1998; 18(6):741-753.

� Crouch E, Wilson R. Interspecies Comparison of Carcinogenic Potency. Journal of Toxicology and Environmental Health, 1979; 5:1095.

� Tomatis L, Agthe C, Bartsch H, Huff J, Montesano R, Saracci R, Walker E, Wilbourn J. Evaluation of the Carcinogenicity of Chemicals: A Review of the Monograph Program of the International Agency for Research on Cancer (1971 to 1977). Cancer Research, 1978; 38(4):877-885.

� Zeise L, Crouch EAC, Wilson R. A Possible Relationship Between Toxicity and Carcinogenicity. International Journal of Toxicology, 1986; 5(2):137-151. doi: 10.3109/10915818609141018

� Sambu S, Wilson R. Arsenic in food and water – a brief history. Toxicology and Industrial Health, 2008; 24:217–226.

� Crouch E, Wilson R. Interspecies Comparison of Carcinogenic Potency. Journal of Toxicology and Environmental Health, 1979; 5:1095-1118.

� Krishnamohan M, Qi L, Lam PKS, Moore MR, Ng JC. Urinary arsenic and porphyrin profile in C57BL/6J mice chronically exposed to monomethylarsonous acid (MMAIII) for two years. Toxicology and Applied Pharmacology, 2007; 224:89-97.

� Crowther J. Some considerations relative to the action of x-rays on tissue cells. Proceedings of the Royal Society B: Biological Sciences, 1924; 96:207-211.

� International Commission on Radiological Protection (ICRP). International Recommendations for X-ray and Radium Protection: 1928 Recommendations. Superseded by ICRP Publication, 103:27; July, 1928.

� Zeise L, Crouch EAC, Wilson R. The Dose Response Relationships for Carcinogens: A Review. Environmental Health Perspectives, 1987; 73:259.

� Guess H, Crump KS, Peto R. Uncertainty estimates for low-dose-rate extrapolation of animal carcinogenicity data. Cancer Research, 1977; 37:3475-3483.

� Crump KS, Hoel DG, Langley CH, Peto R. Fundamental carcinogenic processes and their implications for low dose risk assessment. Cancer Research, 1976; 36:2973-2979.

� Hazewinkel M (ed). Taylor's theorem. Encyclopaedia of Mathematics: Springer, 2001. (ISBN 978-1556080104)

� Crawford M, Wilson R. Low-Dose Linearity: The Rule or the Exception? Human and Ecological Risk Assessment, 1996; 2:305-330.

� Zeise L, Crouch EAC, Wilson R. The Dose Response Relationships for Carcinogens: A Review. Environmental Health Perspectives, 1987; 73:259.

� Linkov I, Wilson R, Gray G.M. Anticarcinogenic Responses in Rodent Cancer Bioassays are not Explained by a Random Effect. Toxicological Sciences, 1998; 43:1-9.

� Madea F, Mußhoff G, Berghaus K. Verkehrsmedizin: Fahreignung, Fahrsicherheit, Unfallrekonstruktion. Deutscher Ärzte-Verlag, 2007; 435.

� Taubes G, Mann CC. Epidemiology Faces its Limits. Science: Research Library, 1995; 269(5221):164.

� RSSG (Royal Society Study Group). Risk Analysis, Perception and Management. London: The Royal Society, 1992.

� Slovic P. Perceived Risk, Trust and Democracy: A Systems Perspective. Risk Analysis, 1993; 13(6): 679-682.

� Kociba RJ, Keyes DG, Beyer JE, Carreon RM, Wade CE, DiHenberger DA, Kalnins RP, Franson LE, Park CN, Barnard SD, Hummel RA, Humiston CG. Results of a two-year chronic toxicity and oncongenicity study of 2,3,7,8-tetrachlorobenzo-p-dioxin in rats. Toxicology of Applied Pharmacology, 1978; 46:279-303.

� Linkov I, Wilson R, Gray GM. Anticarcinogenic Responses in Rodent Cancer Bioassays are not Explained by a Random Effect. Toxicological Sciences, 1998; 43:1-9.

� Doll R. The Benefit of Alcohol in Moderation. Drug and Alcohol Review, 1998; 17:353-363.

� Environmental Protection Agency (EPA). What Are Endocrine Disruptors? Endocrine Disruptor Screening Program (EDSP). Available at: http://www.epa.gov/endo/pubs/edspoverview/whatare.htm, Accessed on October 5, 2011.

� Tellez RT, Chacon PM, Abarca CR, Blount BC, Landingham CB, Crump KS. Long-term environmental exposure to perchlorate through drinking water and thyroid function during pregnancy and the neonatal period. Thyroid, 2005; 15(9):963-975.

� Centers for Disease Control and Prevention (CDC). Perchlorate. National Report on Human Exposure to Environmental Chemicals CAS No. 7601-90-3. Available at: http://www.cdc.gov/exposurereport/data_tables/Perchlorate_ChemicalInformation.html, Accessed on October 5, 2011.

� Konopinski EJ, Marvin C, Teller E. Ignition of the Atmosphere with Nuclear Bombs. Retrieved on November 23, 2008. (1946, Declassified in February 1973)

� Kammen DM, Shlyakhter AI, Wilson R. What is the Risk of the Impossible? Technology: Journal of the Franklin Institute, 1994; 331A:97-116.

� LHC Safety Review Organization. Background Documents for an Independent Assessment of the LHC’s Safety. Compiled by LHCSafetyReview.org, April 4, 2011.

ACKNOWLEDGEMENTS

The author has had the benefit of discussions with many collaborators and hundreds of students. I thank them for preventing me becoming too dismissive of alternate views. In particular, I would like to thank my wife who puts up with the constant discussion of risks and dangers, rather than opportunities and joys. I thank also my friend and collaborator for 30 years, Dr. Edmund Crouch and my father who taught me to question authority and to discern when it was lacking. The paper was completed with the help of Ms Jostine Ho who helped with various references and criticized the text.

Author: Richard Wilson*

*Mallinckrodt Professor of Physics (emeritus), Harvard University, Cambridge, MA 02138.

For correspondence contact: Richard Wilson, Department of Physics, Jefferson Physical Laboratory Rm. 257, Harvard University, Cambridge, MA 02138.

Email: � HYPERLINK "mailto:[email protected]"��[email protected]�, Fax: +1 617 495 0416, Office : +1 617 495 3387.

Based on a presentation to the 2nd World Congress on Risk (Society of Risk Analysis),

Guadalajara, Mexico, June 8th 2008.

PAGE

31