Mindfully.org  

Home | Air | Energy | Farm | Food | Genetic Engineering | Health | Industry | Nuclear | Pesticides | Plastic
Political | Sustainability | Technology | Water
pce cleanup


Absence of Certainty
Is Not Synonymous with Absence of Risk.

John Cairns, Jr. / Editorial / Environmental Health Perspectives Volume 107, Number 2, Feb99

Once harm has been done, even a fool understands it.
Homer, The Iliad, Book XVII, 1.32

Bad things happen. Although human society has been remarkably successful in developing techniques to preempt some bad events, the successes have raised expectations. As we have devised ways of reducing some risks, smaller risks seem more significant. Simultaneously, technological solutions to some problems have created new risks of their own. The environmental risks that have resulted, both from growth in human population and technologies, fall into two main categories: 1) the possible consequences of misplaced wastes and 2) the wholesale shift of land area from natural systems that produce many different ecosystem services to human-managed systems which produce, at most, one or two. The significance of these changes to human society is the subject of much debate. The debate has, at times, become so polarized that it is difficult to distinguish what is known from what is supposed. The basic concepts of risk and uncertainty are keys to following this debate.

Every possible change or action has risks and benefits. Risks are those consequences that human society finds undesirable, and benefits are the desirable consequences. However, risks and benefits both have three components: 1) how likely is it (probability)? 2) how good or bad is it and how large a spatial and temporal area does it affect (magnitude)? and 3) what is the quality of the information on which estimates of probability and magnitude are based (uncertainty)? Uncertainty is a component of both risks and benefits that results from imperfect knowledge about the probability or magnitude of the consequences of a change.

While high uncertainty may obscure both the probability of a risk and the magnitude of harm, uncertainty does not eliminate risk. Unrecognized risks are still risks; uncertain risks are still risks; and denied risks are still risks. For example, before the research of Pasteur, Lister, and Koch in the late 1800s, the concept that germs could cause disease was unrecognized. Surgeons routinely operated without washing their hands. The risk of dying from surgery was recognized and weighed against possible benefits. However, many people died, and others gave up possible benefits of surgery because the risk of infection was unabated. Similarly, in the British Navy, it was observed that lime juice reduced the incidence of scurvy among the crews of sailing ships on long voyages (hence "limeys"). Even so, no move to put lime juice on every ship was made for decades.

As we deal with potential risks to larger, more complex systems, the uncertainty associated with estimates of risk becomes irreducibly larger. How should we, as a society, respond to the increasing need to make decisions about risks that carry high uncertainty? Choices that are relatively easy when dealing with well-characterized linear systems become more difficult as the problems become larger and the knowledge of present state, likely future behavior, and interconnections between the many elements of complex systems is limited. Two kinds of mistakes in this regard are possible: false positive errors and false negative errors. If we act to ameliorate a risk that later proves invalid, we have made a false positive error and have expended resources without any resulting benefit. If we choose not to act to ameliorate a risk that later proves valid, we have made a false negative error and forfeit simpler, more parsimonious, and less wrenching solutions to risks by failing to act in a timely fashion. If response time is even longer, expensive remediation may be necessary or resources may be lost for which there is no possible replacement, including, but not limited to, human life. As the systems whose future is being predicted become larger and more complex, the errors grow more costly.

Both science and ethics have roles in environmental decision making. In most risk assessment schemes, science and ethics come into play sequentially. Ethical considerations define what society considers as a problem requiring action. Science contributes probablistic statements about the nature of the world, the connections between events, and probabilities of future happenings, but science does not tell us what we should do. Ethics or societal consensus comes into play again in the risk management phase as the relative importance of uncertainties, risks, and benefits is weighed and a course of action is chosen. The decision maker can make an educated guess about the future, but must weigh the uncertainty of the prediction against the unpleasant consequences should the decision be wrong. The uncertainty of an estimate of risk can be of little importance if a decision is made to avoid the risk because the magnitude of harm is large but the resulting magnitude of benefits is small. In addition, a large probability of a small adverse effect in combination with widespread and large benefits might be judged acceptable.

The appropriate degrees of tolerance for both risk and uncertainty are important matters of public debate, and these concepts are closely interlinked. The uncertainties of benefits need to be scrutinized as well as uncertainties of risks. Intolerance of uncertainty will guide choices that require a higher tolerance of risk as actions are delayed and methods for abatement are limited by the progression of impact from early to late stages. The absence of certainty is not synonymous with the absence of risk.

In areas of great uncertainty, it is more difficult to distinguish between ethical claims and scientific ones. The response to similar risks and similar uncertainties varies depending on factors such as whether the risk is consistent with our ethics, whether the risk is freely chosen or imposed, and whether the associated benefit comes directly to us or to society at large or to people other than those bearing the risk. Some substantial risks, such as the risk of injury in a car accident, are accepted willingly because the benefit is large and very direct. A similar risk without such direct benefits may be begrudgingly accepted or rejected. If an estimate of risk has high uncertainty and is inconsistent with our belief system, we may well deny it, declaring the true picture to be lost in the uncertainty. Thus, a decision that seems to be based on empirical evidence to one person may, at the same time, appear to be based on ethics to another because the first person judges the level of uncertainty associated with the scientific evidence to be acceptable, whereas the second person judges it too high.


Table 1 Examples of the varying qualities of information used to estimate risk.

Scientific Observations
	Single
		Results of one or more controlled and replicated studies
		Observations of one or  more natural systems
	Multiple
		Meta-analysis of multiple designed studies
		Synoptic surveys of natural systems
	Consilient
		Observations showing that multiple lines of evidence are consistent 
		  and can be connected with other phenomena
Educated Guesses
	Models used to link several well-studied processes together
	Models used to extrapolate from studied conditions to unstudied ones
	Models used to extrapolate from one spatial, temporal, or hierarchical scale to another
	While most predictive models cannot be directly compared to observed effect because they 
	  predict future behavior, often at scales that are untestable, the most convincing models 
	  will be consensual (the state of the art as determined in peer review) and will be based 
	  on observations of relevant processes at the highest possible level (9)

Speculation
	Untested unifying principles
	Models in which the component processes are not based, at the highest possible level, on observation

Hyperbole
	Overstatement or catatrophizing of information
	Biased selective citation of information; anecdotal evidence when meta-analysis are possible
	Redefinition of key terms
	Hiding or misrepresenting the uncertainty associated with information

Uncertainty is a function of the extent and quality of knowledge about a problem. The evidence used to estimate a risk can vary widely in quality (Table 1). Shrader-Frechette (1) suggests that part of the antiscience backlash recently described by several authors [see Ehrlich and Ehrlich (2)] might be attributed to a failure of scientists and decision makers to clearly distinguish observed effects from educated guesses, or from speculation. While society must learn some tolerance for the irreducible uncertainty resulting from complexity, it should be intolerant of uncertainty resulting from a simple failure to collect information or from sloppy scientific technique or interpretation. The costs for collecting this information should be borne by those who will benefit from the action and not by those supporting a precautionary stance. In our own self-interest, even small and uncertain probabilities of risks of large magnitude or irreparable harm must be taken seriously. The scientific method is not capable of proving an action is "safe."

The best that can be offered is repeatedly finding no evidence of adverse effects. And, in fact, when common statistical methods are chosen that emphasize the control of Type I errors, the benefit of the doubt falls toward the conclusion of no adverse effect (3). There will be more false negatives (concluding no adverse effect when in fact there is one) than false positives (concluding an adverse effect when in fact there is none).

Quality and credibility of scientific evidence can be judged. Wilson (4) lists five attributes of robust scientific evidence: 1) repeatability, 2) economy (yielding the largest amount of information with the least amount of effort), 3) mensuration, 4) heuristics, and 5) consilience (the explanations of different phenomena most likely to survive are those that can be connected and proved consistent with one another). Shrader-Frechette (5) described a risk spectrum: at one end are cultural relativists who believe that risks are only cultural constructs, and at the other end are naive positivists who believe that risk assessment is completely objective, neutral, and value free. The cultural relativists underestimate or dismiss the scientific component of risk assessment. The naive positivists underestimate or dismiss the ethical components. However, those who simply deny the existence of risks may be an important third group.

Denial (6) is still a common reaction to many potential environmental risks such as global climate change, environmental endocrine disruptors, and increased human population size and affluence. Regrettably, denial of risk blocks the accumulation of data that would improve estimates of risk and consequence.

Denial also polarizes public debate into "them" and "us" categories. This polarization serves as news entertainment but does not advance the understanding of complex issues. This obstacle is particularly unfortunate because complex, multidimensional problems are not as amenable to rapid resolution through reductionist scientific approaches as previous problems have been. Instead, these problems must be approached by using integrative scientific efforts on a scale larger than in the past. Deniers can be subdivided into two overlapping groups: those who believe there are no problems and those who believe that any problems arising can be solved by human ingenuity. Exemptualists [e.g., see Myers and Simon (7)] believe human creativity and technology exempt human society from risks that result from the biophysical laws of nature. The related, but less sweeping, view is that a technological solution can be found for every problem created by technology (8). These views diminish the importance of risk assessment in their belief that any and all risks can be abated successfully as soon as they become bothersome enough.

In conclusion, some of the risks that are most vigorously denied (e.g., global warming, damage to the ozone layer, overpopulation) would, if realized either singly or in combination, markedly reduce the prospects of leaving a habitable planet. The tendency to discount risks that are temporally or spatially distant or those on larger temporal or spatial scales runs counter to aspirations for sustainable use of the planet.

As stated previously, unrecognized risks are still risks; uncertain risks are still risks; denied risks are still risks! The precautionary principle embodies the belief that it is prudent to attempt to diminish risks with particularly severe consequences, even if the probability of occurrence is moderate or the uncertainty high. Excessive confidence in our own ability to solve problems after harm has been done could result in awkwardness or tragedy.

John Cairns, Jr.
Department of Biology
Virginia Polytechnic Institute and State University
Blacksburg, Virginia

References and Notes

1. Shrader-Frechette KS. Science versus educated guessing--risk assessment, nuclear waste, and public policy. Bioscience 46(7):488-490 (1996).

2. Ehrlich PR, Ehrlich AH. Betrayal of Science and Reason: How Anti-environmental Rhetoric Threatens Our Future. Covelo, CA:Island Press/Shearwater Books, 1996.

3. Shrader-Frechette KS. Science, environmental risk assessment, and the frame problem. Bioscience 44(8):548-552 (1994).

4. Wilson EO. Scientists, scholars, knaves and fools. Am Sci 86:6-7 (1998).

5. Shrader-Frechette KS. Risk and Rationality: Philosophical Foundations for Populist Reforms. Berkeley, CA:University of California Press, 1994.

6. Orr DW, Ehrenfeld D. None so blind: the problem of ecological denial. Conserv Biol 9:985-987 (1995).

7. Myers N, Simon JL. Scarcity or Abundance? New York:W.W. Norton & Company, 1994.

8. National Academy of Engineering. Technological Trajectories and the Human Environment. Washington, DC:National Academy Press, 1997.

9. Barnthouse LW. The role of models in ecological risk assessment: a 1990s perspective. Environ Toxicol Chem 11:1751-1760 (1992).

Last Updated: January 22, 1999

If you have come to this page from an outside location click here to get back to mindfully.org