By: Hans Plugge, Mark Grossman
When it comes to risks, particularly those associated with chemicals, many people may be uncomfortable with an ambiguous answer to a key question: How risky is an exposure?
Scientists (and especially engineers) often state that a certain level of exposure to a substance is "safe." But scientists also frequently struggle to effectively communicate the uncertainties in their statements. And even when they express uncertainty, those concerned about the risks don't want to hear the scientists say: "It depends."
The truth is that even when risk is communicated in the simplest of terms, the underlying uncertainty may be obscured. For instance, the American Conference of Governmental Industrial Hygienists (ACGIH®)—an organization whose stated objective is to improve occupational and environmental health—might establish a Threshold Limit Value (TLV®) of 5 parts per million (ppm) for occupational exposure to a given chemical. For the general public, this could be interpreted to mean that at 4.9 ppm, the exposure is safe and at 5.1, it's dangerous. Simple, right? Not so fast. A look at the definition provided by the ACGIH® indicates that TLVs® are not "a fine line" between safe and dangerous because an individual's response to exposure may vary based on individual, workplace, and other factors.1 2
In practice, the steps undertaken to generate what are considered safe limits of exposure to a given chemical involve some degree of uncertainty, which compounds throughout the limit setting process. This often results in a seemingly definitive hard number, which can be interpreted differently by various parties. The key question is how to relay this uncertainty in understandable terms, especially when, unlike the ACGIH® TLVs®, the precautions that need to be applied when using the limits are not flagged. For most limits that are expressed as hard numbers, this uncertainty may not be appropriately quantified and conveyed, thus potentially impacting the vital risk communication process. In addition, compensating for this uncertainty through various uncertainty safety or assessment factors often leads to overprotection, incurring a societal cost that generally goes unacknowledged, let alone communicated. Societal costs can include the cost of overcompliance (e.g. cost of engineering controls), cost of undercompliance (e.g. cost of additional health effects, regulatory penalties), as well as the cost of testing (e.g. the ethics of using test animals).
The sources of uncertainty
Regulatory standards have typically used hard numbers to communicate risk levels under the assumption that assessmenti factors have been built in. Assessment factors are incorporated to give an unnuanced perception of (un)certainty, by incorporating multiple order of magnitude factors. Whether it is occupational exposure, ii environmental exposure, drinking water contaminant levels, or speed limits, the public is used to seeing hard numbersiii, which are generally perceived simply as hard cut-offs.
Thus, one of the most critical tasks for risk managers is to tell a story without bias and judgment about such numbers while addressing the uncertainty incorporated (sometimes by design) into such "hard" numbers.
Risk uncertainty can be attributed to various sources, including:
Exposure measurement error: This is the compounding of sampling error associated with using instruments to collect samples and analytical error from laboratory methods used to analyze samples. While this is a frequently acknowledged error, it rarely exceeds the 10-30 percent range.
Variability in experimental subjects: When testing on living organisms, even something as simple as the variation in fish size in an acute toxicity test can produce errors in the 10-30 percent range.
Variability in sensitivity: Test subjects can respond differently to similar exposure levels. Even good duplicate studies on similarly sized rats often incur an order of magnitude variability.3 Then comes extrapolation to humans, another major and often unknown,source of uncertainty.
Add to these the differential response from various ages or sensitive populations and one can see that uncertainty plays a big role in standard-setting. Depending on the types of errors considered, it may range from less than 10 percent up to several orders of magnitude.iv 4 5
How safety factors can skew the perception of risk
How have these uncertainties been accounted for during the process of standard setting? One approach is to employ highly specialized and sophisticated statistical techniques or data analytics. However, in many cases, so-called safety/uncertainty/assessment factors6; have been used as an alternative, faster but less precise method. These factors do not actually measure uncertainty (nor safety) but rather are ballpark numbers in the (half) order of magnitude size, to account, for instance, for short-to long term exposures extrapolation.v Various uncertainty factors are then multiplied, often resulting in overall assessment factor on the order of 100 -1000, or even higher. The hope is that these assessment factors are protective, but they often result in overestimated low exposure limits such as PNECs or DNELs.vi 7 8 Given this, usage of such assessment factors has recently been questioned: Data analytics of a broad range of studies suggested that individual assessment factors may need to be much lower than 10.9 10
Uncertainty increases when one starts discussing levels of risk. After all, risk is the product of hazard and exposure, thereby propagating the error/uncertainty in both estimates.11 One needs to be careful here – since often a discussion of risk is actually a discussion of hazard. Determining the likelihood that chemical X will cause a health hazard is a hazard estimate, not a risk estimate because exposure to chemical X isn't considered. Exposure to a low dose of chemical X may not carry the risk that exposure to higher doses does.
Vitamin A and selenium are good examples of the famous phrase, "the dose makes the poison." Too little is bad (they are essential nutrients, after all) and too much is bad, with the probability of negative in utero developmental effects in both cases.12
Managing risk when hazard data are not available
While uncertainty poses issues even when hazard data are available, there are scenarios in which risk managers must compensate for a dearth of this information. How do safety and health professionals get a handle on acceptable chemical exposures when there is little or no toxicity data to establish a single limit? Contemporary scientists typically look at structural similarities between chemicals for which data are available and those for which data are lacking.
One example is read-across–assigning health effects from a similar chemical to the chemical of concern. For example, one of the earliest standards for carbon nanotubes was based on the standards created for asbestos fibers, due to similar shapes of the structures.13 Another would be computational toxicology – computerized imaging of a chemical to identify structures of concern and calculate effects based on these parameters. The U.S. Environmental Protection Agency (U.S. EPA), for example, is heading in that direction, partially based on the regulatory need to eliminate/reduce animal testing.14 15
Other approaches include banding, exposure minimization, and Thresholds of Toxicological Concern (TTC).
Generally speaking, banding involves the replacing of the "hard limit" with an acceptable range of exposures, communicating the inherent uncertainty in an exposure limit. For example, the U.S. National Institute for Occupational Safety and Health (NIOSH)16 has developed an exposure banding process that includes five bands, labeled A through E: The A band has the highest target concentration range (lowest health hazard potential) and the E band has the lowest target concentration range (highest health hazard potential). This exposure banding approach has been extensively used by pharmaceutical companies. One of its major benefits is that once additional information becomes available, one can easily refine the band without the laborious standard-setting process.
The practice of exposure minimization has been used for years by various radiation health and safety professionals and is sometimes referred to as ALARA17 (as low as reasonably achievable). In a similar vein, laboratory professionals often handle research chemicals with very little health hazard information, so they employ "prudent practice18 aimed at minimizing chemical exposures through the use of engineering controls, such as the use of chemical fume hoods, and good work practices, relying on personal protective equipment when exposures cannot be eliminated or minimized by other means. Similar approaches have been suggested for the handling of nanomaterials, which could involve unknown hazards.19
Threshold of Toxicological Concern (TTC), the level below which one would not expect an effect, is another form of assigning chemicals a hazard estimate.20 Similar to banding, one can "bin" chemicals into classes based on their structural properties and assign an acceptable exposure limit, based on previously researched chemicals. One can also calculate a very conservative TTC, by looking at the entire universe of chemicals which results in a TTC that holds for all chemicals. TTC's are especially helpful when deciding whether to test a chemical: If the expected exposure is below the TTC, there would be no expected effect and hence no testing required.
Risk communication: How can uncertainty be expressed to different parties?
Risk communication may be the most overlooked aspect of managing risk. When risk is inadequately explained or oversimplified, it can erode public trust in science. Risk estimates are by definition uncertain with regard to actual risk and hard number limits often fail to communicate this.
For example, when the result of a single air sample taken in the workplace is 4.9 and the limit is 5, do the parties understand that there is a significant probability that another sample taken the next day - perhaps by an OSHA inspectorvii - might be over 5 and result in a citation? Simply put, considering sampling and analytical error, 5 is essentially no different from 4.9 (or 5.1), and perhaps, depending on how the sample was collected and analyzed, a result of 4 is meaningfully different from 5. Compounded uncertainties in other areas generally are much higher, often in the (half) order of magnitude range, e.g. 10 is no different from 3 or 30.21
Risk perception is also a key tenet of risk communication. Human psychology can lead individuals to be more concerned by risks that are, in practice, considerably less threatening than an alternative.22 For example, people may be unwilling to fly but have no problem driving 65 miles per hour down the interstate. One of these is certainly a riskier activity than the other, yet the much lower probability event is what an individual may be more inclined to avoid. The dynamic that could drive this discrepancy is that some people are more attuned and sensitive to additional risks imposed from factors out of their control (e.g. involuntary risk). Chemical exposures are mostly involuntary; thus, they tend to attract a heightened risk perception.
How can uncertainty ultimately be accounted for? One method could be to conduct a detailed risk assessment. For example, if Z individuals out of population N will be affected with X assuming an exposure level of Y, one limit of uncertainty (e.g. what is the highest level of risk possible?) could be addressed. Risk management techniques then focus on that aspect: Minimizing exposure, which thus reduces risk.viii However, this worst-case scenario planning on the high end of the risk spectrum could lead to overly-restrictive guidelines that can be costly when amplified across a supply chain.
Is there a silver bullet?
Of course, none of this discussion would be necessary if all we had were "green" chemicals—that is, chemicals with very limited health effects.23 Although progress has been made to minimize hazard, through, for instance, the U.S. EPA's Safer Chemical Ingredients List24 (mostly used for consumer products), for now this remains unattainable for many industrial chemicals. Given that we do not have much data on many of these new, greener chemicals, industry may seek to avoid so-called "regrettable substitutions" before plunging ahead. For now, greener substitutes for traditional chemicals can offer an intermediate solution.
So, with a perfect solution not exactly on the horizon, how can uncertainty be effectively identified? Understanding the source of data and its uncertainties would be a great starting point. Increased use of data analytics to determine actual uncertainty would be very beneficial. The new norm of alternative approaches to deriving risk data in the absence of "Real Data"ix comes with its own inherent uncertainty, again on the order of magnitude scale. And when there still are no data, minimizing or eliminating the exposure may be preferable.
Communicating uncertainty in risk estimates may be a laborious process, sometimes requiring a deep dive into arcane reports or the expertise of safety and health professionals. Nevertheless, if the end result is a clearer communication of a final risk story to customers, end-users, and the public, it's worth it.
Authors
Hans Plugge is senior toxicologist and manager, safer chemical analytics, at Verisk 3E. Hans can be reached at hplugge@verisk3e.com.
Mark Grossman, CIH, CSP, is manager of occupational safety and health, ISO Engineering and Safety Service, at Verisk. Mark can be reached at Mark.Grossman@verisk.com.
i Uncertainty/safety factors most often are ballpark numbers i.e. orders of magnitude – assessment factor is a better term and could be calculated via data analytics.
ii There are some exceptions. For example, ACGIH® acknowledges that their standards are not completely protective.
iii Let alone the discussion of significant digits – a standard of 5 units inherently acknowledges that you cannot tell the difference between 4.5 and 5.5 – when it gets to the hundreds and thousands this needs to be acknowledged using scientific/engineering notation: unfortunately not exactly the best vehicle for risk communication.
ix Even “perfectly” controlled experiments show a 30% variability in actual measurements with biological variability adding another 70% (Pham et al 2020). So, our level of confidence in the uncertainty of these measurements is about plus or minus an order of magnitude. Selection of the appropriate effect level can be derived using various statistical methods including MUST which picks an “average” study using a preconstructed level of professional bias/expertise (Kostal et al 2020).
x An example would be an assessment factor of ten added in the absence of chronic long-term study data and another factor of 10 added for the absence of/extrapolation to human data resulting in a compound assessment factor of 100 before adding even more assessment factors.
xi PNEC Predicted No Effect Concentration DNEL Derived No Effect Level
xii Called Compliance Safety and Health Officers (CSHOs) by OSHA.
xiii Until recently rarely was chemical substitution with greener alternatives implemented.
ix Very shortly, real data will only be available for traditional chemicals; all new data for new chemicals will be based on alternative approaches mostly computational toxicology, which is benchmarked on traditional toxicology data.
1. American Conference of Governmental Industrial Hygienists. 2020 TLVs® and BEIs® Based on the Documentation of the Threshold Limit Values for Chemical Substances and Physical Agents & Biological Exposure Indices. Cincinnatti, OH: ACGIH®, 2020, 3-4.
2. “TLV Chemical Substances Introduction,” American Conference of Governmental Industrial Hygienists, < https://www.acgih.org/tlv-bei-guidelines/tlv-chemical-substances-introduction >, accessed on November 5, 2020.
3. Ly Ly Pham et al., “Variability in in vivo studies: Defining the upper limit of performance for predictions of systemic effect levels,” Computational Toxicology, August 2020, < https://doi.org/10.1016/j.comtox.2020.100126 >, accessed on November 20, 2020.
4. Jakub Kostal, Hans Plugge and Will Raderman, “Qualifying Uncertainty in Ecotoxicological Risk Assessment: MUST, A Modular Uncertainty Scoring Tool,” Environmental Science and Technology, 2020, < https://doi.org/10.1021/acs.est.0c02224 >, accessed on November 5, 2020.
5. Hans Plugge, Nihar Das and Jakub Kostal, “Meta-Analysis of Acute Fish Toxicological Data,” Integrated Environmental Assessment and Management, 2020 in review
6. “Reference dose: Description and Use in Health Risk Assessments,” United States Environmental Protection Agency,” < https://www.epa.gov/iris/reference-dose-rfd-description-and-use-health-risk-assessments >, accessed on November 5, 2020.
7. “EnviroTox Platform,” HESI, 2018, < https://envirotoxdatabase.org/assets/EnviroToxUserGuide_November2018.pdf >, accessed on November 5, 2020.
8. “Guidance on information requirements and chemical assessment; Chapter R.8: Characterisation of dose [concentration]-response for human health,” The European Chemicals Agency, November 2012, < https://echa.europa.eu/documents/10162/13632/information_requirements_r8_en.pdf >, accessed on November 5, 2020.
9. Escher, SE, Mangelsdorf, I, Hoffmann-Doer, S, Partosch, F, Karwath, A, Schroeder, K, Zapf, A, Batke, M., “Time extrapolation in regulatory risk assessment: The impact of study differences on the extrapolation factors,” Regulatory Toxicology and Pharmacology, 112:104584
10. “Meta-Analysis of Acute Fish Toxicological Data”
11. “Understanding Risk and Hazard,” American Chemistry Council, < https://www.americanchemistry.com/Understanding-Risk-and-Hazard/ >, accessed on November 5, 2020.
12. “DRI Dietary Reference Intakes for Vitamin C, Vitamin E, Selenium and Carotenoids,” The National Academies of Sciences, 2000, < https://www.nap.edu/read/9810/chapter/1#xi >, accessed on November 5, 2020.
13. “Nanotechnologies–Part 2: Guide to Safe Handling and Disposal of Manufactured Nanomaterials,” British Standards Institute, 2007, < https://shop.bsigroup.com/Sandpit/OLD-forms/Nano/PD-6699-2/Confirmation/ >, accessed on November 5, 2020.
14. “Alternative Test Methods and Strategies to Reduce Vertebrate Animal Testing,” United States Environmental Protection Agency, < https://www.epa.gov/assessing-and-managing-chemicals-under-tsca/alternative-test-methods-and-strategies-reduce >, accessed on November 5, 2020.
15. “First annual conference on the state of the science on development and use of new approach methods for chemical safety testing,” United States Environmental Protection Agency, December 17, 2019, < https://www.epa.gov/chemical-research/first-annual-conference-state-science-development-and-use-new-approach-methods >, accessed on November 5, 2020.
16. “Technical Report: The NIOSH Occupational Exposure Banding Process for Chemical Risk Management,” National Institute for Occupational Safety and Health, July 2019, < https://www.cdc.gov/niosh/docs/2019-132/pdfs/2019-132.pdf?id=10.26616/NIOSHPUB2019132 >, accessed on November 5, 2020.
17. C. Jones, J. DeCicco, and S. Sherbini. "Answer to Question #8375 Submitted to 'Ask the Experts,'" Health Physics Society, September 15, 2009, < https://hps.org/publicinformation/ate/q8375.html >, accessed on November 5, 2020.
18. “Prudent Practices in the Laboratory. Handling and Management of Chemical Hazards,” National Research Council, 2011, < https://www.nap.edu/catalog/12654/prudent-practices-in-the-laboratory-handling-and-management-of-chemical >, accessed on November 5, 2020.
19. “Standard Guide for Handling Unbound Engineered Nanoscale Particles in Occupational Settings,” ASTM International, October 2018, < https://www.astm.org/Standards/E2535.htm >, accessed on November 5, 2020.
20. Hartung, T., “Thresholds of Toxicological Concern – Setting a threshold for testing below which there is little concern,” ALTEX 34(3); 331- 351.
21. “Variability in in vivo studies: Defining the upper limit of performance for predictions of systemic effect levels.”
22. Baruch Fischhoff, et al., “Risk Perception and Communication,” Annual Review of Public Health, 1993, < https://www.academia.edu/14960851/Risk_Perception_and_Communication >, accessed on November 6, 2020.
23. Paul Anastas, et al., “12 Principles of Green Chemistry,” American Chemical Society, https://www.acs.org/content/acs/en/greenchemistry/principles/12-principles-of-green-chemistry.html , accessed on November 6, 2020.
24. “Safer Chemical Ingredients List,” United States Environmental Protection Agency, < https://www.epa.gov/saferchoice/safer-ingredients >, accessed on November 5, 2020.