skip to main content


Title: To Test or Not to Test: Tools, Rules, and Corporate Data in US Chemicals Regulation
When the Toxic Substances Control Act (TSCA) was passed by the US Congress in 1976, its advocates pointed to new generation of genotoxicity tests as a way to systematically screen chemicals for carcinogenicity. However, in the end, TSCA did not require any new testing of commercial chemicals, including these rapid laboratory screens. In addition, although the Environmental Protection Agency was to make public data about the health effects of industrial chemicals, companies routinely used the agency’s obligation to protect confidential business information to prevent such disclosures. This paper traces the contested history of TSCA and its provisions for testing, from the circulation of the first draft bill in the Nixon administration through the debates over its implementation, which stretched into the Reagan administration. The paucity of publicly available health and environmental data concerning chemicals, I argue, was a by-product of the law and its execution, leading to a situation of institutionalized ignorance, the underside of regulatory knowledge.  more » « less
Award ID(s):
1827951
NSF-PAR ID:
10397738
Author(s) / Creator(s):
Date Published:
Journal Name:
Science, Technology, & Human Values
Volume:
46
Issue:
5
ISSN:
0162-2439
Page Range / eLocation ID:
975 to 997
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Quaternary ammonium compounds (QACs), a large class of chemicals that includes high production volume substances, have been used for decades as antimicrobials, preservatives, and antistatic agents, and for other functions in cleaning, disinfecting, personal care products, and durable consumer goods. QAC use has accelerated in response to the COVID-19 pandemic and the banning of 19 antimicrobials from several personal care products by the US Food and Drug Administration in 2016. Studies conducted before and after the onset of the pandemic indicate increased human exposure to QACs. Environmental releases of these chemicals have also increased. Emerging information on adverse environmental and human health impacts of QACs is motivating a reconsideration of the risks and benefits across the life cycle of their production, use, and disposal. This paper presents a critical review of the literature and scientific perspective developed by a multidisciplinary, multi-institutional team of authors from academia, governmental, and nonprofit organizations. The review evaluates currently available information on the ecological and human health profile of QACs and identifies multiple areas of potential concern. Adverse ecological effects include acute and chronic toxicity to susceptible aquatic organisms, with concentrations of some QACs approaching levels of concern. Suspected or known adverse health outcomes include dermal and respiratory effects, developmental and reproductive toxicity, disruption of metabolic function such as lipid homeostasis, and impairment of mitochondrial function. QACs’ role in antimicrobial resistance has also been demonstrated. In the US regulatory system, how a QAC is managed depends on how it is used, for example, in pesticides or personal care products. This can result in the same QACs receiving different degrees of scrutiny depending on the use and the agency regulating it. Further, the EPA’s current method of grouping QACs based on structure, first proposed in 1988, is insufficient to address the wide range of QAC chemistries, potential toxicities, and exposure scenarios. Consequently, exposures to common mixtures of QACs and from multiple sources remain largely unassessed. Some restrictions on the use of QACs have been implemented in the US and elsewhere, primarily focused on personal care products. Assessing the risks posed by QACs is hampered by their vast structural diversity and a lack of quantitative data on exposure and toxicity for the majority of these compounds. This review identifies important data gaps and provides research and policy recommendations for preserving the utility of QAC chemistries while also seeking to limit adverse environmental and human health effects. 
    more » « less
  2. Abstract

    When the observed data are contaminated with errors, the standard two‐sample testing approaches that ignore measurement errors may produce misleading results, including a higher type‐I error rate than the nominal level. To tackle this inconsistency, a nonparametric test is proposed for testing equality of two distributions when the observed contaminated data follow the classical additive measurement error model. The proposed test takes into account the presence of errors in the observed data, and the test statistic is defined in terms of the (deconvoluted) characteristic functions of the latent variables. Proposed method is applicable to a wide range of scenarios as no parametric restrictions are imposed either on the distribution of the underlying latent variables or on the distribution of the measurement errors. Asymptotic null distribution of the test statistic is derived, which is given by an integral of a squared Gaussian process with a complicated covariance structure. For data‐based calibration of the test, a new nonparametric Bootstrap method is developed under the two‐sample measurement error framework and its validity is established. Finite sample performance of the proposed test is investigated through simulation studies, and the results show superior performance of the proposed method than the standard tests that exhibit inconsistent behavior. Finally, the proposed method was applied to real data sets from the National Health and Nutrition Examination Survey. AnRpackageMEtestis available through CRAN.

     
    more » « less
  3. null (Ed.)
    In many applications of zero-inflated models, score tests are often used to evaluate whether the population heterogeneity as implied by these models is consistent with the data. The most frequently cited justification for using score tests is that they only require estimation under the null hypothesis. Because this estimation involves specifying a plausible model consistent with the null hypothesis, the testing procedure could lead to unreliable inferences under model misspecification. In this paper, we propose a score test of homogeneity for zero-inflated models that is robust against certain model misspecifications. Due to the true model being unknown in practical settings, our proposal is developed under a general framework of mixture models for which a layer of randomness is imposed on the model to account for uncertainty in the model specification. We exemplify this approach on the class of zero-inflated Poisson models, where a random term is imposed on the Poisson mean to adjust for relevant covariates missing from the mean model or a misspecified functional form. For this example, we show through simulations that the resulting score test of zero inflation maintains its empirical size at all levels, albeit a loss of power for the well-specified non-random mean model under the null. Frequencies of health promotion activities among young Girl Scouts and dental caries indices among inner-city children are used to illustrate the robustness of the proposed testing procedure. 
    more » « less
  4. Abstract

    Tuberculosis (TB) is a life‐threatening infectious disease. The standard treatment is up to 90% effective; however, it requires the administration of four antibiotics (isoniazid, rifampicin, pyrazinamide, and ethambutol [HRZE]) over long time periods. This harsh treatment process causes adherence issues for patients because of the long treatment times and a myriad of adverse effects. Therefore, the World Health Organization has focused goals of shortening standard treatment regimens for TB in their End TB Strategy efforts, which aim to reduce TB‐related deaths by 95% by 2035. For this purpose, many novel and promising combination antibiotics are being explored that have recently been discovered, such as the bedaquiline, pretomanid, and linezolid (BPaL) regimen. As a result, testing the number of possible combinations with all possible novel regimens is beyond the limit of experimental resources. In this study, we present a unique framework that uses a primate granuloma modeling approach to screen many combination regimens that are currently under clinical and experimental exploration and assesses their efficacies to inform future studies. We tested well‐studied regimens such as HRZE and BPaL to evaluate the validity and accuracy of our framework. We also simulated additional promising combination regimens that have not been sufficiently studied clinically or experimentally, and we provide a pipeline for regimen ranking based on their efficacies in granulomas. Furthermore, we showed a correlation between simulation rankings and new marmoset data rankings, providing evidence for the credibility of our framework. This framework can be adapted to any TB regimen and can rank any number of single or combination regimens.

     
    more » « less
  5. null (Ed.)
    The launch of the National Oceanic and Atmospheric Administration (NOAA)/ National Aeronautics and Space Administration (NASA) Suomi National Polar-orbiting Partnership (S-NPP) and its follow-on NOAA Joint Polar Satellite Systems (JPSS) satellites marks the beginning of a new era of operational satellite observations of the Earth and atmosphere for environmental applications with high spatial resolution and sampling rate. The S-NPP and JPSS are equipped with five instruments, each with advanced design in Earth sampling, including the Advanced Technology Microwave Sounder (ATMS), the Cross-track Infrared Sounder (CrIS), the Ozone Mapping and Profiler Suite (OMPS), the Visible Infrared Imaging Radiometer Suite (VIIRS), and the Clouds and the Earth’s Radiant Energy System (CERES). Among them, the ATMS is the new generation of microwave sounder measuring temperature profiles from the surface to the upper stratosphere and moisture profiles from the surface to the upper troposphere, while CrIS is the first of a series of advanced operational hyperspectral sounders providing more accurate atmospheric and moisture sounding observations with higher vertical resolution for weather and climate applications. The OMPS instrument measures solar backscattered ultraviolet to provide information on the concentrations of ozone in the Earth’s atmosphere, and VIIRS provides global observations of a variety of essential environmental variables over the land, atmosphere, cryosphere, and ocean with visible and infrared imagery. The CERES instrument measures the solar energy reflected by the Earth, the longwave radiative emission from the Earth, and the role of cloud processes in the Earth’s energy balance. Presently, observations from several instruments on S-NPP and JPSS-1 (re-named NOAA-20 after launch) provide near real-time monitoring of the environmental changes and improve weather forecasting by assimilation into numerical weather prediction models. Envisioning the need for consistencies in satellite retrievals, improving climate reanalyses, development of climate data records, and improving numerical weather forecasting, the NOAA/Center for Satellite Applications and Research (STAR) has been reprocessing the S-NPP observations for ATMS, CrIS, OMPS, and VIIRS through their life cycle. This article provides a summary of the instrument observing principles, data characteristics, reprocessing approaches, calibration algorithms, and validation results of the reprocessed sensor data records. The reprocessing generated consistent Level-1 sensor data records using unified and consistent calibration algorithms for each instrument that removed artificial jumps in data owing to operational changes, instrument anomalies, contaminations by anomaly views of the environment or spacecraft, and other causes. The reprocessed sensor data records were compared with and validated against other observations for a consistency check whenever such data were available. The reprocessed data will be archived in the NOAA data center with the same format as the operational data and technical support for data requests. Such a reprocessing is expected to improve the efficiency of the use of the S-NPP and JPSS satellite data and the accuracy of the observed essential environmental variables through either consistent satellite retrievals or use of the reprocessed data in numerical data assimilations. 
    more » « less