The US Department of Defense (DOD) is faced with an overwhelming task in evaluating chemicals that could potentially pose a threat to its deployed personnel. There are over 84,000 registered chemicals, and testing them with traditional toxicity-testing methods is not feasible in terms of time or money. In recent years, there has been a concerted effort to develop new approaches to toxicity testing that incorporate advances in systems biology, toxicogenomics, bioinformatics, and computational toxicology. Given the advances, DOD asked the National Research Council to determine how DOD could use modern approaches for predicting chemical toxicity in its efforts to prevent debilitating, acute exposures to deployed personnel. This report provides an overall conceptual approach that DOD could use to develop a predictive toxicology system. Application of Modern Toxicology Approaches for Predicting Acute Toxicity for Chemical Defense reviews the current state of computational and high-throughput approaches for predicting acute toxicity and suggests methods for integrating data and predictions. This report concludes with lessons learned from current high-throughput screening programs and suggests some initial steps for DOD investment.
The US Department of Defense (DOD) is faced with an overwhelming task in evaluating chemicals that could potentially pose a threat to its deployed personnel. There are over 84,000 registered chemicals, and testing them with traditional toxicity-testing methods is not feasible in terms of time or money. In recent years, there has been a concerted effort to develop new approaches to toxicity testing that incorporate advances in systems biology, toxicogenomics, bioinformatics, and computational toxicology. Given the advances, DOD asked the National Research Council to determine how DOD could use modern approaches for predicting chemical toxicity in its efforts to prevent debilitating, acute exposures to deployed personnel. This report provides an overall conceptual approach that DOD could use to develop a predictive toxicology system. Application of Modern Toxicology Approaches for Predicting Acute Toxicity for Chemical Defense reviews the current state of computational and high-throughput approaches for predicting acute toxicity and suggests methods for integrating data and predictions. This report concludes with lessons learned from current high-throughput screening programs and suggests some initial steps for DOD investment.
In the 1970s, flame retardants began to be added to synthetic materials to meet strict flammability standards. Over the years, diverse flame retardants have been manufactured and used in various products. Some flame retardants have migrated out of the products, and this has led to widespread human exposure and environmental contamination. There also is mounting evidence that many flame retardants are associated with adverse human health effects. As a result, some flame retardants have been banned, restricted, or voluntarily phased out of production and use. This publication develops a scientifically based scoping plan to assess additive, nonpolymeric organohalogen flame retardants as a class for potential chronic health hazards under the Federal Hazardous Substances Act, including cancer, birth defects, and gene mutations.
To safeguard public health, the US Environmental Protection Agency (EPA) must keep abreast of new scientific information and emerging technologies so that it can apply them to regulatory decision-making. For decades the agency has dealt with questions about what animal-testing data to use to make predictions about human health hazards, how to perform dose-response extrapolations, how to identify and protect susceptible subpopulations, and how to address uncertainties. As alternatives to traditional toxicity testing have emerged, the agency has been faced with additional questions about how to incorporate data from such tests into its chemical assessments and whether such tests can replace some traditional testing methods. Endocrine active chemicals (EACs) have raised concerns that traditional toxicity-testing protocols might be inadequate to identify all potential hazards to human health because they have the ability to modulate normal hormone function, and small alterations in hormone concentrations, particularly during sensitive life stages, can have lasting and significant effects. To address concerns about potential human health effects from EACs at low doses, this report develops a strategy to evaluate the evidence for such low-dose effects.
Over the past several years, the US Environmental Protection Agency (EPA) has been transforming the procedures of its Integrated Risk Information System (IRIS), a program that produces hazard and doseâ€'response assessments of environmental chemicals and derives toxicity values that can be used to estimate risks posed by exposures to them. The transformation was initiated after suggestions for program reforms were provided in a 2011 report from the National Academies of Sciences, Engineering, and Medicine that reviewed a draft IRIS assessment of formaldehyde. In 2014, the National Academies released a report that reviewed the IRIS program and evaluated the changes implemented in it since the 2011 report. Since 2014, new leadership of EPA's National Center for Environmental Assessment (NCEA) and IRIS program has instituted even more substantive changes in the IRIS program in response to the recommendations in the 2014 report. Progress Toward Transforming the Integrated Risk Information System Program: A 2018 Evaluation reviews the EPA's progress toward addressing the past recommendations from the National Academies.
Since 1938 and 1941, nutrient intake recommendations have been issued to the public in Canada and the United States, respectively. Currently defined as the Dietary Reference Intakes (DRIs), these values are a set of standards established by consensus committees under the National Academies of Sciences, Engineering, and Medicine and used for planning and assessing diets of apparently healthy individuals and groups. In 2015, a multidisciplinary working group sponsored by the Canadian and U.S. government DRI steering committees convened to identify key scientific challenges encountered in the use of chronic disease endpoints to establish DRI values. Their report, Options for Basing Dietary Reference Intakes (DRIs) on Chronic Disease: Report from a Joint US-/Canadian-Sponsored Working Group, outlined and proposed ways to address conceptual and methodological challenges related to the work of future DRI Committees. This report assesses the options presented in the previous report and determines guiding principles for including chronic disease endpoints for food substances that will be used by future National Academies committees in establishing DRIs.
This will help us customize your experience to showcase the most relevant content to your age group
Please select from below
Login
Not registered?
Sign up
Already registered?
Success – Your message will goes here
We'd love to hear from you!
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.