The new field of toxicogenomics presents a potentially powerful set of tools to better understand the health effects of exposures to toxicants in the environment. At the request of the National Institute of Environmental Health Sciences, the National Research Council assembled a committee to identify the benefits of toxicogenomics, the challenges to achieving them, and potential approaches to overcoming such challenges. The report concludes that realizing the potential of toxicogenomics to improve public health decisions will require a concerted effort to generate data, make use of existing data, and study data in new waysâ€"an effort requiring funding, interagency coordination, and data management strategies.
Some of what we know about the health effects of exposure to chemicals from food, drugs, and the environment come from studies of occupational, inadvertent, or accident-related exposures. When there is not enough human data, scientists rely on animal data to assess risk from chemical exposure and make health and safety decisions. However, humans and animals can respond differently to chemicals, including the types of adverse effects experienced and the dosages at which they occur. Scientists in the field of toxicogenomics are using new technologies to study the effects of chemicals. For example, in response to a particular chemical exposure, they can study gene expression ("transcriptomics"), proteins ("proteomics") and metabolites ("metabolomics"), and they can also look at how individual and species differences in the underlying DNA sequence itself can result in different responses to the environment. Based on a workshop held in August 2004, this report explores how toxicogenomics could enhance scientists' ability to make connections between data from experimental animal studies and human health.
In 2007, the National Research Council envisioned a new paradigm in which biologically important perturbations in key toxicity pathways would be evaluated with new methods in molecular biology, bioinformatics, computational toxicology, and a comprehensive array of in vitro tests based primarily on human biology. Although some considered the vision too optimistic with respect to the promise of the new science, no one can deny that a revolution in toxicity testing is under way. New approaches are being developed, and data are being generated. As a result, the U.S. Environmental Protection Agency (EPA) expects a large influx of data that will need to be evaluated. EPA also is faced with tens of thousands of chemicals on which toxicity information is incomplete and emerging chemicals and substances that will need risk assessment and possible regulation. Therefore, the agency asked the National Research Council to convene a symposium to stimulate discussion on the application of the new approaches and data in risk assessment. The symposium was held on May 11-13, 2009, in Washington, DC, and included presentations and discussion sessions on pathway-based approaches for hazard identification, applications of new approaches to mode-of-action analyses, the challenges to and opportunities for risk assessment in the changing paradigm, and future directions.
The US Department of Defense (DOD) is faced with an overwhelming task in evaluating chemicals that could potentially pose a threat to its deployed personnel. There are over 84,000 registered chemicals, and testing them with traditional toxicity-testing methods is not feasible in terms of time or money. In recent years, there has been a concerted effort to develop new approaches to toxicity testing that incorporate advances in systems biology, toxicogenomics, bioinformatics, and computational toxicology. Given the advances, DOD asked the National Research Council to determine how DOD could use modern approaches for predicting chemical toxicity in its efforts to prevent debilitating, acute exposures to deployed personnel. This report provides an overall conceptual approach that DOD could use to develop a predictive toxicology system. Application of Modern Toxicology Approaches for Predicting Acute Toxicity for Chemical Defense reviews the current state of computational and high-throughput approaches for predicting acute toxicity and suggests methods for integrating data and predictions. This report concludes with lessons learned from current high-throughput screening programs and suggests some initial steps for DOD investment.
Toxicogenomics, the study of how genomes respond to exposure to toxicants, may ultimately hold the promise of detecting changes in the expression of a person's genes if he or she is exposed to these toxicants. As the technology rapidly develops, it is critical that scientists and the public communicate about the promises and limitations of this new field. Communicating technical information to the public about a developing science can be challenging, particularly when the applications of that science are not yet well understood. Communicating Toxicogenomics Information to Nonexperts is the summary of a workshop designed to consider strategies for communicating toxicogenomic information to the public and other non- expert audiences, specifically addressing the communication of some key social, ethical, and legal issues related to toxicogenomics and addressing how information related to the social implications of toxicogenomics might be perceived by nonexperts.
The new field of toxicogenomics presents a potentially powerful set of tools to better understand the health effects of exposures to toxicants in the environment. At the request of the National Institute of Environmental Health Sciences, the National Research Council assembled a committee to identify the benefits of toxicogenomics, the challenges to achieving them, and potential approaches to overcoming such challenges. The report concludes that realizing the potential of toxicogenomics to improve public health decisions will require a concerted effort to generate data, make use of existing data, and study data in new waysâ€"an effort requiring funding, interagency coordination, and data management strategies.
The US Department of Defense (DOD) is faced with an overwhelming task in evaluating chemicals that could potentially pose a threat to its deployed personnel. There are over 84,000 registered chemicals, and testing them with traditional toxicity-testing methods is not feasible in terms of time or money. In recent years, there has been a concerted effort to develop new approaches to toxicity testing that incorporate advances in systems biology, toxicogenomics, bioinformatics, and computational toxicology. Given the advances, DOD asked the National Research Council to determine how DOD could use modern approaches for predicting chemical toxicity in its efforts to prevent debilitating, acute exposures to deployed personnel. This report provides an overall conceptual approach that DOD could use to develop a predictive toxicology system. Application of Modern Toxicology Approaches for Predicting Acute Toxicity for Chemical Defense reviews the current state of computational and high-throughput approaches for predicting acute toxicity and suggests methods for integrating data and predictions. This report concludes with lessons learned from current high-throughput screening programs and suggests some initial steps for DOD investment.
In 2007, the National Research Council envisioned a new paradigm in which biologically important perturbations in key toxicity pathways would be evaluated with new methods in molecular biology, bioinformatics, computational toxicology, and a comprehensive array of in vitro tests based primarily on human biology. Although some considered the vision too optimistic with respect to the promise of the new science, no one can deny that a revolution in toxicity testing is under way. New approaches are being developed, and data are being generated. As a result, the U.S. Environmental Protection Agency (EPA) expects a large influx of data that will need to be evaluated. EPA also is faced with tens of thousands of chemicals on which toxicity information is incomplete and emerging chemicals and substances that will need risk assessment and possible regulation. Therefore, the agency asked the National Research Council to convene a symposium to stimulate discussion on the application of the new approaches and data in risk assessment. The symposium was held on May 11-13, 2009, in Washington, DC, and included presentations and discussion sessions on pathway-based approaches for hazard identification, applications of new approaches to mode-of-action analyses, the challenges to and opportunities for risk assessment in the changing paradigm, and future directions.
Introduction: The government performance and results act, the program assessment rating tool, and the Environmental Protection Agency -- Efficiency metrics used by the Environmental Protection Agency and other federal research and development programs -- Are the efficiency metrics used by the federal research and development programs sufficient and outcome-based? -- A model for evaluating research and development programs -- Findings, principles, and recommendations -- Appendices -- Boxes, figures and tables.
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.