42nd Winter Meeting Program Overview

The scientific program of The Toxicology Forum’s 42nd Annual Winter Meeting features a series of diverse scientific sessions that bring together differing scientific viewpoints from leading government regulators, industry scientists, and academic researchers for dynamic presentations and discussions on emerging topics in toxicology. The format of meeting sessions at The Forum allows for sustained dialogue between speakers and audience members with opportunities not found at other meetings.

 

View the Detailed Program

Meeting Sessions:

After 50+ Years, is There a More Effective and Efficient Alternative to the Chronic Rodent Cancer Bioassay? ▸

Moderated by Suzanne Fitzpatrick, US FDA-CFSAN and Vicki Dellarco, US EPA (Retired)

The chronic rodent cancer bioassay has long been the subject of controversy.  Numerous workshops and articles have identified issues surrounding the bioassay, and many have questioned the predictability, utility, and even the necessity of this assay when other more appropriate means of achieving information to determine the cancer potential in humans and inform risk management decisions are available today.  While there remain strongly held opinions on the bioassay, the 21st century trend in toxicology is to promote hypothesis, tier-based testing that draws on a broad array of information from several sources including predictions from computational and molecular methods.  This session will provide an opportunity to bring key thought leaders together to find common ground on scientifically credible alternatives to assess potential human cancer risk from chemical exposures.

Central Issues to be Addressed:

  • Present key positions on alternative approaches that can efficiently and effectively identify substances that have the potential to be carcinogenic to humans, and that can deliver a more predictive assessment of the risk to exposed humans than the traditional rodent cancer bioassay.
  • Exchange views and identify common ground on  scientifically credible alternatives to the rodent cancer bioassay to assess human cancer risk from chemical exposures using available methods and technologies.

Key Characteristics of Carcinogens: Refining Approaches to Systematic Evaluations of Mechanistic Data▸

Moderated by Brian Hughes, NSF International and Susan Borghoff, ToxStrategies, Inc.

This workshop proposes enhancements to this framework by proposing sub-frameworks in the areas of providing a quantitative approach to systematically review and evaluate mechanistic data, incorporating additional information through high-throughput data streams, providing case studies of data integration offered through emerging methodologies, and building consensus through scientific peer review.

IARC, NTP and US EPA have recognized the need for consistency in providing mechanistic data that are comprehensive and clear, particularly when considering large and complex datasets to be used in regulatory decision-making. The recently published manuscript, “Key Characteristics as a Basis for Organizing Data on Mechanism of Carcinogenesis (Smith et al. 2016)” provides a framework for systematically organizing mechanistic data for regulatory agency review.  This framework presents both challenges and opportunities for proposing, evaluating and validating mechanistic studies. This workshop proposes enhancements to this framework by proposing sub-frameworks that include: 

  • Reviewing and ranking mechanistic studies for reliability
  • Incorporating alternative methods for assessing data-gaps, such as read-across, high throughput testing, novel non-animal tests, and in silica approaches to address the key characteristics
  • Providing a case-study for integrating data
  • Delivering peer review, consensus building and communication within the scientific community


Central Issues to be Addressed:

  • Promote strength and reliability metrics within the systematic framework for evaluating key mechanistic data
  • Illustrate research needs for the framework based on case studies of chemicals with diverse data sets, including a proof-of-concept case study for integrating computational data and high throughput testing into the framework
  • Illustrate an electronic process of engaging the scientific community in peer review and consensus building for the mechanisms proposed

 

Sufficient Similarity of Complex Substances and Mixtures: From Case Studies to Application▸

Moderated by Cynthia Rider, NIEHS-NTP

Complex mixtures such as diesel exhaust or disinfection by-products in drinking water are encountered in the real-world. In addition, the composition of many industrial and consumer products is complex; also referred to as UVCBs (substances of unknown, variable composition, complex reaction mixtures, or biological materials), the health risks they pose are difficult to evaluate. Although risk assessors strongly prefer toxicity data on the whole mixture/substance of interest or a related mixture/substance for read-across, these data are seldom available or difficult to compare due to unknown or variable composition. Instead, single chemical data and component-based approaches that rely on simple models of additivity and multiple assumptions are used to estimate potential health effects. A critical obstacle to generating and using toxicity data on mixtures or UVCBs has been a lack of accepted methods for relating the findings from the tested substance to other related substances. US EPA (2000) proposed the general concept of sufficient similarity for evaluating complex mixtures, when toxicity data are not available for the mixture of interest but are available on a related mixture.  Two mixtures are sufficiently similar when the toxicological consequences of exposure to the two mixtures are indistinguishable, despite differences in their composition and component proportions. For example, are toxicity data generated from Texas crude oil relevant to weathered Saudi Arabian oil? Judgments of sufficient similarity can be extremely challenging for complex mixtures.  While there have been limited efforts to develop methods for determining sufficient similarity of complex mixtures and substances, there has not been widespread application or acceptance of these methods. In this session, case studies comparing across complex mixtures to determine sufficient similarity will be discussed. The three types of complex mixtures/substances that will be covered include drinking water disinfection byproducts, botanical dietary supplements, and petroleum substances. Speakers will detail chemical and bioactivity profiling approaches used to assess sufficient similarity and key challenges in generating and using these data. Following the case study presentations, a panel discussion will focus on comparing methods and determining steps required to apply methods more broadly.

A critical obstacle to generating and using toxicity data on mixtures or substances of unknown, variable composition, complex reaction products, or biological materials (UVCBs) has been a lack of accepted methods for relating the findings from the tested substance to other related substances. In this session, speakers will detail chemical and bioactivity profiling approaches used to assess sufficient similarity and key challenges in generating and using these data.

Central Issue to be Addressed:

Currently, there are no accepted methods and criteria for determining sufficient similarity of complex mixtures. We hope to discuss several approaches and challenges in the implementation of these methods.

 


Putting Models to Work: When Can We Actually Use High-Throughput Exposure Estimates?▸

Moderated by Caroline Ring, ToxStrategies, Inc. and John Wambaugh, US EPA-NCCT

Recent interest has focused on high-throughput (HT) exposure assessment for use in screening and prioritizing potential risk for large numbers of chemicals. For example, the Frank R. Lautenberg Chemical Safety for the 21st Century Act requires the US EPA to establish a risk-based process to determine which chemicals it will prioritize for more detailed assessment. HT exposure models have been developed to support estimation of exposure of large numbers of chemicals, through various pathways on a per-chemical-basis and in aggregate. Yet, use of these models requires care. Domain of applicability must be considered, along with (sometimes large) uncertainty in HT exposure model estimates. HT exposure estimates are often characterized as “screening-level estimates” that may not represent actual exposures. Given these caveats, how can HT exposure models be used to inform real-world decision-making? This session will present timely information on recent advances in the development and application of HT exposure models, with focus on the applicability of these models to chemical assessments.

Central Issue to be Addressed:

Use of high-throughput (HT) exposure models requires careful consideration of issues including domain of applicability and uncertainty in model estimates. Given these caveats, how are HT exposure models applied in real-world decision-making?


Evaluation of Therapeutics for the Treatment of Severely-Debilitating or Life-Threatening Diseases▸

Moderated by Judith Prescott, Merck & Co., Inc.

The US Food and Drug Administration (FDA) defines "life-threatening" conditions as “(1) Diseases or conditions where the likelihood of death is high unless the course of the disease is interrupted; and (2) Diseases or conditions with potentially fatal outcomes, where the end point of clinical trial analysis is survival”.  The US FDA additionally defines “severely debilitating” as “diseases or conditions that cause major irreversible morbidity”.  Because SDLT diseases are conditions in which life expectancy is short or quality of life greatly diminished, the medical context is comparable to that of advanced cancer and a similar therapeutic development paradigm should apply (Prescott et al. 2017).  The primary benefit of a well-constructed, modified early and late development approach would be to allow patients earlier and continued access to new, potentially effective therapies for SDLT diseases while maintaining standards for safety and effectiveness and to increase the speed of progression through development.  In addition, enabling facile development of SDLT disease therapeutics should avoid unnecessary use of animals and other drug development resources and lessen the direct economic burden and indirect societal cost associated with late-stage and end-of-life conditions.  While there are regulatory guidelines for advanced cancer therapies and for therapies for rare diseases (which include some, but importantly not all, SDLT conditions), as well as regulatory programs primarily employed to expedite late development programs for serious conditions, there is minimal, and no global, guidance to facilitate early development and availability of non-oncology SDLT disease therapeutics for patients with limited therapeutic options.  For SDLT diseases that occur in small populations, it is often necessary to employ global development plans to afford the opportunity to achieve timely clinical evaluation based on recruitment limitations.  Thus, development of international regulatory guidance to ensure consistent approaches across health authorities and international boundaries is essential.  Without such conditions, investment in development of therapeutics for SDLT diseases will be sporadic, limited and insufficient to address the unmet medical need. 

This session will address the question as to whether SDLT diseases warrant a similar drug development approach to that provided for advanced cancer and the need for a new international regulatory guidance to expedite patient access to SDLT therapeutics.


Use of Alternative Embryo-Fetal Development Assays for Potential Regulatory Acceptance ▸

Moderated by Kerry Blanchard, Merck & Co., Inc.


Tremendous advances have been made in the development of in vitro, ex vivo and non-mammalian in vivo assays (alternative assays), which are now being routinely used in drug discovery screens for detection of potential embryo-fetal development toxicity effects. This session is important and timely as the ICH S5(R3) – Step 1 guideline (Detection of Toxicity to Reproduction for Human Pharmaceuticals) will be published in mid 2017 for public review.  Therefore, this session would provide a forum for understanding the current state of these alternative assays, discussion of Regulatory Agencies’ perspective on assay performance criteria for their possible regulatory acceptance as outlined in ICH S5(R3) – Step 1, and allow discussion of and input for future revision of the draft guidance.  The approach proposed for acceptance of new assays for regulator use could become a model for more streamlined alternative assay acceptance.

With the recent release of the ICH S5(R3) – Step 1 guideline (Detection of Toxicity to Reproduction for Human Pharmaceuticals), which would allow the use of in vitro, ex-vivo, and non-mammalian in vivo embryo-fetal development alternative assays to replace or eliminate in vivo studies in certain circumstances. This session reviews the current experience of the Pharmaceutical Industry with these assays and provide Regulatory Agencies’ perspective on the proposed approach of developing performance criteria for regulatory acceptance of the assays as outlined in the ICH S5(R3) – (step 1) guideline.