Sunday, September 18, 2022, 5:00 PM – 7:00 PM
Environmental Health Science, Maastricht University, NL
Around the millennium, the advent of the so-called ‘omics technologies was widely expected among toxicologists to cause a “paradigm shift” leading to a more data-driven, hypothesis-generating approach by studying global gene-gene interactions induced by adverse chemicals, thus deepening our understanding of toxic mechanisms indicative of disease risks, and creating more reliable prediction models for human safety. Two lines of toxicogenomics research have been explored over the years, the one investigating ‘omics patterns in blood cells in association with markers of toxic exposures, in exposed human populations, the other considering toxic class-specific ‘omics profiles, in particular in cell models treated in vitro, as potential novel test systems for evaluating the safety of new and existing chemical entities. Initially, ‘omics technologies focused on whole genome gene expression, while in more recent years, also proteomics, metabolomics and epigenetics effects of toxic exposures have been assessed. This opened the venue to so-called “multi-omics” of “cross-omics” studies, thus posing tremendous challenges to complex data analysis. Where ‘omics technologies per se have been more or less standardized, this is certainly not the case for complex bioinformatics procedures for extracting relevant, tangible knowledge on toxic mechanisms-of-action, implying that multiple data analysis approaches and methods are considered, even today.
In the past two decades, we have extensively explored both research lines, frequently within the context of international (EU) projects. Examples of obtained results will be presented, and critically discussed as to whether the desired “paradigm shift” has been actually achieved and the output from toxicogenomics studies has made an impact on chemical safety evaluations and environmental health policies.
EUROTOX Award Lecture
Monday, September 19, 2022, 8:30 AM – 9:30 AM
Institute of Experimental Pharmacology & Toxicology, Slovak Academy of Science, SK
The technologies leading to the development of reconstructed human three-dimensional tissue models (3D RHTM) have been known for more than 40 years. The first tissue models served mainly for research interests to study the cell-cell interactions, tissue morphology and physiology and later on for e.g. treatment of burned patients (reconstructed epidermal sheets). However, as the technology of tissue reconstruction improved, 3D RHTM models started to be used for broader purposes. These included safety and efficacy testing of cosmetic ingredients and products, as well as hazard assessment of chemicals and pesticides. The 3D RHTM models also found their use in the pre-clinical testing of drug candidates. Following the standardisation of tissue models production by commercial developers and thanks to the extensive international validation studies, reconstructed human skin and cornea-like models are nowadays used globally by industrial and academic research laboratories to assess the local effects of topically applied chemicals and formulations in vitro.
Four OECD Test Guidelines (TGs) refer explicitly to the use of 3D RHTMs: OECD TG 431 for skin corrosion testing, OECD TG 439 for skin irritation testing, OECD TG 492 for eye irritation testing and OECD TG 498 for assessment of phototoxicity. In addition, the use of 3D RHTM has been implemented in the biocompatibility testing of medical devices (ISO 10993:23) and for the preclinical testing of drugs (ICH S10). Validation studies with skin models have been completed for genotoxicity testing using the Comet and Micronucleus assays. 3D RHTM has found usability in both regulatory as well as non-regulatory testing areas including:
- safety and efficacy assessments of raw cosmetic/pharmaceutical materials and formulations,
- hazard and risk assessment of chemicals or formulations with accidental contact with human eyes or skin (regulated, e.g. by REACH and other chemical legislations) and help to address occupational safety,
- mechanistic information that can be utilised e.g. to determine whether a molecule or compound can be altered to reduce toxicity without loss of efficacy,
- prevention or enhancement of the penetration of a substance via target tissue(s).
In medicine, pharmacology-oriented research and drug development, normal (i.e. obtained from healthy donors), as well as disease tissue models, are considered beneficial for the modelling and understanding of physiological and pathological conditions and even as tools for personalised therapies.
The disruptive technologies of the twenty-first century such as automation and bioprinting, and advances in material engineering will enable in the near future large scale industrial production of at least partly immuno-competent models for use in microfluidic chips. This step will allow for further regulatory acceptance of in vitro methods for systemic toxicity testing.
Acknowledgement of funding: APVV-19-0591, VEGA 2/0153/20, DS-FR-19-0048
EUROTOX – SOT Debate
Monday, September 19, 2022, 2:00 PM – 3:00 PM
Each year the SOT and EUROTOX Annual Meetings include a debate that continues a tradition that originated in the early 1990s in which leading toxicologists advocate opposing sides of an issue of significant toxicological importance.
This year, our debaters will address the proposition: “Is there a role for Artificial Intelligence (AI) and Machine Learning (ML) in risk decisions?”.
The debaters will discuss the principles and limitations of these tools for decision-making. Specific questions to be addressed include:
- Can recent developments in AI and ML be applied in toxicology as they are being applied in precision medicine and other fields? Are current algorithms inherent to AI and ML sufficiently refined for the design of safer products?
- Are these applications sufficiently robust to identify toxic “signatures”, providing information for safety and risk assessment?
- Are they sufficiently reliable to predict toxicity and can they account for genetic and other toxicodynamic variability?
- Are they able to predict risk associated with exposure to mixtures? Can they predict the potential toxicity of new compounds or relate chemical structure or activity to risk?
- How can we tell if the results of AI and ML are accurate and defensible?
In addition to inclusion as a Featured Session at this meeting, this debate took place (with the debaters having taken the reverse positions) in San Diego, US, during the 61st SOT Annual Meeting, March 27-31, 2022.
Johns Hopkins Bloomberg School of Public Health, Baltimore, US
UL LLC, Northbrook, US
Round Table Discussion
Tuesday, September 20, 2022, 8:30 AM – 9:30 AM
The European Commission published a chemicals strategy for sustainability (CSS) on 14 October 2020. It is part of the EU’s zero pollution ambition, which is a key commitment of the European Green Deal. In this session the impact of the CSS on regulatory toxicology will be discussed with the aim to follow a red line starting from the assessment of the impact of contaminants and pesticides on the environment, to their appearance in the food chain and finally, the impact on human exposure and human health. The purpose of the session is to identify how the CSS will change the current landscape, including how new technologies that will help us to achieve the goals of the CSS. The fishbowl concept will be used to help to elicit a maximum participation from all speakers. The speakers have been chosen based on their expertise in the different areas (environment, exposure science, chemical risk assessment, new methodologies) to help generate an interactive and productive discussion.
University of Aberdeen
Joop de Knecht
HESI CITE Lecture
Tuesday, September 20, 2022, 2:00 PM – 3:00 PM
Centre for Molecular Science Informatics, University of Cambridge, UK