Note for contributors: This document is a living research effort. We welcome feedback and discussion!
Reinvent toxicology as a discovery-driven science focused on identifying molecular targets using advanced tools.
Key Insight: Classical toxicology has generated valuable mechanistic discoveries, but remains focused on regulatory endpoints rather than therapeutic targets.
Classical toxicology is hyperfocused on studying the concentration and duration of exposure that yields an effect at any point in life-stage of the human/animal subjects. Studies of occupationally-exposed workers and other chronically-exposed groups gave way to observational studies measuring specific and reproducible damage. For example, benzene-exposed refinery workers showed chromosomal breaks and bone-marrow toxicity [1,2], evidence that supported OSHA's revised Benzene Standard in 1987. Likewise, workers exposed to respirable crystalline silica developed macrophage apoptosis, oxidative DNA damage, and fibroblast activation—mechanisms identified in toxicology research throughout the 1990s and 2000s [3,4]—which informed OSHA's 2016 Respirable Crystalline Silica Standard. At the Libby, Montana Superfund site, asbestos-exposed residents exhibited oxidative lung-cell injury and DNA strand breaks [5,6], contributing to stronger EPA asbestos actions and the 2009 Public Health Emergency Declaration.
These findings then fueled changes in regulations, such as more stringent exposure limits, mandatory monitoring, and expanded medical surveillance programs. However, we overlook the additional major gains from learning from the mechanisms of responses to environmental perturbations.
There is an underappreciation of the role that toxicology has played in molecular biology, with environmental exposures acting as unintentional probes of cellular systems. Examining the interactions between pollutants and cellular machinery has led to critical new molecular pathways. For example, the discovery that dioxins activate the aryl hydrocarbon receptor (AhR) revealed its pivotal role in immune regulation and epithelial barrier homeostasis [7]. Organophosphates revealed acetylcholinesterase as their molecular target, laying the foundation for antidotes to poisoning [8]. Aromatic amines exposed the cancer risk of NAT2 polymorphisms, marking one of the first gene–environment links in carcinogenesis [9,10]. Through mapping the biological vulnerabilities exploited by toxins, we open the overlooked gateway to new therapeutic landscapes. How can we study these effects at scale across all cell-chemical combinations?
Given the enormity of the problem space of exposure-disease relationships, environmental health has been largely labeled as intractable or misguided, creating a growing gap between the need and the research/funding landscape. Unlike the simple scenarios of the past, such as single outcome scenarios in occupational and/or high exposure settings, there have been increasing concerns of multiple, cumulative exposures that have slowly shifted cell homeostasis over time in less overtly observable ways. Dr. Linda Birnbaum, former NIEHS director has described this as pollutants acting as uncontrolled medicines, whereby the effects are nonspecific and not predictable [11].
Given the multitude of challenges in measuring these sorts of molecular fingerprints, the field has stalled in drawing meaningful inferences between environmental exposures and adverse outcomes.
Key Insight: Biotech's high-throughput screening infrastructure and AI foundation models are ready to be deployed for environmental chemical profiling.
Biology is currently undergoing a renaissance that is bringing about a fundamental change to the way we understand cellular mechanisms. The era of studying single-proteins is sunsetting, and being overridden by systems-level understanding of inter and intracellular relationships.
Interestingly, the biotech industry's hardware has already migrated to chemical screening labs, which could give way to the next generation of toxicology at a formative moment in biotech. Over a decade ago, the NIH National Center for Advancing Translational Sciences' (NCATS) industry-scale robotic quantitative high-throughput system was borrowed by the U.S. Environmental Protection Agency (EPA) to evaluate thousands of environmental chemicals [12,13]. Moreover, twenty commercial assay providers underpin the EPA's modern toxicology testing methods [14], including human primary cell profiling systems [15,16], multiplex reporter assays [17] and real-time cell analysis (RTCA) systems [18]. Yet the exorbitant expenses in running these analyses has slowed progress in chemical screening.
Outside of the government sector, there is already biotech infrastructure that could accelerate the mission of screening environmental exposures for actionable biological insights. The focused research organization, EvE Bio, has systematically run over 385,000 drug-target interaction tests on FDA-approved small-molecule drugs against hundreds of human receptors including GPCRs, kinases, and nuclear receptors, and then open-sourcing those drug–target activity maps via a web app that integrates with the DrugBank database [19]. EvE Bio's unprecedented mapping of the pharmome is set to expedite drug safety testing and repurposing. However the team has also expressed interest in adapting their high throughput method to expand upon the Tox21 screens of industrial chemicals and identify targets that lead to adverse phenotypes.
Other organizations are taking steps to build foundational models in drug testing and toxicity testing as the costs of compute power for LLMs dramatically decrease. Foundation models like Tahoe-1x, trained on over 100 million single-cell perturbation profiles (50 cancer cell lines crossed with 1,100 drugs), represents one of the largest known single-cell RNA transcriptomics (scRNA) datasets in the space [20].
The new scRNA or higher-dimensional metabolic screens that will become the backbone of biotech could be expanded to incorporate environmental toxicants, enabling researchers to uncover how specific chemicals shift cellular stress responses prior to full-blown symptomatology. So far, scRNA of chemical exposures has not been attempted to our knowledge. One of the largest transcriptomic tox screens completed is by the EPA, testing three cell lines on approximately 1000 chemicals with 8 point doses for each, using a low-cost 20,000 probe-based assay for RNA expression [21]. Already, the results from this work indicate that these affordable transcriptomic studies could be a potential substitute for the more expensive and laborious Tox21 suite of tier 1 assays.
There may be substantial mutual benefits for both industry and government stakeholders in moving towards single cell sequencing analysis of environmental toxicants. For example, including additional data on pollutant exposures into foundational models like the publicly available Tahoe-1x or private federated industry-level models, would increase the ability to generalize across unknown exposures and improve performance metrics on future predictions. This high-resolution data could not only assist EPA scientists with prioritizing future directions for chemical testing but also be a building block in foundational models for mechanisms of action that may rival the historically underpowered predictive models in toxicology.
Lastly, this type of data could be a starting point for entry into the partnership between the Human Cell Atlas and NEXUS Exposome, called the "Human Exposome Atlas" [22]. It is conceptualized as cataloging environmental exposures across heterogeneous cell types in different tissues, constructing a new layer on the HCA's decade of groundwork in mapping the expression of all human cell types by scRNA. Biotech could capture the value of identifying and treating these subclinical states as new architecture and infrastructure is built to understand earlier markers of irreversible cell damage.
Key Insight: Environmental toxicology research benefits bioremediation, cross-species studies, and broader biological barrier research.
With the advent of this new hybrid bench/computational approach, next gen toxicology screens create a new pipeline of identifying new molecular targets and designing treatments, still taking advantage of the hybrid bench/computational approach. Work at the bench is rapidly changing to incorporate more iterative in silico hypothesis testing, in order to reduce lab time and cost per experiment. Biotech has reduced costs to scale experiments and generate data. AI co-scientists decrease barrier to entry, with validatable findings (in vitro), making it easier and cheaper to test hypothesis with existing data and identify new targets.
Additional second-order effects include:
- Conducting studies in vitro that have implications in bioremediation because the studied enzymes/targets are conserved across species
- Conducting studies across biological barriers
Roadmap: A seven-year plan to transform toxicology from regulatory science to discovery science.