Classical toxicology is hyperfocused on characterizing toxins: identifying harm, quantifying dose-response, and setting safe reference doses.[1] Success is measured in proving to regulators that reducing exposure justifies policy implementation: lead bans in gasoline,[2] PCBs restricted under TSCA,[3] and air pollution mitigation via the Clean Air Act.[4]
The EPA's biotechnology initiatives developed high-throughput screening to assess thousands of chemicals at scale,[5] aiming to convert toxicology from observation to prediction across targets, pathways, and mechanisms. Yet mechanistic work remained supporting evidence for regulatory decisions,[6][7] and considering this data as starting material for therapeutic interventions was far outside the EPA's mandate.
Innovation now in toxicology moves toward advanced computational and multimodal methodologies,[8] addressing concerns of multiple, cumulative exposures while migrating away from animal testing toward new alternative methods (NAMs). However, next-generation toxicology could also apply findings to biopharma by identifying addressable molecular targets that indicate biological damage and increased disease risk.