_ARV_The New Revolution in Toxicology DAVIS

15
Ann. N.Y. Acad. Sci. ISSN 0077-8923 ANNALS OF THE NEW YORK ACADEMY OF SCIENCES Issue: Annals Meeting Reports The new revolution in toxicology: The good, the bad, and the ugly Myrtle Davis, 1 Kim Boekelheide, 2 Darrell R. Boverhof, 3 Gary Eichenbaum, 4 Thomas Hartung, 5 Michael P. Holsapple, 6 Thomas W. Jones, 7 Ann M. Richard, 8 and Paul B. Watkins 9 1 Toxicology and Pharmacology Branch, Developmental Therapeutics Program Division of Cancer Treatment and Diagnosis, The National Cancer Institute, National Institutes of Health, Bethesda, Maryland. 2 Deparment of Pathology and Laboratory Medicine, Brown University, Providence, Rhode Island. 3 Toxicology and Environmental Research and Consulting, The Dow Chemical Company, Midland, Michigan. 4 Department of Drug Safety Science, Johnson & Johnson Pharmaceutical R&D, LLC, Raritan, NJ. 5 Department of Environmental Health Sciences. Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland. 6 Battelle Memorial Institute, Columbus, Ohio. 7 Department of Toxicology and Pathology, Elil Lilly and Company, Indianapolis, Indiana. 8 National Center for Computational Toxicology, Environmental Protection Agency, Research Triangle Park, North Carolina. 9 Institute for Drug Safety Sciences, Hamner University of North Carolina, Research Triangle Park, North Carolina Address for correspondence:Myrtle Davis, D.V.M., Ph.D., Toxicology and Pharmacology Branch, Developmental Therapeutics Program, Division of Cancer Treatment and Diagnosis, The National Cancer Institute, NIH Bethesda, MD 20852. [email protected] In 2007, the United States National Academy of Sciences issued a report entitled Toxicity Testing in the 21 st Century: A Vision and a Strategy. The report reviewed the state of the science and outlined a strategy for the future of toxicity testing. One of the more significant components of the vision established by the report was an emphasis on toxicity testing in human rather than animal systems. In the context of drug development, it is critical that the tools used to accomplish this strategy are maximally capable of evaluating human risk. Since 2007, many advances toward implementation of this vision have been achieved, particularly with regard to safety assessment of new chemical entities intended for pharmaceutical use. Keywords: toxicology; pharmaceuticals; testing Introduction Protection of human safety is a primary objective of toxicology research and risk management. In June 2007, the U.S. National Academy of Sciences re- leased a report Toxicity Testing in the 21 st Century: A Vision and a Strategy. 1 This report (Tox21C) in- cluded four main components: (1) chemical charac- terization, (2) toxicity pathways and targeted testing, (3) dose response and extrapolation modeling, and (4) human exposure data. The report outlined a new vision and strategy for toxicity testing that would be based primarily on human rather than general an- imal biology and would require substantially fewer or virtually no animals. There are clear promises and challenges associated with the vision, including the recognition that components of this vision are natural extensions of the evolution of toxicology sci- ence, and the challenging and long-standing debate associated with a total reliance on cell-based systems and in vitro methods. While there are many topics and issues of interest to toxicologists, there are only a few that have the potential to have as great an im- pact on the science of toxicology as this vision. As such, it can be debated as to whether the report was the initial skirmish in what is now being called the new revolution in toxicology. The realization of this vision will depend upon defining a series of toxicity pathways (e.g., cyto- toxicity, cell proliferation, apoptosis, etc.) that can be monitored using medium- to high-throughput in vitro test systems—preferably based on human cells, cell lines, or tissues—and that are expected to provide a sufficiently comprehensive characteriza- tion of human risk and to reduce or eliminate the use of the apical endpoints currently collected through in vivo animal testing. It is acknowledged that an extraordinary amount of effort will be needed to doi: 10.1111/nyas.12086 Ann. N.Y. Acad. Sci. 1278 (2013) 11–24 C 2013 New York Academy of Sciences. 11

description

Toxicology Fileld

Transcript of _ARV_The New Revolution in Toxicology DAVIS

Ann. N.Y. Acad. Sci. ISSN 0077-8923

ANNALS OF THE NEW YORK ACADEMY OF SCIENCESIssue: Annals Meeting Reports

The new revolution in toxicology: The good, the bad, andthe ugly

Myrtle Davis,1 Kim Boekelheide,2 Darrell R. Boverhof,3 Gary Eichenbaum,4 Thomas Hartung,5

Michael P. Holsapple,6 Thomas W. Jones,7 Ann M. Richard,8 and Paul B. Watkins9

1Toxicology and Pharmacology Branch, Developmental Therapeutics Program Division of Cancer Treatment and Diagnosis,The National Cancer Institute, National Institutes of Health, Bethesda, Maryland. 2Deparment of Pathology and LaboratoryMedicine, Brown University, Providence, Rhode Island. 3Toxicology and Environmental Research and Consulting, The DowChemical Company, Midland, Michigan. 4Department of Drug Safety Science, Johnson & Johnson Pharmaceutical R&D, LLC,Raritan, NJ. 5Department of Environmental Health Sciences. Johns Hopkins Bloomberg School of Public Health, Baltimore,Maryland. 6Battelle Memorial Institute, Columbus, Ohio. 7Department of Toxicology and Pathology, Elil Lilly and Company,Indianapolis, Indiana. 8National Center for Computational Toxicology, Environmental Protection Agency, Research TrianglePark, North Carolina. 9Institute for Drug Safety Sciences, Hamner University of North Carolina, Research Triangle Park, NorthCarolina

Address for correspondence: Myrtle Davis, D.V.M., Ph.D., Toxicology and Pharmacology Branch, DevelopmentalTherapeutics Program, Division of Cancer Treatment and Diagnosis, The National Cancer Institute, NIH Bethesda, MD [email protected]

In 2007, the United States National Academy of Sciences issued a report entitled Toxicity Testing in the 21st Century:A Vision and a Strategy. The report reviewed the state of the science and outlined a strategy for the future of toxicitytesting. One of the more significant components of the vision established by the report was an emphasis on toxicitytesting in human rather than animal systems. In the context of drug development, it is critical that the tools usedto accomplish this strategy are maximally capable of evaluating human risk. Since 2007, many advances towardimplementation of this vision have been achieved, particularly with regard to safety assessment of new chemicalentities intended for pharmaceutical use.

Keywords: toxicology; pharmaceuticals; testing

Introduction

Protection of human safety is a primary objective oftoxicology research and risk management. In June2007, the U.S. National Academy of Sciences re-leased a report Toxicity Testing in the 21st Century:A Vision and a Strategy.1 This report (Tox21C) in-cluded four main components: (1) chemical charac-terization, (2) toxicity pathways and targeted testing,(3) dose response and extrapolation modeling, and(4) human exposure data. The report outlined a newvision and strategy for toxicity testing that would bebased primarily on human rather than general an-imal biology and would require substantially feweror virtually no animals. There are clear promisesand challenges associated with the vision, includingthe recognition that components of this vision arenatural extensions of the evolution of toxicology sci-ence, and the challenging and long-standing debate

associated with a total reliance on cell-based systemsand in vitro methods. While there are many topicsand issues of interest to toxicologists, there are onlya few that have the potential to have as great an im-pact on the science of toxicology as this vision. Assuch, it can be debated as to whether the report wasthe initial skirmish in what is now being called thenew revolution in toxicology.

The realization of this vision will depend upondefining a series of toxicity pathways (e.g., cyto-toxicity, cell proliferation, apoptosis, etc.) that canbe monitored using medium- to high-throughputin vitro test systems—preferably based on humancells, cell lines, or tissues—and that are expected toprovide a sufficiently comprehensive characteriza-tion of human risk and to reduce or eliminate the useof the apical endpoints currently collected throughin vivo animal testing. It is acknowledged that anextraordinary amount of effort will be needed to

doi: 10.1111/nyas.12086Ann. N.Y. Acad. Sci. 1278 (2013) 11–24 C© 2013 New York Academy of Sciences. 11

The new revolution in toxicology Davis et al.

(1) determine the most informative set of toxic-ity pathways; (2) develop, validate, and implementthe appropriate test systems; (3) create the neces-sary data management and computational tools;and (4) define how regulatory decision making willbe adjusted to utilize these new data. While thereport, sponsored by the U.S. Environmental Pro-tection Agency, primarily focuses on the challengesof identifying, assessing, and managing the risks as-sociated with human exposure to chemical agentsfound in the environment, there is passing refer-ence to the potential of applying this revolution-ary approach to toxicity testing in other applica-tions, including pharmaceutical research and de-velopment. It is important to note that the re-port stops short of recommending expanding thetesting requirements for pharmaceuticals, but in-terest in extending the Tox21C testing principles inthat direction has clearly grown over the last severalyears. However, there has been very limited discus-sion regarding the inherent differences between howtoxicity testing is applied to enable human pharma-ceutical development and how it is used to supportenvironmental decision making.

In October 2011, the New York Academy of Sci-ences hosted a conference entitled “The New Rev-olution in Toxicology: The Good, Bad and Ugly,”sponsored by the Academy and Agilent Technolo-gies, Cephalon, and Cyprotex, and promoted by theAmerican College of Toxicology and the Society ofToxicology. This one-day conference attracted ap-proximately 200 attendees, including experiencedand new investigators; clinicians; toxicologists; andpolicy makers with multidisciplinary expertise fromthe fields of pharmacology, genetic and molecu-lar toxicology, animal study design, drug discov-ery and development, computational chemistry, en-vironmental law, cell and molecular biology, andpathology. The conference was particularly timelyand exciting given the recent explosion of broadlyapplicable new models and technologies for assess-ing drug efficacy and toxicity. A primary goal of thissymposium was to advance the discussion by focus-ing on how these differences might affect choicesof appropriate test systems and how those systemswould be applied in practice.

Implementation of the vision: beyond 2007

The symposium began with a session intended tocapture reflections on the status of implementation

of the 2007 NAS Report. The initial discussionbegan with a talk from Daniel Krewski (Univer-sity of Ottawa). Krewski discussed data andprovided rationale to support a broader,population-based approach to risk assessment.Relevant to this topic is a recent report from theInstitute of Medicine (IOM) “Breast Cancer and theEnvironment: A Life Course Approach.”2 In thisreport, among the environmental factors reviewed,those most clearly associated with increased breastcancer risk in epidemiological studies were use ofcombination hormone therapy products, currentuse of oral contraceptives, exposure to ionizingradiation, overweight and obesity among post-menopausal women, and alcohol consumption.Krewski stressed that interactions between thesefactors must be integrated into any risk assessmentstrategy and tailored to inform the strategy. Alongthese lines, Christopher I. Li and colleagues (FredHutchinson Cancer Research Center) conductedan observational study of a subset of patientsin the Women’s Health Initiative (WHI) study,between 1993 and 1998, which included 87,724postmenopausal women aged 50–79 years.3 Theyreported that alcohol use is more strongly relatedto the risk of lobular carcinoma than to ductalcarcinoma, and more strongly related to hormonereceptor–positive breast cancer than to hormonereceptor–negative breast cancer. Their resultssupported the previously identified association ofalcohol consumption with hormone-positive breastcancer risk, as well as three previous case-controlstudies that identified a stronger association ofalcohol with lobular carcinoma.

Krewski also discussed the use of a new approachto dose-response analysis that uses what is termed asignal to noise crossover dose (SNCD). The SNCD isdefined as the dose where the additional risk is equalto the background noise (the difference betweenthe upper and lower bounds of the two-sided 90%confidence interval on absolute risk) or a specifiedfraction thereof. In his published study, the NationalToxicology Program (NTP) database was used as thebasis for these analyses, which were performed usingthe Hill model. The analysis defined the SNCD asa promising approach that warrants further devel-opment for human health risk assessment.4 Finally,Krewski concluded with a vision for the incorpora-tion of systems biology that was offered a numberof years ago in a publication authored by Stephen

12 Ann. N.Y. Acad. Sci. 1278 (2013) 11–24 C© 2013 New York Academy of Sciences.

Davis et al. The new revolution in toxicology

W. Edwards and R. Julian Preston.5 As described inthe publication, it was proposed that systems ap-proaches may provide a method for generating thetype of quantitative mechanism of action data re-quired for risk assessment.

Thomas Hartung (Johns Hopkins BloombergSchool of Public Health) discussed in his talk “Im-plementation of the NAS Vision” a compelling opin-ion on this topic. Hartung reflected on the perceivedatmosphere created by the Tox21C report. Overallhis opinion was that the report, indirectly suggestsmoving away from traditional (animal) testing to-ward modern technologies based on pathways oftoxicity. The concept presented was rather simplein his view: there might be only a couple of hun-dred ways to harm a cell and these pathways oftoxicity could be modeled in relatively simple celltests, which can be run by robots. The goal is todevelop a public database for such pathways, thehuman toxome, to enable scientific collaborationand exchange. Hartung also mentioned that there isa continuously growing awareness—not necessarilyalways described as excitement—about Tox21C instakeholder groups. It was first embraced by scien-tists in the United States—it was difficult to finda U.S. toxicology meeting over the last few yearsthat did not have it as a topic. The U.S. Society ofToxicology instantly started a series of commentsin their journal. Most importantly, the U.S. agen-cies followed fast on the 2007 NAS/NRC report:the Tox21C alliance in 2008 (a paper in Sciencefirst-authored by Francis Collins6); the EPA made ittheir chemical testing paradigm in 2009;7 the Foodand Drug Administration (FDA) followed most ev-idently with the Science article by Margret Ham-burg in 2011.8 The chemical and consumer productindustry got engaged, e.g., the Human ToxicologyProject Consortium,9 as did the pharmaceutical in-dustry, somewhat more reluctantly (in his opinion).In Europe, a reaction to the paradigm has been de-layed, with some adaptation of the vocabulary butnot necessarily an embrace of the new approach.He did not view the current strategies as alternativemethods under a new name. However, Hartung feltthat interest is strongly increasing in Europe.

Hartung was also clear that Tox21C suggests morethan just movement toward databases of pathwaysof toxicity. One big problem is that the respectivescience is still emerging. The call for mechanism-based approaches has been around for a while. The

new concept that may be articulated going forwardis a change in resolution to molecularly definedpathways. The new technologies (especially omics)may allow this. He followed by stating that what isneeded is the human toxome, viewable as a compre-hensive pathway list, annotation of cell types, ref-erences to species differences (or source of elucida-tion), toxicant classes and hazards to these pathways,an integration of information in systems toxicologyapproaches, the in vitro–in vivo extrapolation by re-versed dosimetry, and finally, a means to make senseof the data, most likely in a probabilistic way. Har-tung presented a list of the most notable activities:

� The EPA launched its ToxCast program basedon available high-throughput tests; the EPAmade Tox21C their official toxicity testingparadigm for chemicals in 2009.

� The Tox21C alliance of the EPA, the Na-tional Insitute of Environmental Health Sci-ences (NIEHS), the National Cartography andGeospatial Center (NCGC), and the FDAextended this to more chemicals.

� Case study approaches at the Hamner Institute,which was originally created by the chemicalindustry.

� The Human Toxicology Project Consortium(seven global companies and three stakehold-ers including the Center for Alternatives toAnimal Testing (CAAT)).

� The Human Toxome Project led by CAAT andfinanced by an NIH Transformative ResearchAward (this very competitive and prestigiousgrant is given for projects that have the po-tential to drive change). The project involvesToxCast, the Hamner Institute, Agilent, andseveral members of the Tox21C panel.

� The Organisation for Economic Co-operationand Development (OECD) has embraced thisin their new adverse outcome pathway concept.

Hartung discussed what he considered to be themost advanced regulatory application: a proposalto use some of the high-throughput assays fromToxCast to prioritize endocrine disruptor screen-ing (EDSP21 program) as a representative exampleof this advanced use. However, he emphasized thatthis proposal is a work in progress and has alreadygenerated some resistance. He also mentioned thatOECD has embraced some of the concepts under the

Ann. N.Y. Acad. Sci. 1278 (2013) 11–24 C© 2013 New York Academy of Sciences. 13

The new revolution in toxicology Davis et al.

label of adverse outcome pathways. Lastly, Hartungsurmised that, early on, there is clearly a need for aprocess to qualify the new approaches as a criticalcomponent of their development and implemen-tation. Formal validation as developed for the firstgeneration of alternative methods can only partiallyserve this purpose. For this reason, the Evidence-based Toxicology Collaboration (EBTC) was createdin the United States and Europe in 2011 and 2012, re-spectively (www.ebtox.com). This collaboration ofrepresentatives from agencies, industry, academia,and stakeholder groups aims to apply the tools de-veloped by evidence-based medicine to toxicology.The EBTC secretariat is run by CAAT, and the firstconference was held in early 2012 and hosted by theU.S. EPA. Working groups have started to addresspertinent issues and methodologies. Taken together,Tox21C and its implementation activities includingthe human toxome and the EBTC promise a credibleapproach to revamp regulatory toxicology.

The vision for toxicity testing in the 21st

centuryMichael P. Holsapple’s (ATS, Battelle Memorial In-stitute) talk, “Vision for Toxicity Testing in the21st Century (Tox21C): Promises, Challenges, andProgress,” began with a few excerpts from the re-port that he thought represented the stimulus forthe Tox21C vision and strategy: “ . . . transformativeparadigm shift and . . . new methods in computa-tional biology and a comprehensive array of in vitrotests based on human biology.” He echoed some ofHartung’s comments about the importance of col-laborative activities, such as the EBTC (e.g., Holsap-ple noted that he serves as a member of the steeringteam) and the Human Toxicology Project Consor-tium (e.g., he indicated that he was a co-author of the2010 HTPC workshop report, which is in press).10

Holsapple indicated that he would focus his remain-ing time describing two other activities with whichhe has had some personal experience, and whichcould serve as additional perspectives on approachesto advance the Tox21C vision and strategy. He thenpresented a brief overview of the ILSI Health andEnvironmental Sciences Institute (HESI) Risk As-sessment for the 21st Century (RISK21) program.Holsapple indicated that the impetus for RISK21included the NAS report “Science and Decision:Advancing Risk Assessment”11 and the 2007 NASTox21C report, and that the stimulus for RISK21

included the ever increasing development and useof new technologies that would impact risk as-sessment, such as the following: (1) high-contenttechnologies (e.g., genomics, proteomics, metabo-nomics); (2) high density approaches such as highthroughput toxicity assays; (3) sensitive new analyt-ical chemistry techniques; and (4) increasingly de-tailed knowledge of cellular exposure (PBPK mod-eling methods).

The leaders of the HESI RISK21 program rec-ognize that there has been a lack of consensus onhow best to use and incorporate the informationfrom these new methods into quantitative risk as-sessments, and that there was an opportunity toprovide broad scientific leadership to develop cred-ible approaches and to suggest changes in policies.The vision for RISK21 was to initiate and stimu-late a proactive and constructive dialogue amongexperts from industry, academia, government, andother stakeholders to identify the key advancementsin 21st century risk assessment. The RISK21 pro-gram was structured around four working groups:(1) dose-response—establish a unified approach todose-response assessment that builds on the exist-ing mode of action and key events dose responseframework (KEDRF) to quantitatively incorporatedose-response information, and to address tech-nical issues regarding in vitro to in vivo extrap-olation; (2) cumulative risk—define and developcritical elements of a transparent, consistent, prag-matic, scientific approach for assessing health risksof combined exposures to multiple chemicals in thecontext of other stressors; (3) integrated evaluationstrategies—establish a process that provides flexi-ble, rapid, efficient, transparent, and cost-effectivescientifically sound approaches for creating and in-terpreting data relevant to decision making thatprotects health; and (4) exposure science—proposeapproaches for using new technologies to improvecharacterization of real-world exposures and pro-vide the data-driven evidence base for 21st centuryexposure modeling and risk assessment.

As a segue into his final topic, Holsapple again re-ferred to the previously described paper in Science byFDA commissioner Margaret Hamburg, where shenoted, “We need better predictive models to identifyconcerns earlier in the (drug) development processto reduce time and costs. We also need to modern-ize the tools used to assess emerging concerns aboutpotential risks from food and other exposures”, and

14 Ann. N.Y. Acad. Sci. 1278 (2013) 11–24 C© 2013 New York Academy of Sciences.

Davis et al. The new revolution in toxicology

Figure 1. Menu- and mechanism-based approaches in toxicity testing. Transitioning to mechanism-based approaches couldimprove predictivity and insight, combining an initial triage step, using multi-scale imaging and modeling, to quantify dosimetryand identify highest priority concerns, with subsequent targeted steps using integrated in vitro and in vivo methods to assess priorityconcerns.

to the 2011 FDA Strategic Plan, which listed the toppriority as “Modernize toxicology to enhance prod-uct safety.” Holsapple then described the BattelleMulti-Scale Toxicology Initiative (MSTI; Fig. 1). Henoted that the vision for MSTI was based on the be-lief that mechanism-based approaches could speedtesting and improve accuracy in picking winners.The stated goal of the MSTI concept is to improvepredictivity, insight, and translational value via skill-ful integration of data at all scales. The concept wasto move away from what he described as currentmenu-driven approaches into an approach that has(1) an initial triage step to quantify dosimetry andidentify highest priority concerns using multi-scaleimaging and modeling; and (2) subsequent, tar-geted steps to interrogate and assess priority con-cerns using integrated in vivo and in vitro methods.Holsapple concluded by emphasizing that toxicol-ogy, especially in the context of toxicity testing dur-ing the drug development process, needs to be up-dated, and by noting that the NAS Tox21C visionand strategy has the promise to address that need.However, he cautioned that many challenges remainto realize that promise, and that in spite of gov-ernment buy-in from the EPA, NIH, NIEHS/NTP,and FDA, the need for 21st century validationtools and proof-of-concept projects—especially forrisk assessment and regulatory science—must berecognized.

Legal acceptance of the Tox21C approachE. Donald Elliott, J.D., rounded out the session bypresenting an intriguing review of some of the le-gal consequences of the changes in the approachto safety assessment in his presentation “Mappingthe Pathway to Legal Acceptance of Pathway-basedToxicological Data: How to Climb Mt. Krewski”. El-liott started out providing a description of our legalsystem and some important realizations. Our sys-tem, as he described, is one based on judicial prece-dent set by generalist judges, and tends to be con-servative in allowing new scientific paradigms andtechniques to be accepted in court. There is no sin-gle legal standard that pathway-based (non-apical)data must satisfy; legal hurdles differ in differentcontexts. Standards for legal acceptance of emerg-ing scientific information by courts of general ju-risdiction with lay juries are high, whereas admin-istrative agencies have more discretion. Judges actas gatekeepers to keep new scientific informationfrom reaching juries until it has been shown to bereliable and applicable (the Daubert rule, namedfor the Supreme Court case that created it). Elliottwent on to contrast the judicial system with ad-ministrative agencies. He made the important dis-tinction that administrative agencies are consideredexperts and that they may consider emerging sci-entific information along with other informationprovided that their final decisions are supported by

Ann. N.Y. Acad. Sci. 1278 (2013) 11–24 C© 2013 New York Academy of Sciences. 15

The new revolution in toxicology Davis et al.

“substantial evidence on the record as a whole.”Non-expert judges in reviewing courts do not ruleon the admissibility of individual pieces of evidencebefore administrative agencies. Therefore, agencieshave broad discretion to consider pathway-basedtoxicological information along with other evidencein making weight of evidence determinations, andhave already begun to do so. Elliott provided theapproval of chemical dispersants for use against theDeep Water Horizon oil spill as a recent examplein which the EPA relied in part on pathway-basedtesting. In general, the legal system is more lenientin accepting new information where the costs of afalse positive or false negative are relatively low. Heleft the audience with an important conclusion toponder: broader acceptance of pathway-based toxi-cology in contexts where the stakes are higher (sucha approving a new drug or banning a pesticide) willdepend on developing side-by-side comparisons tothe predictive value of traditional data.

Key differences and challenges betweensafety and risk assessment

The intent of Session II was to discuss the differ-ences and challenges between safety and risk assess-ment, and the use of in vitro Assays. The sessionbegan with an insightful talk entitled “PredictiveToxicology at Abbott in Early Discovery: A Criti-cal Review of Successes and Failures over an 8-YearPeriod” provided by Eric Blomme (Abbott Labo-ratories). Blomme reviewed experience at Abbottover the last eight years using examples to illustratethe strengths, weaknesses, limitations, and optimalapplication of these technologies during lead opti-mization and candidate selection. Finally Blommediscussed several recent promising additions to thetoxicologist’s toolbox and some key challenges fac-ing toxicologists in the pharmaceutical industry inthe near future.

Thomas W. Jones (Eli Lilly and Company) pro-vided an inspiring contrast of ideals. Jones startedby characterizing Tox21C as a response rather thanan answer and pointed out the fact that the commit-tee itself describes the report as a “long-range visionand strategic plan”. He pointed out the salient fea-tures of the report and focused on other importantfactors that frame the discussion. He reiterated thatthe expectation that the products of the revolutionsin biology and biotechnology can be used to trans-form toxicity testing from a system based on whole

animal studies to one founded primarily on in vitromethods using cells of human origin is in fact a hy-pothesis to be tested and presents an open questionregarding applications in other areas of toxicologysupporting human risk assessment (Fig. 3). Jonesmade several points about how we might best ap-ply a human cell-based nonclinical testing scheme,similar to the one envisioned in the Tox21C report,in human pharmaceutical discovery and develop-ment to inform and improve decision making. Heargued that it is critical to be sure that we have acommon understanding of how the current testingparadigm is applied and how it performs, and madethese key distinctions: (1) if risk is realized earlyin a drug development program, then less clinicaldata will be available providing context and balanceto the decision makers regarding benefit; (2) thereneeds to be a balance in the revelation of risk andbenefit; (3) the science being done to realize the po-tential of Tox21C could benefit the pharmaceuticalindustry—however, the experience of the pharma-ceutical industry, where human clinical testing is aroutine part of the development process, will informhow to apply the approaches outlined in Tox21C toother settings where human risk-based decisions arebeing made; and (4) the tools needed to support thesafe testing of human pharmaceuticals are likely tobe different from those needed for environmentaldecision making.

Jones addressed several popular perceptions andmyths. One perception he discussed was that thepharmaceutical industry needs better ways to rec-ognize safety risks in order to make decisions ear-lier. Along these lines, he mentioned the currentsystematic use of animal models as a sentinel mea-sure of safety risk. In Figure 2, Jones outlined theutility of the current animal-based approach andconcordance. For example, the sensitivity of fecaloccult blood for detection of colon cancer is about66% and the sensitivity of prostate-specific anti-gen (PSA) screening for detection of prostate canceris about 72%. Although animal to human transla-tion varies by toxicity and target tissue, pre–first-in-human toxicology studies reveal nearly all toxicitiesthat are relevant to humans.

Tox21 and ToxCast chemical landscapesAnn M. Richard (U.S. Environmental ProtectionAgency) reviewed the U.S. EPA’s ToxCast projectand the related multi-agency Tox21C projects. These

16 Ann. N.Y. Acad. Sci. 1278 (2013) 11–24 C© 2013 New York Academy of Sciences.

Davis et al. The new revolution in toxicology

Figure 2. An outline of the utility and concordance of the current systematic use of animal models as sentinel measures of safetyrisk. Pre–first-in-human toxicology studies reveal nearly all toxicities that are relevant to humans, although translation varies bytarget tissue and toxicity.

projects are employing high-throughput technolo-gies to screen hundreds to thousands of chemicalsin hundreds of assays, probing a wide diversity of bi-ological targets, pathways and mechanisms for usein predicting in vivo toxicity. The ToxCast chem-ical library consists of 960 unique chemicals (in-cluding Phase I and II), with 100 recently added tothis total, and was constructed to span a diverserange of chemical structures and use categories.This library is fully incorporated into the EPA’sapproximately 4000 chemicals contributing to thelarger, more diverse Tox21 chemical library (totaling10,000). These chemical libraries represent centralpillars of the ToxCast and Tox21 projects and areunprecedented in their scope, structural diversity,multiple use scenarios (pesticides, industrial, food-use, drugs, etc.), and chemical feature characteris-tics in relation to toxicology. Chemical databases

built to support these efforts consist of high qualityDSSTox chemical structures and generic substancedescriptions linked to curated test sample informa-tion (supplier, lot, batch, water content, analyticalQC). Cheminformatics, feature and property pro-filing, and a priori and interactive categorization ofthese libraries in relation to biological activity willserve as essential components of toxicity predictionstrategies. Finally, Richard mentioned the DSSToxproject, which provides primary chemical librarydesign and information management for the Tox-Cast and Tox21 projects.

The future of toxicology in drug developmentDavid Jacobson-Kram (Food and Drug Admin-istration) began with a reminder that toxicologytesting serves several important needs in drugdevelopment. In the earliest stages, toxicology data

Ann. N.Y. Acad. Sci. 1278 (2013) 11–24 C© 2013 New York Academy of Sciences. 17

The new revolution in toxicology Davis et al.

Figure 3. Different levels of toxicity analysis and current techniques for engaging each level. Despite biotechnical advances inmany areas supporting human risk assessment, it remains to be definitively determined whether in vitro methods using cells ofhuman origin can effectively supplant systems based on whole animal studies.

are used to determine the maximum recommendeddoses for first-in-man Phase I studies. This infor-mation can also help to identify potential toxicitiesand can also specify maximum stopping doses. Tox-icology studies can also identify certain risks thatcannot be studied in clinical trials: potential forgenetic damage (genotoxicity), carcinogenicity, ter-atogenicity, and risks from long term exposures. Invitro studies have been useful in identifying haz-ards; for example, the Ames assay provides infor-mation on whether a drug candidate can inducegene mutations, and the hERG assay is useful indetermining if a candidate pharmaceutical has po-tential for QT-interval prolongation. While hazardassessment is an important aspect in drug devel-opment, ultimately hazards must be linked to ex-posures in order to quantify risks. Use of in vitroassays in risk assessment is challenging primarilybecause of the difficulty in modeling absorption,distribution, metabolism, and excretion (ADME).Although not insurmountable, use of in vitro as-says for all aspects of drug development will requirenew methods and better understanding of ADMEprocesses.

Application of the strategy anddevelopment of the tools

Kim Boekelheide (Brown University) delivered aninsightful review entitled “Toxicity Testing in the21st Century: A Toxicologic Pathology PerspectiveUsing Testis Toxicity as an Example.” Boekelheidestated that the European Union has been a leader inadvancing alternative testing strategies through leg-islative action and funding the development of new

testing paradigms. In 2004, a group from Unileverin the United Kingdom conceptualized a new testingapproach entitled Assuring Safety without AnimalTesting (ASAT). The ASAT initiative sought to tiehow human disease processes are identified in theclinic together with the development of human rele-vant in vitro mechanistic data, bypassing the currentneed for toxicity testing in animals. The focus andoutcome of an ASAT workshop held June 15–17,2011, on “Recent Developments and Insights intothe Role of Xenobiotics in Male Reproductive Tox-icity” were reviewed by Boekelheide. The key ques-tions for this initiative were clear: (1) Does xenobi-otic exposure cause testicular toxicity in men? (2) Ifnot well understood, what are the data gaps? (3) Cantesticular toxicity be studied in animals and are theresults sufficiently predictive for humans? and (4)Can testicular toxicity be studied in vitro with suffi-cient reliability to identify the relevant mechanismsof action for humans?

To address the first question, known exam-ples of xenobiotic-induced testicular toxicity werereviewed along with the limitations of currentdiagnostic approaches. The diagnosis of testiculartoxicity is currently measured using semen param-eters and serum hormones, like inhibin B and FSH.These biomarkers have significant limitations, in-cluding a delay between exposure and biomarkeralteration, a highly variable measure of effect, andreliance on epidemiological associations. He con-cluded that current approaches to understandingthe etiology of human male toxicant-induced testic-ular injury are insensitive, highly variable, and lackdiagnostic specificity. The need for new tools was

18 Ann. N.Y. Acad. Sci. 1278 (2013) 11–24 C© 2013 New York Academy of Sciences.

Davis et al. The new revolution in toxicology

identified as a data gap (question 2), and the poten-tial for using modern molecular approaches, such asserum assessment of testis-specific microRNAs andmonitoring sperm molecular biomarkers, includ-ing sperm mRNA transcripts and DNA methylationmarks, may address this data gap.

The potential and limitations of animal exper-iments to inform regarding the human response(question 3) was highlighted using phthalate-induced effects in fetal testis as an example. Char-acteristics of fetal testicular responses to phtha-lates were compared between human, mouse, andrat. Fetal rat testis responds to phthalate expo-sure by suppressing steroidogenesis in the Leydigcells, resulting in lower fetal testosterone levels. Onthe other hand, recent studies have shown thatboth human and mouse fetal Leydig cells are re-sistant to these phthalate-induced anti-androgeniceffects.12, 13 Therefore rats, but not humans ormice, would be expected to show effects of low-ered fetal testicular testosterone production, in-cluding reduced steroidogenic gene expression,shortened anogenital distance, nipple/areola reten-tion, hypospadias, cryptorchidism, and Leydig cellhyperplasia. Interestingly, all three species sharephthalate-induced alterations in fetal seminiferouscords, including the induction of multinucleatedgerm cells. This example makes the point that ani-mal models of interactions of xenobiotics with tes-ticular function may be limited by uncertaintiesabout their relevance to humans.14

Regarding the last question, in vitro approaches toevaluating testicular function have so far been lim-ited by the inability to recapitulate spermatogenesisin vitro, a complex process unique to the testis. TheASAT workshop focused on revolutionizing testistoxicity testing approaches, including the main rec-ommendation to design a functioning “testis in apetri dish” capable of spermatogenesis. Limitationsto developing this novel approach were cited, in-cluding the lack of relevant human cell lines, diffi-culties in assessing the efficiency of spermatogenesis,difficulties in incorporating the needed paracrineinteractions, and finally, a reliance on cell-specificapical endpoints rather than developing insight intotoxicant-associated modes of action. The workshopproposed developing a human stem cell differen-tiation model of spermatogenesis with stem cellsprogressing through meiosis and giving rise to hap-loid spermatids, combined with bioengineering ap-

Figure 4. The development of new in vitro approaches to tox-icity testing. The most thorough scope of toxicity response maybe best evaluated through a balanced combination of unbiasedtesting and pathway-specific expert-driven testing.

proaches to build three-dimensional scaffolds forthe assembly of the multiple cell types needed forsuccessful spermatogenesis. Ultimately, the goal is toidentify toxicants that interrupt this process usinghigh-throughput in vitro tools.

The presentation ended with a broad view of ap-proaches to development of the in vitro tests of thefuture. Major concerns about the validity of in vitromodels were mentioned, including the requirementfor key tissue-specific pathways, the ability of thein vitro system to recapitulate in vivo biology, andthe complexity of interacting cell types. Boekelheideargued that the development of the new in vitro ap-proaches might be best served with a balanced com-bination of unbiased testing and pathway-specificexpert-driven testing, so that the most completelandscape of toxicity response could be evaluated(Fig. 4).

Evaluating injection site tolerability priorto testing in humansGary Eichenbaum (Janssen Research and Devel-opment) discussed the importance of evaluatingand minimizing the infusion site irritation poten-tial of intravenous formulations prior to testing newmolecular entities in humans. Several promising ad-vances that he discussed include the use of nonclin-ical in silico and in vitro models to screen candidateformulations and thereby enable a reduced use of invivo nonclinical models. If an intravenous formu-lation causes severe injection site reactions, theseadverse effects may pose significant challenges forthe nonclinical safety assessment of systemic toxi-city as well as the clinical development and use of

Ann. N.Y. Acad. Sci. 1278 (2013) 11–24 C© 2013 New York Academy of Sciences. 19

The new revolution in toxicology Davis et al.

the compound. The extent to which infusion sitereactions may limit further development dependsin part on the severity and root cause of the adverseeffects, as well as the dose/concentration responseand the margin of safety. The talk focused on in vitroand in vivo nonclinical strategies and models fordeveloping, evaluating and optimizing intravenousformulations of compounds that have the potentialto cause local toxicity following injection.

Eichenbaum reviewed the common types of in-jection site reactions (chemical and mechanical) andthe key factors for chemically mediated injectionsite reactions: formulation concentration, solubil-ity, and rate of injection. The common types listedinclude:

� Inflammation/irritation of the vessel at the in-jection site.

� Endothelial hyperplasia/intimal edema at in-jection site.

� Necrosis of the venous wall with mixed inflam-matory cell infiltration/abscessation at the in-jection site.

� Thrombosis at the injection site.� Necrosis with abscessation and adjacent in-

flammation at the entry point.� Interstitial pneumonitis.� Area of necrosis and inflammation in the lungs

(probable infarct).� Thromboembolus in the lungs.

He then reviewed targeted safety testing and dis-cussed several nonclinical models to evaluate po-tential for infusion site reactions, such as a static ki-netic solubility model; a dynamic solubility model;plasma solubility; in vitro hemolysis; cell based mod-els; hen egg chorioalloantoic membrane model; andnonclinical in vivo infusion models.

Lastly, Eichenbaum provided a case example ofa compound with limited solubility at neutral pHwith potential to cause infusion site reactions attherapeutic concentrations. He presented a multi-factorial approach to optimize and evaluate keyparameters: test article concentration, pH, buffers,excipient types, excipient weight percentage, anddosing volume and rate. In this example, an in silicoprecipitation index model was developed to supportinfusion site safety assessment and the translationof nonclinical data to human situations. The modelpredicts that there may be some subtle differences

in species sensitivity (dog > rat > rabbit), but thatthe preclinical models should be fairly predictivedespite the physiological differences.

A schema was presented for setting doses/concentrations that are more likely to be well tol-erated in the nonclinical species and in humans.The in silico model results predict that if the plasmasolubility is ≤ 0.05 mg/mL, then the formulationconcentration should be ≤ 0.5 mg/mL to reduce theprobability of an infusion site reaction. If the plasmasolubility is between 0.05 and 0.1 mg/mL, then for-mulation concentrations up to 1.5 mg/mL are notlikely to cause precipitation-mediated irritation. Ifthe plasma solubility is >1 mg/mL, the model pre-dicts minimal likelihood for precipitation or irrita-tion up to concentrations of 10 mg/mL in the dosingsolution. These predictions were in agreement within vivo results with the example compound that waspresented and other low solubility compounds thathave been evaluated. The intravenous formulationthat was selected from these assessments as part ofthe case-example was well tolerated in humans andachieved target exposures for efficacy.

The conclusions of the presentation were (1) anin silico and several in vitro models can help toscreen and identify candidate formulations with re-duced/limited infusion site irritation potential oflow solubility compounds; (2) in vivo models pro-vide additional and important information aboutrisk for infusion site irritation of candidate for-mulations, but species differences must be consid-ered; and (3) staged application of these models cansupport the successful optimization of intravenousformulations that have reduced risk for injectionsite irritation in humans, and at the same timereduce the number of in vivo nonclinical studiesrequired.

Integrated cell signaling in toxicologyMany human diseases are a consequence of aber-rant development of tissues in which the transi-tion of adult stem cells into their appropriate dif-ferentiated cell type lineages and designated nicheswithin a tissue has been interrupted by either ge-netic or epigenetic events. These molecular eventsoften result in dysfunctional tissues or, as in the caseof cancer, the development of tumors that bypassall normal cybernetic control mechanisms. Thesecontrol mechanisms require homeostatic-regulatedgene expression through highly coordinated

20 Ann. N.Y. Acad. Sci. 1278 (2013) 11–24 C© 2013 New York Academy of Sciences.

Davis et al. The new revolution in toxicology

networks of extracellular, intercellular, and intra-cellular signaling events within and between thecells of a tissue. Brad L. Upham (Michigan StateUniversity) proposed that gap junction intercellu-lar channels are critical in modulating the levels oflow molecular weight second messengers neededfor the transduction of an external signal to thenucleus in the expression of genes essential to thenormal maintenance of a tissue. Thus, any compre-hensive systems biology approach to understandingthe role of signaling in toxicology must also includegap junctions, as aberrant gap junctions have beenclearly implicated in many human diseases.

Idiosyncratic hepatotoxicity: from humanto mouse to computerPaul B. Watkins, (The Hamner University of NorthCarolina Institute for Drug Safety Sciences (IDSS))discussed drug induced liver injury (DILI), a ma-jor adverse drug event that leads to terminationof clinical development programs and regulatoryactions including failure to approve for market-ing, restricted indications, and withdrawal from themarketplace. The type of DILI that is most prob-lematic is idiosyncratic, meaning that only a verysmall fraction of treated patients are susceptible tothe DILI. Current preclinical models, even human-ized ones, do not reliably identify molecules thathave this liability, and conversely, predict liabili-ties in molecules that are in fact quite safe for theliver. Reliable preclinical testing will probably notbe developed until there is greater understandingof the mechanisms underlying idiosyncratic DILI.Based on the belief that the best models to studyDILI are the people who have actually experiencedit, there are two major efforts underway to createregistries and tissue banks from these rare individ-uals: the Severe Adverse Events Consortium sup-ported by industry and the Drug Induced Liver In-jury Network supported by the National Institutesof Health. Genome-wide association analyses andwhole exome/whole genome sequencing of certainDILI cases are well underway and appear promising.However, it has become clear that preclinical exper-imental approaches are also needed to both providebiological plausibility for associations observed andto generate specific hypotheses that can be testedwith the genetic data. Ongoing approaches at theIDSS include chemoinformatic analysis of impli-cated drugs, use of panels of inbred and genetically

defined mice, and organotypic liver cultures includ-ing systems derived from induced pluripotent stemcells obtained from patients who have experiencedDILI. The IDSS also leads DILIsim, a public–privatepartnership that involves scientists from 11 majorpharmaceutical companies and the FDA, with thegoal of integrating multiple streams of data into anin silico model that would explain and ultimatelypredict the hepatoxic potential of new drug candi-dates in humans.

Application of the strategy anddevelopment of the tools

Kyle Kolaja (Roche) proposed that one possiblemeans to bridge the gap between late-stage assess-ments of safety-related organ toxicities and earlydiscovery is through the use of human pluripotentstem cell–derived tissues, which afford improvedcellular systems that replicate the critical functionalaspects of intact tissues. The combination of stemcell–derived tissues with small-scale assays can en-sure these models are amenable to high-throughput,low compound usage assays, and thus have utility indrug discovery toxicology. He presented publishedwork that focused on stem cell–derived cardiomy-ocytes, characterizing these cells molecularly andfunctionally as a novel model of pro-arrhythmiaprediction.

Darrell R. Boverhof (The Dow Chemical Com-pany) discussed the promises and challenges of ap-plying toxicogenomics and in vitro technologies forthe assessment of chemical sensitization potential.He highlighted how advances in molecular and cel-lar biology are providing tools to modernize ourapproaches to chemical hazard assessment, includ-ing that for skin sensitization. Dr. Boverhof has beenresearching the application of toxicogenomics andin vitro assays for assessing the sensitization poten-tial of chemicals. Determination of the skin sensiti-zation potential of industrial chemicals, agrochem-icals, and cosmetics is crucial for defining their safehandling and use. The mouse local lymph node as-say (LLNA) has emerged as the preferred in vivoassay for this evaluation; however, the assay hascertain limitations including the use of radioactiv-ity, poor specificity with certain chemistries (falsepositives), and the inability to distinguish betweendifferent classes of sensitizers, namely skin andrespiratory sensitizers. To address these limitations,researchers have been exploring the application of

Ann. N.Y. Acad. Sci. 1278 (2013) 11–24 C© 2013 New York Academy of Sciences. 21

The new revolution in toxicology Davis et al.

toxicogenomics to the LLNA to provide enhancedendpoints for the assessment of chemical sensitiza-tion potential. Data generated to date indicate thattoxicogenomic responses are providing increasedinsight into the cellular and molecular mechanismsof skin sensitization, which may increase the speci-ficity and extend the utility of the LLNA.15–18 In par-allel with these efforts, research is being conductedon the development and application of in vitro as-says for predicting skin sensitization potential. Re-cent regulations (e.g., the EU Cosmetics Directive),as well as responsible stewardship, have pushed thedevelopment of non-animal approaches that can ef-fectively predict skin sensitization potential for newchemical entities. These assays have built upon ourcurrent understanding of the molecular and cellu-lar events involved in the acquisition of skin sen-sitization, and are showing promise for providingnon-animal alternatives for characterization of skinsensitizing chemicals.16

Conclusions: Are we getting there?

The broad aims of this conference were (1) to pro-vide a forum to discuss the recent advances intoxicity testing, their application to pharmaceuti-cal discovery and development, and their relevanceto safety assessment; (2) to bring together leadingscientists from different disciplines to encourageinterdisciplinary thinking; (3) to encourage out-standing junior scientists, students, and post-docsto pursue research in this promising field; (4) toprovide networking opportunities among scientistsand guests; and (5) to encourage collaborations toadvance science. Conference speakers and audienceparticipants achieved these aims through presenta-tions and engaging discussion. At the end of theconference, participants were encouraged to criti-cally explore questions such as:

� What was the initial response to the 2007National Academy of Sciences report ToxicityTesting in the 21st Century: A Vision and a Strat-egy, and how did this vary across sectors?

� What has been achieved since the publicationof the report?

� How has implementation of the recommenda-tions in the report been achieved across sec-tors?

� What have been the strengths, weaknesses,limitations, and optimal applications of these

technologies during lead optimization andcandidate selection?

� What recent promising additions to the toxi-cologist’s toolbox have emerged?

� What are key challenges facing toxicologistsin the pharmaceutical industry in the nearfuture?

� What lessons can we draw from the successof the report, and from the criticisms madeagainst it?

� What recommendations can we adapt acrossthe pharmaceutical industry?

Some of these questions have been the subject offollow-on initiatives and will be a constant sourceof ongoing discussions.

Conflicts of interest

The authors declare no conflicts of interest.

References

1. Committee on Toxicity Testing and Assessment of Environ-mental Agents, National Research Council. 2011 ToxicityTesting in the 21st Century: A Vision and a Strategy. NationalAcademies Press. Washington D.C.

2. Committee on Breast Cancer and the Environment: TheScientific Evidence, Research Methodology, and Future Di-rections; Institute of Medicine. 2012. Breast Cancer and theEnvironment: A Life Course Approach. National AcademiesPress. Washington D.C.

3. Li, C.I. et al. 2010. Alcohol consumption and risk of post-menopausal breast cancer by subtype: the women’s healthinitiative observational study. J Natl Cancer Inst. 102: 1422–1431.

4. Sand, S., C.J. Portier & D. Krewski. 2011. A signal-to-noisecrossover dose as the point of departure for health risk as-sessment. Environ Health Perspect. 119: 1766–1774.

5. Edwards, S.W. & R.J. Preston. 2008. Systems biology andmode of action based risk assessment. Toxicol Sci. 106: 312–318.

6. Collins, F.S., G.M. Gray & J.R. Bucher. 2008. Toxicology.Transforming environmental health protection. Science 319:906–907.

7. Firestone, M.R. et al. 2010. The U.S. Environmental Pro-tection Agency Strategic Plan for Evaluating the Toxicity ofChemicals. J. Toxicol. Environ. Health B. Cri. Rev. 13: 139–162.

8. Hamburg, M.A. 2011. Advancing Regulatory Science. Sci-ence 331: 987–987.

9. Seidle, T. & M.L. Stephens. 2009. Bringing toxicology intothe 21st century: A global call to action. Toxicol in vitro. 23:1576–1579.

10. Stephens, M.L. et al. 2012. Accelerating the developmentof 21st-century toxicology: outcome of a Human Toxicol-ogy Project Consortium workshop. Toxicol. Sci. 125: 327–334.

22 Ann. N.Y. Acad. Sci. 1278 (2013) 11–24 C© 2013 New York Academy of Sciences.

Davis et al. The new revolution in toxicology

11. Committee on Improving Risk Analysis Approaches Used bythe U.S. EPA, National Research Council. 2009. Science andDecisions: Advancing Risk Assessment. National AcademiesPress. Washington D.C.

12. Mitchell, R.T. et al. 2012. Do phthalates affect steroidogenesisby the human fetal testis? Exposure of human fetal testisxenografts to di-n-butyl phthalate. J Clin Endocrinol Metab.97: E341–348.

13. Heger, N.E. et al. 2012. Human fetal testis xenografts are re-sistant to phthalate-induced endocrine disruption. EnvironHealth Perspect. 120: 1137–1143.

14. Anson, B.D., K.L. Kolaja & T.J. Kamp. 2011. Opportunitiesfor use of human iPS cells in predictive toxicology. ClinPharmacol Ther. 89: 754–758.

15. Boverhof, D.R. 2009. I. Evaluation of a toxicogenomic ap-proach to the local lymph node assay (LLNA). Toxicol Sci.107: 427–539.

16. Ku, H.O. et al. 2011. Pathway analysis of gene expression inlocal lymph nodes draining skin exposed to three differentsensitizers. J Appl Toxicol. 31: 455–462.

17. Adenuga, D. et al. 2012. Differential gene expression re-sponses distinguish contact and respiratory sensitizers andnonsensitizing irritants in the local lymph node assay. ToxicolSci. 126: 413–425.

18. Aeby P. et al. 2010. Identifying and characterizing chemicalskin sensitizers without animal testing: Colipa’s research andmethod development program. Toxicol In Vitro. 24: 1465–1473.

Additional reading

Abassi, Y.A., B. Xi, et al. 2012. Dynamic monitoring of beatingperiodicity of stem cell-derived cardiomyocytes as a predictivetool for preclinical safety assessment. Br. J. Pharmacol. 165:1424–1441.

Bolt, H.M. & J.G. Hengstler. 2008. Most cited articles in theArchives of Toxicology: the debate about possibilities and lim-itations of in vitro toxicity tests and replacement of in vivostudies. Arch. Toxicol. 82: 881–883.

Brendler-Schwaab, S.Y., P. Schmezer, et al. 1994. Cells of differenttissues for in vitro and in vivo studies in toxicology: Compila-tion of isolation methods. Toxicol. In Vitro 8: 1285–1302.

Charlton, J.A. & N.L. Simmons. 1993. Established human renalcell lines: Phenotypic characteristics define suitability for usein in vitro models for predictive toxicology. Toxicol. In Vitro7: 129–136.

Cheng, H. & T. Force. 2010. Molecular mechanisms of cardiovas-cular toxicity of targeted cancer therapeutics. Circ. Res. 106:21–34.

Clark, D.L., P.A. Andrews, et al. 1999. Predictive value of preclin-ical toxicology studies for platinum anticancer drugs. Clin.Cancer Res. 5: 1161–1167.

Davila, J.C., R.J. Rodriguez, et al. 1998. Predictive value of in vitromodel systems in toxicology. Annu. Rev. Pharmacol. Toxicol.38: 63–96.

Dickens, H., A. Ullrich, et al. 2008. Anticancer drug cis-4-hydroxy-L-proline: correlation of preclinical toxicology withclinical parameters of liver function. Mol. Med. Report 1: 459–464.

Ehrich, M. 2003. Bridging the gap between in vitroand in vivo toxicology testing. Altern. Lab. Anim. 31:267–271.

Eschenhagen, T., T. Force, et al. 2011. Cardiovascular side effectsof cancer therapies: a position statement from the Heart Fail-ure Association of the European Society of Cardiology. Eur. J.Heart Fail. 13: 1–10.

Fagerland, J. A., H. G. Wall, et al. 2012. Ultrastructural anal-ysis in preclinical safety evaluation. Toxicol Pathol. 40:391–402.

Fielden, M.R., B.P. Eynon, et al. 2005. A gene expression signaturethat predicts the future onset of drug-induced renal tubulartoxicity. Toxicol. Pathol. 33: 675–683.

Fielden, M.R. & K.L. Kolaja. 2008. The role of early in vivo toxicitytesting in drug discovery toxicology. Expert Opin. Drug Saf. 7:107–110.

Feldman, A.M., W.J. Koch, et al. 2007. Developing strategies tolink basic cardiovascular sciences with clinical drug develop-ment: another opportunity for translational sciences. Clin.Pharmacol. Ther. 81: 887–892.

Force, T., K. Kuida, et al. 2004. Inhibitors of protein kinase signal-ing pathways: emerging therapies for cardiovascular disease.Circulation 109: 1196–1205.

Force, T., C.M. Pombo, et al. 1996. Stress-activated pro-tein kinases in cardiovascular disease. Circ. Res. 78:947–953.

Force, T. & J.R. Woodgett. 2009. Unique and overlapping func-tions of GSK-3 isoforms in cell differentiation and prolifer-ation and cardiovascular development. J. Biol. Chem. 284:9643–9647.

Fukumoto, J. & N. Kolliputi. 2012. Human lung on a chip: in-novative approach for understanding disease processes andeffective drug testing. Front. Pharmacol. 3: 205.

Ganter, B., S. Tugendreich, et al. 2005. Development of a large-scale chemogenomics database to improve drug candidateselection and to understand mechanisms of chemical toxicityand action. J. Biotechnol. 119: 219–244.

Geenen, S., P.N. Taylor, et al. 2012. Systems biology tools fortoxicology. Arch. Toxicol. 86: 1251–1271.

Higgins, J., M.E. Cartwright, et al. 2012. Progressing preclin-ical drug candidates: strategies on preclinical safety studiesand the quest for adequate exposure. Drug Discov. Today 17:828–836.

Holsapple, M.P., C.A. Afshari, et al. 2009. Forum series: the “vi-sion” for toxicity testing in the 21st century: promises andconundrums. Toxicol. Sci. 107: 307–308.

Huh, D., D.C. Leslie, et al. 2012. A human disease model ofdrug toxicity-induced pulmonary edema in a lung-on-a-chipmicrodevice. Sci. Transl. Med. 4: 159ra147.

Kim, H.J., D. Huh, et al. 2012. Human gut-on-a-chip inhabitedby microbial flora that experiences intestinal peristalsis-likemotions and flow. Lab Chip 12: 2165–2174.

Kluwe, W.M. 1995. The complementary roles of in vitro andin vivo tests in genetic toxicology assessment. Regul. Toxicol.Pharmacol. 22: 268–272.

MacDonald, J.S. & R.T. Robertson. 2009. Toxicity testing in the21st century: a view from the pharmaceutical industry. Toxi-col. Sci. 110: 40–46.

Marchan, R., H.M. Bolt, et al. 2012. Systems biology meets toxi-cology. Arch. Toxicol. 86: 1157–1158.

Ann. N.Y. Acad. Sci. 1278 (2013) 11–24 C© 2013 New York Academy of Sciences. 23

The new revolution in toxicology Davis et al.

Marin, J.J., O. Briz, et al. 2009. Hepatobiliary transporters inthe pharmacology and toxicology of anticancer drugs. Front.Biosci. 14: 4257–4280.

Marks, L., S. Borland, et al. 2012. The role of the anaesthetisedguinea-pig in the preclinical cardiac safety evaluation of drugcandidate compounds. Toxicol. Appl. Pharmacol. 263: 171–183.

Marx, U., H. Walles, et al. 2012. “Human-on-a-chip” develop-ments: a translational cutting-edge alternative to systemicsafety assessment and efficiency evaluation of substances inlaboratory animals and man? Altern. Lab Anim. 40: 235–257.

Meek, B. & J. Doull. 2009. Pragmatic challenges for the visionof toxicity testing in the 21st century in a regulatory context:another Ames test? . . . or a new edition of “the Red Book”?Toxicol. Sci. 108: 19–21.

Mikaelian, I., M. Scicchitano, et al. 2013. Frontiers in preclinicalsafety biomarkers: microRNAs and messenger RNAs. Toxicol.Pathol. 41: 18–31.

Olaharski, A.J., H. Uppal, et al. 2009. In vitro to in vivo concor-dance of a high throughput assay of bone marrow toxicityacross a diverse set of drug candidates. Toxicol. Lett. 188: 98–103.

Olson, H., G. Betton, et al. 2000. Concordance of the toxicityof pharmaceuticals in humans and in animals. Regul. Toxicol.Pharmacol. 32: 56–67.

Pleil, J.D., M.A. Williams, et al. 2012. Chemical Safety for Sus-

tainability (CSS): human in vivo biomonitoring data for com-plementing results from in vitro toxicology—-a commentary.Toxicol. Lett. 215: 201–207.

Polson, A.G. & R.N. Fuji. 2012. The successes and lim-itations of preclinical studies in predicting the phar-macodynamics and safety of cell-surface-targeted bi-ological agents in patients. Br. J. Pharmacol. 166:1600–1602.

Rennard, S.I., D.M. Daughton, et al. 1990. In vivo and in vitromethods for evaluating airways inflammation: implicationsfor respiratory toxicology. Toxicology 60: 5–14.

Robinson, J.F., P.T. Theunissen, et al. 2011. Comparison of MeHg-induced toxicogenomic responses across in vivo and in vitromodels used in developmental toxicology. Reprod. Toxicol. 32:180–188.

Sai, K. & Y. Saito. 2011. Ethnic differences in the metabolism,toxicology and efficacy of three anticancer drugs. Expert Opin.Drug Metab. Toxicol. 7: 967–988.

Warner, C.M., K.A. Gust, et al. 2012. A systems toxicol-ogy approach to elucidate the mechanisms involved inRDX species-specific sensitivity. Environ. Sci. Technol. 46:7790–7798.

Yang, B. & T. Papoian. 2012. Tyrosine kinase inhibitor (TKI)–induced cardiotoxicity: approaches to narrow the gaps be-tween preclinical safety evaluation and clinical outcome. J.Appl. Toxicol. 32: 945–951.

24 Ann. N.Y. Acad. Sci. 1278 (2013) 11–24 C© 2013 New York Academy of Sciences.

Copyright of Annals of the New York Academy of Sciences is the property of Wiley-Blackwell and its content

may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express

written permission. However, users may print, download, or email articles for individual use.