Sections
OpenTox Blog
 
You are here: Home » Data » Blogentries » Public » Integrating Predictive Toxicology Resources and Applications

Integrating Predictive Toxicology Resources and Applications

Posted by Barry Hardy at Apr 25, 2010 01:55 PM |

This workshop will provide a forum for discussion of the integrating resource and application needs of today's predictive toxicology researchers. We will discuss issues and solutions for the more effective support, management and exploitation of predictive toxicology resources created by international research projects and solution developers. Discussions will include requirements and collaboration opportunities in the areas of interoperable resources for data, models and applications supported by public standards, interfaces, vocabularies and ontologies, and best practice approaches in computer science and engineering. Stakeholder perspectives from industry, regulatory, and academic research viewpoints will be shared. Participants will engage in Knowledge Café discussions to discuss user requirements in interdisciplinary science and integrated testing strategies and to determine promising future strategies, directions and actions.

Avendi Hotelpre-conference OpenTox workshop

of the AXLR8 2010 meeting

30th May 2010
to be held at the Avendi Hotel, Griebnitzsee, Potsdam, Germany

Brandenburg / Potsdam / Berlin Room

 

Link to Summary of Workshop Discussion

Download Presentations:

Download pdf for slides for:

- the OpenTox Workshop Presentation

- Robert Kavlock, Director, National Center for Computational Toxicology (US EPA)

- Carl Westmoreland, Director, Science & Technology, Unilever Safety & Environmental Assurance Centre (Unilever)

- Michael Schwarz, Professor and Coordinator of ReProTect (University of Tuebingen)

- Emilio Benfenati, Professor and CAESAR Project Coordinator (Istituto di Ricerche Farmacologiche Mario Negri)

- Egon Willighagen, Project Leader, Bioclipse and the CDK (University of Uppsala)

- Jeffrey S. Wiseman, CEO (Pharmatrope)

Short Overview of Program:

Sunday 30 May 2010Avendi Hotel Dinner

12.00 Lunch (in hotel restaurant)

12.45 Introduction and Overview of Workshop Goals, Barry Hardy, Director, Community of Practice & Research Activities and OpenTox Project Coordinator (Douglas Connect)

12.50 Stakeholder Perspectives – Requirements and Solutions for Predictive Toxicology Resource Management and Applications
Perspectives on solutions for predictive toxicology resource management and applications will be presented by stakeholder representatives from different regulatory, industry, and research stakeholders and including the requirements for the support of interdisciplinary science, integrated testing strategies, safety assessment and the acceptance of alternative methods for purposes such as satisfying European REACH legislative requirements and US EPA prioritization. Perspectives will include the views and experiences of

Robert Kavlock, Director, National Center for Computational Toxicology (US EPA);

Carl Westmoreland, Director, Science & Technology, Unilever Safety & Environmental Assurance Centre (Unilever);

Michael Schwarz, Professor and Coordinator of ReProTect (University of Tuebingen);

Emilio Benfenati, Professor and CAESAR Project Coordinator (Istituto di Ricerche Farmacologiche Mario Negri);

Egon Willighagen, Project Leader, Bioclipse and the CDK (University of Uppsala);

Jeffrey S. Wiseman, CEO (Pharmatrope).

Please see detailed program below for more information.


15.00 Coffee Break

ToxCreate15.15 Satisfying Requirements for Predictive Toxicology Infrastructure, Barry Hardy (Douglas Connect)
User Requirements; Design Principles; Interoperability; Toxicology Vocabularies and Ontologies.

15.30 The OpenTox Framework Design, Stefan Kramer, Professor of Bioinformatics (Technical University of Munich)
Architecture; Components; Interfaces; Web Services; Extensibility.

15.50 Integrating Predictive Toxicology Application Demonstrations, Nina Jeliazkova, Technical Manager and Co-Owner (Ideasconsult), Christoph Helma, Owner (In Silico Toxicology) and Andreas Karwath, Research Assistant (Albert-Ludwigs University of Freiburg)
Use Cases, Demonstrations and Results for applications integrating multiple resources will be presented for:
ToxPredict(a) ToxCreate – creation of a predictive toxicology model and validation
(b) ToxPredict – prediction of toxicities for a chemical compound

16.20 Collaboration, Sustainability & Future Directions, Barry Hardy (Douglas Connect)Knowledge Café
Solutions for supporting collaboration and innovation in predictive toxicology. Sustainability & Future Directions.

16.30 Knowledge Café Discussions
Groups will engage in small group discussions on issues, ideas, innovAvendi Hotel Boat Rideative strategies and ways forward.
17.30 Concluding Discussion and Actions
18.00 Close
18.10 Boat Trip from Hotel on Lakes
20.30 Dinner

Detailed Program

12.00 Lunch
A buffet lunch for workshop participants will take place in the restaurant at the Avendi Hotel.

12.45 Introduction and Overview of Workshop Goals, Barry Hardy, Director, Community of Practice & Research Activities and OpenTox Project Coordinator (Douglas Connect)
In this workshop we will discuss issues and solutions for the more effective support, management and exploitation of resources in predictive toxicology created by major international and national programs and projects. The critical importance of linked resources and their incorporation into applications will provide a guiding theme.  Discussions will include collaboration opportunities in the areas of interoperable infrastructure and resources, public standards, interfaces, vocabularies and ontologies. Stakeholder perspectives from industry, regulatory, and chemical, biological and engineering research viewpoints will be shared. Participants will engage in Knowledge Cafe discussions to discuss user requirements and to determine promising future strategies, directions and actions.

12.50 Stakeholder Perspectives - Solutions for Predictive Toxicology Resource Management
1. Learning By Doing – Phase I of the ToxCast Research Program, Robert Kavlock, Director, National Center for Computational Toxicology (US EPA)

In 2007, the US EPA embarked on a multi-year, multi-million dollar research program to develop and evaluate a new approach to prioritizing the toxicity testing of environmental chemicals.  ToxCast was divided into three main phases of effort – a proof of concept, an expansion and confirmation, and finally, a reduction to practice.  Phase I of the effort is now largely complete and results are being analyzed. This presentation will summarize a few key lessons learned along the way, all of them magnified by the scope of the assays being used and the numbers of chemicals being profiled.  The first lesson was about data management.   Because the data were coming from a number of very diverse sources, and were of very diverse nature, a well designed and implemented work flow strategy and database system was required.   Accurate chemical information management and chemical structure annotation were deemed essential, involving the documentation of sample handling, and confirmation of the purity, salt form and stability of the chemicals.  Standardized dictionaries, or ontologies, of terms were a critical need to ensure comparability of the animal bioassay data that was generated over the span of decades from multiple laboratories.  The second lesson is that transparency is paramount.  Data, both from the newer high throughput screening methods and the traditional assays against which they are being prepared, must be made available to the scientific community for independent analysis and interpretation.  The ToxCast Data Analysis Summit of May 2009 was a major step in that direction, along with the posting of datasets on a public web site.  Although the amount of data can be overwhelming, it is important to critically evaluate the individual results to detect nuances that could be missed with global analyses.  A third lesson is that researchers need to take a careful look at unexpected results and attempt to refute or confirm those using alternate technologies in order to avoid unsupported conclusions and discern artifacts.  Finally, it takes a true team effort to ensure that these various elements are appropriately addressed.  Paying attention to lessons learned helps ensure that the research will progress on schedule, will deliver interpretable results, and will convince the broader scientific and stakeholder communities of the value of the findings. This is an abstract of a proposed presentation and does not necessarily represent EPA policy.

2. New Perspectives on Safety: Insights from EPAA's Platform on Science, Carl Westmoreland, Director, Science & Technology, Unilever Safety & Environmental Assurance Centre (Unilever)
The European Partnership on Alternative Approaches to Animal Testing (EPAA) is a partnership between the European Commission and Industry that was created in November 2005 to promote and reinvigorate the development of novel approaches to toxicity testing. One of the platforms within EPAA (Platform on Science) focuses on the identification of the research needed for the 3Rs. One action within this group is to follow up the recommendations made at an EPAA workshop entitled ‘New Perspectives on Safety’ that was held in Brussels in 2008. At this workshop, the question was posed ‘Against the current landscape of expertise in biology and chemistry, and drawing upon recent developments in technology, what opportunities now exist to design alternative approaches to toxicity testing? – the goals being to improve our ability to characterise the potential of chemicals and drugs to cause adverse health effects while providing animal welfare benefits’. The Workshop was populated not only with experienced toxicologists, but also by eminent investigators from other disciplines who would not normally engage with toxicology issues.

Some of the principles discussed in the 2008 workshop and in subsequent follow-up activities have been:

o The grand challenge is how we can predict human toxicity without the use of experimental animals
o It is not about 'alternatives to animal tests' but alternative ways of doing safety assessment
o We are not looking for a 1:1 replacement of the current animal tests but for new ways of doing science
o How can new tests and approaches actually be used in making decisions about safety?

The key aspects to the EPAA’s approach to exploring novel approaches to safety assessment are:

o Involving scientists from disciplines previously unconnected with 'alternatives to animal testing' and exploring new ways of doing science
o A collaborative approach between academia (test developers), industry (test appliers), ECVAM (test validation) and regulators (users of test data)
o Exploring in more depth the potential of computational chemistry and systems biology on one hand and of the stem cells technology on the other, to facilitate the development of new ways to characterise risks of human safety.

3. Michael Schwarz, Professor and Coordinator of ReProTect (University of Tuebingen) Reproductive toxicity testing will have the highest impact on animal use and costs for regulatory safety testing under REACH. Integrated testing strategies using in vitro tests for prediction of reproductive toxicity are urgently required. An integrated project called ReProTect (www.reprotect.eu) was started in 2004 founded by the European Commission within the 6th Framework Program for research, assembling 33 European partners from academia, SMEs, governmental institutions and industry. The project ended in December of 2009. The aim of ReProTect was to develop or improve in vitro assays to provide detailed information on the hazard of compounds to the mammalian reproductive cycle. In ReProTect in vitro systems able to assess chemical’s hazard in the areas of male/female fertility, implantation and prenatal development were developed. Furthermore, innovative methods including proteome analysis, QSARs, and microarray technologies have been implemented for specific endpoints. More then 20 different tests were newly developed or optimized. More than 150 peer-reviewed reproductive toxicants have been tested. Scientific information on the assumed mechanism of the various test chemicals has been collected and SOPs for each of the tests have been produced. An independent statistical analysis for evaluating the intra/inter-laboratory variability of the tests has been conducted. A huge amount of data has been produced comprising information on the experimental results of all assays along with information on the evaluation of test results (e.g. EC50 values and other relevant parameters). In addition, information on the toxicological in vivo profiles of the ~150 test chemicals has been recorded. We are currently working on designing and implementing a database in order to make the ReProTect datasets publicly available. At this early stage, we seek harmonization with other European projects and standardization of data base formats.

4. Integrating Infrastructure Requirements for Chemistry Research in Predictive Toxicology, Emilio Benfenati, Professor and CAESAR Project Coordinator (Istituto di Ricerche Farmacologiche Mario Negri)
In the development of predictive toxicology models within the CAESAR project, our approach emphasized consideration of model reliability, ease-of-use and a REACH regulatory context. Not all chemical and toxicity data were accepted for model building; rather data was integrated in a way to guarantee the quality of the model.  A chemical perspective was quite important in order to define the structure format of the datasets.

Resource integration needs to be based on the overall purpose of the prediction and on the components available. The integration of different models should be addressed within this perspective and properly take into account the available chemical information. In one approach models are run in parallel, and then integrated, in some preferably intelligent way. A second possibility is when tools are run in cascade, and also in this case the chemical information is taken into account for a proper decision.

Within CAESAR we integrated different tools at different levels, depending on the situation. For instance, the BCF model runs more models in parallel, and then combines results, while the mutagenicity model has several steps in a sequence, which evaluate some chemical features driving the process.  A wise integration of data mining tools and chemical information may provide advantage also in assessing the model reliability. Six tools implemented within CAESAR, which may guide the evaluation of the reliability of the model for specific chemicals, including addressing the applicability domain, will be described.

Chemistry requirements in predictive toxicology also include the need for carefully selected model compounds for the endpoints under study, supported by easily accessible, high-quality, publically-available curated data. Model compounds require the documented availability of well-characterised chemical samples accompanied by analytical quality control and standard operating procedures for handling.

5. Bioclipse - Life Science Application and Ontology Development in Cheminformatics and Bioinformatics, Egon Willighagen, Project Leader, Bioclipse and the CDK (University of Uppsala)
Bioclipse is a life sciences computation platform, with knowledge about scientific data from the ground up. Integrating a wide range of Open Source software a number of applications have recently been developed. A first application involves the use of ontology to extract chemical data sets from the chEMBL database. A second one involves an environment to set up molecular structure data sets, assign activities, and calculate molecular descriptors using local and remote web services. A third application addresses decision support, where molecular structures are compared against heuristic rules. Interpretation has been given focus: relevant substructures are highlighted in the molecular structure, and atoms are colored by their contribution to the predictions. Reflecting the Bioclipse nature, the decision support module makes use of both local as well as remote predictive models.

Recently, initial OpenTox API support was added, allowing, for example, the query and extraction of data sets provided by OpenTox-capable databases. Being able to interact with OpenTox services from within Bioclipse, embeds the OpenTox functionality in the wide range of other functionality implemented in Bioclipse, and by using the scripting functionality allows analyses using the OpenTox services to be easily shared with other scientists.

6. Adverse Events Data Mining Tools Designed for Integration with OpenTox, Jeffrey S. Wiseman, CEO (Pharmatrope)
There has been a concerted effort over the last decade to systematize the recording of adverse event data and enhance its utility for data mining and drug design.  One result has been the US FDA’s AERS database, which currently describes 8,000,000 drug-adverse event relationships that span 1,800 marketed drugs and 100,000 drug-adverse event associations.  While the AERS is one of the most significant single sources of data on human toxicity that is currently available, extracting meaningful information from the AERS is problematical because of the inherently subjective nature of the reporting.  Not all significant adverse events are reported and the association of a drug and adverse event reflects only co-occurrence and not necessarily cause and effect, for example.  Pharmatrope has undertaken to extract the statistically meaningful data from this dataset as a first step to establishing causal links between chemical structure and human toxicity, and ultimately prediction of toxicity for new drugs or other chemicals with high human exposure.  Statistical analysis is applied not only to drug event relationships, but also to event-event relationships and to drug substructure-event relationships. 

In designing the commercial Pharmatrope product we have chosen to produce an infrastructure for data mining rather than a complete data mining software interface.  It is our philosophy that flexibility is a key factor in creative data mining and it is important to provide interoperability of data sources with data visualization and analysis tools.  This design is highly consistent with the OpenTox architecture, which can be regarded as a prototype for the next generation of data mining systems.  We will discuss how our adverse events models may be integrated with OpenTox and a project plan that utilizes the integrated systems to demonstrate a cause-and-effect link between chemical structure, biological mechanism, and human toxicity by working “backwards” from adverse event to biological mechanism. 

15.00 Coffee Break

15.15 Satisfying Requirements for Predictive Toxicology Infrastructure, Barry Hardy (Douglas Connect)
Progress on a well-engineered modernization of predictive toxicology information technology and interoperability between toxicology systems and resources is urgently required.  Progress on interoperability requires a more coordinated approach on standards, progress on the development of public vocabularies and ontologies and standardized computing interfaces. To address these needs the FP7-funded OpenTox Framework has been developed as an interoperable, extensible modern computing platform supporting international researcher needs in data management and integration, ontology, model building, validation and reporting. The design of the Framework has been guided by user requirements, OECD Validation principles, REACH regulatory requirements, and best practices in computing engineering and standards. Insights on user requirements in interdisciplinary science, industry application needs in product development and safety assessment, and the effective support of interdisciplinary R&D should guide further developments of sustainable approaches to infrastructure development and management.

15.30 The OpenTox Framework Design, Stefan Kramer, Professor of Bioinformatics (Technical University of Munich)
OpenTox has been designed as a platform-independent collection of components that interact via well-defined interfaces. The preferred form of communication between components is through web services. Initial research defined the essential components of the architecture, approach to data access, schema and management, use of controlled vocabularies and ontologies, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides applications to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers to develop applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals and to maximize involvement of different stakeholder groups in developments in as timely a manner as possible. The OpenTox Framework currently includes, with its initial APIs, services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. Extensions for supporting new R&D activity and requirements in chemistry, in vitro assay development and computational and systems biology are currently under evaluation and development. OpenTox applications are based on a set of distributed, interoperable, extensible OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of similar and/or complementary data coming from different datasets into a unifying structure having a shared terminology and meaning.

15.50 Integrating Predictive Toxicology Application Demonstrations, Nina Jeliazkova, Technical Manager and Co-Owner (Ideasconsult), Christoph Helma, Owner (In Silico Toxicology) and Andreas Karwath, Research Assistant (Albert-Ludwigs University of Freiburg)
Two OpenTox-based applications, ToxCreate for model building and validation, and ToxPredict for model prediction of toxicity endpoints for chemical structures, will be presented. The applications involve the combination of distributed multiple predictive toxicology resources.

16.20 Collaboration, Sustainability & Future Directions, Barry Hardy (Douglas Connect)
Solutions for supporting collaboration and innovation in predictive toxicology. Sustainability & Future Directions.

16.30 Knowledge Café Discussions
Groups will engage in small group discussions on issues, ideas, innovative strategies and ways forward.  The following topics will be discussed:

1) Requirements for Biological R&D

Moderated by Juergen Hescheler (University of Cologne)

What are the key infrastructure requirements for supporting biological research in predictive toxicology?  What are the current gaps and how could they be best filled?
2) Requirements for Chemical R&D

Moderated by Emilio Benfenati (Mario Negri Institute of Pharmacological Research)

What are the key infrastructure requirements for supporting chemical research in predictive toxicology?  What are the current gaps and how could they be best filled?

3) Requirements for Computational R&D

Moderated by Jeff Wiseman (Pharmatrope)

What are the key infrastructure requirements for supporting computational research in predictive toxicology?  What are the current gaps and how could they be best filled?

4) Collaboration & Interoperability

Moderated by Barry Hardy (Douglas Connect)

What are the key collaboration opportunities for achieving improved interoperability between predictive toxicology resources?  What are the key requirements for a common vocabulary and ontology? What concrete steps could be taken?

5) Integrated Testing Strategies

Moderated by Erwin Roggen (Novozymes)

What are the key requirements for infrastructure supporting integrated testing strategies?  What are the critical current gaps?  How do we best go about closing those gaps? 

6) Validation & Acceptance

Moderated by Horst Spielmann (Free University of Berlin)

What are the key requirements for infrastructure supporting advancing methods to achieve validation and acceptance success?  What are the critical current gaps?  How do we best go about closing those gaps? 

17.30 Concluding Discussion and Actions
The workshop session will be completed with a concluding discussion and summary of main recommendations and actions moving forward.

18.00 Close

18.10 Boat Trip from Hotel on Lakes
A boat trip around the surrounding lakes will leave from the Avendi Hotel.

20.30 Dinner
Evening dinner will take place at the Avendi Hotel after return from the boat trip.

Participants: 


Alexey Lagunin, OpenTox Vocabulary Resource Manager, IBMC, Russian Acad. of Med Sciences
André Guillouzo, Professor and Director of INSERM U991 Unit, Université de Rennes
Andreas Bender, Lecturer for Molecular Informatics, University of Cambridge
Andreas Karwath, OpenTox Validation & Reporting Service Manager, Albert-Ludwigs University of Freiburg
Barry Hardy, Director, Community of Practice & Research Activities & Coordinator of OpenTox, Douglas Connect
Bart van der Burg, Coordinator of ChemScreen, BioDetection Systems
Carl-Fredrik Mandenius, Professor and Coordinator of Invitroheart, Linköpings Univeristy
Carl Westmoreland, Director, Science & Technology, Unilever Safety & Environmental Assurance Centre, Unilever
Christoph Helma, Founder & Manager, In Silico Toxicology
Clemens Wittwehr, Competence Group Leader in JRC/IHCP/Systems Toxicology, EC DG Joint Research Center
Egon Willihagen, Postdoctoral Research Associate, University of Uppsala
Emilio Benfenati, Professor and Coordinator of CAESAR, Mario Negri Inst. of Pharm. Research
Emily McIvor, EU Policy Advisor, Humane Society International
Erica Toft, Coordinator of ForInViTox, InViToPharma, Expertrådet
Erwin L Roggen, Science Manager and Coordinator of Sens-it-iv, Novozymes A/S
Flavio Maria Zucco, Senior Scientist, Instit. of Neurobiology & Mol. Med, CNR
Hajime Kojima, Director, JaCVAM
Horst Spielmann, Professor for Regulatory Toxicology, Free University of Berlin
Jeffrey Wiseman, CEO, Pharmatrope
Jos Kleinjans, Professor and Coordinator of carcinoGENOMICS, Maastricht University
Juergen Buesing, Scientific Project Officer, Alternative Testing Methods, European Commission
Juergen Hescheler, Professor and Coordinator of ESNATS, University of Cologne
Julia Fentem, Head, Unilever's Safety & Environmental Assurance Centre
Kim Boekelheide, Professor of Pathology & Lab Medicine, Brown University
Lothar Terfloth, Scientist, Molecular Networks
Luchesar Iliev, Technical Developer, Ideaconsult
Marc Weimer, Biostatistician, German Cancer Research Center
Michael Schwartz, Professor and Coordinator of ReProTect, University of Tuebingen
Monika Schaefer-Korting, Vice-President, Free University of Berlin
Nathalie Alépée, Scientific Coordinator, Alternative Methods, L'Oréal
Nina Jeliazkova, Founder and Co-Director, Ideaconsult
Peter Hansen, Professor, Technical University of Berlin
Robert Kavlock, Director, National Center for Computational Toxicology, US EPA
Roland Grafstrom, Professor, Karolinska Institute
Roman Affentranger, Research Activity Coordinator, Douglas Connect
Stefan Kramer, Professor of Bioinformatics, Technical University of Munich
Stephen Bryant, Senior Investigator and Coordinator of Pubchem, NCBI
Sylvia Escher, Group Manager Structure Activity Relationships, Fraunhofer Institute
Tobias Girschick, OpenTox Algorithm Service Developer, Technical University of Munich
Troy Seidle, Director, Research & Toxicology Department, Humane Society International
Tuula Heinonen, Research Director & Coordinator of FICAM, University of Tampere
Vedrin Jeliazkov, Founder and Co-Director, Ideaconsult
Vivian Kral, AXLR8 Administration Manager, Free University of Berlin

Contact: 

Please contact Workshop Chair and OpenTox Coordinator Barry Hardy (Barry.Hardy -(at)- douglasconnect.com) with any questions related to the workshop or its outcomes.

Document Actions