News

EFSA reaffirms commitment to robustness, transparency and openness

2 Apr 2015

EFSA has issued a statement outlining its commitment to increasing the robustness, transparency and openness of scientific assessments. Scientific assessments are evidenced-based and demand rigorous methodologies to collect, evaluate and integrate scientific evidence, together with transparent and open communication of the processes and results of the assessment, EFSA said, noting that a structured and clearly […]

EFSA reaffirms commitment to robustness, transparency and openness

http://www.dreamstime.com/stock-photo-eu-building-image15987420EFSA has issued a statement outlining its commitment to increasing the robustness, transparency and openness of scientific assessments.

Scientific assessments are evidenced-based and demand rigorous methodologies to collect, evaluate and integrate scientific evidence, together with transparent and open communication of the processes and results of the assessment, EFSA said, noting that a structured and clearly documented approach is essential if the outcome of the scientific assessment is to be communicated unambiguously to decision makers, the wider scientific community and stakeholders. This will help to clearly focus on key issues and allow reproducibility of the assessments between expert groups and organisations.

Scientific advisory bodies recognise a need to improve the transparency and openness of scientific assessments in line with today’s normative and societal expectations, according to the organisation. Open scientific assessment can be defined as a decision support process where there is not only full transparency (showing what has been done and how it has been done) but also an interaction with the wider scientific community outside world on the data, the methodologies used and the outcome. In this context, the framing of the scientific question posed by the requester (in most cases a decision maker/ stakeholder) is important to ensure that the question represents the problem to be addressed and that it is agreed and clearly expressed prior to the start of the assessment. Assessment methodologies, including methods to retrieve, and analyse pertinent information should ensure the assessment is fit for purpose and appropriately tailored to answer the question posed. This implies that different degrees of complexity and completeness, both of data examined and methodologies applied, may be employed as a function of the knowledge available on the topic, the timeframe and the resources prioritised for responding to the framed question. Consequently, application of the most extensive approaches in all cases may not be fit for purpose or appropriate because of lack of data/knowledge of the subject and/or limited time and resources. In other words, a comprehensive response that is given too late may be of no use to decision makers or risk managers.

The data and evidence needed for scientific assessments generally come from a multitude of sources and are rarely available in a straightforward manner. Like a jigsaw puzzle, EFSA said, different pieces of the evidence need to be integrated to bring an overall view; the so called weight of evidence. A major challenge in this regard is how to apportion and weigh each piece within the assortment of evidence as data may come from experimental studies in animals or cell based assays, surveillance/ epidemiological studies, computational models, or from molecular tools (e.g. omics).

When evaluating substances, scientists most often rely on tests from experimental animals (e.g. rats or mice), in vitro (cell-based) systems or computer generated data ‘in silico’ and extrapolate to the target species itself (e.g. humans). Since the tested and the target species may not have the same physiology or metabolism, the biological relevance of such data for a risk or benefit assessment in the target species should be carefully considered and discussed, EFSA believes. Furthermore, most substances may cause a variety of biological responses that may range from homeostatic, adaptive to adverse or beneficial. The relevance of observed responses therefore needs expert interpretation and judgement as to whether they are important and meaningful enough for human, animal, plant or environmental health (EFSA, 2011). For example, in the assessment of substances that may interfere with endocrine systems, it is crucial to differentiate between responses considered to be within a ‘normal’ biological state or to be of adverse nature (EFSA, 2013).

It is important that decision makers understand the degree of confidence and uncertainty in a scientific assessment and the strength of the data on which it is based. Uncertainty has been defined by the World Health Organization as ‘imperfect knowledge concerning the present or future state of an organism, system, or subpopulation under consideration’ (WHO, 2004). Uncertainty analysis is a pre-requisite to produce a transparent scientific assessment and involves the identification, description and evaluation of individual uncertainties on their own scale, and their combination on the assessment outcome. The development of a harmonised methodology for uncertainty analysis constitutes a major challenge since the characterisation of uncertainties may be of qualitative and quantitative nature depending on the knowledge available and the purpose of the assessment.

EFSA notes that it has recently launched a number of activities to further contribute to producing more robust, transparent and open scientific assessments.  In line with the Authority’s intention to further open its activities to wider scrutiny and participation, EFSA has recently published a discussion paper on Transformation to an ‘Open EFSA’ as a means of consulting on how it will achieve two strategic goals within the next five years. These are (i) to improve the overall quality of available information and data used for its outputs and (ii) to comply with normative and societal expectations of openness (EFSA, 2014).

In this context, EFSA has started the PROMETHEUS project (PROmoting MEtTHods for Evidence Use in Scientific assessments) (2014-2016), which aims at further defining the process and guiding principles for evidence use in scientific assessments and critically evaluating the available methods to fulfil these principles (e.g. collecting, validating and integrating evidence, ensuring transparency, data accessibility, etc.). This framework may identify the need for EFSA to define or refine specific methodologies. The PROMETHEUS project will lead to the publication of two scientific reports (i) principles and process for dealing with data and evidence in scientific assessments (May 2015), (ii) analysis of methodologies applied by EFSA for evidence use in scientific assessments (October 2016). This second report will contribute to the coordination of scientific projects and activities aimed at developing/refining and implementing methods for supporting evidence use in scientific assessments.

Overall, the project will deliver a methodological umbrella for dealing with evidence for the whole scientific assessment landscape under which principles, processes and tools are clearly defined. One of the key challenges with such a framework is to ensure that rigorous methodologies are used and further refined while keeping scientific assessments fit for purpose. Within the above framework, EFSA will produce guidance documents (GD) to develop harmonised methodologies and tools on weight of evidence, biological relevance and uncertainty analysis. The first two activities are just starting (March 2015) whereas the latter on uncertainty analysis started in 2013 and the draft GD will be subject to public consultation between May and June 2015.

The GD on the weight of evidence aims to produce a methodological tool that will enable the combining of different strands of evidence in a consistent and transparent manner within scientific assessments. The GD should be finalised in summer 2017 and, as for all of EFSA’s horizontal GDs, will take into account any relevant work in this area within and outside Europe to ensure consistency and harmonisation of methods and terminologies. To encompass all areas of scientific assessments undertaken by EFSA, weight of evidence methodologies will be developed for chemical hazards (human and animal health, environmental risk assessment), biological hazards, GMOs, plant health, animal health and welfare, and human nutrition using both qualitative and quantitative methods. The working group on weight of evidence will comprise expertise in areas including statistical analysis, systematic review and meta-analysis, frequentist and non-frequentist methods, conventional and new methods for hazard characterisation and exposure modelling.

The GD on biological relevance will focus on scientific criteria to decide on the biological relevance of observed adverse or positive health effects (e.g. nature and size of the biological changes or differences) for the considered target species. These criteria will cover the various sectors of EFSA’s scientific work. Case studies covering the remit of all relevant EFSA Scientific Panels and the Scientific Committee will be developed to illustrate the application of these criteria. The finalisation of this activity is expected by spring 2017.

Finally, the development of a GD on uncertainty analysis is already underway and embraces the identification, characterisation and documentation of uncertainties in scientific assessments using qualitative, semi-quantitative and quantitative (deterministic and probabilistic) approaches. It is intended that the GD will provide a toolbox of different methods to help EFSA’s Scientific Panels to address uncertainty in a systematic and harmonised way. It will focus on uncertainties related to the various steps of the risk assessment, i.e. hazard identification, hazard characterisation, exposure assessment and risk characterisation; and the overall outcome of the assessment. The finalisation of this activity is expected by the end of 2016 and will include a testing phase to allow EFSA’s Panels to identify the methods that best suit their needs and consequently to implement them in their scientific assessments followed by evaluation of their experiences.

In view of the horizontal aspect of these topics and in order to get a common understanding and agreement, an interview with the chair of the Scientific Committee addressing these topics is co-published with this editorial together with a technical report presenting the outcome of a targeted consultation of national and international scientific advisory bodies on these activities (EFSA, 2015). In addition, workshops will be organised to enable scientific consultations with national and international scientific advisory bodies including EFSA’s sister agencies, the European Commission non-food committees (the Scientific Committee on Consumer Safety (SCCS), the Scientific Committee on Emerging and Newly Identified Health Risks (SCENIHR) and the Scientific Committee on Health and Environmental risks (SCHER)), national agencies and international bodies throughout the timeline of these activities. Strong links between the related activities mentioned in this Editorial will be established to ensure the consistency of the various scientific outputs with the overall objectives of this work and to avoid duplication, while keeping them fit for purpose. Additionally, the draft GDs resulting from each of these EFSA activities will be subject to a public consultation prior to their finalisation. Finally, an international workshop will be organised to facilitate the dissemination and understanding of the concepts and outcomes of these activities by interested parties in European and international organisations and stakeholders.