In March 2018, Informatics Europe published a report [1] focusing on the main principles and criteria that should be followed when individual researchers in informatics (computer science) are evaluated for their research activity, addressing the specificities of this area. This subsumes evaluation of a specific piece of research and can often be generalised to university departments or research centres, since their research performance is largely determined by their individuals.
This report confirms the findings of the 2008 Informatics Europe report (Research Evaluation for Computer Science, Informatics Europe Report, Eds. Bertrand Meyer, Christine Choppy, Jan van Leeuwen and Jørgen Staunstrup) on the subject, and at the same time incorporates a number of new observations concerning the growing emphasis on collaborative, transparent, reproducible, and accessible research.
The conclusions of the report are summarised in nine recommendations to people involved in evaluation committees and funding agencies:
1. Informatics is an original discipline combining mathematics, science, and engineering. Researcher evaluation must adapt to its specificity.
2. A distinctive feature of publication in informatics is the importance of highly selective conferences. Journals have complementary advantages but do not necessarily carry more prestige. Publication models that couple conferences and journals, where the papers of a conference are published directly in a journal, are a growing trend that may bridge the current gap between these two forms of publishing.
3. Open archives and overlay journals are recent innovations in the informatics publication culture that offer improved tracking in evaluation.
4. To assess impact, artifacts such as software can be as important as publications. The evaluation of such artifacts, which is now performed by many conferences (often in the form of software competitions), should be encouraged and accepted as a standard component of research assessment. Advances that lead to commercial exploitation or adoption by industry or standard bodies also represent an important indicator of impact.
5. Open science and its research evaluation practices are highly relevant to informatics. Informatics has played a key enabling role in the open science revolution and should remain at its forefront.
6. Numerical measurements (such as citation and publication counts) must never be used as the sole evaluation instrument. They must be filtered through human interpretation, specifically to avoid errors, and complemented by peer review and assessment of outputs other than publications. In particular, numerical measurements must not be used to compare researchers across scientific disciplines, including across subfields of informatics.
7. The order in which a publication in informatics lists authors is generally not significant and differs across sub-fields. In the absence of specific indications, it should not serve as a factor in the evaluation of researchers.
8. In assessing publications and citations, the use of public archives should be favoured. When using ranking and benchmarking services provided by for-profit companies, the respect of open access criteria is mandatory. Journal-based or journal-biased ranking services are inadequate for most of informatics and must not be used.
9. Any evaluation, especially quantitative, must be based on clear, published criteria. Furthermore, assessment criteria must themselves undergo assessment and revision.
These recommendations are consistent with a recent statement of three national Academies (Statement by three national academies (Académie des Sciences, Leopoldina and Royal Society) on good practice in the evaluation of researchers and research programmes) that also provides recommendations on evaluator selection, overload and training.
Links:
The Informatics Europe report is now available on its web side in its final form:
http://www.informatics-europe.org/publications.html
The direct link to download is:
http://www.informatics-europe.org/component/phocadownload/category/10-reports.html?download=76:research-evaluation-2018
Reference:
[1] Floriana Esposito, Carlo Ghezzi, Manuel Hermenegildo, Hélène Kirchner and Luke Ong (eds): “Informatics Research Evaluation, Informatics Europe Report, 2018