Menu
Close
Sign up for NPL updates
Close
Sign up for NPL updates

Receive regular emails from NPL to get a glimpse of our activities and see how our experts are informing and influencing scientific debate

NPL signs San Francisco Declaration on Research Assessment (DORA)

Bajram Zeqiri FREng, NPL Head of Science, discusses NPL's decision to sign DORA.

8 minute read

NPL recently signed the San Francisco Declaration on Research Assessment, better known globally as DORA. I first came across DORA just prior to lock-down. Its principles on publication metrics and how we employ them to judge the quality of research and by extension the contribution of scientists and engineers, struck a chord with me. I made it a priority for my first year as NPL’s Head of Science to ensure NPL signed up. A paper describing the original rationale behind the setting up of DORA and the significant issues it was trying to address was generated, gained the appropriate internal approvals and on 13 February 2024, our CEO Dr Peter Thompson signed DORA on behalf of NPL. This blog describes the reasons behind why we have taken this important step. 

Over the last two decades bibliometric or scientometric data using metrics such as the Journal Impact Factor (JIF) [1] and the h-index [2] have been employed in isolation to assess the quality of research articles and the contribution of individual scientists. There are a wealth of studies indicating that applying metrics in this way establishes a distorted incentive system that can have damaging consequences on the development of science. The JIF is seen and used as a proxy for research quality but can be open to manipulation, whilst the drive to publish in journals with a high JIF has strongly influenced practices and behaviours. Journal Editors can impose selection criteria designed to increase the number of citations, for example where submissions might be related to hot topic areas, rather than areas which might be slightly peripheral, scientifically laudable and with the potential to break new ground. The view is that this can drive conservative copy-cat research that stifles genuine ground-breaking innovation. The system therefore encourages scientists to work in more populated areas where the likelihood of garnering citations is increased, through referencing each other’s work. There are many ways the system can be “gamed”: hyper-authorship or the increasing trend for publications to have very many authors, self-citation and publishing progress in a technical area in a series of small incremental developments rather than one definitive reference publication. There are a number of other significant downsides to the way these metrics have been applied within the academic world to guide decisions on funding, recruitment, tenure and promotion. There are also well documented issues regarding their effect on research integrity, the increasing strain the metrics induce on the scientific publication system and on diversity and inclusion. Their use has become so ingrained because it is undoubtedly much easier simply to compare numbers.  

It is ironic that the authors of the original publications that introduced the JIF and h-index warned of their misuse, particularly with regards to evaluating individuals. In terms of the JIF, Garfield stated that “there is a wide variation of citations from article to article within a single journal”, [1] so that publication in a high JIF Journal per se was no guarantee that it would have scientific impact or attract citations. Similarly, relating to the h-index, Hirsch importantly emphasised that: “Obviously, a single number can never give more than a rough approximation to an individual’s multi-faceted profile, and many other factors should be considered in combination in evaluating an individual” [2]. His original analysis concentrated on the field of physics, but Hirsch also noted large differences in the h-index when compared to the biological field i.e. a strong dependence on the technical field being considered. NPL’s core mission as a National Metrology Institute lies focused on metrology: the science of measurement, an area that traditionally does not tend to attract a high level of citations, despite its critical importance in today’s world. 

Research is a highly competitive endeavour. Evaluating the quality of research outputs, primarily peer-reviewed publications, plays a key role in defining the relevance and likely impact of the research as well as the performance of scientists in ways which are career-defining. Inappropriate application of assessment criteria can be damaging to the research profession, to the careers of young scholars as well as those who do not take traditional career paths.  As a result, there has been significant global activity attempting to re-shape the research assessment landscape to translate it away from reliance on metrics alone, to a broader and more considered application of both quantitative and qualitative indicators. Arguably, the single most significant development was the establishment of the San Francisco Declaration on Research Assessment (DORA) (https://sfdora.org/) which focused on moving away from the use of JIF. Since its inception a little over 10 years ago, DORA has been signed by over 21,500 individuals and 3,133 organisations in 166 countries. DORA recommends that the JIF is not used as a surrogate measure of the value of a research article or to make career defining decisions. UKRI are classified on the DORA website as a Supporting Organisation. Research funding organisations such as Wellcome and CRUK increasingly expect organisations to adhere to the DORA principles in terms of the way they assess research quality.  

A number of themes run through the DORA recommendations. These are the need to:

  • eliminate the use of journal-based metrics, such as Journal Impact Factors, in funding, appointment, and promotion considerations; 

  • assess research on its own merits rather than on the basis of the journal in which the research is published;  

  • capitalise on the opportunities provided by online publication (such as relaxing unnecessary limits on the number of words, figures, and references in articles, and exploring new indicators of significance and impact). 

DORA recommendations aimed at the “Institution” level and therefore relevant to NPL, states that organisations need to:- 

  • Be explicit about the criteria used to reach hiring, tenure, and promotion decisions, clearly highlighting, especially for early-stage investigators, that the scientific content of a paper is much more important than publication metrics or the identity of the journal in which it was published; 

  • For the purposes of research assessment, consider the value and impact of all research outputs (including datasets and software) in addition to research publications, and consider a broad range of impact measures including qualitative indicators of research impact, such as influence on policy and practice. 

The final point is particularly important to NPL. We are an impact driven organisation and must ensure research outputs and their influence are properly assessed in order to fully capture the valuable work we do. Since the introduction of the NPL Impact from Science initiative more than six years ago, we have already established an excellent framework for approaching research assessment that focuses on impact. This recognises that not all of our work will create impact through publications in peer-reviewed scientific journals and there are important alternative routes: good practice guides, training courses, industry newsletters, documentary standards and outreach are a few examples of different types of outputs that need to be considered. In terms of what will change with NPL being a signatory to DORA, it is my view that we do start from a very good position, although guidance needs to be explicit to managers, recruiters and promotion interviewers to consider all types of output and to not just rely on bibliometrics.  

So, what do my vital statistics look like? According to Web of Science (WoS), they are: h-index=25 and number of citations around 2,400. According to WoS, I have published 146 articles, although WoS includes in these numbers many other forms of written output such as patents, editorials, Conference Proceedings and NPL Reports. I know I have had around 85 peer-reviewed papers published, and writing these papers has been something I have enjoyed throughout my career. Given these numbers, do metrics maketh the metrologist? I would undoubtedly have to say no, as I know my impact, and the contribution I have made as a scientist/engineer/metrologist has additionally come in many other forms. Some of this has originated from the innovations and metrology reported in my publications that has fed into international standards underpinning the safe use of medical ultrasound worldwide. Part of the impact I have had relates to novel metrology-related products developed and licenced through UK SMEs that have contributed to their growth. For the last two decades, this has enabled users worldwide to undertake better ultrasonic measurements from the advances in metrology that I and my team have generated. Have I had a paper published in Science or Nature? No. Would I like one? Well, I wouldn’t say no, but maybe I’ve been conditioned to think in this way? To be honest, I would rather see my science and engineering continue to turn into real world impact although I do recognise that this clearly constitutes a higher bar. Translating invention into innovation represents a much more difficult challenge, can generally only be done over long timescales with eventual impact realisation involving many factors some of which may be out of my control.         

Peer-reviewed publications represent an important metric for any organisation and NPL is no different. Annually, we publish close to 400 peer-reviewed journal publications; a figure which has seen consistent growth over the last decade or so. As DORA acknowledges, the peer-reviewed journal paper will rightly remain the central output informing research assessment, but judging the quality, impact and relevance of research is inevitably nuanced. Across NPL, there are many instances of research that has absolutely changed the world in terms of forming the basis of new metrology, where the number of citations gathered by the pioneer paper has been really very modest and could never be taken as measure of the quality and significance of the original manuscript. The number of NPL publications in Journals such as Nature and Science has increased significantly over the last 5 years. The “bar” for publication in these Journals is unquestionably high and rightly so in terms of the potentially much broader impact of the published research: each publication should be rightly lauded. However, achieving a publication in such a journal, or indeed any journal, can only ever be part of the story regarding the quality, significance and impact of the work described. 

References 

[1] Garfield E, 1955, “Citation indexes to science: a new dimension in documentation through association of ideas”. Science. 122:108-111. 

[2] Hirsch, JE. 2005. “An index to quantify an individual’s scientific research output”. PNAS. 102 (46).  16569-16572. 

 

20 Mar 2024