Metric analysis

REF 2021: The metric tide is rising again

REF is ripe for radical change, say Stephen Curry, Elizabeth Gadd and James Wilsdon

As we continue to digest last week’s headlines from the Research Excellence Framework (REF) – and with more to come as detailed panel reports and impact case studies emerge over coming weeks – attention is already turning to the scope and design of the next evaluation cycle.

As in 2008 and 2014, the possibility of a simpler and cheaper process that relies on readily available metrics is offered as an alternative to a process that is widely recognized as having become too cumbersome.

It is worth remembering that anyone under 60 working in UK universities is part of a system shaped by successive waves of national research assessment, dating back to the first exercise of research selectivity in 1986. eight cycles, it has become a very complex evaluation machine. , to use a term coined by political scientist Peter Dahler-Larsen.

This mechanism is both admired – and seen by some as something to emulate – as a fair and responsible basis on which to determine the annual allocation of around £2bn of Quality Linked (QR) funding, and contested as a source of bureaucracy, competition and conformity.

It is therefore right that the designers and users of the REF remain attentive to the potential of new technologies and other innovations to improve, revive or rationalize its operations. When Tim Berners-Lee invented the World Wide Web, the UK was already completing its second evaluation cycle. Since then, advances in ICT, data science, scientometrics, and related fields have transformed the possibilities and practices of measurement and management, and research evaluation has progressed alongside it.

Many see machine learning and artificial intelligence as the latest general-purpose technologies capable of boosting productivity and transforming work practices in many industries, including research. There have been calls to integrate these technologies into the REF.

Catch the wave

Over the decades, the culture and management of UK academic research have become so deeply melded with assessment mechanisms that they make reform difficult. Seen from a distance, unchecking everything may seem simple; up close, all you see is a spaghetti of interdependencies and connections.

That said, various factors are now lining up to support a more radical overhaul of exercise than at any time in recent years.

Public R&D spending is expected to increase through to 2025. There is potential for more strategic integration between the RQ and other funding streams through UK research and innovation structures, combined to an increased urgency around the culture of research, impact, diversity and inclusion. And there is already a strong desire to reduce bureaucracy, through Adam Tickell Current Review and UKRI’s initiative for ‘simpler and better’ funding.

Now is the time to look openly and creatively at how we can simplify and improve REF. the Future Research Assessment Program, which research funding agencies launched in 2020, is admirable in its scope and intent to do so. Several evaluation and analysis components are currently underway.

As the latest addition to this mix, Research England announces today that it has asked the three of us to lead a updated review of the potential role of metrics in the UK research assessment system.

Short and sharp

The metric tide revisited will take a brief, sharp and informed look at the current and potential uses of metrics, with four well-defined objectives:

  • To return to the conclusions and recommendations of the last review of these issues—The metric tidewhich two of us co-authored in 2015 – and assess progress against it;
  • Consider whether recent developments in infrastructure, methodologies, and uses of research parameters negate or modify any of these 2015 findings, or suggest additional priorities;
  • Re-examine the role of metrics in any future REFs and determine whether the design changes considered by FRAP suggest similar or different conclusions to those reached in 2015;
  • To offer up-to-date advice to UKRI and higher education funding bodies on the most effective ways to support and encourage responsible research assessment and the use of metrics.

This will be a quick review, ending in September 2022. The original Metric Tide has been underpinned by extensive evidence gathering and consultation, and there is no need to repeat it all from scratch .

We have also seen welcome progress on these agendas since 2015, led by the Research Assessment Statement; by institutions adopting their own policies for responsible measurement and evaluation; and with additional advice at the international level from bodies such as the International Network of Research Management Societies, Science Europe, Unesco and the World Research Council.

We will organize roundtables in June and July to invite experts and stakeholder groups to provide formal input. These will include researchers from all disciplines and career stages; scientometrists; metrics providers; university leaders and research directors; editors; librarians; learned societies; research funders; and infrastructure providers. We will also work with the Forum for Responsible Research Measures– itself created as a recommendation from The Metric Tide – as a source for informal monitoring.

More than anything, as a team, we care passionately about improving research cultures and providing the evidence and answers that FRAP and the wider community need. We know how vital it is to have correct assessment systems; how REF objectives and priorities should be balanced against technologies, methods and applications; and how any REF reform proposal should take into account the experiences and ideas of users and the expectations of stakeholders.

The various components of FRAP, including ours, will be brought together in the fall. It will then be up to ministers to decide how radical they are. We are quietly optimistic about the prospects for positive change.

Stephen Curry is Professor of Structural Biology and Deputy Provost for Equality, Diversity and Inclusion at Imperial College London, and Chair of the Steering Committee for the Declaration on Research Assessment. He was co-author of The Metric Tide

Elizabeth Gadd is Head of Research Policy at Loughborough University and Chair of the Research Evaluation Group of the International Network of Research Management Societies

James Wilson is Professor of Computational Science Research Policy at the University of Sheffield and Director of the Research on research institute. He chaired the journal The Metric Tide and is a founding member of the Forum for Responsible Research Metrics

A version of this article also appeared in Quinzaine de la recherche