色盒直播

‘Avoid all-metric approach to REF’, says review

Bibliometric-led evaluation of research outputs unable to replicate ‘richness’ of peer-review model

十二月 12, 2022
Old numbers
Source: iStock

A review commissioned by the UK’s main research funders has cautioned against moving to a fully metricised system for the next Research Excellence Framework (REF) but says additional “value-led indicators” on culture and people should be included in future assessments.

Recommending that funders should “avoid an all-metric approach” to the next REF, the review suggests it is “unlikely” that a bibliometric-led evaluation of the sector’s research outputs “will deliver what the research community, government and stakeholders need from the exercise”.

Dispensing with peer review – on which the national audit of UK research held every seven years has largely relied since its inception in the 1980s – would be particularly unwise for the assessment of research impact because the “available indicators or infrastructure cannot approximate the richness of the current case study format”, concludes the review, which was designed to provide a “short, sharp, evidenced-informed look” at the current and potential uses of metrics in research assessment.


色盒直播 Campus views: Don’t let the REF tail wag the academic dog


The study, titled “Harnessing the metric tide” and published on 12 December, is a follow-up to the last major look at research metrics,?2015’蝉 The Metric Tide, which concluded that it was “not?currently feasible to?assess the quality of research outputs using quantitative indicators alone”.

At present, the REF’s use of bibliometrics is limited; quantitative data, typically citation counts, can be requested by sub-panels, while statistics can also be included in impact or environment statements to illustrate the comparative strength of research outputs. However, concerns over the cost and bureaucratic burden of the REF, which is used to guide the allocation of about ?2 billion in annual research grants, have led to calls for the greater use of citations, which can often reflect the quality and influence of research outputs.

The new report – led by Metric Tide authors James Wilsdon, digital science professor of research policy at the University of Sheffield, and Stephen Curry, assistant provost for equality, diversity and inclusion at Imperial College London, as well as Elizabeth Gadd, research policy manager at Loughborough University – acknowledges that “there may be more opportunities to use metrics” given analytical advances since 2015 but is sceptical about how far citations and other bibliometric data can be used, at least in isolation.

“It is rare, particularly in higher education institutions, for the limitations of bibliometrics to be considered so flawed as to lead to the cessation of bibliometric analysis altogether, despite this being the more rigorous course of action in some circumstances,” reflects the study, which draws attention to the debate over the “appropriate place of bibliometrics in research evaluation practices…including the question as to whether they have any place at all”.

Instead of increasing the use of bibliometrics – which advocates believe would ease the burden on REF reviewers and cut costs while leading to virtually identical league tables and allocations in many subjects – the review recommends enhancing environment statements, which “should be given greater weight” than the 15 per cent of a university’s score in the 2021 REF, principally by including new so-called “data for good” indicators around people and culture.

These “value-led indicators” might include statistics on gender and ethnicity pay gaps among research staff, the percentage of research staff on short-term contracts, open research indicators, information on staff’s peer-review activities and the volume of research leave taken.

“These ‘data for good’ could then enhance existing Hesa [Higher Education Statistics Agency]?data and be used as an input to the people and culture aspects of future assessments,” suggests the report.

The report’s conclusions will feed into the broader review of the REF, the Future Research Assessment Programme (FRAP), being chaired by Sir Peter Gluckman, New Zealand’s former chief science adviser, which was expected to report back in late spring 2023.

More broadly, report also calls for the REF to be renamed the Research Qualities Framework as part of efforts to reduce competition?among researchers and encourage more collaboration.

The new moniker would “replace the contested and ill-defined term 'excellence’ with an alternative that reflects plural dimensions of research quality and impact, and encompasses processes, cultures and behaviours”, says the report, which also calls for more awareness and discussion about responsible research assessment within UK universities.

Welcoming the findings of the new study, Research England’s executive chair, Dame Jessica Corner, it was “imperative that any changes [to the REF] are built on sound evidence and close consultation with the sector”.

jack.grove@timeshighereducation.com

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.

Reader's comments (2)

Whatever you do, game players will game play with their 20% or part time hired hacks to give them their publications and grants to submit for the REF. Of course who cares how much these people are payed for this game, it is the public that finally foots the bill. These academic merceneries are the shame of academia. No academic should be returned to the REF unless they have had a full REF cycle relationship with the institution and on contracts not less than 80% FTE (if not 100% FTE). If these institution they are being returned from are not their primary institution, then further details of active engagement should be sought by the REF panel before their outputs are considered. The REF panel might also want to disclose the names of these researchers who are returned, so their home institutions know that they are doing this.
"renamed the Research Qualities Framework as part of efforts to reduce competition among researchers and encourage more collaboration." One word: LOL
ADVERTISEMENT