A proposed additional assessment of research quality between research excellence frameworks based on metrics such as citations rather than peer review would not be seen as credible, according to one of the authors of a major government-commissioned report on the subject.
Despite report concluding in July that it was “not currently feasible to assess the quality of research outputs using quantitative indicators alone”, the idea of a “mini REF” that uses metrics has nonetheless made it into the government’s Green Paper on higher education.
The paper suggests “making greater use of metrics and other measures to ‘refresh’ the REF results and capture emerging pockets of research excellence in between full peer review”.
Stephen Curry, a professor of structural biology at Imperial College London and one of the authors of the metrics report, said that he did not think an intermediate assessment based on metrics “would have the credibility and support of the community”.
色盒直播
The “real problem” with metrics was that “on their own they can’t be reliable because we can’t have enough data across all the disciplines”, he said, citing arts and humanities as an area where “the information just isn’t there” in terms of citation coverage.
The Green Paper, released on 6 November, is only a consultation document, and a spokesman for the Department for Business, Innovation and Skills said that it would await responses on the future shape of the REF. The Green Paper says that it will “consider” the findings of The Metric Tide.
色盒直播
Debate will now move towards how exactly metrics will be used in a new REF system. The 2014 REF did use metrics, but only in a small way. Fewer than a third of the subject panels requested citation statistics, and these data were generally used only where there was disagreement among the reviewers over quality.
In the natural sciences, citations metrics are more abundant, Professor Curry said, but running an intermediate assessment involving only some subjects would lead excluded disciplines to be seen as “second best”. And the more metrics were used, he said, the more universities would attempt to game them.
He added: “I would question whether you would need a mini REF. Does the research landscape really change in two to three years?”
James Wilsdon, professor of science and democracy at the University of Sussex and chair of the group that wrote The Metric Tide, shared Professor Curry’s caution.
色盒直播
“Having looked at the question of metrics in exhaustive detail…I for one, and my committee, are not persuaded that there’s an easy solution here in moving overall from a peer-review process to a metrics process,” he said.
But publishers, which sell a variety of metrics tools, have pushed for their inclusion in the assessment process.
Earlier this year, Nick Fowler, managing director of Elsevier’s research management division, argued in a presentation to the Higher Education Policy Institute that greater use of metrics could drive down costs, and that multiple metrics made gaming the system “very hard”.
POSTSCRIPT:
Print headline: Metrics-based mini REF ‘won’t be seen as credible’
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to 罢贬贰’蝉 university and college rankings analysis
Already registered or a current subscriber? Login