Background image source: Getty
Is it better to be rich or well thought of? That, in some sense, is the question facing heads of research when contemplating their strategies for the research excellence framework.
According to Giles Carden, director of strategic planning and analytics at the University of Warwick, research-intensive institutions in particular are faced with a choice between seeking to maximise their grade point averages – often seen as a measure of quality – and maximising the volume of researchers they submit.
“I visualise this as a see-saw where you need to achieve the right balance,” he says.
Maximising volume also maximises a university’s share of the research funding distributed on the basis of the REF (known in England as quality-related funding). Of course, the formulas used by the UK funding councils to determine the distribution also take quality into account, and analysis by Times Higher Education reveals that English universities’ grade point averages in the REF correlate reasonably closely with their post-REF QR allocations, announced in March.
Statistically, the correlation coefficient is 0.5, where 1 represents a perfect correlation. However, changes in GPA between the 2008 research assessment exercise and the 2014 REF correlate only very weakly with funding changes pre- and post-REF.
Meanwhile, the correlation between funding levels and submission volume is a very high 0.96, and the correlation between volume changes and funding changes is also high. This reflects the greater diversity between universities in their size than in their GPAs.
But a less selective submission is likely to suppress a university’s GPA, resulting in reputational damage that could have indirect effects on other funding sources. For this reason, one observer, who did not want to be named, says that his institution chose to pursue a GPA maximisation strategy for the 2014 REF.
“Our GPA did improve, but our league table position dropped because others had improved more,” he says. “If we’d taken a more inclusive approach, we’d not have got a lot, if any, more funding, and we’d certainly have done worse on GPA. So for us it was effectively a defensive tactic.”
GPA maximisation tactics appear to have had mild success. There is a correlation of 0.27 between submitting a lower proportion of staff in 2014 and a rise in GPA. However, the observer notes that “the credibility factor” is also important to reputation.
“Results that are obviously the result of extreme selectivity will often simply be dismissed, including in public fora,” he says.
In general, universities do not appear to have been significantly more selective in 2014 than in 2008. No definitive figures were published in 2008 for eligible staff numbers, but analysis of indicative figures suggests that 59 per cent of eligible staff were submitted in 2008, compared with 55 per cent in 2014. It is also worth noting that the correlation between the percentage of staff submitted by universities in 2008 and 2014 is a very high 0.86, indicating that, for all their strategic agonies, institutions took largely the same approach in both years.
That research-intensive universities typically not only score highly on GPA but also submit high volumes and proportions of researchers is reflected in relatively strong correlations between each institution’s proportion of eligible staff submitted in 2014 and its GPA, its submission volume and its QR funding in 2015-16 (see graph).
Measured steps
According to Carden, for a research intensive, “cutting deep is never going to achieve the right outcome and is rather divisive”. The effectiveness of Warwick’s relatively inclusive strategy (it submitted 83 per cent of eligible staff), he said, is reflected in its 5 per cent increase in QR funding in 2015-16 and its rise of one place up the GPA rankings to joint eighth, in “a steep part of the competitive gradient”.
Universities that constrain their volume expect to take a funding hit. However, there is only a very low correlation between differences in the proportions of eligible staff submitted in 2008 and 2014 and changes in funding levels. This also suggests that attempts to increase income by submitting higher proportions of staff largely failed.
One way to measure REF success is to look at QR income per researcher submitted to the REF. However, the prominence of biomedical institutions in that ranking reflects the fact that biomedicine both scored highly in the REF and attracts a lot of funding.
Interestingly, the University of Manchester, which will experience the biggest reduction by far in its QR income for 2015-16, has also netted a rise of just over ?900 in its per-FTE staff income.
Top 10 in QR funding per staff member submitted to REF
Institution | QR funding (?) per FTE staff submitted 2015?16 | QR funding (?) per FTE staff eligible 2015?16 | ?% of eligible FTE staff submitted 2015-16?? |
---|---|---|---|
Institute of Cancer Research? | 154,555 | 147,400 | 95 |
Liverpool School of Tropical Medicine? | 142,162 | 45,233 | 32 |
Imperial College London? | 83,520 | 76,743 | 92 |
University of Cambridge? | 59,425 | 56,502 | 95 |
London School of Hygiene and Tropical Medicine? | 58,866 | 48,771 | 83 |
University of Oxford? | 58,596 | 50,868 | 87 |
St George’s, University of London? | 55,178 | 28,611 | 52 |
Cranfield University? | 54,041 | 20,009 | 37 |
University College London? | 52,966 | 48,367 | 91 |
Courtauld Institute of Art | 50,456 | 48,972 | 96 |
?