色盒直播

REF 2014: results by subject

Were new research stars born? Who was helped or hurt by impact? Times Higher Education dissects the results by discipline

December 18, 2014

Source: Miles Cole

Download full REF 2014 results by subject


It has taken years of toil and turmoil, but a new landscape of research excellence emerged this week following the grading of thousands of research outputs and impact case studies at an estimated cost of tens of millions of pounds.

After more than a year of nail-biting since last November’s submission deadline, the results of the 2014 research excellence framework are finally published today.

During that interlude, 36 subpanels have pored over more than 190,000 outputs and nearly 7,000 impact case studies written by more than 52,000 full-time equivalent staff at 154 higher education institutions.

Their verdicts will inform the allocation of quality-related research funding – currently about ?1.6 billion a year in England alone – for the next six years, according to a formula to be announced in March. Arguably just as importantly, the results will also determine the institutional and departmental pecking orders – at least in research – for that period.

色盒直播

ADVERTISEMENT

Times Higher Education’s analysis of the REF results by subject reveals that the University of Oxford is likely to see the most lively departmental Christmas parties this year: its submissions have the highest grade point average in 10 of the REF’s 36 units of assessment (see box, below for an explanation of how GPA is calculated). This compares with the University of Cambridge’s four and the London School of Economics’ three. Imperial College London, the top-ranked multidisciplinary institution by overall GPA, is not ranked first in any subject, but scores consistently highly in all units. By contrast, Oxford ranks relatively low in some units, such as theology and religious studies (12th), English language and literature (13th) and, most strikingly, modern languages and linguistics (32nd).

In 2008, Cambridge departments appeared most frequently at the top of the rankings, heading or co-heading 19 of the 67 units of assessment. Oxford headed or co-headed 11.

色盒直播

ADVERTISEMENT

When judged solely on outputs – the papers, chapters and monographs submitted for assessment in the REF – Oxford comes top in six units, Cambridge and the University of Warwick in four, the LSE in three and Imperial in two.

The range of institutions that can claim to be top for impact – the new element of the 2014 exercise, counting for 20 per cent of total scores – is much greater. It is not uncommon for several institutions in a unit to score a maximum GPA of 4, indicating that all of their submission is deemed “outstanding”. And while predictions that impact could disproportionately favour post-92 institutions have not been borne out (subject tables) reveals that the life sciences have scored very highly on impact, as predicted. Ranked by GPA for impact, the six units of assessment covering the life sciences – headed by clinical medicine – all cluster at the top, followed by most of the physical sciences, interspersed with some social sciences. The arts and humanities mostly achieve lower scores.

Arguably this confirms those disciplines’ fears about their ability to demonstrate impact. However two of the most nervous disciplines – mathematics and philosophy – avoid the bottom (they are ranked 24th and 32nd respectively), while another, Classics, is the top non-science subject for impact, ranking eighth.

Based on quality of outputs alone, which accounts for 65 per cent of overall scores, the ranking of units of assessment is headed by four physical sciences: chemistry; physics; electrical and electronic engineering, metallurgy and materials; and mathematical sciences. Clinical medicine is only joint eighth with the highest ranked humanities subject, Classics. The highest social science, economics and econometrics, is ranked sixth, but most social sciences languish towards the bottom of the ranking. The lowest output GPA is recorded by anthropology and development studies, and by art and design.

Ranked according to their GPA scores in the “environment” category of assessment, which counts for 15 per cent of overall scores, the social sciences and humanities sit near the bottom, with the life sciences clustering towards the top. Four of the six units of assessment that have seen their overall GPA rise the most in 2014 are social sciences, but many units of assessment – including most in the life sciences – are not comparable to 2008 units.

All this means that the life and physical sciences typically score higher overall than the humanities and social sciences on overall GPA (see national profiles). On this all-round measure, chemistry and clinical medicine are tied at the top, followed by biological sciences; public health, health services and primary care; and physics. The highest ranked humanities subject, in 10th, is Classics, and the top-ranked social science is economics and econometrics, in 16th. This is four places lower than the lowest ranked life science, agriculture, veterinary and food science. The unit for sport and exercise sciences, leisure and tourism has the lowest overall GPA, 2.83.

The lowest ranked science, meanwhile, is computer science and informatics, in joint 29th place. It is joint 19th on outputs but is pulled down by rock-bottom scores in both environment and impact. However, Steve Furber, chair of the subpanel and ICL professor of computer engineering at the University of Manchester, was impressed by the quality of impact case studies he saw, which “really conveyed a strong message that there is work with impact going on right across the sector – not just where you would expect to find it in the high-end institutions”.

He is also sceptical that units of assessment can be meaningfully compared: “There was a calibration process for ensuring reasonable comparability between subjects, but it is a bit like calibrating chalk against cheese. Each subpanel evolved its own internal culture, then applied it fairly consistently, but ensuring consistency between subpanels is much harder.”

色盒直播

ADVERTISEMENT

Dame Ann Dowling, chair of main panel B, which covers engineering and the physical sciences, and president of the Royal Academy of Engineering, believes there is “good comparability” across subjects under a single main panel, but efforts to ensure comparability across main panels were more limited and should be beefed up next time – especially around impact, where judgements are likely to be “less subject-specific” than they are regarding outputs.

Bruce Brown, chair of main panel D, which covers the arts and humanities, and pro vice-chancellor for research at the University of Brighton, also urges caution when comparing national profiles of different units of assessment. He notes that some receive the majority of their submissions from post-92 institutions, while others are dominated by research-intensives.

“However, I am satisfied that if there are differences in the quality profile [of different subjects] in any element of assessment they are a direct result of quality of research that has been assessed, rather than any differences in practices or working methods,” he adds. “The REF has been a very thorough and rigorous exercise that stuck to the criteria and did exactly what it said on the tin.”

色盒直播

ADVERTISEMENT

Methodology

In the overall table of excellence, institutions are ranked according to the grade point average (GPA) of their overall quality profiles. This is made up of profiles for output (worth 65 per cent of the total), impact (20 per cent) and environment (15 per cent).

The data published today by the Higher Education Funding Council for England present the proportion of each institution’s submission, in each unit of assessment, that falls into each of five quality categories. For outputs, these are 4* (world-leading), 3* (internationally excellent), 2* (internationally recognised), 1* (nationally recognised) and unclassified. For impact, they are 4* (outstanding), 3* (very considerable), 2* (considerable), 1* (recognised but modest) and unclassified. Times Higher Education then aggregates these profiles into a single institutional quality profile based on the number of full-time equivalent staff submitted to each unit of assessment. This reflects the view that larger departments should count for more in calculating an institution’s overall quality.

Each institution’s overall quality profile is then converted into a grade point average by multiplying its percentage of 4* research by 4, its percentage of 3* research by 3, its percentage of 2* research by 2 and its percentage of 1* research by 1; the results are added together and divided by 100 to give a score between 0 and 4. Note that owing to the rounding of decimals, there may appear to be a small discrepancy between the overall quality profiles and the stated GPA.

Where universities have the same GPA, they are ranked according to their research power. This is calculated by multiplying the institution’s overall rounded GPA by the exact total number of full-time equivalent staff it submitted to the REF. This is an attempt to combine volume and quality to produce a ranking that gives a more accurate indication than GPA of the relative amount of quality-related research funding each institution is likely to receive. Further analysis of results based on research power and market share is available here.

The 2008 rank order is taken from Times Higher Education’s published rankings based on the 2008 research assessment exercise, which was the forerunner of the REF.

The impact table is constructed in a similar way but takes account solely of each institution’s quality profiles for impact. A similar table based solely on quality profiles for outputs is available online. The figure for number of UoAs entered counts multiple submissions separately.

The subject ratings rank each institutional submission to each of the 36 units of assessment based on the GPA of the institution’s overall quality profiles in that unit of assessment. Tables based solely on quality profiles for outouts and for impact are also produced alongside and are sorted by GPA score. Note that the rank order figures in the left-most column may not relate exactly to output and impact rankings. Figures for research power, as before, are calculated by multiplying GPAs by the number of academics submitted to the unit of assessment. Where a university submitted fewer than four people to a unit of assessment, Hefce has suppressed its quality profiles for impact and outputs, so it is not possible to calculate a GPA. This is indicated in the table by a dash.

As before, 2008 GPAs are taken from 色盒直播’s rankings based on the 2008 RAE. However, since many units of assessment have changed since then, figures are not always available. Where they are not, the column is marked “n/a”. The same marker is used to indicate where an institution did not submit to the relevant unit of assessment in 2008.

Where units of assessment have changed their names since 2008 but are still broadly comparable, we have included 2008’s score for comparison. These are indicated with an asterisk in the name of the UoA.

For each unit of assessment we also show the “national profile” provided by Hefce, which we use to calculate GPA and research power figures for the entire discipline. As above, the GPA calculation is weighted according to the number of people submitted by each university.

Where the same institution made multiple submissions to the same unit of assessment, the various submissions are marked with different letters (eg, “University of Poppleton A”). Where universities have made joint submissions, they are listed on separate lines. The one with the higher research power is listed first.

The Higher Education Statistics Agency’s data on the number of staff eligible to be submitted, published today, was not available in time for Times Higher Education’s press deadline. Hence, this information is not taken into account in these tables.


Data analysis was undertaken by TES Global Information. Special thanks to Almut Sprigade and Ilan Lazarus. The official REF data are at .

色盒直播

ADVERTISEMENT

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Reader's comments (4)

Your download link to the full report is broken - would it be possible to make this available please - it's probably the most important piece of information on your site at the moment..... G
Hi Graham. Non-subscribers who have reached their monthly article limit (10 articles) will get this error message when clicking to download files.
Hi Chris. From my experience, that is not the problem. After accessing only ONE full report, I received error messages on ALL the subsequent links I tried to access. Worse, the 4 broken links/unsuccessful attempts were counted as part of my (pre-registration) quota of 5 articles!
can we get research intensity by subject results please

Sponsored

ADVERTISEMENT