色盒直播

Leader: Some very necessary measures

Rankings aren't perfect, but they give users, especially students, a broad view of the global sector that's impossible to get otherwise

July 8, 2010

Caveat emptor. That's the very sensible advice on world university rankings from Ellen Hazelkorn, author of the forthcoming book Rankings and the Battle for World-Class Excellence: How Rankings Are Reshaping Higher Education.

The problem with league tables, she says in our cover story, is that there is "no objective set of criteria or weightings. They do not elucidate a basic truth; rather, the choice of indicators reflects the ranker's view of what is important." She is absolutely right. Because there are few sound data that are internationally comparable, indicators and proxies are used. Thus, rankings measure only what can be measured, not what should be measured. But surely that is better than no measure at all?

Another issue is frequency. The blogger Kris Olds poses an interesting question in his blog GlobalHigherEd: why are rankings published annually? Why not, say, every three to four years? He does have a point. Universities are like oil tankers: no change made in the short term can effect a turnaround in fortunes that immediately and dramatically improves their rankings position. Most large fluctuations are due to errors or tweaks in the methodology.

The biggest criticism, however, is that rankings have begun to change behaviour as they have grown in influence. This is not all bad: they have stimulated interest in universities and higher education. But governments now use them to formulate policies, and universities use them to benchmark themselves. This, according to Malcolm Grant, provost of University College London, forces institutions to act against their own interests. Using league table positions as a key performance indicator for universities, as is done in some countries, is a "false god", he warns.

色盒直播

ADVERTISEMENT

This situation has led to rankings overhauls (ie, this magazine's) and to the development of intricate multidimensional approaches that compare universities on a range of criteria according to size and mission so as to better reflect the complex modern institution.

Amid all the effort directed at making rankings more accurately reflect universities' output and serve institutional ends, is there perhaps a danger that the needs of those for whom rankings were first devised are being swept aside? The weakness of rankings is also their strength: simplicity. Students considering a university outside their home country need to compare institutions in different countries across different continents. It would be foolish to advise a student to choose a university on the basis of rankings alone, but they do give them a starting point.

色盒直播

ADVERTISEMENT

It is not enough to dismiss all rankings by saying there are just too many differences between institutions and nations. Arguing that rankings compare apples and oranges is not an excuse to give students a lemon. Until universities themselves can come up with a feasible alternative that does not do a disservice to those whom they aspire to teach, rankings are here to stay.

Potential undergraduates need simple facts and figures, and universities should focus their efforts on how to provide them in a way that allows for easy comparisons, instead of seemingly devoting their efforts to defensive and increasingly complex ways to obfuscate information about what they do.

Scientia potestas est, so universities profess. Or is that what they are frightened of?

ann.mroz@tsleducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT