Ranking Higher Education Systems

As you may know, a new ranking of national systems of higher education was released a couple of weeks ago. The ranking was put together by a team of Monash University professors led by Ross Williams, and published by the Melbourne Institute and Universitas 21, a global alliance of 24 research intensive universities. It’s not the first attempt to rank national systems – we here at HESA have done two iterations of a global ranking on affordability and accessibility – but it is probably the most ambitious.

The ranking covers twenty indicators over four themes – Resources, Environments, Connectivity and Outputs. For those keeping score, Canada came third overall behind the U.S. and Sweden. We were top in the resource category (there goes the underfunding argument!), third in the output category and mid-table in both environment and connectivity. Not bad, huh?

But you need to take these results with a big grain of salt for a couple of reasons. The first is that the “environment” indictors are mostly qualitative and there’s no published source available to check the data on two of them. The scoring here is simply bizarre, though. I know there’s no such thing as a validity check in ranking, but surely to God the fall-down-laughing test should have given the authors pause before ranking Canada behind Ukraine on this one.

Similarly, the choice of measures and data sources for outputs tends to benefit the U.S. disproportionately. Awarding points for the percentage of the population over 24 with a degree privileges those countries that massified a long time ago (i.e., the U.S.). Using crude gross enrolment ratios as a measure of access privileges countries where students stay enrolled for a long time (there’s also a nerdy comparability problems on the numerator of this statistics in that the U.S. counts its community college students and we don’t). And using researchers per head of population and unemployment rates as measures of university outputs when in fact institutions have at best partial control over these numbers is just silly.

One thing that is disappointing about these rankings is that in many ways, they just measure how close national systems are to a Northern European ideal. If you’re a rich country that drank the Humboldt kool-aid a long time ago, you tend to do well. If you’re poor and have largely teaching-oriented universities, you don’t. I suppose since the ideal of “world-class” research universities has become a global standard of sorts, this makes some sense. But a measure which was more sensitive to levels of national income would have been nice.

Overall: it’s a good first attempt, but there’s still some room for improvement.

Posted in

One response to “Ranking Higher Education Systems

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.