HESA

Higher Education Strategy Associates

Category Archives: QS

November 15

Ten Years of Global University Rankings

Last week, I had the honour of chairing a session at the Conference on World-Class Universities, in Shanghai.  Held on the 10th anniversary of the release of the first global rankings (both the Shanghai rankings and the Times Higher Ed Rankings – then run by QS – appeared for the first time in 2003).  And so it was a time for reflection: what have we learned over the past decade?

The usual well-worn criticisms were aired: international rankings privilege, the measurable (research) over the meaningful (teaching), they exalt the 1% over the 99%, they are a function of money not quality, they distort national priorities… you’ve heard the litany.  And these criticisms are no less true just because they’re old.  But there’s another side to the story.

In North America, the reaction to the global rankings phenomenon was muted – that’s because, fundamentally, these rankings measure how closely institutions come to aping Harvard and Stanford.  We all had a reasonably good idea of our pecking order.  What shocked Asian and European universities, and higher education ministries, to the core was to discover just how far behind America they were.  The first reactions, predictably, were anger and denial.  But once everyone had worked through these stages, the policy reaction was astonishingly strong.

It’s hard to find many governments in Europe or Asia that didn’t adopt policy initiatives in response to rankings.  Sure, some – like the empty exhortations to get X institutions into the top 20/100/500/whatever – were shallow and jejune.  Others – like institutional mergers in France and Scandinavia, or Kazakhstan setting up its own rankings to spur its institutions to greater heights – might have been of questionable value.

However, as a Dutch colleague of mine pointed out, rankings have pushed higher education to the front of the policy agenda in a way that nothing else – not even the vaunted Bologna Process – has done.  Country after country – Russia, Germany, Japan, Korea, Malaysia, and France, to name but a few – have poured money into excellence initiatives as a result of rankings.  We can quibble about whether the money could have been better spent, of course, but realistically, if that money hadn’t been spent on research, it would have gone to health or defence – not higher education.

But just as important, perhaps, is the fact that higher education quality is now a global discussion.  Prior to rankings, it was possible for universities to claim any kind of nonsense about their relative global pre-eminence (“no, really, Uzbekistan National U is just like Harvard”).  Now, it’s harder to hide.  Everybody has had to focus more on outputs.  Not always the right ones, obviously, but outputs nonetheless.  And that’s worth celebrating.  The sector as a whole, and on the whole, is better for it.

November 01

More Shanghai Needed

I’m in Shanghai this week, a guest of the Center for World-Class Universities at Shanghai Jiao Tong University for their biannual conference. It’s probably the best spot on the international conference circuit to watch how governments and institutions are adapting to a world in which their performance is being measured, compared and ranked on a global scale.

In discussions like this the subject of rankings is never far away, all the more so at this meeting because its convenor, Professor Nian Cai Liu, is also the originator of the Academic Ranking of World Universities, also known as the Shanghai Rankings. This is one of three main competing world rankings in education, the others being the Times Higher Education Supplement (THES) and the QS World Rankings.

The THES and QS rankings are both commercially-driven exercises. QS actually used to do rankings for THES, but the two parted ways a couple of years ago when QS’s commercialism was seen to have gotten a little out of hand. After the split, THES got a little ostentatious about wanting to come up with a “new way” of doing rankings, but in reality, the two aren’t that different: they both rely to a considerable degree on institutions submitting unverified data and on surveys of “expert” opinion. Shanghai, on the other hand, eschews surveys and unverified data, and instead relies entirely on third-party data (mostly bibliometrics).

In terms of reliability, there’s really no comparison. If you look at the correlation between the indicators used in each of the rankings, THES and QS are very weak (meaning that the final results are highly sensitive to the weightings), while the Shanghai rankings are very strong (meaning their results are more robust). What that means is that, while the Shanghai rankings are an excellent rule-of-thumb indicator of concentrations of scientific talent around the world, the QS and THES rankings in many respects are simply measuring reputation.

(I could be a bit harsher here, but since QS are known to threaten academic commentators with lawsuits, I’ll be circumspect.)

Oddly, QS and THES get a lot more attention in the Canadian press than do the Shanghai rankings. I’m not sure whether this is because of a lingering anglophilia or because we do slightly better in those rankings (McGill, improbably, ranks in the THES’s top 20). Either way, it’s a shame, because the Shanghai rankings are a much better gauge of comparative research output, and with its more catholic inclusion policy (500 institutions ranked compared to the THES’s 200), it allows more institutions to compare themselves to the best in the world – at least as far as research is concerned.