I’ve been remiss the last month or so in not keeping you up-to-date with some of the big international rankings releases, namely the Leiden Rankings, the Times Top 100 Under 50 rankings, and the U21 Ranking of National Higher Education Systems.
Let’s start with Leiden (previous articles on Leiden can be found here, and here), a multidimensional bibliometric ranking that looks at various types of publication and impact metrics. Because of the nature of the data it uses, and the way it displays results, the rankings are both stable and hard to summarize. I encourage everyone interested in bibliometrics to take a look and play around with the data themselves to see how the rankings work. In terms of Canadian institutions, our Big Three (Toronto, UBC, McGill) do reasonably well, as usual (though the sheer volume of publications from Toronto is a bit of a stunner), perhaps more surprising is how Victoria outperforms most of the U-15 on some of these measures.
Next, there’s the U21 National Systems Rankings (which, again, I have previously profiled, back here and here). This is an attempt to rank not individual institutions, but rather whole national higher education systems based on Resources, Environments, Connectivity, and Outputs. The US comes tops, Sweden 2nd, and Canada 3rd overall – we climb a place from last year. We do this mostly on the basis of being second in the world in terms of resources (that’s right, folks: complain as we all do about funding, and how nasty governments are here to merely maintain budgets in real dollars, only Denmark has a better-resources system than our own), and third in terms of “outputs” (mostly research-based).
We do less well, though, in other areas, notably “Environment”, where we come 33rd (behind Bulgaria, Thailand, and Serbia, among others. That’s mostly because of the way the ranking effectively penalizes us for: a) being a federation without certain types of top-level national organizations (Germany suffers on this score as well), b) for our system being too public (yes, really), and c) Statscan data on higher education being either unavailable or totally impenetrable to outsiders. If you were to ignore some of this weirder stuff, we’d have been ranked second.
The innovation in this year’s U21 rankings is the normalization of national scores by per capita GDP. Canada falls to seventh on this measure (though the Americans fall further, from first to fifteenth). The Scandinavians end up looking even better than they usually do, but so – interestingly enough – does Serbia, which ranks fourth overall in this version of the ranking.
Finally, there’s the Times Higher Top 100 Institutions Under 50, a fun ranking despite some of the obvious methodological limitations (which I pointed out back here) and won’t rehash again. This ranking always changes significantly each year because the institutions at the top tend to be close to 50 years out, and as such get rotated out and new ones take their place. Asian universities took four of the top five spots globally (Postech and KAIST in Korea, HKUST in Hong Kong, and Nanyang in Singapore). Calgary, in 19th place was the best Canadian performer, but Simon Fraser made 24th and three other Canadian universities took their place for the first time: Guelph (73) UQAM (84) and Concordia (96).
Even if you don’t take rankings overly seriously, all three rankings provide ample amounts of thought-provoking data. Poke around and you’re sure to find at least a few surprises.
At the following website you can find a side by side comparison of 8 world university rankings. It includes 2 of the 3 above.
http://listedtech.com/content/8-world-university-rankings-side-side-2013