Universitas 21 is one of the higher-prestige university alliances out there (McGill, Melbourne and the National University of Singapore are among its members). Now like a lot of university alliances it doesn’t actually do much. The Presidents or their alternates meet every year or so, they have some moderately useful inter-institution mobility schemes, that kind of thing. But the one thing it does which gets a lot of press is that it issues a ranking every year. Not of universities, of course (membership organizations which try to rank their own members tend not to last long), but rather of higher education systems. The latest one is available here.
I have written about the U21 rankings before , but I think it’s worth another look this year because there have been some methodological changes and also because Canada has fallen quite a ways in the rankings. So let’s delve into this a bit.
The U21 rankings are built around four broad concepts: Resources (which makes up 20% of the final score), Environment (20%), Connectivity (20%) and Output (40%), each of which is measured through a handful of variables (25 in all). The simplest category is Resources, because all the data is available through OECD documentation. Denmark comes top of this list – this is before any of the cuts I talked about back here kick in, so we can expect it to fall in coming years. Then in a tight bunch come Singapore, the US, Canada and Sweden.
Next comes “Environment”, which is a weird hodge-podge of indicators around regulatory issues, institutional financial autonomy, percentages of students and academic staff who are female, a survey of business’ views of higher education quality and – my favourite – how good their education data is. Now I’m all for giving Canada negative points for Statscan’s uselessness, but there’s something deeply wrong with any indicator of university quality which ranks Canada (34th) and Denmark (31st) behind Indonesia (29th) and Thailand (21st). Since most of these scores come from survey responses, I think it would be instructive to publish the results of these responses, because they flat-out do not meet the fall-down-laughing test.
The Connectivity element is pretty heavily weighted to things like percentage of foreign students and staff and what percentage of articles are co-authored with foreign scholars. For structural and geographical reasons, European countries (especially the titchy ones) tend to do very well on this measure and so they take all nine of the top nine spots. New Zealand comes tenth, Canada eleventh. The Output measure combines research outputs and measures of access, plus an interesting new one on employability. However, because not all of these measures are normalized for system-size, the US always runs away with this category (though due to some methodological tweaks less so than they used to). Canada comes seventh on this measure.
Over the last three years, Canada has dropped from third to ninth in the measures. The table below shows why this is the case.
Canada’s U21 Ranking Scores by Category, 2012-2016
In 2015, when Canada dropped from 3rd to 6th, it was because we lost points on “environment” and “connectivity”. It’s not entirely clear to me why we lost points on the latter, but it is notable that on the former there was a methodological change to include the dodgy survey data I mentioned earlier, so this drop may simply reflect a methodological change. This year, we lost points on resources which frankly isn’t surprising given controls on tuition and real declines in government funding in Canada. But it’s important to note that the way this is scored, what matters is not whether resources (or resources per-student) are going up or down, it’s whether they are going up or down relative to the category leader – i.e. Denmark. So even with no change in our funding levels, we could expect our scores to rise over the next few years.