Canada’s Rankings Run-up

Canada did quite well out of a couple of university rankings which have come out in the last month or so: the Times Higher education’s “Most International Universities” ranking, and the QS “Best Student Cities” ranking.  But there’s actually less to this success than meets the eye.  Let me explain.

Let’s start with the THE’s “Most International” ranking.  I have written about this before, saying it does not pass the “fall-down-laughing” test which is really the only method of testing a ranking’s external validity.  In previous years, the ranking was entirely about which institutions had the most international student, faculty and research collaborations.  These kinds of criteria inevitably favour institutions in small countries with big neighbours and disfavour big countries with few neighbours, it was no surprise that places like the University of Luxembourg and the Qatar University would top the list, and the United States would struggle to put an institution in the top 100.  In other words, the chosen indicators generated a really superficial standard of “internationalism” that lacked credibility (Times readers were pretty scathing about the “Qatar #1 result).

Now as a result of this, the Times changed it methodology.  Drastically.  They didn’t make a big deal of doing so (presumably not wishing to draw more attention to the rankings’ earlier superficiality), but basically, i) they added a fourth set of indicators (worth 25% of total) for international reputation based on THE’s annual survey of academics and ii) they excluded any institution which didn’t receive at least 100 in said academic survey.  (check out Angel Calderon’s critique of the new rules here for more details, if that sort of thing interests you).  That last one is a big one: in practice it means the universe for this ranking is only about 200 institutions.

On the whole, I think the result is a better ranking and confirms more closely to what your average academic on the street thinks of as an “international” university.  Not surprisingly, places like Qatar and Luxembourg suddenly vanished from the rankings.  Indeed, as a result of those changes fully three-quarters of the institution that were ranked in 2016 disappeared from the rankings in 2017.  Not surprisingly, Canadian universities suddenly shot up as a result.  UBC jumped from to 40th to 12th, McGill went from 76th to 23rd, Alberta from 110th to 31st, Toronto from 128th to 32nd, and so on.

Cue much horn-tooting on social media from those respective universities for these huge jumps in “internationality”.  But guys, chill.  It’s a methodology change.  You didn’t do that: the THE’s methodologists did.

Now, over to the second set of rankings, the QS “Best Student Cities”, the methodology for which is here.  The ranking is comprised of 22 indicators spread over six areas: university quality (i.e. how highly-ranked, according to QS, are the institutions in that city), “student mix”, which is a composite of total student numbers, international student numbers and some kind of national tolerance index,; “desirability”, which is a mix of data about pollution, safety, livability (some index made up by the Economist), corruption (again, a piece of national-level data) and students’ own ratings of the city (QS surveys students on various things); “employer activity”, which is mostly based on an international survey of employers about institutional quality, “affordability”, and “student view” (again, from QS’ own proprietary data.

Again, Montreal coming #1 is partly the result of a methodology change. This is the first year QS added student views to the mix, and Montreal does quite well on that front’ eliminate those scores and Montreal comes third.  And while the inclusion of student views in any ranking is to be applauded, you have to wonder about the sample size.  QS says they get 18,000 responses globally…Canada represents about 1% of the world’s students and Montreal institutions represent 10-15% of Canadian students, so if the responses are evenly distributed, that means there might be 20 responses from Montreal in the sample (there’s probably more than that because responses won’t be evenly distributed, but my point is we’re talking small numbers).  So I have my doubts about the stability of that score.  Ditto on the employer ratings, where Montreal somehow comes top among Canadian cities, which I am sure is news to most Canadians.  After all, where Montreal really wins big is on things like “livability” and “affordability”, which is another way of saying the city’s not in especially great shape economically.

So, yeah, some good rankings headlines for Canada: but let’s understand that nearly all of it stems from methodology changes.  And what methodologists give, they can take away.

Posted in

One response to “Canada’s Rankings Run-up

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.