Higher Education Strategy Associates

Research Rankings: Burning Questions

We understand that some results from our research rankings are causing some head-scratching. We thought we’d give you some insight into some of the key puzzles.

Q: Why isn’t U of T first? U of T is always first.

The fact that we didn’t include medical research is a big reason; had we done so, the results might have been quite different. But part of it also is that Toronto’s best subjects tend to be ones with high research costs and high publication/citation rates. Once you control for that, UBC surpasses Toronto on all measures.

Q: Why does UBC appear to be so much better than everyone else in SSHRC-related disciplines?

A variety of reasons, but much of it is down to the fact that the Sauder School is really good.

Q: Looking at the data, which schools stand out as being under-rated?

Simon Fraser makes the top ten in both SSHRC and NSERC disciplines, which most of the U-15 can’t say. UQ Rimouski came seventh in science and engineering – they aren’t very big but their strength in marine sciences puts them close to the top overall. In SSHRC-related disciplines, the answer is Guelph, which does extremely well in this area, despite having a reputation which is more science-based. York and Trent over-perform in both science and arts. York might not be such a surprise – it’s a big school with lots of resources even if it isn’t super in any of the “money” disciplines. But Trent was a revelation – by far the best publication record of any small-ish school in the country across all disciplines.

Q: And over-rated?

Despite being U-15 members, Western, Dalhousie and Laval all had relatively modest performances. At these schools more than the others, a lot of their research prestige seems to hang on their medical faculties.

Q: Any anomalies?

Apart from l’Université de Montréal, none of the francophone schools do very well in the social sciences and humanities rankings, and the culprit is on the bibliometric side rather than the funding side. The practice of publishing in French has the tendency to lessen the size of the potential audience. This reduces potential citations and hence reduces H-index scores. In the sciences and engineering, where publication tends to happen in English, francophone schools punch actually punch above their weight.

Q: Any trends of note?

UBC aside, its’ the Ontario institutions who really steal the show. Sure, they’re funded abysmally, but they perform substantially better on publication measures than anyone else in the country. We can’t say why, for sure, but maybe those high salaries really work. They’re tough on undergrad class sizes, though…

This entry was posted in Rankings, Research and tagged , . Bookmark the permalink.

5 Responses to Research Rankings: Burning Questions

  1. Thomas Carey says:

    I get the value of field-normalization, but shouldn’t we also be doing some demographic normalization to account for the seniority of researchers? Senior academics are likely to have a higher citation count for a number of reasons, so an institution will suffer a drop in these rankings if it has just completed an early retirement program and filled slots with early career researchers.

    Could another aspect of the research culture or ‘research intensity’ of an institution be illuminated by an different apples-to-apples normalization, which looked at career length as a factor in research impact? This would likely be an onerous task across the board, but perhaps a small sample to test the impact of career length would be worthwhile.

  2. Ravi Menon says:

    No question that the more senior you are, the higher your h-index gets although you’d be amazed at hom many people start to plateau. In a comparison of large schools, the demographics of the professors may or may not be all that different, given that we are dealing with large numbers of faculty. However, smaller schools (or different Provinces) may have very cyclical hiring patterns may well have a different demographic, and that could skew things-one way or the other-who knows ? The CAUT annual report on Universities usually has the demographics, at least by career rank (lecturer, assistant, associate, full professor), so that’s at least a good place to start in terms of demographic normalization.

    My school-Western- has hired a lot of people in Neuroscience in the past decade, so they are on the young side. Yet it was the top ranked Neuroscience program in the survey. So one should just magically expect that rankings will change when career rank normalized. Excellence is excellence. It doesn’t matter how old you are.

    Using h-index, or any normalization thereof does bias the results to a snapshot based on productivity 5 or more years ago. Recent papers rarely enter into anyones calculation of h-index. So this is really a snapshot of the research productivity of 5-10 years ago at a minimum. On the other hand, University reputations take 50-100 years to build, so regardless of how good recent faculty hires may be, it will take several generations to really establish a public reputation outside of academic circles and their dry rankings. The beauty of agnostic approaches like this is that places that normally get overlooked can feel good about the steps they are taking to establish a “new Canadian order” in a generation or two.

    • Alex Usher says:

      Hi Ravi. Thanks for reading our stuff.

      Some excellent points here. The CAUT data you reference isn’t quite fine enough for us to use for normalization purposes, but we can use the UCASS data on which it is based to do so (it just costs a lot).

      There’s no question that h-index is age-related. It;s one reason that I’m not a big fan of using this as a way to compare individuals. However, as soon as you get groups of researchers together, the age differences start to wash out – it’s rare (not unheard of, but rare) for one university or faculty to have a significantly different age profile from its peers.

      You’re also absolutely right about h-index being a backward-looking indicator. That’s exactly why we balanced it out with granting council data, which is a bit better at looking at current research strengths.

  3. Ravi Menon says:

    Granting also tends to correlate with h-index though. The higher your h (and generally the more senior you are), the more grant funding you have (although I in no way equate grant funding with innovation). Nonetheless, there are correlations between the h-index data and the current funding data that would indicate that they both produce similar rankings.

    Western ranks #10 in total research funding according to the last Research InfoSource numbers (numbers that include medical research funding), but only #16 in per capita funding (despite using only the unionized faculty as the denominator and excluding clinical faculty), commensurate with it’s general indicators in your survey. Thus while one can argue subtleties of the ranking criteria, and weightings of those criteria, enormously disparate methodologies and ranking approaches do seem to produce the same rank ordering of Canadian Universities-as far as research is concerned.

    And while undergraduate education is at least as important as research, there is no question that a University’s international reputation is based on it’s research credentials. Harvard does not do very well on “best student experience” and Western does exceptionally well. But I’m pretty sure if you ask parents where they want their child to go, the answer will be very clear. International students with the means and marks to go anywhere will be drawn to the research powerhouses. Research prowess is the best advertising.

  4. Pingback: The Listening Tour | HESA

Leave a Reply

Your email address will not be published. Required fields are marked *

We encourage constructive debate. Therefore, all comments are moderated. While anonymous or pseudonymous comments are permitted, those that are inflammatory or disrespectful may be blocked.