Research Rankings Reloaded

You’ll recall that a couple of months ago we released a set of research rankings; you may also remember that complaints were raised about a couple of issues in our methodology. Specifically, critics argued was that by including all permanent faculty we had drawn the net too wide, and that we should have excluded part-timers.

Well, we’ve now re-done the analysis, and are releasing them today as an annex to our original publication for all to see. Two key things to highlight about the changes are (i) the effect of excluding part-time professors is more significant in SSHRC disciplines than in NSERC-ones, and (ii) the use and function of part-time professors appear seems to differ systematically on linguistic grounds. Compared to part-time professors at anglophone universities, those at francophone universities are both more numerous and have profiles which resemble those of adjuncts at anglophone institutions (whereas in Anglophone institutions, part-timers look reasonably similar to the full-time population).

At the top of the table, not a great deal changes – it’s still the same top six in both SSHRC and NSERC totals (though ordinal position does change slightly – McGill slips ahead of UBC to top spot in the SSHRC rankings because of much better performance on research income). Neither is there much change in the bottom quartile or so. In the middle ranks though, where institutions are more tightly bunched together, small changes in scores can cause some pretty big changes in rankings; University of Manitoba goes from 27th to 17th in the NSERC rankings; UQAM (which has by far the country’s largest contingent of part-time faculty) jumps from 43rd to 17th in the SSHRC ones. In fact, francophone institutions generally did a lot better with the revisions. What we had initially assumed was a “language effect” – poor results driven by the fact that publishing in French limits readership and hence citations – may in fact have been driven by employment patterns.

But for most institutions, contrary to the expectations of some of our critics, not much changed. Even where it did, gains in the one field were sometimes offset by losses in the other (Ottawa, for instance, rose five places in Arts but fell two in Science). Which makes sense: the only way excluding part-timers would change outcomes would be if your institution used part-timers in a significantly different way than others do.

Now, you may still dislike our rankings because you don’t like field-normalization, or the specific metrics used, or whatever. That’s cool: debating metrics is important and we’d love to engage constructively on that. But let’s bury the canard that somehow an improper frame significantly skewed our results. It didn’t. The results are the results. Deal with ‘em.

Posted in

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.