One of the interesting things about our new research rankings – which unlike previous attempts at such things are fully field-normalized – is that it shines a very different light on who the “leaders” are in terms of research.
Back in the day, the ten “leading” research institutions in the country (Laval, McGil, Montreal, Queen’s, Toronto, McMaster, Waterloo, Western, Alberta and UBC) created the “G-10.” It was a talking-shop, mostly: a forum where big universities could exchange data quietly amongst themselves. Around the turn of the century, three more institutions (Ottawa, Calgary and Dalhousie) were added, and more recently Manitoba and Saskatchewan were included as well.
Waterloo apart, the U-15 is basically a list of the country’s established medical schools. But the idea that simply having a medical school makes you research-intensive is questionable. If you were looking for “research leaders,” you’d probably start with looking at bibliometric measures, like the H-index measures. There are 16 schools which have an average H-index score above one (i.e., where the average professor at that school has an H-index above the national average) in NSERC disciplines, and 22 which have an average H-index above one in SSHRC disciplines.
So how does the U-15 membership fare in these? UBC, Toronto, McGill and Montreal and are in the top five in both SSHRC and NSERC disciplines, so they’re indisputably “tops.” Queen’s, McMaster, Alberta, Waterloo and Manitoba all have above-average scores in both areas. After that, it gets trickier: Ottawa has a well-above average score in NSERC disciplines but a below-average one in SSHRC disciplines; Saskatchewan and Calgary are above-average in SSHRC disciplines but not in NSERC ones. Laval, Dalhousie and Western are below-average in both.
But what about schools outside the U-15? Well, Simon Fraser makes the top ten in both fields, a claim most of the U-15 can’t make. York, Concordia and Trent (yeah, we did a double-take, too) both have above-average scores in both fields; from a purely bibliometric perspective, they are at least in the same class as Manitoba. Trent and Concordia don’t look so good when funding measures are taken into account, but the other two do OK and seem at least the equal of a few of the U-15. One could also make a decent case for Guelph, which is well above average in SSHRC disciplines, and only a shade below it in NSERC ones.
So why aren’t these schools in the U-15’s big research club, even though they clearly outperform some of the weaker U-15 members? Unfortunately, the answer is prestige. If York and Concordia were allowed into the club, Toronto and McGill would probably want to find another sandbox to play in. In academia, exclusivity matters.
Excluding medical research from the mix and then turning around and arguing, like you do, that other schools deserve to be considered as research intensive or more than some U15 members, seems a bit ridiculous. It’s comparing “full schools” with “half schools” by handicaping them of their medical schools (or like comparing the US olympic swimming team without Phelps to other country’s complete teams). There must not have been anything interesting to comment on today I guess…
Hi AMD. Thanks for reading our stuff.
That’s a reasonable point. But I think you’re then coming close to saying that “research intensive” necessarily equals presence of a medical school. If the U-15 want to describe themselves as a club of universities with medical schools, that’s their perogative (though it would be tough on Waterloo). But they should be up front about it.
There are exceptional research intensive universities that do not have medical schools and yet most people would give up their first born to attend. Looking at just the top 5 in the US, MIT and Caltech come to mind and in fact they trounce every single Canadian school in the Times Higher Education or Shanghai global rankings. So the presence of a medical school does not ensure excellence. In fact if a University is mediocre in it’s NSERC and SSHRC funded research, it’s probably going to be poor in the CIHR category. So leaving out med school funding data is not likely to change the relative rankings in the HESA survey if any proper normalization is used, as was done here for discipline specific averages.
This is also true for funding. On a per capita research funding basis, the Nova Scotia School of Agriculture ranks above several Canadian Universities with medical schools. So tell them that not having a medical school is hurting their individual faculty members’ productivity and output.