Bibliometrics III: The Leiden Rankings

One of my favourite bibliometric analysis tools is the criminally-underused (at least in Canada) Leiden Rankings. The nice thing about Leiden – apart from it being global in scope – is its web-based, interactive nature. Users can choose comparators by region or country, whether or not to use non-English-language papers and how to normalize for institutional size. Unlike most rankings (e.g., the Times Higher), it’s the user that’s in control.

Most importantly, users choose the indicators for comparisons. One can choose from a number of “impact” indicators, which examine various measures of publications and citations, both on a “raw” basis and adjusted for field of study. Or one can use “collaboration” indicators, which look at how often an institution’s scholars co-author papers with colleagues from other institutions at home and abroad. I really do recommend taking a few minutes to play with Leiden indicators yourself – putzing around with data is both fun and instructive!

How do Canadian institutions fare? In terms of impact, U of T is far and away on its own at the top in every category. In terms of publications catalogued in the Thomson Web of Science, over the period 2006-2010, Toronto actually ranked second in the entire world, behind only Harvard. Unfortunately, its performance in having its papers in the top decile of cited papers each year was much weaker – only 87th globally (no other Canadian institution made the top 100). UBC and McGill place second and third in most categories, with everyone else a long way back (though McMaster, Simon Fraser and the University of Victoria punch well above their weight in certain individual categories).

In terms of collaboration, Canadian academics are somewhat less likely than those elsewhere to co-publish with academics from other institutions, be they foreign or domestic. On these measures, the University of Victoria usually comes top among Canadian institutions, though McGill and UBC also do well on proportion of total articles co-authored with foreign academics (Toronto, surprisingly, comes eighth among Canadian institutions on this measure).

In short, if you’re looking for multi-dimensional analyses of pan-institutional research performance, Leiden is really hard to beat. But if you’re looking for comparisons in specific fields of study, it’s not useful at all. And while its multi-dimensional stuff allows fine-tuned comparisons, sometimes it’s heuristically convenient to boil things down to a single number. Again, for that you need to look elsewhere.

If you saw yesterday’s Globe and Mail Report on Research, you’ll know that we at HESA have been working for the last little while on a project to deliver exactly these kinds of research comparisons. In our next installment of this series, we’ll be showing you some of our results. Stay tuned!

Posted in

One response to “Bibliometrics III: The Leiden Rankings

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.