Moneyball

I was at a conference last week in Italy, much of which focused around the use of data in institutional decision-making (technically it was a conference on rankings, but increasingly rankings are being seen as a data source for institutional benchmarking and strategizing rather than as a consumer tool, so there was a lot of overlap). One of the most interesting presentations involved a lot of discussion on the sheer amount of data now available on institutional performance (which, depending on how you treat the bibliometric data underpinning it, can reach into the billions of data points) and speculation concerning the idea of institutions using all this data to play “Moneyball” in order to improve their performance.

Now before you get ideas about your President, principal or Vice-Chancellor being replaced by Brad Pitt, it’s probably worth defining some terms here. “Moneyball”, to use the original definition in Michael Lewis’s book of the same name, is really about using performance data to uncover hidden sources of advantage. Properly conceived, it is a tool for poorer clubs to use against richer ones, finding hidden value to counteract the massive inbuilt advantage that the big rich clubs have.

There probably needs to be some distinction here made between “using data to work out how to get better at rankings” and “Moneyball”, because they aren’t the same thing. It’s really not that difficult to work out where small investments can significantly improve rankings: there are a half-dozen well-known high-return tricks around enrolling international students, hiring international staff and publishing research which will work for everyone. Moneyball is about torturing the data to tell you something that isn’t obvious, to tease out secrets that others don’t yet have.
It seems to be there are two basic areas one can do this, and they both have to do with research publications. First, one could imagine a program of data analysis which could determine, with some degree of accuracy, which emerging areas of science are likely to see major bursts in publications over the next five to ten years. It wouldn’t be flawless or error-free, of course – bets would still need to be placed and some would miss – but knowing where the scientific action is likely to take place over the next decade or so is half the battle. The second half would be to try to work out, based on early career data, which scientists were likely to be superstars, and hire them before they hit the big time.

Can either of these things actually be done? Hard to tell. It sure seems like McMaster did something like this when it attracted a whole bunch of scientists in health statistics who suddenly hit the big time and all emerged on the Thomson Reuters “most highly cited” list all at the same time, thereby propelling the university past McGill and briefly into the top 70 in the Shanghai Rankings, though there’s no way to tell if this is what they actually intended. It seems difficult to imagine. But I imagine an institution that genuinely wanted to strive for excellence could put together some kind of data skunkworks for a million dollars or so (perhaps from a dedicated philanthropic gift?) which could try to put some of these theories in action. No guarantees it would work, but if I were an ambitious University President I just might want to try.

An institution that really wanted to play this game might not just hire people for its own purposes; it might try to sabotage competitor institutions, too; find a couple of key people from the university next door not because you desire strength in their research area, but because their departure will have ripple effects and disproportionately hurt their home institution. I am less convinced about this one; I suspect this is not an easy trick to pull off with academics, though it is completely do-able with administrators and big universities frequently pinching promising VPs from less-prestigious institutions, which in the long run helps keep the local pecking order intact. But again, this is less Moneyball than it is simply flexing financial muscle.
Anyways, the Moneyball thing is at least potentially doable. The question is whether anyone thinks it is worth doing and whether any institutions’ internal structures would allow an ambitious central administration to play these kinds of games. And I think the answer to both questions is yes – but not in Canada. Australian and UK universities are just rankings-obsessed enough to do this, and may believe that the payoff will come through higher international enrolment and fees. My guess is that some top-40-but-not-top-20 research universities in the United States may feel the same way (though their payoffs would likely come through more sponsored research). I don’t think the calculus runs the same way anywhere in Canada really (though, hey, McMaster, you never know).

But just because universities in Canada don’t partake in this doesn’t mean they won’t be affected by it. They could very well be the subject of poaching by those who do (again, this is not unknown within Canada albeit minus the data analytics: during the boom years, the University of Calgary in particular was known for plundering other Prairie schools for talent). As a result, institutions need to at least keep up with changes in the field in order to be prepared.

Posted in

2 responses to “Moneyball

  1. The UK Research Excellence Framework certainly attuned universities into comparative advantages via selective snagging of researchers from other institutions (an exercise in deck chair re-arrangement). The consequences were predictable. Smaller universities took the risk and saw their most successful people be drawn away by offers they can’t match. In Canada, this has likely been exacerbated by some of the high profile chairs – though most of these recruitments have been from outside of Canada (still, I’ve seen the reciprocal effect of top Canadian scientists drawn away by similar deep-pocketed offers). The problem is, with research funding being so precarious, there are no guarantees and it is unclear whether generous packages do much except provide some insulation – and likely resentment among colleagues. Predicting the future is always a fools game and this is especially so in science. Instead, some universities are hedging their futures by topping up their junior scientist bets with a few established scientist side-wagers. It’s also true that in Canada, given the insufficiency of indirects, that research is very much a loss leader. It has many other benefits but fiscal stability ain’t one. We’ll see what happens when the international student boom goes bust.

  2. Maybe Canada could benefit in a round-about way: Canadian institutions could provide a refuge from the high data/high competition/you’re-only-as-good-as-your-last-article world being born.

    It worked for the University of Basel, a minuscule institution, in the nineteenth century, when it found itself with Jakob Burckhardt (the historian), Franz Overbeck (a pioneering theologian) and Friedrich Nietzsche (of whom you may have heard), by offering them all refuge from the pressures of the ascendant Prussian research institutions.

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.