A few weeks ago, the Times Higher Education published a ranking of “top attractors of industry funds”. It’s actually just a re-packaging of data from its major fall rankings exercise: “industry dollars per professors” is one of its thirteen indicators and this is just that indicator published as a standalone ranking. What’s fascinating is how at odds the results are with published data available from institutions themselves.
Take Ludwig-Maximillans University in Munich, the top university for research income according to THE. According to the ranking, the university collects a stonking $392,800 in industry income per academic. But a quick look at the university’s own facts and figures page reveals a different story. The institution says it receives €148.4 million in “outside funding”. But over 80% of that is from the EU, the German government, or a German government agency. Only €26.7 million comes from “other sources”. This is at a university which has 1492 professors. I make that out to be 17,895 euros per prof. Unless the THE gets a much different $/€ rate than I do, that’s a long way from $392,800 per professor. In fact, the only way the THE number makes sense is if you count the entire university budget as “external funding” (1492 profs time $392,800 equals roughly $600M, which is pretty close to the €579 million figure which the university claims as its entire budget).
Or take Duke, second on the THE list. According to the rankings, the university collects $287,100 in industry income per faculty member. Duke’s Facts and Figures page says Duke has 3,428 academic staff. Multiply that out and you get a shade over $984 million. But Duke’s financial statements indicate that the total amount of “grants, contracting and similar agreements” from non-government sources is just under $540 million, which would come to $157,000 per prof, or only 54% of what the Times says it is.
The 3rd place school, the Korea Advanced Institute of Science and Technology (KAIST), is difficult to examine because it seems not to publish financial statements or have a “facts & figures” page in English. However, assuming Wikipedia’s estimate of 1140 academic staff is correct, and if we generously interpret the graph on the university’s research statistics page as telling us that 50 of the 279 billion Won in total research expenditures comes from industry, then at current exchange rates that comes to a shade over $42 million, or $37,000 per academic. Or, one-seventh of what the THE says it is.
I can’t examine the fourth-placed institution, because Johns Hopkins’ financial statements don’t break out its grant funding by public and private sources. But tied for fifth place is my absolute favourite, Anadolou University in Turkey, which allegedly has $242,500 in income per professor. This is difficult to check because Turkish universities appear not to publish their financial documents. But I can tell you right now that this is simply not true. On its facts and figures page, the university claims to have 2,537 academic staff (if you think that’s a lot, keep in mind Anadolu’s claim to fame is as a distance ed university. It has 2.7 million registered students in addition to the 30,000 or so it has on its physical campus, roughly half of whom are “active”). For both numbers to be true, Anadolu would have to be pulling in $615 million/year in private funding, and that simply strains credulity. Certainly, Anadolu does do quite a bit of business – a University World News article from 2008 suggests that it was pulling in $176 million per year in private income (impressive, but less than a third of what is implied by the THE numbers), but much of that seems to come from what we would typically call “ancillary enterprises” – that is, businesses owned by the university – rather than external investment from the private sector.
I could go through the rest of the top ten, but you get the picture. If only a couple of hours of googling on my part can throw up questions like this, then you have to wonder how bad the rest of the data is. In fact, the only university in the top ten where the THE number might be something close to legit is that for Wageningen University in the Netherlands. This university lists €101.7 million in “contract research”, and has 587 professors. That comes out to a shade over €173,000 (or about $195,000 per professor) which is at least spitting distance from the $242,000 claimed by THE. The problem is, it’s not clear from any Wageningen documentation I’ve been able to find how much of that contract research is actually private sector. So it may be close to accurate, or it may be completely off.
The problem here is a problem common to many rankings systems. It’s not that the Times Higher is making up data, and it’s not that institutions are (necessarily) telling fibs. It’s that if you hand out a questionnaire to a couple of thousand institutions who, for reasons of local administrative practice, define and measure data in many different ways, and ask for data on indicators which do not have a single obvious response (think “number of professors”: do you include clinicians? Part-time profs? Emeritus professors?), you’re likely to get data which isn’t really comparable. And if you don’t take the time to verify and check these things (which the THE doesn’t, it just gets the university to sign a piece of paper “verifying that all data submitted are true”), you’re going to end up printing nonsense.
Because THE publishes this data as a ratio of two indicators (industry income and academic staff) but does not publish the indicators themselves, it’s impossible for anyone to work out where the mistakes might be. Are universities overstating certain types of income, or understating the number of professors? We don’t know. There might be innocent explanations for these things – differences of interpretation that could be corrected over time. Maybe LMU misunderstood what was meant by “outside revenue”. Maybe Duke excluded medical faculty when calculating its number of academics. Maybe Anadolu excluded its distance ed teachers and included ancillary income. Who knows?
The problem is that the Times Higher knows that these are potential problems but does nothing to rectify them. It could be more transparent and publish the source data so that errors could be caught and corrected more easily, but it won’t do that because it wants to sell the data back to institutions. It could spend more time verifying this data, but it has chosen to hide instead behind sworn statements from universities. To do more would be to reduce profitability.
The only way this is ever going to be solved is if institutions themselves start making their THE submissions public, and create a fully open database of institutional characteristics. That’s unlikely to happen because institutions appear to be at least as fearful of full transparency as the THE. As a result, we’re likely to be stuck with fantasy numbers in rankings for quite some time yet.