HESA

Higher Education Strategy Associates

Measuring Innovation

Yesterday, I described how the key sources of institutional prestige were beginning to shift away from pure research & publication towards research & collaboration with industry.  Or, to put it another way, the kudos now come not from solely doing research, but rather in participating in the process of turning discoveries into meaningful and commercially viable products.  Innovation, in other words (though that term is not unproblematic).  But while we all have a pretty good grasp on the various ways to measure research output, figuring out how to measure an institutions’ performance in terms of innovation is a bit trickier.  So today I want to look at a couple of emerging attempts to do just that.

First out of the gate in this area is Reuters, which has already published two editions of a “top 100 innovative universities” list.  The top three won’t surprise anyone (Stanford, MIT, Harvard) but the next three – Texas, Washington and the Korea Advanced Institute of Science and Technology – might:  it’s a sign at least that some non-traditional indicators are being put in the mix. (Obligatory CanCon section: UBC 50th, Toronto 57th and that’s all she wrote.)

So what is Reuters actually measuring?  Mostly, it’s patents.  Patents filed, Success rates of patents filed, percentage of patents for which coverage was sought in all three of the main patent offices (US, Europe, japan), patent citations, patent citation impact…you get the idea.  It’s a pretty one-dimensional view of innovation.  The bibliometric bits are slightly more interesting – percent of articles co-written with industry partners, citations in articles originating in industry – but that maybe gets you to one and a half dimensions, tops.

Meanwhile, the THE may be inching towards an innovation ranking.  Last year, it released a set of four “innovation indicators”, but only published the top 15 in each indicator (and included some institutions not usually thought of as universities in the list, such as “Wright-Patterson Airforce Base”, the Scripps Research Institute” and the “Danish Cancer Society”) which suggests this was a pretty quick rip-and-grab from the Scopus database rather than a long, thoughtful detailed inquiry into the subject.  Two of the four indicators, “resources from industry” and “industry contribution” (i.e. resources from industry as a percentage of total research budget), are based on data from the THE’s annual survey of institutions and while they may be reasonable indicators of innovation, for reasons I pointed out back here, you should intensely distrust the data.  The other two indicators are both bibliometric:  “patent citations” and “industry collaboration” (i.e. co-authorships).  On the whole, THE’s effort is slightly better than Reuters’, but is still quite narrow.

The problem is that the ways in which universities support innovation in an economic sense are really tough to measure.  One might think that counting spin-offs would be possible, but the definition of a spin-off might vary quite a bit from place to place (and it’s tough to know if you’ve caught 100% of said activity).  Co-working space (that is space where firms and institutions interact) would be another way to measure things, but it’s also very difficult to capture.  Economic activity in university tech parks is another, but not all economic activity in tech parks are necessarily university- or even science-based (this is an issue in China and many developing countries as well).  The number of students engaged in firm-based work-integrated learning (WIL) activities would be great but a) there is no common international definition of WIL and b) almost no one measures this anyway.  Income from patent licensing is easily obtainable in some countries but not others.

What you’d really want, frankly, is a summary of unvarnished opinions about the quality of industry partnerships with the businesses themselves, perhaps weighted by the size of the businesses involved (an 8 out of 10 at Yale probably means more than a 9 out of 10 at Bowling Green State).  We can get these at a national level through the World Economic Forum’s annual competitiveness survey, but not at an institutional level, which is presumably more important.  And that’s to say nothing of the value of finding ways to measure the various ways in which institutions support innovation in ways other than through industry collaboration.

Anyways, these problems are not insoluble.  They just take imagination and work.  If I were in charge of metrics in Ontario, say, I could think of many ways – some quantitative, some qualitative – that we might use to evaluate this.  Not many of them would translate easily into international comparisons.  For that to happen would require a genuine international common data set to emerge.  That’s unlikely to happen any time soon, but that’s no reason to throw up our hands.  It would be unimaginably bad if, at the outset of an era where institutions are judged on their ability to be economic collaborators, we allow patent counts to become the standard way of measuring success.  It’s vitally important that thoughtful people in higher education put some thought into this topic.

This entry was posted in Innovation, Rankings and tagged . Bookmark the permalink.

2 Responses to Measuring Innovation

  1. Luc Lalande says:

    Innovation metrics are indeed laudable but in practice a tough nut to crack. I agree that the reliance on patent-related metrics as a proxy for innovation performance is much too narrow to get the job done. Counting the number of university spin-offs and startups poses it’s own set of problems. There is no neutral party to verify self-reporting by post-secondary institutions. I know of one university claiming well-over 150 startups in the past few years but offer no source for external groups to see the supporting data. At least with patent data, you can always check the USPTO et al databases. I would like to see policy professionals in both government and think tanks come up with some (innovative) ideas in this regard. That would at least get the ball rolling.

  2. Alon Eisenstein says:

    Thanks for the analysis Alex. Patents counting is indeed problematic since just because you paid for the patent doesn’t mean it was actually used to create commercial applications, at which I would argue against claiming this as “innovstion” at all.

    As for “counting start ups”, we must ask whether you can actually track their longevity. Lots of students are encouraged to be entrepreneurs and start their own business. But how many actually nurture them to a point where the company generates benefit to society through job creation and taxes?

    As for measuring WIL… As already cautioned by HEQCO, quality matters more than quantity. And tracking quality will be really hard as well.

Leave a Reply

Your email address will not be published. Required fields are marked *

We encourage constructive debate. Therefore, all comments are moderated. While anonymous or pseudonymous comments are permitted, those that are inflammatory or disrespectful may be blocked.