A Shift in Rhetoric on Innovation

Could a shift in thinking about innovation lead to a radical reduction in university research budgets?

Time was, universities could tell a pretty simple story about innovation. Give money to talented people in universities (preferably “world-class” ones), and let them work on interesting projects. Through the magic of peer-reviewed publication, knowledge will be transferred, entrepreneurs will get cool ideas for products, and massive innovation and productivity growth will ensue. But while universities argue for better funding because technological booms based on university-developed technologies (e.g, computers and the internet) are regular occurrences, it seems fewer and fewer people are agreeing with that story.

While it’s undeniable that long-run productivity is related to levels of investment in R&D, the idea that university research specifically has this effect isn’t a well-tested empirical proposition. Actually, much of the case for that basically comes down to university presidents pointing at Stanford and Silicon Valley and saying “we could do that, too, if we had enough money.”

So far, governments have bought this shtick. But there’s been a notable shift in tone in among innovation wonks over the last year or so. Instead of talk about spillovers from public research, what’s “in” these days is talk about the inevitable entrepreneurial explosion that will happen as a couple of billion new consumers start interacting with the global market place. For instance, check out Vijay Vaitheeswaran’s Need, Speed and Greed, Philip Auerswald’s The Coming Prosperity and Erik Brynjolfsson and Andrew McAfee’s Race Against the Machine.

Here’s partly what’s going on: ICTs are what’s known as an “enabling technology” – that is, a broad technology whose effects extend over a huge swath of the economy. There haven’t been many of these in history: writing, the steam engine, electricity, etc. When one comes along, it takes a long time for people to work out how to use it in ways that raise productivity. Electricity was first deployed in the economy for things like the telegraph. Its economic potential as a power source wasn’t really exploited for another 50 years or so, when factories were gradually re-organized to take advantage of electrification. After a big burst of “pure” research at the start, gains began to depend on more “applied” types of research.

We may be at a similar point with computers; that is, the gains to productivity for the next decade or two are more likely to come more from development than from research. If so, that radically changes the outlook for university research funding. There’s likely to be less of it, and a greater proportion of what’s left will go to more applied projects.

Something to ponder, anyway.

Posted in

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.