HESA

Higher Education Strategy Associates

Category Archives: Research

June 06

Making “Applied” Research Great Again

One of the rallying calls of part of the scientific community over the past few years is how under the Harper government there was too much of a focus on “applied” research and not enough of a focus on “pure”/”basic”/”fundamental research.  This call is reaching a fever pitch following the publication of the Naylor Report (which, to its credit, did not get into a basic/applied debate and focussed instead on whether or not the research was “investigator-driven”, which is a different and better distinction).  The problem is that that line between “pure/basic/fundamental” research and applied research isn’t nearly as clear cut as people believe, and the rush away from applied research risks throwing out some rather important babies along with the bathwater.

As long-time readers will know, I’m not a huge fan of a binary divide between basic and applied research.  The idea of “Basic Science” is a convenient distinction created by natural scientists in the aftermath of WWII as a way to convince the government to give them money the way they did during the war but without having soldier looking over their shoulder.  In some fields (medicine, engineering), nearly all research is “applied” in the sense that it there are always considerations of the end-use for the research.

This is probably a good time for a refresher on Pasteur’s Quadrant.  This concept was developed by Donald Stokes, a political scientist at Princeton, just before his death in 1997.  He too thought the basic/applied dichotomy was pretty dumb, so like all good social scientists he came up with a 2×2 instead.  One consideration in classifying science is whether or not it involved a quest for fundamental understanding; the other was whether or not the researcher had any consideration for end-use.   And so what you get is the following:

June 6 -17 Table 1

(I’d argue that to some extent you could replace “Bohr” with “Physics” and “Pasteur” with “Medicine” because it’s the nature of the fields of research and not individual researchers’ proclivities, per se, but let’s not quibble).

Now what was mostly annoying about the Harper years – and to some extent the Martin and late Chretien years – was not so much that the federal government was taking money out of the “fundamental understanding” row and into the “no fundamental understanding” row (although the way some people go on you’d be forgiven for thinking that), but rather than it was trying to make research fit into more than one quadrant at once.  Sure, they’d say, we’d love to fund all your (top-left quadrant) drosophilia research, but can you make sure to include something about its eventual (bottom-right quadrant) commercial applications?  This attempt to make research more “applied” is and was nonsense, and Naylor was right to (mostly) call for an end to it.

But that is not the same thing as saying we shouldn’t fund anything in the bottom-right corner – that is, “applied research”.

And this is where the taxonomy of “applied research” gets tricky.  Some people – including apparently the entire Innovation Ministry, if the last budget is any indication – think that the way to bolster that quadrant is to leave everything to the private sector, preferably in sexy areas like ICT, Clean Tech and whatnot.  And there’s a case to be made for that: business is close to the customer, let them do the pure applied research.

But there’s also a case to be made that in a country where the commercial sector has few big champions and a lot of SMEs, the private sector is always likely to have some structural difficulties doing the pure applied research on its own.  It’s not simply a question of subsidies: it’s a question of scale and talent.  And that’s where applied research as conducted in Canada’s colleges and polytechnics comes in.  They help keep smaller Canadian companies – the kinds that aren’t going to get included in any “supercluster” initiative – competitive.  You’d think this kind of research should be of interest to a self-proclaimed innovation government.  Yet whether by design or indifference we’ve heard nary a word about this kind of research in the last 20 months (apart perhaps from a renewal of the Community and College Social Innovation Fund).

There’s no reason for this.  There is – if rumours of a cabinet submission to respond to the Naylor report are true – no shortage of money for “fundamental”, or “investigator-driven” research.  Why not pure applied research too?  Other than the fact that “applied research” – a completely different type of “applied research”, mind you – has become a dirty word?

This is a policy failure unfolding in slow motion.  There’s still time to stop it, if we can all distinguish between different types of “applied research”.

May 17

Diversity in Canada Research Chairs

One of the hot topics in Ottawa over the past couple of months is the issue of increasing diversity among researchers.   Top posts in academia are still disproportionately occupied by white dudes, and the federal minister of Science, Kirsty Duncan, would like to change that by threatening institutions with a loss of research funding.

There’s no doubt about the nature of the problem.  As in other countries, women and minorities have trouble making it up the career ladder in academia at the same rate as white males.  The reasons for this are well-enough known that I probably needn’t recount them here (though if you really want a good summary try here and here).  There was a point when one might reasonably have suspected that time would take care of the problem.  Once PhD completion rates equalized (until the 1990s they still favored men) and female scientists began making their way up the career ladder, it might have been argued, the problem of representation at the highest levels would take care of itself.  But it quite plainly hasn’t worked out that way and more systemic solutions need to be found.  As for Indigenous scholars and scholars with disabilities, it’s pretty clear we still have a lot of pipeline issues to worry about and equalizing PhD completion rates, in addition to solving problems related to career progression, is a big challenge.

Part of what Ottawa is trying to do is to get institutions to take their responsibilities on career progression seriously by getting them each to commit to equity plans.  Last October, the government announced that institutions without equity plans will become ineligible for new CERC awards; earlier this month, Kirsty Duncan attached the same condition to the Canada Research Chairs (CRC) program.

(A quick reminder here about how the Chairs program works.  There are two types of awards: Tier 1 awards for top researchers, worth $200,000/year for seven years, and Tier 2 awards for emerging researchers, worth $100,000/year for five years.  There are 2000 awards in total, with roughly equal numbers of Tier 1 and Tier 2 awards.  Each university gets an allocation of chairs based – more or less – on the share of tri-council funding its staff received, with a boost for smaller institutions.  So, University of Toronto gets 256 chairs, Université Ste. Anne gets one, etc.  Within that envelope institutions are free to distribute awards more or less as they see fit.)

The problem is, as the Minister well knows, all institutions already have equity plans and they’re not working.  So she has attached a new condition: they also fix the demographic distribution of chair holders so that they “ensure the demographics of those given the awards reflect the demographics of those academics eligible to receive them” by 2019.  It’s not 100% clear to me what this formulation means. I don’t believe it means that women must occupy 50% of all chairs; I am fairly sure that the qualifier “of those eligible to receive” means something along the lines of “women must occupy a percentage of Tier 1 chairs equal to their share of full professors, and of Tier 2 chairs equal to their share of associate and assistant professors”.

Even with those kind of caveats, reaching the necessary benchmarks in the space of 18-24 months will requires an enormous adjustment.  The figure I’ve seen for major universities is that only 28% of CRCs are women.  Given that only about 15-18% of chairs turn over in any given year, getting that up to the 40-45% range the benchmark implies by 2019 means that between 65 and 79% of all CRC appointments for the next two years will need to be female and probably higher than that for the Tier 1s.  That’s certainly achievable, but it’s almost certain to be accompanied by a lot of general bitchiness among passed-over male candidates.  Brace yourselves.

But while program rules allow Ottawa to use this policy tool to take this major step for gender equality, it will be harder to use it for other equity categories.  Institutions don’t even really have a measure of how many of their faculty have disabilities, so setting benchmarks would be tricky.  Indigenous scholars pose an even trickier problem. According to the formula used for female scholars, Indigenous scholars’  “share” of CRCs might be 1%, or about 20 nationally.  The problem is that only five institutions (Alberta, British Columbia, McGill, Montreal, Toronto) have 100 or more CRCs and would thus be required to reserve a spot for an Indigenous scholar.  An institution like (say) St. FX, which has only five chairs, would have a harder time.  It can achieve gender equity simply by having two or three female chairs.  But how would it achieve parity for Indigenous scholars?  It’s unlikely it could be required to reserve one of its five (20%) spots to an Indigenous scholar.

Many institutions would obviously hire Indigenous faculty anyway, it’s just that the institutional allocations which form the base of this program’s structure make it difficult to achieve some of what Ottawa wants to achieve on equity and diversity.

 

May 08

Naylor Report, Part II

Morning all.  Sorry about the service interruption.  Nice to be back.

So, I promised you some more thoughts about the Fundamental Science Review.  Now that I’ve lot of time to think about it, I think I’m actually surprised by what it doesn’t say, says and how many questions remain open.

What’s best about the report?  The history and most of the analysis are pretty good.  I think a few specific recommendations (if adopted) might actually be a pretty big deal – in particular the one saying that the granting councils should stop any programs forcing researchers to come up with matching funding, mainly because it’s a waste of everyone’s time.

What’s so-so about it?  The money stuff for a start.  As I noted in my last blog post, I don’t really think you can justify a claim to more money based on “proportion of higher ed investment research coming from federal government”.  I’m more sympathetic to the argument that there needs to be more funds, especially for early career researchers, but as noted back here it’s hard to argue simultaneously that institutions should have unfettered rights to hire researchers but that the federal government should be pick up responsibility for their career progression.

The report doesn’t even bother, really, to make the case that more money on basic research means more innovation and economic growth.  Rather, it simply states it, as if it were a fact (it’s not).  This is the research community trying to annex the term “innovation” rather than co-exist with it.  Maybe that works in today’s political environment; I’m not sure it improves overall policy-making.  In some ways, I think it would have been preferable to just say: we need so many millions because that’s what it takes to do the kind of first-class science we’re capable of.  It might not have been politic, but it would have had the advantage of clarity.

…and the Governance stuff?  The report backs two big changes in governance.  One is a Four Agency Co-ordinating Board for the three councils plus the Canada Foundation for Innovation (which we might as well now call the fourth council, provided it gets an annual budget as recommended here), to ensure greater cross-council coherence in policy and programs.  The second is the creation of a National Advisory Committee on Research and Innovation (NACRI) to replace the current Science, Technology and Innovation Council and do a great deal else besides.

The Co-ordinating committee idea makes sense: there are some areas where there would be clear benefits to greater policy coherence.  But setting up a forum to reconcile interests is not the same thing as actually bridging differences.  There are reasons – not very good ones, perhaps, but reasons nonetheless – why councils don’t spontaneously co-ordinate their actions; setting up a committee is a step towards getting them to do so, but success in this endeavour requires sustained good will which will not necessarily be forthcoming.

NACRI is a different story.  Two points here.  The first is that it is pretty clear that NACRI is designed to try to insulate the councils and the investigator-driven research they fund from politicians’ bright ideas about how to run scientific research.  Inshallah, but if politicians want to meddle – and the last two decades seem to show they want to do it a lot – then they’re going to meddle, NACRI or no.  Second, the NACRI as designed here is somewhat heavier on the “R” than on the “I”.  My impression is that as with some of the funding arguments, this is an attempt to hijack the Innovation agenda in Research’s favour.  I think a lot of people are OK with this because they’d prefer the emphasis to be on science and research rather than innovation but I’m not sure we’re doing long-term policy-making in the area any favours by not being explicit about this rationale.

What’s missing?  The report somewhat surprisingly punted what I expected to be a major issue: namely, the government’s increasing tendency over time to fund science outside the framework of the councils in such programs as the Canada Excellence Research Chairs (CERC) and the Canada First Research Excellence Fund (CFREF).  While the text of the report makes clear the authors’ have some reservations about these programs, the recommendations are limited to a “you should review that, sometime soon”.  This is too bad, because phasing out these kinds of programs would be an obvious way to pay for increase investigator-driven funding (though as Nassif Ghoussoub points out here  it’s not necessarily a quick solution because funds are already committed for several years in advance).  The report therefore seems to suggest that though it deplores past trends away from investigator-driven funding, it doesn’t want to see these recent initiatives defunded, which might be seen in government as “having your cake and eating it too”.

What will the long-term impact of the report be? Hard to say: much depends on how much of this the government actually takes up, and it will be some months before we know that.  But I think the way the report was commissioned may have some unintended adverse consequences.  Specifically, I think the fact that this review was set up in such a way as to exclude consideration of applied research – while perfectly understandable – is going to contribute to the latter being something of a political orphan for the foreseeable future.  Similarly, the fact that the report was done in isolation from the broader development of Innovation policy might seem like a blessing given the general ham-fistedness surrounding the Innovation file, in the end I wonder if the end result won’t be an effective division of policy, with research being something the feds pay universities do and innovation something they pay firms to do.  That’s basically the right division, of course, but what goes missing are vital questions about how to make the two mutually reinforcing.

Bottom line: it’s a good report.  But even if the government fully embraces the recommendations, there are still years of messy but important work ahead.

April 18

Naylor Report, Take 1

People are asking why I haven’t talked about the Naylor Report (aka the Review of Fundamental Science) yet.  The answer, briefly, is i) I’m swamped ii) there’s a lot to talk about in there and iii) I want to have some time to think it over.  But I did have some thoughts about chapter 3, where I think there is either an inadvertent error or the authors are trying to pull a fast one (and if it’s the latter I apologize for narking on them).  So I thought I would start there.

The main message of chapter 3 is that the government of Canada is not spending enough on inquiry-driven research in universities (this was not, incidentally, a question the Government of Canada asked of the review panel, but the panel answered it anyway).  One of the ways that the panel argues this point is that while Canada has among the world’s highest levels of Research and Development in the higher education sector – known as HERD if you’re in the R&D policy nerdocracy – most of the money for this comes from higher education institutions themselves and not the federal government.  This, that say, is internationally anomalous and a reason why the federal government should spend more money.

Here’s the graph they use to make this point:

Naylor Report

Hmm.  Hmmmmm.

So, there are really two problems here.  The first is that HERD can be calculated differently in different countries for completely rational reasons.  Let me give you the example of Canada vs. the US.  In Canada, the higher education portion of the contribution to HERD is composed of two things: i) aggregate faculty salaries times the proportion of time profs spend on research (Statscan occasionally does surveys on this – I’ll come back to it in a moment) plus ii) some imputation about unrecovered research overhead.  In the US, it’s just the latter.  Why?  Because the way the US collects data on HERD, the only faculty costs they capture are the chunks taken out of federal research grants.  Remember, in the US, profs are only paid 9 months per year and at least in the R&D accounts, that’s *all* teaching.  Only the pieces of research grant they take out as summer salary gets recorded as R&D expenditure (and also hence as a government-sponsored cost rather than a higher education-sponsored one).

But there’s a bigger issue here.  If one wants to argue that what matters is the ratio of federal portion of HERD to the higher-education portion of HERD, then it’s worth remembering what’s going on in the denominator.  Aggregate salaries are the first component.  The second component is research intensity, as measured through surveys.  This appears to be going up over time.  In 2000, Statscan did a survey which seemed to show the average prof spending somewhere between 30-35% of their time on research. A more recent survey shows that this has risen to 42%.  I am not sure if this latest co-efficient has been factored into the most recent HERD data, but when it does, it will show a major jump in higher education “spending” (or “investment”, if you prefer) on research, despite nothing really having changed at all (possibly it has been and it is what explains the bump seen in expenditures in 2012-13)

What the panel ends up arguing is for federal funding to run more closely in tune with higher education’s own “spending”.  But in practice what this means is: every time profs get a raise, federal funding would have to rise to keep pace.  Every time profs decide – for whatever reasons – to spend more time on research, federal funds should rise to keep pace.  And no doubt that would be awesome for all concerned, but come on.  Treasury Board would have conniptions if someone tried to sell that as a funding mechanism.

None of which is to say federal funding on inquiry-driven research shouldn’t rise.  Just to say that using data on university-funded HERD might not be a super-solid base from which to argue that point

June 08

Are NSERC decisions “skewed” to bigger institutions?

That’s the conclusion reached by a group of professors from – wait for it – smaller Canadian universities, as published recently in PLOS One. I urge you to read the article, if only to understand how technically rigorous research without an ounce of common sense can make it through the peer-review process.

Basically, what the paper does is rigorously prove that “both funding success and the amount awarded varied with the size of the applicant’s institution. Overall, funding success was 20% and 42% lower for established researchers from medium and small institutions, compared to their counterpart’s at large institutions.” 

They go on to hypothesize that:

“…applicants from medium and small institutions may receive lower scores simply because they have weaker research records, perhaps as a result of higher teaching or administrative commitments compared to individuals from larger schools. Indeed, establishment of successful research programs is closely linked to the availability of time to conduct research, which may be more limited at smaller institutions. Researchers at small schools may also have fewer local collaborators and research-related resources than their counterparts at larger schools. Given these disparities, observed funding skew may be a consequence of the context in which applicants find themselves rather than emerging from a systemic bias during grant proposal evaluation.”

Oh my God – they have lower success rates because they have weaker research records?  You mean the system is working exactly as intended?

Fundamentally, this allegedly scientific article is making a very weird political argument.  The reason profs at smaller universities don’t get grants, according to these folks, is because they got hired by worse universities –  which means they don’t get the teaching release time, the equipment and whatnot that would allow them to compete on an even footing with the girls and boys at bigger schools.  To put it another way, their argument is that all profs have inherently equal ability and are equally deserving of research grants, it’s just that some by sheer random chance got allocated to weaker universities, which have put a downer on their career, and if NSERC doesn’t actively ignore actual outputs and perform some sort of research grant affirmative action, then it is guilty of “skewing” funding.

Here’s another possible explanation: yes, faculty hired by bigger, richer, more research-intensive institutions (big and research-intensive are not necessarily synonymous, but they are in Canada) have all kinds of advantages over faculty hired by smaller, less research-intensive universities.  But maybe, just maybe, faculty research quality is not randomly distributed.  Maybe big rich universities use their resources mainly to attract faculty deemed to have greater research potential.  Maybe they don’t always guess quite right about who has that potential and who doesn’t but on the whole it seems likelier than not that the system works more or less as advertised.

And so, yes, there is a Matthew effect (“for unto every one that hath shall be given, and he shall have abundance”) at work in Science: the very top of the profession gets more money than the strata below them and that tends to increase the gap in outcomes (salary, prestige, etc).  But that’s the way the system was designed.  If you want to argue against that, go ahead. But at least do it honestly and forthrightly: don’t use questionable social science methods to allege NSERC of “bias” when it is simply doing what has always been asked to do.

April 11

Those New Infrastructure Funds

I have been meaning to write about the new $2 billion “Strategic Investment Fund” (SIF), the 3-year infrastructure money-dump the Liberals announced in the budget.  However I waited a bit too long and Paul Wells beat me to it in an excellent little article called How to Spend $2 Billion on Research Really Quickly (available here).

Do read Wells’ piece in its entirety, but the Coles Notes version is:

  1. The deadline for submission is quite soon (May 9), which is kind of a crazy goal for slow-moving organizations for universities to hit
  2. The money is not a straight-out grant: matching funding is required, which could be a bit of a challenge
  3. The amount of work required to a shot of that money in terms of getting engineering and regulatory approvals, environmental assessments, providing evidence of “additionality”, “sustainability”, “meeting industry needs”, “benefiting aboriginal populations” and of course getting approval from one’s Board of Governors, is stonkingly huge.

Those are all good points.  Let me add a couple of more.

First of all, yes these things are challenging but hardly unprecedented. The timeline and process are almost exactly those seen in the Knowledge Infrastructure Program (KIP) the Tories created for the 2009 budget.  In both cases, the programs were announced with unbelievably tight timeline criteria (about two months from budget time to deadline) and the same matching funding requirement.  In both cases, the program was announced with eligibility criteria but no selection criteria.  That means we don’t really know what the government is looking for, what kinds of things it wants to see in submissions and how it will go about choosing from among the many submitted projects.  There is a margin for shenanigans here, but it’s the same margin the Tories had in ’09 and everyone seems to think that process went OK.

Second of all, the key thing to understand here is that although the rhetoric around infrastructure is always about “new” infrastructure, the fact of the matter is given the timelines and the rules, this program will be almost entirely about renovations and re-fits. (and occasionally some expansions).  The tight timelines make it impossible to submit any build project that isn’t already in the pipeline, and the rule that the federal money shouldn’t displace already-committed money means pretty much anything in the pipeline is ineligible.  In my (admittedly non-random) quick scan of projects completed under KIP, I could only find one example of a project which was 100% new build, namely, the ART Lab studios at the University of Manitoba.

(Also – apparently U of M managed to get KIP to fund seven different projects.  Kudos to one or both of their planning shop and government relations shop).

Third, between twenty years of CFI funding plus now two rounds of KIP/SIF (let’s be honest, it’s the same program), one does start to wonder at what point we start entering into a moral hazard position where the provinces essentially opt out of the infrastructure game because they know the feds will pony up – or indeed whether we haven’t actually reached that point in several provinces.   True, the feds might respond by saying “but they can play a role by choosing the projects for which they want to provide matching funds”.  To which, if I were a provincial government, I might calmly explain that the feds should use this explanation for a rather protracted rectal examination because in effect what they are doing is blackmailing the provinces into spending on things they didn’t really intend to spend money during a period where most provinces are trying to control spending not increase it.  (I might also explain that if the federal government that when it says it wants to consult with provinces, it’s generally more effective to do so before announcing the program rather than after).

I’m sure there are many in Ottawa (including some higher education membership organizations) who think the idea of adding infrastructure to student aid and research as areas of shared jurisdiction in higher education would be just swell.  But it’s not entirely obviously to me that divorcing capital investment policy from system-level strategy is a recipe for good outcomes.  I suspect this is going to be part of a debate on “fiscal imbalances” between federal and provincial governments sometime quite soon.  Watch this space.

January 29

Asleep at the Switch…

… is the name of a new(ish) book by Bruce Smardon of York University, which looks at the history of federal research & development policies over the last half-century.  It is a book in equal measures fascinating and infuriating, but given that our recent change of government seems to be a time for re-thinking innovation policies, it’s a timely read if nothing else.

Let’s start with the irritating.  It’s fairly clear that Smardon is an unreconstructed Marxist (I suppose structuralist is the preferred term nowadays, but this is York, so anything’s possible), which means he has an annoying habit of dropping words like “Taylorism” and “Fordism” like crazy, until you frankly want to hurl the book through a window.  And it also means that there are certain aspects of Canadian history that don’t get questioned.  In Smardon’s telling, Canada is a branch-plant economy, always was a branch-plant economy, and ever shall be one until the moment where the state (and I’m paraphrasing a bit here) has the cojones to stand up to international capital and throw its weight around, after which it can intervene to decisively and effectively restructure the economy, making it more amenable to being knowledge-intensive and export-oriented.

To put it mildly, this thesis suffers from the lack of a serious counterfactual.  How exactly could the state decisively rearrange the economy so as to make us all more high-tech?  The best examples he gives are the United States (which achieved this feat through massive defense spending) and Korea (which achieved it by handing over effective control of the economy to a half-dozen chaebol).  Since Canada is not going to become a military superpower and is extremely unlikely to warm to the notion of chaebol, even if something like that could be transplanted here (it can’t), it’s not entirely clear to me how Smardon expects something like this to happen, in practice.  Occasionally, you get a glimpse of other solutions (why didn’t we subsidize the bejesus out of the A.V. Roe corporation back in the 1960s?  Surely we’d be an avionics superpower by now if we had!), but most of these seem to rely on some deeply unrealistic notions about the efficiency of government funding and procurement as a way to stimulate growth.  Anyone remember Fast Ferries?  Or Bricklin?

Also – just from the perspective of a higher education guy – Smardon’s near-exclusive focus on industrial research and development is puzzling.  In a 50-year discussion of R&D, Smardon essentially ignores universities until the mid-1990s, which seems to miss quite a bit of relevant policy.  Minor point.  I digress.

But now on to the fascinating bit: whatever you think of Smardon’s views about economic restructuring, his re-counting of what successive Canadian governments have done over the past 50 years to make the Canadian economy more innovative and knowledge-intensive is really quite astounding.  Starting with the Glassco commission in the early 1960s, literally every government drive to make the country more “knowledge-intensive” or “innovative” (the buzzwords change every decade or two) has taken the same view: if only publicly-funded researchers (originally this meant NRC, now it means researchers in university) could get their acts together and talk to industry and see what their problems are, we’d be in high-tech heaven in no time.  But the fact of the matter is, apart from a few years in the 1990s when Nortel was rampant, Canadian industry has never seemed particularly interested in becoming more innovative, and hence why we perennially lag the entire G7 with respect to our record on business investment in R&D.

You don’t need to buy Smardon’s views about the potentially transformative role of the state to recognize that he’s on to something pretty big here.  One is reminded of the dictum about how the definition of insanity is doing the same thing over and over, and expecting a different result.  Clearly, even if better co-ordination of public and private research efforts is a necessary condition for swifter economic growth, it’s not a sufficient one.  Maybe there are other things we need to be doing that don’t fit into the Glassco framework.

At the very least, seems to me that if we’re going to re-cast our R&D policies any time soon, this is a point worth examining quite thoroughly, and Smardon has done us all a favour by pointing this out.

Bon weekend.

December 07

H > A > H

I am a big fan of the economist Paul Romer, who is most famous for putting knowledge and the generation thereof at the centre of  discussions on growth.  Recently, on (roughly) the 25th anniversary of the publication of his paper on Endogeneous Technological Change, he wrote a series of blog posts looking back on some of the issues related to this theory.  The most interesting of these was one called “Human Capital and Knowledge”.

The post is long-ish, and I recommend you read it all, but the upshot is this: human capital (H) is something stored within our neurons, which is perfectly excludable.  Knowledge (A) – that is, human capital codifed in some way, such as writing – is nonexcludable.  And people can use knowledge to generate more human capital (once I read a book or watch a video about how to use SQL, I too can use SQL).  In Romer’s words:

Speech. Printing. Digital communications. There is a lot of human history tied up in our successful efforts at scaling up the H -> A -> H round trip.

And this is absolutely right.  The way we turn a patterns of thought in one person’s head into thoughts in many people’s heads is the single most important question in growth and innovation, which in turn is the single most important question in human development.  It’s the whole ballgame.

It also happens to be what higher education is about.  The teaching function of universities is partially about getting certain facts to go H > A > H (that is, subject matter mastery), and partially about getting certain modes of thought to go H > A > H (that is, ways of pattern-seeking, sense-making, meta-cognition, call it what you will). The entire fight about MOOCs, for instance, is a question of whether they are a more efficient method of making H > A > H happen than traditional lectures (to which I think the emerging answer is they are competitive if the H you are talking about is “fact-based”, and not so much if you are looking at the meta-cognitive stuff.  But generally, “getting better” at H > A > H in this way is about getting more efficient at the transfer of knowledge and skills, which means we can do more of it for the same price, which means that economy-wide we will have a more educated and productive society.

But with a slight amendment it’s also about the research function of universities.  Imagine now that we are not talking H > A > H, but rather H > A > H1.  That is, I have a certain thought pattern, I put it into symbols of some sort (words, equations, musical notation, whatever) and when it is absorbed by others, it generates new ideas (H1). This is a little bit different than what we were talking about before.  The first is about whether we can pass information or modes of thought quickly and efficiently; this one is about whether we can generate new ideas faster.

I find it helpful to think of new ideas as waves: they emanate outwards from the source and lose in intensity as they move further from the source.  But the speed of a wave is not constant: it depends on the density of the medium through which the ideas move (sound travels faster through solids than water, and faster through water than air, for instance).

And this is the central truth of innovation policy: for H > A > H1 to work, there has to be a certain density of receptor capacity for the initial “A”.  A welder who makes a big leap forward in marine welding will see his or her ideas spread more quickly if she is in Saint John or Esquimault than if she is in Regina.  To borrow Matt Ridley’s metaphor of innovation being about “ideas having sex”, ideas will multiply more if they have more potential mates.

This is how tech clusters work: they create denser mediums through which idea-waves can pass; hence, they speed up the propagation of new ideas, and hence, under the right circumstances, they speed up the propagation of new products as well.

This has major consequences for innovation policy and the funding of research in universities.  I’ll explain that tomorrow.

November 18

The Radical Implications of David Turpin’s Installation Speech

David Turpin was installed as President at the University of Alberta earlier this week.  His inaugural speech was good.  Very good.  Read a shortened version of it here.

(Full disclosure: I spoke at a leadership function at the University of Alberta in August, for which I received a fee.  The University has also recently purchased two of our syndicated research products.  Make of that what you wish.)

The speech starts out with what I would call some standard defences of the university, which any president would give: we seek truth and knowledge, we innovate, and we create jobs, yadda yadda.  Where it gets interesting is where he starts his appeal to the provincial government.  Let me quote what I think are the key bits:

“Our task continues to be to ask unexpected questions, seek truth and knowledge, and help society define, understand and frame its challenges. Our goal for the future is to find new and innovative ways to mobilize our excellence in research and teaching to help municipal, provincial, national and international communities address these challenges.”

Note: the truth/knowledge tasks “continue”, but now we’re adding a “goal” of mobilizing the university’s talents to address “challenges”.  And these are not just abstract challenges.  Turpin gets very, very specific here:

To our municipal partners: We will work with you to address your major goals on poverty reduction, homelessness, downtown revitalization, infrastructure renewal and transportation.

To our provincial partners: We will work with you to strengthen a post-secondary education system that serves the needs of all Alberta’s learners. We will provide our students the educational experience they need to seed, fuel and drive social, cultural and economic diversification. We will advance social justice, leading reconciliation with our First Nations and protection for minorities. We will conduct research to sustainably develop Alberta’s wealth of natural resources and improve Albertans’ health and wellness.

These are really specific promises.  If I’m a municipal or provincial official, what I hear from this is “Cool! U of A is going to be my think tank!  It’s going to put expertise at my disposal in areas like poverty reduction and economic diversification”.  That may or may not be Turpin’s intent, but it’s what they will hear.  And that’s well beyond the traditional role of a university in Canada, and in some ways beyond even some of the “state service” commitments that exist in US Land Grant institutions.  Sure, ever since von Humboldt, universities have been there to serve and strengthen the state, but I think the way Turpin is articulating this is genuinely new.

Now, no doubt the University has enormous resources to help achieve all of these things.  But those resources are mostly faculty members and grad students.  And while the university can ask them nicely to help folks at city hall/the legislature when they come calling, the question is: what’s in it for the profs and grad students to drop what they’re doing and go help the city/province (especially if they feel they have better things to do)?  Is the expectation that staff will do this out of a collective desire to contribute to their communities, or will incentives be put in place?

This goes deep to the heart of a university’s research mission.  At research universities like U of A, tenure and promotion is based mostly on publication records, and time is supposed to be spent 40-40-20 on teaching, research, and service.  But if your provost walks down the hall and says “hey, I just met with a couple of MLAs, and they’re hoping they can borrow your expertise for a couple of weeks”, do those expectations now change?  Will tenure/promotion committees actually take into account work done for government as equivalent to work done for an academic publication?

(For those of you not native to academe, it may seem amazing that research done for public policy, something that changes the way government makes decisions in a certain area, is not rated as highly for tenure/promotion as publishing things in journals that on average are read by a handful of people.  It is amazing, yes.  But true more often than not.)

If the answer to those questions is no, then I don’t think this initiative will go far.  But if the answer is yes, then Turpin is literally talking about a new kind of university, one that is prepared to sacrifice at least some of the prestige associated with being a “world-class university” with a laser-like focus on publication outputs, in order to contribute to its community in very concrete ways.  It’s not a reduction in research intensity, but it is a different type of research intensity.

The risk, of course, is that this new type of intensity won’t come with as many dollars attached.  I hope that’s not the case.  But in any event, this could be quite an exciting experiment.  One definitely worth keeping an eye on.

November 11

Times You Wish There Was a Word Other Than Research

There is something about research in modern languages (or English, as we used to call it) that sets many people’s teeth on edge, but usually for the wrong reasons.

Let’s go back a few months to Congress, specifically to an article Margaret Wente wrote where she teed-off on a paper called “Sexed-up Paratext: The Moral Function of Breasts in 1940s Canadian Pulp Science Fiction”.   Her point mostly was about “whatever happened to the great texts?”  Which, you know: who cares?  The canon is overrated, and the transversal skills that matter can be taught through many different types of materials.

But she hit a nerve by articulating a point about research in the humanities, and why the public feels uneasy about funding them.  Part of it is optics, and what looks to outsiders like childish delight in mildly titillating or “transgressive” titles.  But mostly, it just doesn’t “look like” what most people think of as research.  It’s not advancing our understanding of the universe, and it’s not making people healthier, so what’s it doing other than helping fuel career progression within academia?  And that’s not a judgement at all on what’s in the paper itself (I haven’t read the paper, I can’t imagine Wente did either); even if it were the best paper at Congress, people who defend the humanities wouldn’t likely point to a paper whose title contains the words “the Moral Functions of Breasts” as a way to showcase the value of humanities research.  The title just screams self-indulgence.

And yet – as a twitter colleague pointed out at the time – whoever wrote this piece probably is a great teacher.  With this kind of work, they can show the historical roots of things like sexuality in comics, which is highly relevant to modern issues like Gamergate.  If we want teachers to focus on material that is relevant and can engage students, and if you really want scholarly activity to inform teaching, surely this is exactly the kind of thing that should be encouraged.  As scholarly activity, this is in a completely different – and much, much better – category than, say, the colonoscopic post-modernist theorizing that was so memorably skewered during the Sokal Affair because you can clearly see the benefits for teaching and learning.

But is it “research”?

The academe doesn’t like to talk about this much because, you know, you stick to your discipline and I’ll stick to mine.   You can push the point if you want and claim that all research is similar because, regardless of discipline, research is an exercise in pattern recognition.  There are, however, some fundamental differences between what sciences call research and what humanities call research.   In the sciences, people work to uncover laws of nature; in the social sciences, people (on a good day) are working on laws (or at least patterns) of human behaviour and interaction.  In humanities, especially English/Modern Languages, what’s essentially going on is narrative-building.  That’s not to say that narratives are unimportant, nor that the construction of good narrative is easier than other forms of scholarly work.  But it is not “discovery” in the way that research is in other disciplines. 

And here’s the thing: when the public pays for research, it thinks it’s paying for discovery, not narrative-building.  In this sense, Wente taps into something genuine in the zeitgeist; namely, the public claim that: “we’re being duped into paying for something to which we didn’t agree”.  And as a result, all research comes under suspicion.  This is unfortunate: we’re judging two separate concepts of scholarly work by a single standard, and both end up being found suspect because one of them is mislabeled.

To be clear: I am not at all suggesting that one of these activities is superior to the other.  I am suggesting that they are different in nature and impact.  For one thing, the most advanced scientific research is mostly unintelligible to lower-year undergraduates, whereas some of the best narrative work is actually – much like Sexed-up Paratext – intended precisely to render some key academic concepts more accessible to a broader audience.

It is precisely for this reason that we really ought to have two separate words to describe the two sets of activities.  The problem is finding one that doesn’t create an implicit hierarchy between the two.  I think we might be stuck with the status quo.  But I wish we weren’t.

Page 1 of 712345...Last »