HESA

Higher Education Strategy Associates

Category Archives: Research

September 20

How Families Make PSE Choices

Over the last few months at HESA Towers we’ve been doing a lot of interviews of parents of grade 12 students, to help understand what it is that shapes and shifts their perceptions of higher education institutions.  I can’t give away much of the content here (that’s for paying customers), but one issue I do think is worth a mention is what we’re finding about how families make decisions about post-secondary education.

The way researchers conceive of decision-making in post-secondary is a pretty linear one, at least where traditional-aged students are concerned.  Parents and students, separately or in tandem, research possible career avenues and try to match them with educational pathways and students’ own interests.  They research programs and institutions, and try to judge quality.  They examine their finances – preferably jointly – and discuss what is affordable.  And on the basis of these various pieces of information, they winnow down the number of potential programs/institutions from a large number to a fairly small one, to which one might apply and finally down to a single institution.  Conceptually, it’s like a funnel, wide at the top and then narrowing gradually as students and parents seek and process information.

As a conceptual model, this suffers from just one problem: it’s mostly wrong.

Here’s what we’ve found instead.  The first is that the notion of parents and students “discussing” post-secondary options is valid only if you think of “discussions” as being asynchronous snatches of conversation that stretch over months or even years.  Parents do not really see their role as one of getting students to decide on choices.  In fact, most assume that the more direct they are about discussing or suggesting options, the more their kids will disengage or oppose them.  Instead, parents see their role as almost horticultural.  They “plant seeds” with their kids by suggesting ideas here and there, but more or less allow them to come to their own conclusions.

Another key set of assumptions about family decision making is that money is a central part of the discussion and money plays an important role in the eventual choice of institution.  And here the answer is basically “yes and no”.  Parents do talk to their kids about money in general terms.  A few don’t – they refuse to talk about it so as “not to distract them”, some save money but don’t tell them about it to “keep them motivated” – but for the most part parents let their kids know at least in general terms how much money is available.

But when it comes to choosing an institution, money plays an ambiguous role.  It’s pretty clear that most parents would prefer if their kid stayed home for financial reasons For the most part, kids are more than happy to study close to home, too, so for them the issue of money simply doesn’t impinge on choice.  Money (or lack of it) really only comes into play once a student starts coming close to making a decision that involves going away to school.

Not surprisingly, parents are reluctant to spend money on kids who they think are unlikely to benefit much by going away to school.  This isn’t just a preference for spending less money rather than more: many parents of grade 12 students simply don’t think their kids are mature enough or organized enough to go away.  But – and here’s where it gets interesting – parents don’t necessarily express this opinion by talking to their kids about money.  Another way they do it is to talk up local schools – or at least avoid talking up more distant ones – during their “planting seeds” discussions and hope the kid comes to the preferred conclusion on his or her own (though to some extent this also reflects greater parental familiarity with local as opposed to more distant institutions).

On the other hand, if the kid is perceived as actually having their act together – partially an issue of grades but also one of having goals and a sense of purpose – money becomes less of an issue for parents.  It’s not that the issue disappears or that they’ll let their kids do whatever they want, but if parents think their kid has their act together, they are more open to allow the student to drive the decision about where to go to school.

So in other words, much of the “discussion” occurs by way of kids spending hundreds of hours cracking the books (or not cracking then, as the case may be) and thus sending signals to parents about their talents and capacities.  Based on the presence/absence of said capacities, parents gradually, over a number of years, drop hints about institutional and program preferences, sometimes to kids who have a hopelessly short attention span about such things.  Sometime between mid-grade 11 and early grade-12, the students themselves become serious about searching for institutions.  When they start this process, they are doing so after having experienced years of subtle (or maybe not-so-subtle) hints from their parents about what kinds of programs and institutions are acceptable.  From that, they make a choice.  Then, and pretty much only then, do discussion about money explicitly come into the open.  But in many cases it does not need to because the student has already made the “correct” (form the parents’ point of view) choice which will unlock a contribution sufficient to get the student into school.

As noted above, this is quite different from how most college choice theories describe the decision-making process. And I think this has some serious consequences for the way we communicate to families and students issues of price and affordability.  There need to be some very simple, general messages about how aid makes education affordable, wherever one chooses to undertake it, that can be hammered over and over.   Communicating the specific details of aid programs is almost a total waste of time until very late in the final year of high school, after the institutional choice has already been made. It’s almost as if two different information products need to be created for two different audiences: a simple and general one for use during the choice process and a detailed one for afterwards.

There are also quite a lot of implications here for how institutions should sell themselves.  But those we keep for our institutional clients.  Drop us a line (info@higheredstrategy.com) if you’re interested in becoming one.

August 29

Fundamental Choices on Fundamental Science

The federal government has been somewhat quiet on the subject of science funding since the release of the Fundamental Science Review (see previous blogs here here and here) back in April.  Within much of the scientific community, which for the most part fell head over heels in love with the Report, this has given cause for concern; personally, I think this is pretty much par for the course, and we aren’t likely to see much in the way of hints about the size of any possible investment until October or so.

The major good piece of news is that for the first time in a long time, economic growth is going way ahead of expectations and the likelihood is there’s going to be about $10 billion more in the fiscal framework than originally expected.  Now the likelihood is they’ll blow some of that on projects designed to keep Kathleen Wynne in power, some on daycare, and maybe a bit on deficit reduction just to show they haven’t totally forgotten their pledges around fiscal restraint, but there should be enough left in the till to put a decent amount of money towards science.

But the question is: will they?  And how fundamentally will they re-shape the system in the process?

To give you the research situation in a nutshell:  Apart from a brief blip in the 2016 budget, the overall granting council budget has been falling gently in real dollars since 2010.  The overall science budget has actually stayed steady or even increased, but a lot of that extra money is going to programs like the new Canada 150 Research Chairs, the Canada First Research Excellence Fund, etc.  And within the granting council budgets, and increasing amount of money has been diverted away from fundamental research.  Some of it has gone to more graduate scholarships, but there has also been an increasing focus on making research more focussed on end-use, creating partnerships with industry (which has a similar effect), etc.  Add to the fact that some granting councils (notably CIHR, whose management decisions over the last decade appear to be the result of sustained cane toad licking) have started substantially reducing the number of awards they give out each year in order to increase the average size of their awards.

This has varied outcomes from a political point of view.  A large number of individual researchers in basic sciences, particularly biology and medicine, are livid.  A smaller number of researchers with more strength in translational and applied research are doing just fine, thank you very much.  And the universities, who are still by and larger getting the money they want, recognize that many of their employees are unhappy campers; however, since they continue to receive money either way, the status quo isn’t intolerable even if it isn’t ideal.

Now, into this steps David Naylor and his fellow commissioners with a report on how to fix it.  They ask for a whole lot of money: $1.3 billion in funding for fundamental research phased in over four years.   But – and here’s the tricky bit – how to pay for it?  Do you ask for completely 100% new money?  Because that’s a lot.  It’s something like a 30% increase, which not many programs get these days.  Or do you say: hey, let’s undo all those bad decisions of the past decade and dismantle CFREF, the Excellence Chairs and whatnot, rejig the council funding so less of their money goes to translational research, etc. (Nassif Ghoussoub outlines one possible approach along these lines here). Basically, spend the money we have better before asking for more dollars.

If it were me, I’d take option two.  But that would create winners and losers and governments hate that even if the winners in this case would be very loud and happy.  So Naylor and co. went with option one: ask for all money to be new.  Well, they actually did kind of say all that other money (CFREF, CERCs) was bunk because there were a lot of “this program should be reviewed but it’s out of our scope” comments (not sure it was actually, but leave that aside) but they very specifically avoided saying “lets repurpose some money.“  It’s a higher risk strategy, I think, because you need to ask for a larger sum of money, but on the plus side: no losers.

What will the outcome be?  If I had to guess, it’s that Naylor will mostly get his wish on funding because, fortuitously, money is available and they can probably get by without much re-purposing. But if that hadn’t been the case (and still may not be – still plenty of time for a Black Swan even between now and budget day), who knows what would have happened?  Because just as turkeys don’t vote for Christmas, you know there is literally no one in Ottawa willing to brief the politicians on the re-purposing option.

Which is too bad, because even with all the research money in the world, it’s still important to spend it properly.

August 28

Welcome Back

Morning all.  Hope you had a good summer.  To welcome you back, let’s take a quick look at state of play in the sector as we start the academic year.

In Canadian PSE, I don’t think there’s a whole lot of doubt about where things are headed this year.  Post-Naylor, we’re going to be talking research, research, research.  If you doubt this, take a look at Universities Canada’s recent budget submission.   As always, there are three “asks”; for the first time I can remember all three asks are about research.  It’s clear that scientists – particularly those in health-related fields who have been jerked around the most in recent years – have been making their voices heard and that University Presidents at least are responding to that pressure by making this issue central to higher education lobbying for the next twelve months.

(I think this is poor form, actually.  Less than two years on from the Truth and Reconciliation Commission and not an enormous amount of progress immediately evident, I’m not sure how appropriate it is to not have something on indigenous education this year.  But I’m not in charge.)

“Superclusters” will probably get a lot of mileage as they get announced in the run-up to the budget next winter.  Apparently the competition – which is supposed to have five winners – received 50 applications, each of which was supposed to have at least one post-secondary education partner.  However, we’ve also been told that only 20 PSE institutions’ names were attached to these proposals.  From this we can deduce that i) it’s likely that a handful of institutions’ – no prizes for guessing which ones – are on three or more proposals (rumour has it one is on no less than 18), and either ii) almost none of these proposals have more than one participating PSE institution or iii) there are a lot of the same institutions over and over again.  If it’s the latter then the program has basically abandoned the idea of clusters being geographic in nature and this program basically is back to Network Centre of Excellence but with some private enterprise attached.  Which defeats the purpose of this stuff, in my view.  No self-sustaining cluster gets by on research alone. It gets by more than anything on having lots of trained workers of various kinds.  And that means colleges and polytechnics *have* to be part of the mix.  If they’re not then this whole thing is a conceptual failure from the get-go.

(But hey, this is Ottawa.  No one’s ever going to measure the results.  And even if by some miracle the policy’s was found officially wanting, presumably they can always claim that it’s because they didn’t spend enough money.)

While much of the attention will be focussed on Ottawa, remember we live in a federal country.  For most institutions, the real game this year will be in provincial capitals.  2018 is going to see elections in Ontario (June) and Quebec (October).  Combined with a minority legislature in British Columbia, what we have is a situation where the country’s three largest provinces – all of whom have budgets which are more or less in balance – are going to be in spending mode for the next twelve months.  Not everyone is going to share in this bounty, of course. My guess would be that Manitoba, Newfoundland and Saskatchewan are going to see continued or intensified restraint and from what I hear Alberta is about to find out exactly how miserable a tuition freeze combined with zero funding growth can be.  But still, for the sector as a whole, what we have right now is possibly the best alignment of the constellations we’ve seen in about a decade.

Outside Canada, I think the big stories are going to be in Brazil, Russia, where the lingering effects of the commodity price collapse have left state budgets in very weak shape to fund higher education; in England the chaotic combination of Brexit and a historically incompetent/cowardly government will surely provide some entertainment, while in the US the twin topics of free speech on campus and the dismantling of many Obama-era improvements in student policy – particularly in the area of oversight of private colleges – will get top billing even if President Trump’s own ideas about student aid are surprisingly generous.

Here on this blog, I’m hoping to shift topic areas slightly this year.  Often last year I felt I wasn’t adding much to discussions beyond what I had already contributed in the previous five years, and I do worry sometimes about the blog feeling stale.  I hope this year to be able to focus a little bit more on areas I’ve dealt with less fulsomely in the past: particularly, colleges & polytechnics and on international PSE (the latter with a bit of a data focus).  Also, at some point this fall we will be moving to accept advertisements. We’ll see how all that goes.

And with that: have a good year, everyone.  Let’s get to work.

June 06

Making “Applied” Research Great Again

One of the rallying calls of part of the scientific community over the past few years is how under the Harper government there was too much of a focus on “applied” research and not enough of a focus on “pure”/”basic”/”fundamental research.  This call is reaching a fever pitch following the publication of the Naylor Report (which, to its credit, did not get into a basic/applied debate and focussed instead on whether or not the research was “investigator-driven”, which is a different and better distinction).  The problem is that that line between “pure/basic/fundamental” research and applied research isn’t nearly as clear cut as people believe, and the rush away from applied research risks throwing out some rather important babies along with the bathwater.

As long-time readers will know, I’m not a huge fan of a binary divide between basic and applied research.  The idea of “Basic Science” is a convenient distinction created by natural scientists in the aftermath of WWII as a way to convince the government to give them money the way they did during the war but without having soldier looking over their shoulder.  In some fields (medicine, engineering), nearly all research is “applied” in the sense that it there are always considerations of the end-use for the research.

This is probably a good time for a refresher on Pasteur’s Quadrant.  This concept was developed by Donald Stokes, a political scientist at Princeton, just before his death in 1997.  He too thought the basic/applied dichotomy was pretty dumb, so like all good social scientists he came up with a 2×2 instead.  One consideration in classifying science is whether or not it involved a quest for fundamental understanding; the other was whether or not the researcher had any consideration for end-use.   And so what you get is the following:

June 6 -17 Table 1

(I’d argue that to some extent you could replace “Bohr” with “Physics” and “Pasteur” with “Medicine” because it’s the nature of the fields of research and not individual researchers’ proclivities, per se, but let’s not quibble).

Now what was mostly annoying about the Harper years – and to some extent the Martin and late Chretien years – was not so much that the federal government was taking money out of the “fundamental understanding” row and into the “no fundamental understanding” row (although the way some people go on you’d be forgiven for thinking that), but rather than it was trying to make research fit into more than one quadrant at once.  Sure, they’d say, we’d love to fund all your (top-left quadrant) drosophilia research, but can you make sure to include something about its eventual (bottom-right quadrant) commercial applications?  This attempt to make research more “applied” is and was nonsense, and Naylor was right to (mostly) call for an end to it.

But that is not the same thing as saying we shouldn’t fund anything in the bottom-right corner – that is, “applied research”.

And this is where the taxonomy of “applied research” gets tricky.  Some people – including apparently the entire Innovation Ministry, if the last budget is any indication – think that the way to bolster that quadrant is to leave everything to the private sector, preferably in sexy areas like ICT, Clean Tech and whatnot.  And there’s a case to be made for that: business is close to the customer, let them do the pure applied research.

But there’s also a case to be made that in a country where the commercial sector has few big champions and a lot of SMEs, the private sector is always likely to have some structural difficulties doing the pure applied research on its own.  It’s not simply a question of subsidies: it’s a question of scale and talent.  And that’s where applied research as conducted in Canada’s colleges and polytechnics comes in.  They help keep smaller Canadian companies – the kinds that aren’t going to get included in any “supercluster” initiative – competitive.  You’d think this kind of research should be of interest to a self-proclaimed innovation government.  Yet whether by design or indifference we’ve heard nary a word about this kind of research in the last 20 months (apart perhaps from a renewal of the Community and College Social Innovation Fund).

There’s no reason for this.  There is – if rumours of a cabinet submission to respond to the Naylor report are true – no shortage of money for “fundamental”, or “investigator-driven” research.  Why not pure applied research too?  Other than the fact that “applied research” – a completely different type of “applied research”, mind you – has become a dirty word?

This is a policy failure unfolding in slow motion.  There’s still time to stop it, if we can all distinguish between different types of “applied research”.

May 17

Diversity in Canada Research Chairs

One of the hot topics in Ottawa over the past couple of months is the issue of increasing diversity among researchers.   Top posts in academia are still disproportionately occupied by white dudes, and the federal minister of Science, Kirsty Duncan, would like to change that by threatening institutions with a loss of research funding.

There’s no doubt about the nature of the problem.  As in other countries, women and minorities have trouble making it up the career ladder in academia at the same rate as white males.  The reasons for this are well-enough known that I probably needn’t recount them here (though if you really want a good summary try here and here).  There was a point when one might reasonably have suspected that time would take care of the problem.  Once PhD completion rates equalized (until the 1990s they still favored men) and female scientists began making their way up the career ladder, it might have been argued, the problem of representation at the highest levels would take care of itself.  But it quite plainly hasn’t worked out that way and more systemic solutions need to be found.  As for Indigenous scholars and scholars with disabilities, it’s pretty clear we still have a lot of pipeline issues to worry about and equalizing PhD completion rates, in addition to solving problems related to career progression, is a big challenge.

Part of what Ottawa is trying to do is to get institutions to take their responsibilities on career progression seriously by getting them each to commit to equity plans.  Last October, the government announced that institutions without equity plans will become ineligible for new CERC awards; earlier this month, Kirsty Duncan attached the same condition to the Canada Research Chairs (CRC) program.

(A quick reminder here about how the Chairs program works.  There are two types of awards: Tier 1 awards for top researchers, worth $200,000/year for seven years, and Tier 2 awards for emerging researchers, worth $100,000/year for five years.  There are 2000 awards in total, with roughly equal numbers of Tier 1 and Tier 2 awards.  Each university gets an allocation of chairs based – more or less – on the share of tri-council funding its staff received, with a boost for smaller institutions.  So, University of Toronto gets 256 chairs, Université Ste. Anne gets one, etc.  Within that envelope institutions are free to distribute awards more or less as they see fit.)

The problem is, as the Minister well knows, all institutions already have equity plans and they’re not working.  So she has attached a new condition: they also fix the demographic distribution of chair holders so that they “ensure the demographics of those given the awards reflect the demographics of those academics eligible to receive them” by 2019.  It’s not 100% clear to me what this formulation means. I don’t believe it means that women must occupy 50% of all chairs; I am fairly sure that the qualifier “of those eligible to receive” means something along the lines of “women must occupy a percentage of Tier 1 chairs equal to their share of full professors, and of Tier 2 chairs equal to their share of associate and assistant professors”.

Even with those kind of caveats, reaching the necessary benchmarks in the space of 18-24 months will requires an enormous adjustment.  The figure I’ve seen for major universities is that only 28% of CRCs are women.  Given that only about 15-18% of chairs turn over in any given year, getting that up to the 40-45% range the benchmark implies by 2019 means that between 65 and 79% of all CRC appointments for the next two years will need to be female and probably higher than that for the Tier 1s.  That’s certainly achievable, but it’s almost certain to be accompanied by a lot of general bitchiness among passed-over male candidates.  Brace yourselves.

But while program rules allow Ottawa to use this policy tool to take this major step for gender equality, it will be harder to use it for other equity categories.  Institutions don’t even really have a measure of how many of their faculty have disabilities, so setting benchmarks would be tricky.  Indigenous scholars pose an even trickier problem. According to the formula used for female scholars, Indigenous scholars’  “share” of CRCs might be 1%, or about 20 nationally.  The problem is that only five institutions (Alberta, British Columbia, McGill, Montreal, Toronto) have 100 or more CRCs and would thus be required to reserve a spot for an Indigenous scholar.  An institution like (say) St. FX, which has only five chairs, would have a harder time.  It can achieve gender equity simply by having two or three female chairs.  But how would it achieve parity for Indigenous scholars?  It’s unlikely it could be required to reserve one of its five (20%) spots to an Indigenous scholar.

Many institutions would obviously hire Indigenous faculty anyway, it’s just that the institutional allocations which form the base of this program’s structure make it difficult to achieve some of what Ottawa wants to achieve on equity and diversity.

 

May 08

Naylor Report, Part II

Morning all.  Sorry about the service interruption.  Nice to be back.

So, I promised you some more thoughts about the Fundamental Science Review.  Now that I’ve lot of time to think about it, I think I’m actually surprised by what it doesn’t say, says and how many questions remain open.

What’s best about the report?  The history and most of the analysis are pretty good.  I think a few specific recommendations (if adopted) might actually be a pretty big deal – in particular the one saying that the granting councils should stop any programs forcing researchers to come up with matching funding, mainly because it’s a waste of everyone’s time.

What’s so-so about it?  The money stuff for a start.  As I noted in my last blog post, I don’t really think you can justify a claim to more money based on “proportion of higher ed investment research coming from federal government”.  I’m more sympathetic to the argument that there needs to be more funds, especially for early career researchers, but as noted back here it’s hard to argue simultaneously that institutions should have unfettered rights to hire researchers but that the federal government should be pick up responsibility for their career progression.

The report doesn’t even bother, really, to make the case that more money on basic research means more innovation and economic growth.  Rather, it simply states it, as if it were a fact (it’s not).  This is the research community trying to annex the term “innovation” rather than co-exist with it.  Maybe that works in today’s political environment; I’m not sure it improves overall policy-making.  In some ways, I think it would have been preferable to just say: we need so many millions because that’s what it takes to do the kind of first-class science we’re capable of.  It might not have been politic, but it would have had the advantage of clarity.

…and the Governance stuff?  The report backs two big changes in governance.  One is a Four Agency Co-ordinating Board for the three councils plus the Canada Foundation for Innovation (which we might as well now call the fourth council, provided it gets an annual budget as recommended here), to ensure greater cross-council coherence in policy and programs.  The second is the creation of a National Advisory Committee on Research and Innovation (NACRI) to replace the current Science, Technology and Innovation Council and do a great deal else besides.

The Co-ordinating committee idea makes sense: there are some areas where there would be clear benefits to greater policy coherence.  But setting up a forum to reconcile interests is not the same thing as actually bridging differences.  There are reasons – not very good ones, perhaps, but reasons nonetheless – why councils don’t spontaneously co-ordinate their actions; setting up a committee is a step towards getting them to do so, but success in this endeavour requires sustained good will which will not necessarily be forthcoming.

NACRI is a different story.  Two points here.  The first is that it is pretty clear that NACRI is designed to try to insulate the councils and the investigator-driven research they fund from politicians’ bright ideas about how to run scientific research.  Inshallah, but if politicians want to meddle – and the last two decades seem to show they want to do it a lot – then they’re going to meddle, NACRI or no.  Second, the NACRI as designed here is somewhat heavier on the “R” than on the “I”.  My impression is that as with some of the funding arguments, this is an attempt to hijack the Innovation agenda in Research’s favour.  I think a lot of people are OK with this because they’d prefer the emphasis to be on science and research rather than innovation but I’m not sure we’re doing long-term policy-making in the area any favours by not being explicit about this rationale.

What’s missing?  The report somewhat surprisingly punted what I expected to be a major issue: namely, the government’s increasing tendency over time to fund science outside the framework of the councils in such programs as the Canada Excellence Research Chairs (CERC) and the Canada First Research Excellence Fund (CFREF).  While the text of the report makes clear the authors’ have some reservations about these programs, the recommendations are limited to a “you should review that, sometime soon”.  This is too bad, because phasing out these kinds of programs would be an obvious way to pay for increase investigator-driven funding (though as Nassif Ghoussoub points out here  it’s not necessarily a quick solution because funds are already committed for several years in advance).  The report therefore seems to suggest that though it deplores past trends away from investigator-driven funding, it doesn’t want to see these recent initiatives defunded, which might be seen in government as “having your cake and eating it too”.

What will the long-term impact of the report be? Hard to say: much depends on how much of this the government actually takes up, and it will be some months before we know that.  But I think the way the report was commissioned may have some unintended adverse consequences.  Specifically, I think the fact that this review was set up in such a way as to exclude consideration of applied research – while perfectly understandable – is going to contribute to the latter being something of a political orphan for the foreseeable future.  Similarly, the fact that the report was done in isolation from the broader development of Innovation policy might seem like a blessing given the general ham-fistedness surrounding the Innovation file, in the end I wonder if the end result won’t be an effective division of policy, with research being something the feds pay universities do and innovation something they pay firms to do.  That’s basically the right division, of course, but what goes missing are vital questions about how to make the two mutually reinforcing.

Bottom line: it’s a good report.  But even if the government fully embraces the recommendations, there are still years of messy but important work ahead.

April 18

Naylor Report, Take 1

People are asking why I haven’t talked about the Naylor Report (aka the Review of Fundamental Science) yet.  The answer, briefly, is i) I’m swamped ii) there’s a lot to talk about in there and iii) I want to have some time to think it over.  But I did have some thoughts about chapter 3, where I think there is either an inadvertent error or the authors are trying to pull a fast one (and if it’s the latter I apologize for narking on them).  So I thought I would start there.

The main message of chapter 3 is that the government of Canada is not spending enough on inquiry-driven research in universities (this was not, incidentally, a question the Government of Canada asked of the review panel, but the panel answered it anyway).  One of the ways that the panel argues this point is that while Canada has among the world’s highest levels of Research and Development in the higher education sector – known as HERD if you’re in the R&D policy nerdocracy – most of the money for this comes from higher education institutions themselves and not the federal government.  This, that say, is internationally anomalous and a reason why the federal government should spend more money.

Here’s the graph they use to make this point:

Naylor Report

Hmm.  Hmmmmm.

So, there are really two problems here.  The first is that HERD can be calculated differently in different countries for completely rational reasons.  Let me give you the example of Canada vs. the US.  In Canada, the higher education portion of the contribution to HERD is composed of two things: i) aggregate faculty salaries times the proportion of time profs spend on research (Statscan occasionally does surveys on this – I’ll come back to it in a moment) plus ii) some imputation about unrecovered research overhead.  In the US, it’s just the latter.  Why?  Because the way the US collects data on HERD, the only faculty costs they capture are the chunks taken out of federal research grants.  Remember, in the US, profs are only paid 9 months per year and at least in the R&D accounts, that’s *all* teaching.  Only the pieces of research grant they take out as summer salary gets recorded as R&D expenditure (and also hence as a government-sponsored cost rather than a higher education-sponsored one).

But there’s a bigger issue here.  If one wants to argue that what matters is the ratio of federal portion of HERD to the higher-education portion of HERD, then it’s worth remembering what’s going on in the denominator.  Aggregate salaries are the first component.  The second component is research intensity, as measured through surveys.  This appears to be going up over time.  In 2000, Statscan did a survey which seemed to show the average prof spending somewhere between 30-35% of their time on research. A more recent survey shows that this has risen to 42%.  I am not sure if this latest co-efficient has been factored into the most recent HERD data, but when it does, it will show a major jump in higher education “spending” (or “investment”, if you prefer) on research, despite nothing really having changed at all (possibly it has been and it is what explains the bump seen in expenditures in 2012-13)

What the panel ends up arguing is for federal funding to run more closely in tune with higher education’s own “spending”.  But in practice what this means is: every time profs get a raise, federal funding would have to rise to keep pace.  Every time profs decide – for whatever reasons – to spend more time on research, federal funds should rise to keep pace.  And no doubt that would be awesome for all concerned, but come on.  Treasury Board would have conniptions if someone tried to sell that as a funding mechanism.

None of which is to say federal funding on inquiry-driven research shouldn’t rise.  Just to say that using data on university-funded HERD might not be a super-solid base from which to argue that point

December 08

Cluster Theory

Unless you’ve been under a rock for the last twelve months, you’ll have noted that the Government of Canada has become enamoured of “innovation clusters” as a means of raising national productivity levels.  What should we make of this?

For some annoying reason, the Liberals act as if cluster theory is something new rather than something which dates back to the mid-1980s (Michael Porter’s The Wealth of Nations gave the idea its first mass-market outing in 1990; six years later, Saskia Sassen gave us what is probably still the most engaging short book description of cluster formation in Regional Advantage.  In fact what is actually new – in Canada at least – is the idea that the federal government should encourage cluster formation/densification with great huge wads of cash.  $800 million over four years, in fact, according to Liberal manifesto and the 2016 Budget.

There are three reasons to be skeptical about this set of developments.  One is political, the second administrative, and the third is empirical.

The political problem is this: we live in Canada.  There is no way on God’s green earth that doling out money for what amount to economic development (or, say it softly, “industrial policy”) isn’t going to get 100% enmeshed in regional pork-barrelling.  Initially, the Government’s plan was for five clusters (I’ve heard it may now be for as many as eight).  Well, isn’t that convenient – five clusters, five regions.  I mean put away all your crystal balls about what’s going to get funded: It’ll be something Ocean-y in the Atlantic, something aerospace-y in Quebec, ICT-y in Ontario, Energy-y in the Prairies and (probably) life sciences-y in BC.  Whether each of these clusters is equally deserving of, or has the capacity to absorb, public dollars is irrelevant once regional politics comes into play.   Inevitable result is sub-optimal investments

The second issue is an administrative one.  Say you want to spend $150 million (or so) on “a cluster” in a variety of ways which increases research productivity, corporate partnerships, etc., etc.  It’s not just a question of deciding among hundreds of worthy micro-projects within a $150 million budget.  Who actually manages the project?  It’s not like giving money to a university or a hospital – a cluster has no corporate entity.  Occasionally, you get a trade organization that might conceivably act as a co-ordinator of a cluster, like say Communitech in Waterloo, and you could use them to distribute money in a way that made sense regionally.  But i) not every cluster has one of those and ii) even if they do, they’re going to tend to be biased towards established players rather than new ones.  The only alternative is to manage it all from Ottawa, but that’s a frightening prospect for a project that’s meant to improve industry flexibility.

Which brings us to the third, empirical, problem.  I’ve said this before but it bears repeating: a lot of the research on innovation is American, and assumes things like having DARPA around, and being at the technological frontier and having access to lots of venture capital and all that good stuff.  Most countries in the world don’t have that.  In fact, when most countries in the world (including us) think about “clusters” they are thinking about something fundamentally different than what Americans think of when they use that term, because our cluster thinking is designed as much around attracting established foreign companies as it is around developing native entrepreneurial talent.

And here’s a little secret: there are almost no good examples anywhere of clusters having been built on government money.  In fact, to the extent that anyone can work out what it is that makes a great cluster, it’s the presence of one or two industry-leading companies plus one heck of a lot of spin-offs related by disgruntled former employees who want to do their own thing (see especially Steven Klepper’s recent posthumously-published book Experimental Capitalism).  This is actually something most Canadian clusters are really bad at: the OECD Cluster rankings, although now a bit dated, show Canadian clusters generally in the bottom half of clusters across the OECD for new company formation.  Government can do something about this, but it’s not by spending money, it’s about using law and regulation to make sure non-competes are unenforceable.  Surprisingly, given that this is supposed to be a government devoted to evidence-based policy, that issue doesn’t appear to show up at all in our government’s thinking on clusters.

So what are we spending money on, exactly?  And why?  To what end?  Although the government’s had over a year to work on this, it’s really hard to get a sense of what the plan is.  I suspect that a lot of this money will end up in the hands of universities because they know the “apply for government money” game really well and can play to the Minister’s predilection to be photographed in front of a lot of shiny hi-tech gadgetry.

But will any of it have the slightest effect on national productivity?  I have my doubts.

June 08

Are NSERC decisions “skewed” to bigger institutions?

That’s the conclusion reached by a group of professors from – wait for it – smaller Canadian universities, as published recently in PLOS One. I urge you to read the article, if only to understand how technically rigorous research without an ounce of common sense can make it through the peer-review process.

Basically, what the paper does is rigorously prove that “both funding success and the amount awarded varied with the size of the applicant’s institution. Overall, funding success was 20% and 42% lower for established researchers from medium and small institutions, compared to their counterpart’s at large institutions.” 

They go on to hypothesize that:

“…applicants from medium and small institutions may receive lower scores simply because they have weaker research records, perhaps as a result of higher teaching or administrative commitments compared to individuals from larger schools. Indeed, establishment of successful research programs is closely linked to the availability of time to conduct research, which may be more limited at smaller institutions. Researchers at small schools may also have fewer local collaborators and research-related resources than their counterparts at larger schools. Given these disparities, observed funding skew may be a consequence of the context in which applicants find themselves rather than emerging from a systemic bias during grant proposal evaluation.”

Oh my God – they have lower success rates because they have weaker research records?  You mean the system is working exactly as intended?

Fundamentally, this allegedly scientific article is making a very weird political argument.  The reason profs at smaller universities don’t get grants, according to these folks, is because they got hired by worse universities –  which means they don’t get the teaching release time, the equipment and whatnot that would allow them to compete on an even footing with the girls and boys at bigger schools.  To put it another way, their argument is that all profs have inherently equal ability and are equally deserving of research grants, it’s just that some by sheer random chance got allocated to weaker universities, which have put a downer on their career, and if NSERC doesn’t actively ignore actual outputs and perform some sort of research grant affirmative action, then it is guilty of “skewing” funding.

Here’s another possible explanation: yes, faculty hired by bigger, richer, more research-intensive institutions (big and research-intensive are not necessarily synonymous, but they are in Canada) have all kinds of advantages over faculty hired by smaller, less research-intensive universities.  But maybe, just maybe, faculty research quality is not randomly distributed.  Maybe big rich universities use their resources mainly to attract faculty deemed to have greater research potential.  Maybe they don’t always guess quite right about who has that potential and who doesn’t but on the whole it seems likelier than not that the system works more or less as advertised.

And so, yes, there is a Matthew effect (“for unto every one that hath shall be given, and he shall have abundance”) at work in Science: the very top of the profession gets more money than the strata below them and that tends to increase the gap in outcomes (salary, prestige, etc).  But that’s the way the system was designed.  If you want to argue against that, go ahead. But at least do it honestly and forthrightly: don’t use questionable social science methods to allege NSERC of “bias” when it is simply doing what has always been asked to do.

April 11

Those New Infrastructure Funds

I have been meaning to write about the new $2 billion “Strategic Investment Fund” (SIF), the 3-year infrastructure money-dump the Liberals announced in the budget.  However I waited a bit too long and Paul Wells beat me to it in an excellent little article called How to Spend $2 Billion on Research Really Quickly (available here).

Do read Wells’ piece in its entirety, but the Coles Notes version is:

  1. The deadline for submission is quite soon (May 9), which is kind of a crazy goal for slow-moving organizations for universities to hit
  2. The money is not a straight-out grant: matching funding is required, which could be a bit of a challenge
  3. The amount of work required to a shot of that money in terms of getting engineering and regulatory approvals, environmental assessments, providing evidence of “additionality”, “sustainability”, “meeting industry needs”, “benefiting aboriginal populations” and of course getting approval from one’s Board of Governors, is stonkingly huge.

Those are all good points.  Let me add a couple of more.

First of all, yes these things are challenging but hardly unprecedented. The timeline and process are almost exactly those seen in the Knowledge Infrastructure Program (KIP) the Tories created for the 2009 budget.  In both cases, the programs were announced with unbelievably tight timeline criteria (about two months from budget time to deadline) and the same matching funding requirement.  In both cases, the program was announced with eligibility criteria but no selection criteria.  That means we don’t really know what the government is looking for, what kinds of things it wants to see in submissions and how it will go about choosing from among the many submitted projects.  There is a margin for shenanigans here, but it’s the same margin the Tories had in ’09 and everyone seems to think that process went OK.

Second of all, the key thing to understand here is that although the rhetoric around infrastructure is always about “new” infrastructure, the fact of the matter is given the timelines and the rules, this program will be almost entirely about renovations and re-fits. (and occasionally some expansions).  The tight timelines make it impossible to submit any build project that isn’t already in the pipeline, and the rule that the federal money shouldn’t displace already-committed money means pretty much anything in the pipeline is ineligible.  In my (admittedly non-random) quick scan of projects completed under KIP, I could only find one example of a project which was 100% new build, namely, the ART Lab studios at the University of Manitoba.

(Also – apparently U of M managed to get KIP to fund seven different projects.  Kudos to one or both of their planning shop and government relations shop).

Third, between twenty years of CFI funding plus now two rounds of KIP/SIF (let’s be honest, it’s the same program), one does start to wonder at what point we start entering into a moral hazard position where the provinces essentially opt out of the infrastructure game because they know the feds will pony up – or indeed whether we haven’t actually reached that point in several provinces.   True, the feds might respond by saying “but they can play a role by choosing the projects for which they want to provide matching funds”.  To which, if I were a provincial government, I might calmly explain that the feds should use this explanation for a rather protracted rectal examination because in effect what they are doing is blackmailing the provinces into spending on things they didn’t really intend to spend money during a period where most provinces are trying to control spending not increase it.  (I might also explain that if the federal government that when it says it wants to consult with provinces, it’s generally more effective to do so before announcing the program rather than after).

I’m sure there are many in Ottawa (including some higher education membership organizations) who think the idea of adding infrastructure to student aid and research as areas of shared jurisdiction in higher education would be just swell.  But it’s not entirely obviously to me that divorcing capital investment policy from system-level strategy is a recipe for good outcomes.  I suspect this is going to be part of a debate on “fiscal imbalances” between federal and provincial governments sometime quite soon.  Watch this space.

Page 1 of 812345...Last »