Higher Education Strategy Associates

Tag Archives: funding

May 15

Provincial Budgets 2017

Springtime brings with it two certainties: 1) massive, irritating weekend traffic jams in Toronto as the city grants permits to close down Yonge street for a parade to virtually any group of yahoos, thus making it impossible to go from the cities east to west ends and 2) provincial budgets.  And with that, it’s time for my annual roundup of provincial budgets (click on the year for previous analyses – 2016 2015 2014 2013.  It’s not as bad as last year but it’s still kind of depressing.

Before we jump in, I need to remind everyone about some caveats on this data.  What is being compared here is announced spending in provincial budgets from year-to-year.  But what gets allocated and what gets spent are two different things. Quebec in particular has a habit of delivering mid-year cuts to institutions; on the flip side, Nova Scotia somehow spent 15% more than budgeted on its universities.  Also, not all money goes to institutions as operating funding:  this year, Newfoundland cut operating budgets slightly but threw in a big whack of cash for capital spending at College of the North Atlantic, so technically government post-secondary spending is up there this year.

One small difference this year from previous years: the figures for Ontario exclude capital expenditures.  Anyone who has a problem with that, tell the provincial government to publish its detailed spending estimates at the same time it delivers the budget like every other damn province.

This year’s budgets are a pretty mixed bunch.  Overall, provincial allocations after inflation fell by $13 million nationally – or just about .06%.  But in individual provinces the spread was between +4% (Nova Scotia) and -7% (Saskatchewan).  Amazing but true: two of the three provinces with the biggest gains were ones in which an election was/is being held this spring.

Figure 1: 1-Year change in Provincial Transfers to Post-Secondary Institutions, 2016-17 to 2017-18, in constant $2017

Province Budget Figure 1 Year Change Provincial Transfers


Now, this probably wouldn’t be such a big deal if it hadn’t come on the heels of a string of weak budgets for post-secondary education.  One year is neither here nor there: it’s the cumulative effect which matters.  Here’s the cumulative change over the past six years:

Figure 2: 6-year Change in Provincial Transfers to Post-Secondary Institutions, 2011-12 to 2017-18, in constant $2017

Figure 2 6 year chage in provincial transfers


Nationally, provinces are collectively providing 1% less to universities in inflation-adjusted dollars in 2017-18 than they were in 2011-12.  Apart from the NDP governments in Manitoba and Alberta, it’s really only Quebec which has bothered to keep its post-secondary funding ahead of inflation.  Out east, it’s mostly been a disaster – New Brunswick universities are down 9% over the last six years (not the end of the world because of concomitant enrolment declines), and a whopping 21% in Newfoundland.

The story is different on the student aid front, because a few provinces have made some big moves this year.  Ontario and New Brunswick have introduced their “free tuition” guarantees, thus resulting in some significant increases in SFA funding, while Quebec is spending its alternative payment bonanza from the Canada Student Loans Program changes (long story short: under the 1964 opt-out agreement which permitted the creation of the Canada Student Loans Program, every time CSLP spends more, it has to send a larger cheque to Quebec).  On the other side, there’s Newfoundland, which has cut it’s student aid budget by a whopping 78%.  This appears to be because the province is now flouting federal student aid rules and making students max out their federal loans before accessing provincial aid, rather than splitting the load 60-40 as other provinces do.

Figure 3: 1-Year change in Provincial Student Financial Aid Expenditures, 2016-17 to 2017-18, in constant $2017

Figure 3 1 Year change in student aid expenditures


And here’s the multi-year picture, which shows a 46% increase in student aid over the past six years, from $1.9 billion to just under $2.8 billion.  But there are huge variations across provinces.  In Ontario, aid is up 83% over six years (and OSAP now constitutes over half of all provincial student aid spending), while Saskatchewan is down by half and Newfoundland by 86%, mostly in the present year.  The one province where there is an asterisk here is Alberta, where there was a change in reporting in 2013-2014; the actual growth is probably substantially closer to zero than to the 73% shown here.

Figure 4: 6-Year change in Provincial Student Financial Aid Expenditures, 2016-17 to 2017-18, in constant $2017

Figure 4 6 Year Change in Provincial Student Aid

So the overall narrative is still more or less the same it’s been for the past few years.  On the whole provincial governments seem a whole lot happier spending money on students than they do on institutions.    Over the long run that’s not healthy, and needs to change.

May 08

Naylor Report, Part II

Morning all.  Sorry about the service interruption.  Nice to be back.

So, I promised you some more thoughts about the Fundamental Science Review.  Now that I’ve lot of time to think about it, I think I’m actually surprised by what it doesn’t say, says and how many questions remain open.

What’s best about the report?  The history and most of the analysis are pretty good.  I think a few specific recommendations (if adopted) might actually be a pretty big deal – in particular the one saying that the granting councils should stop any programs forcing researchers to come up with matching funding, mainly because it’s a waste of everyone’s time.

What’s so-so about it?  The money stuff for a start.  As I noted in my last blog post, I don’t really think you can justify a claim to more money based on “proportion of higher ed investment research coming from federal government”.  I’m more sympathetic to the argument that there needs to be more funds, especially for early career researchers, but as noted back here it’s hard to argue simultaneously that institutions should have unfettered rights to hire researchers but that the federal government should be pick up responsibility for their career progression.

The report doesn’t even bother, really, to make the case that more money on basic research means more innovation and economic growth.  Rather, it simply states it, as if it were a fact (it’s not).  This is the research community trying to annex the term “innovation” rather than co-exist with it.  Maybe that works in today’s political environment; I’m not sure it improves overall policy-making.  In some ways, I think it would have been preferable to just say: we need so many millions because that’s what it takes to do the kind of first-class science we’re capable of.  It might not have been politic, but it would have had the advantage of clarity.

…and the Governance stuff?  The report backs two big changes in governance.  One is a Four Agency Co-ordinating Board for the three councils plus the Canada Foundation for Innovation (which we might as well now call the fourth council, provided it gets an annual budget as recommended here), to ensure greater cross-council coherence in policy and programs.  The second is the creation of a National Advisory Committee on Research and Innovation (NACRI) to replace the current Science, Technology and Innovation Council and do a great deal else besides.

The Co-ordinating committee idea makes sense: there are some areas where there would be clear benefits to greater policy coherence.  But setting up a forum to reconcile interests is not the same thing as actually bridging differences.  There are reasons – not very good ones, perhaps, but reasons nonetheless – why councils don’t spontaneously co-ordinate their actions; setting up a committee is a step towards getting them to do so, but success in this endeavour requires sustained good will which will not necessarily be forthcoming.

NACRI is a different story.  Two points here.  The first is that it is pretty clear that NACRI is designed to try to insulate the councils and the investigator-driven research they fund from politicians’ bright ideas about how to run scientific research.  Inshallah, but if politicians want to meddle – and the last two decades seem to show they want to do it a lot – then they’re going to meddle, NACRI or no.  Second, the NACRI as designed here is somewhat heavier on the “R” than on the “I”.  My impression is that as with some of the funding arguments, this is an attempt to hijack the Innovation agenda in Research’s favour.  I think a lot of people are OK with this because they’d prefer the emphasis to be on science and research rather than innovation but I’m not sure we’re doing long-term policy-making in the area any favours by not being explicit about this rationale.

What’s missing?  The report somewhat surprisingly punted what I expected to be a major issue: namely, the government’s increasing tendency over time to fund science outside the framework of the councils in such programs as the Canada Excellence Research Chairs (CERC) and the Canada First Research Excellence Fund (CFREF).  While the text of the report makes clear the authors’ have some reservations about these programs, the recommendations are limited to a “you should review that, sometime soon”.  This is too bad, because phasing out these kinds of programs would be an obvious way to pay for increase investigator-driven funding (though as Nassif Ghoussoub points out here  it’s not necessarily a quick solution because funds are already committed for several years in advance).  The report therefore seems to suggest that though it deplores past trends away from investigator-driven funding, it doesn’t want to see these recent initiatives defunded, which might be seen in government as “having your cake and eating it too”.

What will the long-term impact of the report be? Hard to say: much depends on how much of this the government actually takes up, and it will be some months before we know that.  But I think the way the report was commissioned may have some unintended adverse consequences.  Specifically, I think the fact that this review was set up in such a way as to exclude consideration of applied research – while perfectly understandable – is going to contribute to the latter being something of a political orphan for the foreseeable future.  Similarly, the fact that the report was done in isolation from the broader development of Innovation policy might seem like a blessing given the general ham-fistedness surrounding the Innovation file, in the end I wonder if the end result won’t be an effective division of policy, with research being something the feds pay universities do and innovation something they pay firms to do.  That’s basically the right division, of course, but what goes missing are vital questions about how to make the two mutually reinforcing.

Bottom line: it’s a good report.  But even if the government fully embraces the recommendations, there are still years of messy but important work ahead.

April 18

Naylor Report, Take 1

People are asking why I haven’t talked about the Naylor Report (aka the Review of Fundamental Science) yet.  The answer, briefly, is i) I’m swamped ii) there’s a lot to talk about in there and iii) I want to have some time to think it over.  But I did have some thoughts about chapter 3, where I think there is either an inadvertent error or the authors are trying to pull a fast one (and if it’s the latter I apologize for narking on them).  So I thought I would start there.

The main message of chapter 3 is that the government of Canada is not spending enough on inquiry-driven research in universities (this was not, incidentally, a question the Government of Canada asked of the review panel, but the panel answered it anyway).  One of the ways that the panel argues this point is that while Canada has among the world’s highest levels of Research and Development in the higher education sector – known as HERD if you’re in the R&D policy nerdocracy – most of the money for this comes from higher education institutions themselves and not the federal government.  This, that say, is internationally anomalous and a reason why the federal government should spend more money.

Here’s the graph they use to make this point:

Naylor Report

Hmm.  Hmmmmm.

So, there are really two problems here.  The first is that HERD can be calculated differently in different countries for completely rational reasons.  Let me give you the example of Canada vs. the US.  In Canada, the higher education portion of the contribution to HERD is composed of two things: i) aggregate faculty salaries times the proportion of time profs spend on research (Statscan occasionally does surveys on this – I’ll come back to it in a moment) plus ii) some imputation about unrecovered research overhead.  In the US, it’s just the latter.  Why?  Because the way the US collects data on HERD, the only faculty costs they capture are the chunks taken out of federal research grants.  Remember, in the US, profs are only paid 9 months per year and at least in the R&D accounts, that’s *all* teaching.  Only the pieces of research grant they take out as summer salary gets recorded as R&D expenditure (and also hence as a government-sponsored cost rather than a higher education-sponsored one).

But there’s a bigger issue here.  If one wants to argue that what matters is the ratio of federal portion of HERD to the higher-education portion of HERD, then it’s worth remembering what’s going on in the denominator.  Aggregate salaries are the first component.  The second component is research intensity, as measured through surveys.  This appears to be going up over time.  In 2000, Statscan did a survey which seemed to show the average prof spending somewhere between 30-35% of their time on research. A more recent survey shows that this has risen to 42%.  I am not sure if this latest co-efficient has been factored into the most recent HERD data, but when it does, it will show a major jump in higher education “spending” (or “investment”, if you prefer) on research, despite nothing really having changed at all (possibly it has been and it is what explains the bump seen in expenditures in 2012-13)

What the panel ends up arguing is for federal funding to run more closely in tune with higher education’s own “spending”.  But in practice what this means is: every time profs get a raise, federal funding would have to rise to keep pace.  Every time profs decide – for whatever reasons – to spend more time on research, federal funds should rise to keep pace.  And no doubt that would be awesome for all concerned, but come on.  Treasury Board would have conniptions if someone tried to sell that as a funding mechanism.

None of which is to say federal funding on inquiry-driven research shouldn’t rise.  Just to say that using data on university-funded HERD might not be a super-solid base from which to argue that point

March 29

Conflicting Views on Research Funding

Every year on budget night, we at HESA Towers publish a graph tracking granting council expenditures in real dollars.  This year it looks like this:

Tri-council Funding Envelopes

Research Council Funding.png

Some people really like the graph and pass it around and re-tweet it because it shows that whatever governments say about their love for science and innovation, it’s not showing up in budgets.  Others (hi Nassif!) dislike it because it doesn’t do justice to how badly researchers are faring under the current environment.  Now, these critics have a point, but I think some of the criticism misunderstands why government funds research in the first place.

The critique of that graph usually makes some combination of the following points:

  1. Enrolments have gone way up over the past fifteen years, so there are more profs and hence more people needing research grants.
  2. At some councils, at least, the average grant size is increasing, sometimes quite significantly.  That’s good for those who get grants, but it means the actual number of awards is decreasing at the same time as the number of people applying is increasing.
  3. In addition to an increasing number of applicants, the number of applications per applicant also seems to be increasing, presumably as a rational response to increasing competition (two lottery tickets are better than one!).

Now, from the point of view of researchers, what all this means is that “steady funding in real dollars” is irrelevant.  On the ground, faculty are having to spend more time on grant proposals, for fear of not receiving one.  The proportion receiving awards is falling, which has an effect on scientific progression, particularly when it happens to younger faculty.  So it’s easy to see why the situation has academic scientists in a panic, and why they’d prefer a graph that somehow shows how applicant prospects of receiving grants are nosediving.   And that graph would as be as undeniably true as the one we publish.

But, from the perspective of Ottawa, I think the answer might well be: “not our problem”.

Here’s why.  The main reason governments get into the research game is to solve a market failure.  The private sector can’t capture all the benefits of basic research because of spillovers, so they underinvest in it.  Therefore, governments invest to fill the gap.  This has been standard economic theory for over 50 years now.

So, to be blunt, government is there to buy a particular amount of science that is in the public interest given corporate underinvestment.  It is not there to provide funds so that the academic career ladder works smoothly.

Provinces and universities decided to hire more science profs to deal with a big increase in access?  Great!  But did anyone ask the feds if they’d be prepared to backstop those decisions with more granting council funds?  Nope. They just assumed the taps would keep flowing.  Academia decided to change the rules of pay and promotion in such a way that emphasized research, thus creating huge new demand for more research dollars.  Fantastic!  But did anyone ask the feds to see how they’d cope with the extra demand?  Nope.  Just hope for the best.

There’s a case, of course, to say that the federal government, via the granting councils, should be more concerned than it is with the national pipeline for scientific talent.  What’s happening right now could really cause a lot of good young scientists to either flee their careers or their country (or both), and that’s simply a waste of expensively-produced talent.  But for the feds to thoroughly get into the business of national science planning requires provinces and institutions to give the councils a more direct role in institutional hiring decisions and the setting of tenure standards.  I bet I can guess how most people would feel about that idea.

So could the government put more money into granting councils?  Sure.  Could some councils make things better by reversing their Harper-era decisions to go with larger average grant sizes?  Yes, obviously.  But let’s remember that at least part of the problem is that institutions and academics have taken a lot of decisions over the past twenty years about what research and scientific careers should look like with very little thought to the macro fiscal implications, under the assumption that the feds and the councils would be there to bail them out.

That needs to change, too.

September 30

Athletics Scholarships in Canada

Time was, about twenty years ago, Canadian universities didn’t spend money on university athletic scholarships.  Then things changed and universities turned on the taps.  Today we ask the question: “how’s that going for everyone”?

Well, it’s not going too badly, if you’re an athlete.  Just under 5,830 students received athletic scholarships totalling $15,981,189 in 2013-14 – that’s a little over $3,000 a pop.  CIS officially recognizes twenty-one sports, nine of which have teams for both genders (eighteen total), plus football which is male-only and rugby and field-hockey which are female-only.  However, roughly 85% of the scholarship dollars are concentrated in just nine sports, as shown below in Figure 1.  Some have almost no scholarships at all: inter-collegiate curling, for instance, has only 16 scholarships nationally for both sexes.

Figure 1: Top Sports by Scholarship Expenditure, 2013-14


What’s interesting here is that over time, the amount of money spent on Athletics scholarship has been rising quickly and steadily.  Even after accounting for inflation, Canadian universities spent nearly three times as much on athletics scholarships ($16 million vs. $5.8 million) in 2013-14 as they did ten years earlier.  It’s an interesting choice of expenditure by allegedly cash-strapped institutions.

Figure 2: Total Athletics Scholarships by Gender, 2003-4 vs 2013-4, in constant $2014


I suspect most institutions would probably defend it as a kind of strategic enrolment investment, much the way they defend other kinds of tuition discounting.  I mean, does it really matter if you give someone a $5,000 academic entrance scholarship or a $5,000 athletic scholarship?  They’re both forms of tuition discounting.  And of course, the absolute amounts are trivial.  $16 million is only 1% of the total amount of funding given by universities to students (if you include funding to graduate students).  And if you want into get into truly ludicrous comparisons, it’s less than what the University of Michigan spends on salaries for its football coaching staff.

A final point to make here is around gender equity.  Male and female athletes receive awards at roughly the same rate (45% of athletes of each gender receive an award), which is good.  However, imbalances remain in terms of the number of athletics spots for men than women (53% of all athletic team spots are male, compared to about 41% of undergraduates as a whole), and in terms of the size of the average award ($3,286 vs. $2,737).  Those results are better than they were a decade ago, and they appear to be slightly better than they are in the US, where actual legislation exists in the form of Title IX to enforce equity in sports, but they are still some ways from equal.

June 08

Are NSERC decisions “skewed” to bigger institutions?

That’s the conclusion reached by a group of professors from – wait for it – smaller Canadian universities, as published recently in PLOS One. I urge you to read the article, if only to understand how technically rigorous research without an ounce of common sense can make it through the peer-review process.

Basically, what the paper does is rigorously prove that “both funding success and the amount awarded varied with the size of the applicant’s institution. Overall, funding success was 20% and 42% lower for established researchers from medium and small institutions, compared to their counterpart’s at large institutions.” 

They go on to hypothesize that:

“…applicants from medium and small institutions may receive lower scores simply because they have weaker research records, perhaps as a result of higher teaching or administrative commitments compared to individuals from larger schools. Indeed, establishment of successful research programs is closely linked to the availability of time to conduct research, which may be more limited at smaller institutions. Researchers at small schools may also have fewer local collaborators and research-related resources than their counterparts at larger schools. Given these disparities, observed funding skew may be a consequence of the context in which applicants find themselves rather than emerging from a systemic bias during grant proposal evaluation.”

Oh my God – they have lower success rates because they have weaker research records?  You mean the system is working exactly as intended?

Fundamentally, this allegedly scientific article is making a very weird political argument.  The reason profs at smaller universities don’t get grants, according to these folks, is because they got hired by worse universities –  which means they don’t get the teaching release time, the equipment and whatnot that would allow them to compete on an even footing with the girls and boys at bigger schools.  To put it another way, their argument is that all profs have inherently equal ability and are equally deserving of research grants, it’s just that some by sheer random chance got allocated to weaker universities, which have put a downer on their career, and if NSERC doesn’t actively ignore actual outputs and perform some sort of research grant affirmative action, then it is guilty of “skewing” funding.

Here’s another possible explanation: yes, faculty hired by bigger, richer, more research-intensive institutions (big and research-intensive are not necessarily synonymous, but they are in Canada) have all kinds of advantages over faculty hired by smaller, less research-intensive universities.  But maybe, just maybe, faculty research quality is not randomly distributed.  Maybe big rich universities use their resources mainly to attract faculty deemed to have greater research potential.  Maybe they don’t always guess quite right about who has that potential and who doesn’t but on the whole it seems likelier than not that the system works more or less as advertised.

And so, yes, there is a Matthew effect (“for unto every one that hath shall be given, and he shall have abundance”) at work in Science: the very top of the profession gets more money than the strata below them and that tends to increase the gap in outcomes (salary, prestige, etc).  But that’s the way the system was designed.  If you want to argue against that, go ahead. But at least do it honestly and forthrightly: don’t use questionable social science methods to allege NSERC of “bias” when it is simply doing what has always been asked to do.