HESA

Higher Education Strategy Associates

Category Archives: Canada

May 16

Jobs: Hot and Not-So-Hot

Remember when everyone was freaking out because there were too many sociology graduates and not enough welders?  When otherwise serious people like Ken Coates complained about the labour market being distorted by the uninformed choices of 17-19 year-olds?  2015 seems like a long time ago.

Just for fun the other day, I decided to look at which occupations have fared best and worst in Canada over the past ten years (ok, I grant you my definition of fun may not be universal).  Using public data, the most granular data I can look at are two-digit National Occupation Codes, so some of these categories are kind of broad.  But anyway, here are the results:

Table 1: Fastest-growing occupations in Canada, 2007-2017

May 16-17 Table 1 Fastest Growing

See any trades in there?  No, me neither.  Four out of the top ten fastest-growing occupations are health-related in one way or another.  There are two sets of professional jobs – law/social/community/ government services (which includes educational consultants, btw) and natural/applied sciences) which pretty clearly require bachelor’s if not master’s degrees.  There are three other categories (Admin/financial supervisors, Technical occupations in art, and paraprofessional occupations in legal, social, etc) which have a hodgepodge of educational requirements but on balance probably have more college than university graduates.   And then there is the category retail sales supervisors and specialized sales occupations, which takes in everything from head cashiers to real estate agents and aircraft sales representatives.  Hard to know what to make of that one.  But the other nine all seem to require training which is pretty squarely in traditional post-secondary education specialties.

Now, what about the ten worst-performing occupations?

Table 2: Fastest-shrinking Occupations in Canada 2007-2017

May 16-17 Table 2 Fastest Shrinking Occupation
This is an interesting grab bag.  I’m fairly sure, given the amount of whining about managerialism one hears these days, that it will be a surprise to most people that the single worst-performing job sector in Canada is “senior management occupations”.  It’s probably less of a surprise that four of the bottom ten occupations are manufacturing-related, and that two others – Distribution, Tracking and Scheduling and Office Support Occupations – which are highly susceptible to automation are there, too.  But interestingly, almost none of these occupations, bar senior managers, have significant numbers of university graduates in them. Many wouldn’t even necessarily have a lot of college graduates either, at least outside the manufacturing and resources sectors.

Allow me to hammer this point home a bit, for anyone who is inclined to ever again take Ken Coates or his ilk seriously on the subject of young people’s career choices.  Trades are really important in Canada.  But the industries they serve are cyclical.  If we counsel people to go into these areas, we need to be honest that people in these areas are going to have fat years and lean years – sometimes lasting as long as a decade at a time.  On the other hand, professional occupations (nearly all requiring university study) and health occupations (a mix of university and college study) are long-term winners.

Maybe students knew that all along, and behaved accordingly.  When it comes to their own futures, they’re pretty smart, you know.

 

May 15

Provincial Budgets 2017

Springtime brings with it two certainties: 1) massive, irritating weekend traffic jams in Toronto as the city grants permits to close down Yonge street for a parade to virtually any group of yahoos, thus making it impossible to go from the cities east to west ends and 2) provincial budgets.  And with that, it’s time for my annual roundup of provincial budgets (click on the year for previous analyses – 2016 2015 2014 2013.  It’s not as bad as last year but it’s still kind of depressing.

Before we jump in, I need to remind everyone about some caveats on this data.  What is being compared here is announced spending in provincial budgets from year-to-year.  But what gets allocated and what gets spent are two different things. Quebec in particular has a habit of delivering mid-year cuts to institutions; on the flip side, Nova Scotia somehow spent 15% more than budgeted on its universities.  Also, not all money goes to institutions as operating funding:  this year, Newfoundland cut operating budgets slightly but threw in a big whack of cash for capital spending at College of the North Atlantic, so technically government post-secondary spending is up there this year.

One small difference this year from previous years: the figures for Ontario exclude capital expenditures.  Anyone who has a problem with that, tell the provincial government to publish its detailed spending estimates at the same time it delivers the budget like every other damn province.

This year’s budgets are a pretty mixed bunch.  Overall, provincial allocations after inflation fell by $13 million nationally – or just about .06%.  But in individual provinces the spread was between +4% (Nova Scotia) and -7% (Saskatchewan).  Amazing but true: two of the three provinces with the biggest gains were ones in which an election was/is being held this spring.

Figure 1: 1-Year change in Provincial Transfers to Post-Secondary Institutions, 2016-17 to 2017-18, in constant $2017

Province Budget Figure 1 Year Change Provincial Transfers

 

Now, this probably wouldn’t be such a big deal if it hadn’t come on the heels of a string of weak budgets for post-secondary education.  One year is neither here nor there: it’s the cumulative effect which matters.  Here’s the cumulative change over the past six years:

Figure 2: 6-year Change in Provincial Transfers to Post-Secondary Institutions, 2011-12 to 2017-18, in constant $2017

Figure 2 6 year chage in provincial transfers

 

Nationally, provinces are collectively providing 1% less to universities in inflation-adjusted dollars in 2017-18 than they were in 2011-12.  Apart from the NDP governments in Manitoba and Alberta, it’s really only Quebec which has bothered to keep its post-secondary funding ahead of inflation.  Out east, it’s mostly been a disaster – New Brunswick universities are down 9% over the last six years (not the end of the world because of concomitant enrolment declines), and a whopping 21% in Newfoundland.

The story is different on the student aid front, because a few provinces have made some big moves this year.  Ontario and New Brunswick have introduced their “free tuition” guarantees, thus resulting in some significant increases in SFA funding, while Quebec is spending its alternative payment bonanza from the Canada Student Loans Program changes (long story short: under the 1964 opt-out agreement which permitted the creation of the Canada Student Loans Program, every time CSLP spends more, it has to send a larger cheque to Quebec).  On the other side, there’s Newfoundland, which has cut it’s student aid budget by a whopping 78%.  This appears to be because the province is now flouting federal student aid rules and making students max out their federal loans before accessing provincial aid, rather than splitting the load 60-40 as other provinces do.

Figure 3: 1-Year change in Provincial Student Financial Aid Expenditures, 2016-17 to 2017-18, in constant $2017

Figure 3 1 Year change in student aid expenditures

 

And here’s the multi-year picture, which shows a 46% increase in student aid over the past six years, from $1.9 billion to just under $2.8 billion.  But there are huge variations across provinces.  In Ontario, aid is up 83% over six years (and OSAP now constitutes over half of all provincial student aid spending), while Saskatchewan is down by half and Newfoundland by 86%, mostly in the present year.  The one province where there is an asterisk here is Alberta, where there was a change in reporting in 2013-2014; the actual growth is probably substantially closer to zero than to the 73% shown here.

Figure 4: 6-Year change in Provincial Student Financial Aid Expenditures, 2016-17 to 2017-18, in constant $2017

Figure 4 6 Year Change in Provincial Student Aid

So the overall narrative is still more or less the same it’s been for the past few years.  On the whole provincial governments seem a whole lot happier spending money on students than they do on institutions.    Over the long run that’s not healthy, and needs to change.

May 12

Statistical Deceptions on Student Debt

Every couple of years, the Canadian Federation of Students (CFS) produces a “research paper” to provide a new “evidence-based” spin to back up its eternal demand for free tuition. Last month, they put out a new version, this one entitled The Political Economy of Student Debt in Canada. The theme this time is lightly-recycled Piketty: Canada’s main problems are inequality and rising indebtedness; if we eliminate tuition, that’ll strike a blow against both so wa-hey! The word “neoliberal” appears frequently.

This is all fine. It’s Lobbying 101 to link your own issues to those of the ruling government’s agenda in order to increase the likelihood that they’ll get picked up. Inequality is certainly a theme of this decade, as is the constant media drumbeat of ever-rising household debt (though for reasons that pass understanding they never match up statistics about rising debt with equivalent statistics about rising assets).

But there is a problem here. To make the analogy stick you’d have to be able to prove that student debt, like household debt, is rising rapidly when in fact it’s not. Data from the National Graduates Survey (NGS) suggest that student indebtedness has been more or less stable since 2000; the more recent/timely (but less accurate) Canadian Undergraduate Survey Consortium data (see here and here) actually suggests it has decreased a bit since 2000. And it is certainly the case that student loan burdens – that is, the percentage of after-tax income devoted to paying student debt – has decreased substantially over the last decade and a half, due mainly to falling taxes and lower interest rates. Average student loan debt – that is, the amount of debt owed by students at the time of graduation – may in fact perhaps the one type of personal debt which isn’t increasing.

So imagine my surprise when I saw this graph in the middle of the research paper, purporting to show that student debt has increase 40% in real terms since 1999:

unnamed

Where on earth does this data come from? Well, it’s not the NGS and it’s not any survey of graduating students. Rather, it’s from the once-triennial, now quadrennial Survey of Financial Security (SFS), which measures student debt in an entirely different way.

Both NGS and the CUSC try to measure the average debt at the point of graduation. NGS does it by asking graduates two years after graduation how much debt they left school with; CUSC asks students a couple of months before they graduate how much debt they have. SFS is not a survey of graduates; it’s a survey of 20,000 or so Canadian households. And when it reports debt, it does so i) by measuring outstanding debt, not debt at the time of graduation and ii) be measuring household debt, not individual debt. So if your household contains multiple individuals with student debt (whether as roommates or in a family relationship), SFS will combine the debt of all individuals. The second factor will definitely tend to inflate the amount of debt reported; the first is more ambiguous because on the one hand it is including both borrowers who graduated recently and those who graduated many years ago (which one would think would lower the average figure because the latter have been in repayment for many years), but on the other will tend to exclude those who graduated with lower debt because they will often have paid it off and hence be excluded from the statistic (thus raising average debt somewhat).

Also, because it measures outstanding debt rather than debt at graduation, it will tend to lag trends in student aid. That is, even after student debt at graduation stops rising, outstanding student debt will continue to rise as earlier cohort of (less indebted) graduates repay their loans and later cohorts of (more indebted) graduates take their place in the ranks of “those with outstanding student debt”. So it’s not really a big surprise that outstanding household student debt rose in the 2000s, because that’s the natural corollary of rising student debt at graduation in the 1990s (which, unlike rising student debt in the 2000s was actually a thing).

The point here is not that the data used is “fake”: the data itself is real. But to make their point about “rising student debt” the CFS’ report writers have used a quite different definition of student debt than that used by literally every other PSE stakeholder, indeed different to any definition of student debt CFS has ever used. And they have done so without mentioning that they have used an alternative definition. This is not an innocent oversight. The person or persons who authored this document clearly know their way around Statistics Canada data; anyone with that level of knowledge also understands that if you say “student debt has risen 40% since 1999”, people will understand that to mean “individual debt at graduation”, not “outstanding household debt amongst the entire population”. It’s a deliberate deception to further a politically convenient narrative.

Student debt, as that term is commonly understood, has not risen by 40% in real dollars since 1999. On the contrary, student debt levels are broadly stable and repayment burdens are much reduced over the past decade and a half. Using torqued, cherry-picked statistics to try to convince the public that the reverse is happening is pretty poor form.

 

May 11

Trade-offs in Apprenticeships

I haven’t worked on apprenticeship projects much in the last few years, but one of my current gigs has got me thinking about the area again.  And one thing that I apparently missed completely was a new (well, new to me anyway) effort to harmonize apprenticeship program sequencing nationally (details here).

Wait a minute, you say – weren’t apprenticeships always harmonized?  Isn’t that what Red Seal is all about?

Well, sort of.  Red Seal was about harmonizing outcomes.  Basically, Red Seal was an exam that journeypersons could take after completing their (provincially-governed) training which would certify them as being qualified to ply their trade right across the country.  It was optional – if you had no intention of leaving your home province there wasn’t a whole lot of point in taking the exam because completion of the program was itself sufficient to allow one to practice there.  Red Seal was therefore basically a mobility tool for people who had completed apprenticeships.

Now, that was fine when most apprentices started and completed their training in one province.  But during the resource boom, there was an explosion of apprentices who began training in one province and then moved and wanted to complete training in another.  This created problems because although Red Seal had long since harmonized apprenticeship training outcomes, each province got to those outcomes in quite different ways.  Within the same trade, the number of required hours/weeks of training varied from one province to another, and the sequencing was different.  Something that an electrician learned at level 1 in Alberta wasn’t taught until level 3 in Ontario, something that made things complicated if, for instance, an apprentice level 2 electrician got laid off in Windsor and wanted to try his/her luck in Alberta.

As I say, I’ve been out of this file awhile but what seems to have happened is that the provincial directors of apprenticeship seem to have got together and actually co-ordinated things like training sequencing, number of weeks of in-class training, etc, and this is what they refer to as “harmonization”.  According to that federal website, this harmonization initiative is about halfway done – i.e about half the Red Seal trades were harmonized in 2016 and 2017 and the rest will be rolled out in stages over the next couple of years.

So, a triumph for the Canadian apprenticeship system?  Well, not so fast.

Not all trades programs are apprenticeship programs, but the curriculum still has to line up because everyone wants graduates of pre-employment trades programs to be able to become apprentices in that area.  So what that means is that national harmonization of apprenticeship programs in effect means nationalization of the entire trades curriculum.  And what that means is the effectiveness of all those local industry committees that every community college program has suddenly just got a lot less effective, because significant curriculum changes now have to be negotiated among ten provincial directors of apprenticeships.

Traditionally, those committees have been a point of pride in Canada because they have given trades programs the ability to respond quickly to business needs.  Now, their effectiveness has been traded away in the name not of journeyperson mobility but of apprentice mobility, which was a thing in the resource boom but maybe not so much in the bust.  Is that a smart trade-off?  I suspect the answer varies quite a bit by trade, and yet solution this is being applied uniformly across Red Seal Trades.

We are told “industry” asked for this change, but I really wonder who was part of the consultation.  I can certainly believe that big industry with training efforts in many different provinces asked for it.  I can believe that extractive industries asked for it.  I have a harder time believing that smaller and medium enterprises asked for it because it substantially lowers their ability to affect curriculum and to some degree lowers the values of apprentices to them.

Silver linings have clouds, basically.  And centralized curricula have trade-offs.

May 09

Conservative Leadership Platform Analysis

So, I just read through all the thirteen leadership candidates’ websites, looking for their thoughts on all the stuff this blog cares about: post-secondary education, skills, science, innovation, youth, etc.

The things I do for you people.

Actually, it was a pretty quick exercise because it turns out almost no one in the Tory leadership race places much importance on post-secondary education, skills, innovation, youth.  They seem to care a lot about taxes, and immigration (and to a lesser extent guns), but for a party that was in government less than two years ago, the Conservative candidates seem to have remarkably little appreciation for the things that actually drive a modern economy.  Anyways, briefly, here is what the candidates say about the issues this blog cares about.

 Chris Alexander (Former Minister of Citizenship & Immigration, ex-MP Ajax-Pickering): No specific platform on higher education, but the topic does come up frequently in his policies.  Expanding educational exports to Asia is priority.  He says he wants 400,000 new international students/year by 2020 and 500,000 per year by 2023 (I’m pretty sure he does not actually mean “new” as in new visa applications every year, I think that’s total in the country at any one time).  He also wants to spend money on new National Centres of Excellence and Centres of Excellence for Commercialization and Research for the digital economy as well as invest more in research related to art and design (I assume OCAD’s Robert Luke has something to do with that one).  He also has a general pledge to incentivize PSE institutions to collaborate more with “incubators accelerators and companies of all sizes”, whatever that means.

Maxime Bernier (Former Minister of industry, Foreign Affairs, and Min. of State for Small Business, MP for Beauce)The main point of interest in the Bernier platform is the rise in the personal tax exemption to $15,000 per year, which will have favourable impacts for many students.  Under his health platform, Bernier indicates he wants the federal government to vacate the health field and transfer tax points to the provinces; though he does not say so explicitly, it’s a fairly safe assumption that the same would apply to the transfer of funds to provinces for post-secondary education under the Canada Social Transfer.

Steven Blaney (Former Minister of Public Safety, MP Bellechasse—Les Etchemins—Lévis): Nothing at all.

Michael Chong (Former Minister of Intergovernmental Affairs, and Sport, MP Wellington-Halton Hills):  Nothing at all.

Kellie Leitch (Former Minister of Labour and the Status of Women, MP Simcoe-Grey): Nothing at all.

Pierre Lemieux (Former MP Glengarry-Prescott-Russell): Nothing at all.  Are you seeing a pattern yet?

Deepak Obhrai (MP Calgary Forest Lawn)Nothing at all.

 Erin O’Toole: (Former Minister of Veterans Affairs, MP Durham): O’Toole is the only candidate with anything even vaguely resembling plans for science and Innovation in the form of a scheme to extend the notion of “flow-through shares” –a tax gimmick heavily used in resource industries to defray development expenses – to new life-sciences and tech companies as well.  More intriguing is O’Toole’s “Generation Kick-Start” platform, which promises everyone who completes a degree, diploma or apprenticeship with an extra $100,000 of personal exemptions (i.e. $15K in reduced taxes) to be used before they turn 30.  That goes up to $300,000 if their credential in an area where skills are in “short supply” (definition vague but seems to include engineers, coders and “skilled tradespeople” even though 3 years into the oil slump the latter wouldn’t really qualify as “in demand”).  The latter half of the proposal is goofy, but the basic idea has a lot of merit.

 Rick Peterson: (A BC Investment Advisor of Some Sort): Nothing at all.

Lisa Raitt (Former Minister of Natural Resources, Labour, and Transportation, MP Milton). Like Maxime Bernier proposal, Raitt proposes to raise the basic tax exemption to 15K.  She also wants to increase the (totally useless) apprenticeship and completion grant up to $4,000.

 Andrew Saxton (ex-MP, North Vancouver)Saxton’s policy pages are – to put it mildly – light on detail.  However, he says he does want to invest in “skills training to ensure Canadian skills are matched with Canadian jobs” (whatever that means).  Also, having lived in Switzerland for some time, he advocates a Swiss-style apprenticeship program which extends into industries like banking, pharmaceuticals, etc.

Andrew Scheer (Former Speaker of the House of Commons, MP Regina-Qu’appelle) Scheer’s money proposals in education are limited to a pledge that parents of students attending independent schools a tax deduction of up to $4000 tuition annually per child, and a tax credit of $1,000 (i.e. a $150 reduction in taxes) to parents who choose to homeschool their child.  In addition, Scheer pledges that “public universities or colleges that do not foster a culture of free speech and inquiry on campus” will “not have support from the federal government”.  He then lists the tri-councils and CRCs as specific funding mechanisms for which institutions would not be eligible: it is unclear if the ban would include CFI and – more importantly – CSLP.  Note that the ban would only cover public institutions; private (i.e. religious) institutions would be able to limit free inquiry – as indeed faith-based institutions do for obvious reasons – and still be eligible for council funding.

Brad Trost (ex-MP Saskatoon-University): Nothing apart from a pledge for tax support to private education and homeschooling identical to Scheer’s.

And that’s the lot.  I think it’s fair to say that the field’s appreciation for the role of knowledge and skills in the modern economy is pretty weak.   Maybe dangerously so.  Still, if you are voting in this election and you think PSE and skills are important, your best bet is probably Chris Alexander; if you want to raise youth living standards, vote for O’Toole followed perhaps by Maxime Bernier or Lisa Raitt.

(And yes, I know the percentage of Conservative voters motivated by those two sets of issues are vanishingly small, but I only have this one shtick, so cut me some slack).

 

May 08

Naylor Report, Part II

Morning all.  Sorry about the service interruption.  Nice to be back.

So, I promised you some more thoughts about the Fundamental Science Review.  Now that I’ve lot of time to think about it, I think I’m actually surprised by what it doesn’t say, says and how many questions remain open.

What’s best about the report?  The history and most of the analysis are pretty good.  I think a few specific recommendations (if adopted) might actually be a pretty big deal – in particular the one saying that the granting councils should stop any programs forcing researchers to come up with matching funding, mainly because it’s a waste of everyone’s time.

What’s so-so about it?  The money stuff for a start.  As I noted in my last blog post, I don’t really think you can justify a claim to more money based on “proportion of higher ed investment research coming from federal government”.  I’m more sympathetic to the argument that there needs to be more funds, especially for early career researchers, but as noted back here it’s hard to argue simultaneously that institutions should have unfettered rights to hire researchers but that the federal government should be pick up responsibility for their career progression.

The report doesn’t even bother, really, to make the case that more money on basic research means more innovation and economic growth.  Rather, it simply states it, as if it were a fact (it’s not).  This is the research community trying to annex the term “innovation” rather than co-exist with it.  Maybe that works in today’s political environment; I’m not sure it improves overall policy-making.  In some ways, I think it would have been preferable to just say: we need so many millions because that’s what it takes to do the kind of first-class science we’re capable of.  It might not have been politic, but it would have had the advantage of clarity.

…and the Governance stuff?  The report backs two big changes in governance.  One is a Four Agency Co-ordinating Board for the three councils plus the Canada Foundation for Innovation (which we might as well now call the fourth council, provided it gets an annual budget as recommended here), to ensure greater cross-council coherence in policy and programs.  The second is the creation of a National Advisory Committee on Research and Innovation (NACRI) to replace the current Science, Technology and Innovation Council and do a great deal else besides.

The Co-ordinating committee idea makes sense: there are some areas where there would be clear benefits to greater policy coherence.  But setting up a forum to reconcile interests is not the same thing as actually bridging differences.  There are reasons – not very good ones, perhaps, but reasons nonetheless – why councils don’t spontaneously co-ordinate their actions; setting up a committee is a step towards getting them to do so, but success in this endeavour requires sustained good will which will not necessarily be forthcoming.

NACRI is a different story.  Two points here.  The first is that it is pretty clear that NACRI is designed to try to insulate the councils and the investigator-driven research they fund from politicians’ bright ideas about how to run scientific research.  Inshallah, but if politicians want to meddle – and the last two decades seem to show they want to do it a lot – then they’re going to meddle, NACRI or no.  Second, the NACRI as designed here is somewhat heavier on the “R” than on the “I”.  My impression is that as with some of the funding arguments, this is an attempt to hijack the Innovation agenda in Research’s favour.  I think a lot of people are OK with this because they’d prefer the emphasis to be on science and research rather than innovation but I’m not sure we’re doing long-term policy-making in the area any favours by not being explicit about this rationale.

What’s missing?  The report somewhat surprisingly punted what I expected to be a major issue: namely, the government’s increasing tendency over time to fund science outside the framework of the councils in such programs as the Canada Excellence Research Chairs (CERC) and the Canada First Research Excellence Fund (CFREF).  While the text of the report makes clear the authors’ have some reservations about these programs, the recommendations are limited to a “you should review that, sometime soon”.  This is too bad, because phasing out these kinds of programs would be an obvious way to pay for increase investigator-driven funding (though as Nassif Ghoussoub points out here  it’s not necessarily a quick solution because funds are already committed for several years in advance).  The report therefore seems to suggest that though it deplores past trends away from investigator-driven funding, it doesn’t want to see these recent initiatives defunded, which might be seen in government as “having your cake and eating it too”.

What will the long-term impact of the report be? Hard to say: much depends on how much of this the government actually takes up, and it will be some months before we know that.  But I think the way the report was commissioned may have some unintended adverse consequences.  Specifically, I think the fact that this review was set up in such a way as to exclude consideration of applied research – while perfectly understandable – is going to contribute to the latter being something of a political orphan for the foreseeable future.  Similarly, the fact that the report was done in isolation from the broader development of Innovation policy might seem like a blessing given the general ham-fistedness surrounding the Innovation file, in the end I wonder if the end result won’t be an effective division of policy, with research being something the feds pay universities do and innovation something they pay firms to do.  That’s basically the right division, of course, but what goes missing are vital questions about how to make the two mutually reinforcing.

Bottom line: it’s a good report.  But even if the government fully embraces the recommendations, there are still years of messy but important work ahead.

April 18

Naylor Report, Take 1

People are asking why I haven’t talked about the Naylor Report (aka the Review of Fundamental Science) yet.  The answer, briefly, is i) I’m swamped ii) there’s a lot to talk about in there and iii) I want to have some time to think it over.  But I did have some thoughts about chapter 3, where I think there is either an inadvertent error or the authors are trying to pull a fast one (and if it’s the latter I apologize for narking on them).  So I thought I would start there.

The main message of chapter 3 is that the government of Canada is not spending enough on inquiry-driven research in universities (this was not, incidentally, a question the Government of Canada asked of the review panel, but the panel answered it anyway).  One of the ways that the panel argues this point is that while Canada has among the world’s highest levels of Research and Development in the higher education sector – known as HERD if you’re in the R&D policy nerdocracy – most of the money for this comes from higher education institutions themselves and not the federal government.  This, that say, is internationally anomalous and a reason why the federal government should spend more money.

Here’s the graph they use to make this point:

Naylor Report

Hmm.  Hmmmmm.

So, there are really two problems here.  The first is that HERD can be calculated differently in different countries for completely rational reasons.  Let me give you the example of Canada vs. the US.  In Canada, the higher education portion of the contribution to HERD is composed of two things: i) aggregate faculty salaries times the proportion of time profs spend on research (Statscan occasionally does surveys on this – I’ll come back to it in a moment) plus ii) some imputation about unrecovered research overhead.  In the US, it’s just the latter.  Why?  Because the way the US collects data on HERD, the only faculty costs they capture are the chunks taken out of federal research grants.  Remember, in the US, profs are only paid 9 months per year and at least in the R&D accounts, that’s *all* teaching.  Only the pieces of research grant they take out as summer salary gets recorded as R&D expenditure (and also hence as a government-sponsored cost rather than a higher education-sponsored one).

But there’s a bigger issue here.  If one wants to argue that what matters is the ratio of federal portion of HERD to the higher-education portion of HERD, then it’s worth remembering what’s going on in the denominator.  Aggregate salaries are the first component.  The second component is research intensity, as measured through surveys.  This appears to be going up over time.  In 2000, Statscan did a survey which seemed to show the average prof spending somewhere between 30-35% of their time on research. A more recent survey shows that this has risen to 42%.  I am not sure if this latest co-efficient has been factored into the most recent HERD data, but when it does, it will show a major jump in higher education “spending” (or “investment”, if you prefer) on research, despite nothing really having changed at all (possibly it has been and it is what explains the bump seen in expenditures in 2012-13)

What the panel ends up arguing is for federal funding to run more closely in tune with higher education’s own “spending”.  But in practice what this means is: every time profs get a raise, federal funding would have to rise to keep pace.  Every time profs decide – for whatever reasons – to spend more time on research, federal funds should rise to keep pace.  And no doubt that would be awesome for all concerned, but come on.  Treasury Board would have conniptions if someone tried to sell that as a funding mechanism.

None of which is to say federal funding on inquiry-driven research shouldn’t rise.  Just to say that using data on university-funded HERD might not be a super-solid base from which to argue that point

April 17

British Columbia: Provincial Manifesto Analysis

On May 9th, our left-coasters go to the polls.  What are their options as far a post-secondary education is concerned?

Let’s start with the governing Liberals.  As is often the case with ruling parties, some of their promises are things that are both baked into the fiscal framework and will take longer than one term to complete (e.g. “complete re-alignment of $3 billion in training funds by 2024”), or are simply re-announcements of previous commitments (page 85-6 of the manifesto appears to simply be a list of all the SIF projects the province already agreed to co-fund), or take credit for things that will almost certainly happen anyways (“create 1000 new STEM places”…. in a province which already has 55,000 STEM seats and where STEM spots have been growing at a rate of about 1700/year anyway…interestingly the Liberals didn’t even bother to cost that one…)

When you throw those kinds of promises away, what you are left with is a boatload of micro-promises, including: i) making permanent the current BC Training Tax Credit for employers, ii) creating a new Truck Logger training credit (yes, really), iii) spending $10M on open textbooks over the next 4 years, iv) reducing interest rates on BC student loans to prime, v) making minor improvements to student aid need assessment, vi) providing a 50% tuition rebate to Armed Forces Veterans, vii) creating a centralized province-wide admission system and viii) allowing institutions to build more student housing (currently they are restricted from doing so because any institutional debt is considered provincial debt and provincial debt is more or less verboten…so this is a $0 promise just to relax some rules).  There’s nothing wrong with any of those, of course, but only the last one is going to make any kind of impact and as a whole it certainly doesn’t add up to a vision.  And not all of this appears to be new money: neither the student loan changes nor the centralized application system promises are costed, which suggests funds for these will cannibalized from elsewhere within the system.  The incremental cost of the remaining promises?  $6.5 million/year.  Whoop-de-do.  Oh, and they’re leaving the 2% cap on tuition rises untouched.

What about the New Democrats?  Well, they make two main batches of promises.  One is about affordability, and consists of matching the Liberal pledge on a tuition cap, slightly outdoing them on provincial student loan interest (eliminating it on future and past loans, which is pretty much the textbook definition of “windfall gains”), and getting rid of fees for Adult Basic Education and English as a Second Language Program (which, you know, GOOD).  There’s also an oddly-worded pledge to provide a $1,000 completion grant “for graduates of university, college and skilled trades programs to help pay down their debt when their program finishes”: based on the costing and wording, I think that means the grant is restricted to those who have provincial student loans.

The NDP also has a second batch of policies around research – $50M over two years to create a graduate scholarship fund and $100M (over an unspecified period, but based on the costing, it’s more than two years) to fund expansion of technology-related programs in BC PSE institutions.  There is also an unspecified (and apparently uncosted) promise to expand tech-sector co-op programs.  Finally, they are also promising to match the Liberals on the issue of allowing universities to build student housing outside of provincial controls on capital spending.

Finally, there are the Greens, presently running at over 20% in the polls and with a real shot at achieving a significant presence in the legislature for the first time.  They have essentially two money promises: one, “to create a need-based grant system” (no further details) and two, an ungodly bad idea to create in BC the same graduate tax credit rebate that New Brunswick, Nova Scotia and now Manitoba all have had a shot at (at least those provinces had the excuse that they were trying to combat out-migration; what problem are the BC Greens trying to solve?).

Hilariously, the Green’s price-tag for these two items together is…$10 million.  Over three years.  Just to get a sense of how ludicrous that is, the Manitoba tax credit program cost $55 million/year in a province a quarter the size.  And within BC, the feds already give out about $75M/year in up-front grants.  So I think we need to credit the Greens with being more realistic than their federal cousins (remember the federal green manifesto?  Oy.), but they have a ways to go on realistic budgeting.

(I am not doing a manifesto analysis for the BC Conservatives because a) they haven’t got one and b) I’ve been advised that if they do release one it will probably be printed in comic sans.)

What to make of all this?  Under Gordon Campbell, the Liberals were a party that “got” post-secondary education and did reasonably well by it; under Christy Clark it’s pretty clear PSE can at best expect benign neglect.  The Greens’ policies focus on price rather than quality, one of their two signature policies is inane and regressive, and their costing is off by miles.

That leaves the NDP.  I wouldn’t say this is a great manifesto, but it beats the other two.  Yeah, their student aid policies are sub-optimally targeted (they’re all for people who’ve already finished their programs, so not much access potential), but to their credit they’ve avoided going into a “tuition freezes are magic!” pose.  Alone among the parties, they are putting money into expansion and graduate studies and even if you don’t like the tech focus, that’s still something.

But on the whole, this is a weak set of manifestos.  I used to say that if I was going to run a university anywhere I’d want it to be In British Columbia.  It’s the least-indebted jurisdiction in Canada, has mostly favourable demographics, has easy access from both Asia (and its students) and from the well-off American northwest.  And it’s got a diversified set of institutions which are mostly pretty good at what they do.  Why any province would want to neglect a set of institutions like that is baffling; but based on these manifestos it seems clear that BC’s PSE sector isn’t getting a whole lot of love from any of the parties.  And that’s worrying for the province’s long-term future.

April 12

Access: A Canadian Success Story

Statscan put out a very important little paper on access to post-secondary education on Monday.  It got almost zero coverage despite conclusively putting to bed a number of myths about fees and participation, so I’m going to rectify that by explaining it to y’all in minute detail.

To understand this piece, you need to know something about a neat little Statscan tool called the Longitudinal Administrative Database (LAD).  Every time someone files an income tax form for the first time, LAD randomly selects one in five of them and follows them for their entire lifetime.  If at the time someone first files a tax return they have the same address as someone who is already in the LAD (and who is the right age to have a kid submitting a tax form for the first time), one can make a link between a parent and child.  In other words, for roughly 4% of the population, LAD has data on both the individual and the parent, which allows some intergenerational analysis.  Now, because we have tax credits for post-secondary education (PSE), tax data allows us to know who went to post-secondary education and who did not (it can’t tell us what type of institution they attended, but we know that they did attend PSE).  And with LAD’s backward link to parents, it means we can measure attendance by parental income.

Got that?  Good.  Let’s begin.

The paper starts by looking at national trends in PSE participation (i.e. university and college combined) amongst 19 year-olds since 2001, by family income quintile.  Nationally, participation rates rose by just over 20%, from 52.6% to 63.8%.  They also rose for every quintile.  Even for youth the lowest income quintile, participation is now very close to 50%.

 Figure 1: PSE enrolment rates by Income Quintile, Canada 2001-2014

PSE by Income Quintile

This positive national story about rates by income quintile is somewhat offset by a more complex set of results for participation rates by region.  In the 6 eastern provinces, participation rate rose on average by 13.6 percentage points; in the four western provinces, it rose by just 2.8 percentage points (and in Saskatchewan it actually fell slightly).  The easy answer here is that it’s about the resource boom, but if that were the case, you’d expect to see a similar pattern in Newfoundland, and a difference within the west between Manitoba and the others.  In fact, neither is true: Manitoba is slightly below the western average and Newfoundland had the country’s highest PSE participation growth rate.

 Figure 2: PSE Participation rates by region, 2002-2014

PSE by region

(actually, my favourite part of figure 2 is data showing that 19 year-old Quebecers – who mostly attend free CEGEPs, have a lower part rate than 19 year-old Ontarians who pay significant fees, albeit with benefit of a good student aid system.)

But maybe the most interesting data here is with respect to the closing of the gap between the top and bottom income quintile.  Figure 3 shows the ratio of participation rates of students from the bottom quintile (Q1) to those from the top quintile (Q5), indexed to the ratio as it existed in 2001, for Canada and selected provinces.  So a larger number means Q1 students are becoming more likely to attend PSE relative to Q5s and a smaller number means they are becoming less likely.  Nationally, the gap has narrowed by about 15%, but the interesting story is actually at the provincial level.

Figure 3: Ratio of Q1 participation rates to Q5 participation rates, Canada and selected provinces, 2001-2014

Q1 to Q5 participation rates

At the top end, what we find is that Newfoundland and Ontario are the provinces where the gap between rich and poor has narrowed the most.  Given that one of these provinces has the country’s highest tuition and the other the lowest, I think we can safely rule out tuition, on its own, as a plausible independent variable (especially as Quebec, the country’s other low-tuition province, posted no change over the period in question).  At the bottom end, we have the very puzzling case of Saskatchewan, where inequality appears to have got drastically worse over the past decade or so.  And again, though it’s tempting to reach for a resource boom explanation, nothing similar happened in Alberta so that’s not an obvious culprit.

Anyways, here’s why this work is important.  For decades, the usual suspects (the Canadian Federation of Students, the Canadian Center for Policy Alternatives) have blazed with self-righteousness about the effects of higher tuition and higher debts (debt actually hasn’t increased that much in real terms since 2000, but whatever).  But it turns out there are no such effects.  Over a decade of tuition continuing to increase slowly and average debts among those who borrow of over $25,000 and it turns out not only did participation rates increase, but participation rates of the poorest quintile rose fastest of all.

And – here’s the kicker – different provincial strategies on tuition appear to have had diddly-squat to do with it.  So the entire argument the so-called progressives make in favour of lower tuition is simply out the window.  That doesn’t mean they will change their position, of course.  They will continue to talk about the need to eliminate student debt because it is creating inequality (it’s actually the reverse, but whatever).  But of course, this make the free-tuition position even sillier.  If the problem is simply student debt, then why advocate a policy in which over half your dollars go to people who have no debt?

It’s the Ontario result in particular that matters: it proves that a high-tuition/high-aid policy is compatible with a substantial widening of access.  And that’s good news for anyone who wants smart funding policies in higher education.

April 03

Data on Race/Ethnicity

A couple of week ago, CBC decided to make a big deal about how terrible Canadian universities were for not collecting data on race (see Why so many Canadian universities Know so little about their own racial diversity). As you all know, I’m a big proponent of better data in higher education. But the effort involved in getting new data has to be in some way proportional to the benefit derived from that data. And I’m pretty sure this doesn’t meet that test.

In higher education, there are only two points where it is easy to collect data from students: at the point of application, and at the point of enrolment. But here’s what the Ontario Human Rights Code has to say about collecting data on race/ethnicity in application forms:

Section 23(2) of the Code prohibits the use of any application form or written or oral inquiry that directly or indirectly classifies an applicant as being a member of a group that is protected from discrimination. Application forms should not have questions that ask directly or indirectly about race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, record of offences, age, marital status, family status or disability.

In other words, it’s 100% verboten. Somehow, CBC seems to have missed this bit. Similar provisions apply to data collected at the time of enrolment –a school still needs to prove that there is a bona fide reason related to one’s schooling in order to require a student to answer the question. So generally speaking, no one asks a question at that point either.

Now, if institutions can’t collect relevant data via administrative means, what they have to do to get data on race/ethnicity is move to a voluntary survey. Which in fact they do, regularly. Some do a voluntary follow-up survey of applicants through Academica, others attach race/ethnicity questions on the Canadian Undergraduate Survey Consortium (CUSC) surveys, others attach it to NSSE. Response rates on these surveys are not great: NSSE sometimes gets 50% but that’s the highest rate available. And, broadly speaking, they get high-level data about their student body. The data isn’t great quality because of the response rate isn’t fabulous and the small numbers mean that you can’t really subdivide ethnicity very much (don’t expect good numbers on Sikhs v. Tamils), but one can know at a rough order of magnitude what percentage of the student body is visible minority, what percentage self-identifies as aboriginal, etc. I showed this data at a national level back here.

Is it possible to get better data? It’s hard to imagine, frankly. On the whole, students aren’t crazy about being surveyed all the time. NSSE has the highest response rate of any survey out there, and CUSC isn’t terrible either (though it tends to work on a smaller sample size). Maybe we could ask slightly better questions about ethnicities, maybe we could harmonize the questions across the two surveys. That could get you data at institutions which cover 90% of institutions in English Canada (at least).

Why would we want more than that? We already put so much effort into these surveys: why go to all kinds of trouble to do a separate data collection activity which in all likelihood would have worse response rates than what we already have?

It would be one thing, I think, if we thought Canadian universities had a real problem in not admitting minority students. But the evidence at the moment the opposite: that visible minority students in fact attend at a rate substantially higher than their share of the population. It’s possible of course that some sub-sections of the population are not doing as well (the last time I looked at this data closely was a decade ago, but youth from the Caribbean were not doing well at the time). But spending untold dollars and effort to get at that problem in institutions across country when really the Caribbean community in Canada is clustered in just two cities (three, if you count the African Nova Scotians in Halifax)? I can’t see it.

Basically, this is one of those cases where people are playing data “gotcha”. We actually do know (more or less) where we are doing well or poorly at a national level. On the whole, where visible minorities are concerned, we are doing well. Indigenous students? Caribbean students? That’s a different story. But we probably don’t need detailed institutional data collection to tell us that. If that’s really what the issue is, let’s just deal with it. Whinging about data collection is just a distraction.

Page 3 of 3812345...102030...Last »