HESA

Higher Education Strategy Associates

Category Archives: Uncategorized

May 15

Provincial Budgets 2017

Springtime brings with it two certainties: 1) massive, irritating weekend traffic jams in Toronto as the city grants permits to close down Yonge street for a parade to virtually any group of yahoos, thus making it impossible to go from the cities east to west ends and 2) provincial budgets.  And with that, it’s time for my annual roundup of provincial budgets (click on the year for previous analyses – 2016 2015 2014 2013.  It’s not as bad as last year but it’s still kind of depressing.

Before we jump in, I need to remind everyone about some caveats on this data.  What is being compared here is announced spending in provincial budgets from year-to-year.  But what gets allocated and what gets spent are two different things. Quebec in particular has a habit of delivering mid-year cuts to institutions; on the flip side, Nova Scotia somehow spent 15% more than budgeted on its universities.  Also, not all money goes to institutions as operating funding:  this year, Newfoundland cut operating budgets slightly but threw in a big whack of cash for capital spending at College of the North Atlantic, so technically government post-secondary spending is up there this year.

One small difference this year from previous years: the figures for Ontario exclude capital expenditures.  Anyone who has a problem with that, tell the provincial government to publish its detailed spending estimates at the same time it delivers the budget like every other damn province.

This year’s budgets are a pretty mixed bunch.  Overall, provincial allocations after inflation fell by $13 million nationally – or just about .06%.  But in individual provinces the spread was between +4% (Nova Scotia) and -7% (Saskatchewan).  Amazing but true: two of the three provinces with the biggest gains were ones in which an election was/is being held this spring.

Figure 1: 1-Year change in Provincial Transfers to Post-Secondary Institutions, 2016-17 to 2017-18, in constant $2017

Province Budget Figure 1 Year Change Provincial Transfers

 

Now, this probably wouldn’t be such a big deal if it hadn’t come on the heels of a string of weak budgets for post-secondary education.  One year is neither here nor there: it’s the cumulative effect which matters.  Here’s the cumulative change over the past six years:

Figure 2: 6-year Change in Provincial Transfers to Post-Secondary Institutions, 2011-12 to 2017-18, in constant $2017

Figure 2 6 year chage in provincial transfers

 

Nationally, provinces are collectively providing 1% less to universities in inflation-adjusted dollars in 2017-18 than they were in 2011-12.  Apart from the NDP governments in Manitoba and Alberta, it’s really only Quebec which has bothered to keep its post-secondary funding ahead of inflation.  Out east, it’s mostly been a disaster – New Brunswick universities are down 9% over the last six years (not the end of the world because of concomitant enrolment declines), and a whopping 21% in Newfoundland.

The story is different on the student aid front, because a few provinces have made some big moves this year.  Ontario and New Brunswick have introduced their “free tuition” guarantees, thus resulting in some significant increases in SFA funding, while Quebec is spending its alternative payment bonanza from the Canada Student Loans Program changes (long story short: under the 1964 opt-out agreement which permitted the creation of the Canada Student Loans Program, every time CSLP spends more, it has to send a larger cheque to Quebec).  On the other side, there’s Newfoundland, which has cut it’s student aid budget by a whopping 78%.  This appears to be because the province is now flouting federal student aid rules and making students max out their federal loans before accessing provincial aid, rather than splitting the load 60-40 as other provinces do.

Figure 3: 1-Year change in Provincial Student Financial Aid Expenditures, 2016-17 to 2017-18, in constant $2017

Figure 3 1 Year change in student aid expenditures

 

And here’s the multi-year picture, which shows a 46% increase in student aid over the past six years, from $1.9 billion to just under $2.8 billion.  But there are huge variations across provinces.  In Ontario, aid is up 83% over six years (and OSAP now constitutes over half of all provincial student aid spending), while Saskatchewan is down by half and Newfoundland by 86%, mostly in the present year.  The one province where there is an asterisk here is Alberta, where there was a change in reporting in 2013-2014; the actual growth is probably substantially closer to zero than to the 73% shown here.

Figure 4: 6-Year change in Provincial Student Financial Aid Expenditures, 2016-17 to 2017-18, in constant $2017

Figure 4 6 Year Change in Provincial Student Aid

So the overall narrative is still more or less the same it’s been for the past few years.  On the whole provincial governments seem a whole lot happier spending money on students than they do on institutions.    Over the long run that’s not healthy, and needs to change.

May 11

Trade-offs in Apprenticeships

I haven’t worked on apprenticeship projects much in the last few years, but one of my current gigs has got me thinking about the area again.  And one thing that I apparently missed completely was a new (well, new to me anyway) effort to harmonize apprenticeship program sequencing nationally (details here).

Wait a minute, you say – weren’t apprenticeships always harmonized?  Isn’t that what Red Seal is all about?

Well, sort of.  Red Seal was about harmonizing outcomes.  Basically, Red Seal was an exam that journeypersons could take after completing their (provincially-governed) training which would certify them as being qualified to ply their trade right across the country.  It was optional – if you had no intention of leaving your home province there wasn’t a whole lot of point in taking the exam because completion of the program was itself sufficient to allow one to practice there.  Red Seal was therefore basically a mobility tool for people who had completed apprenticeships.

Now, that was fine when most apprentices started and completed their training in one province.  But during the resource boom, there was an explosion of apprentices who began training in one province and then moved and wanted to complete training in another.  This created problems because although Red Seal had long since harmonized apprenticeship training outcomes, each province got to those outcomes in quite different ways.  Within the same trade, the number of required hours/weeks of training varied from one province to another, and the sequencing was different.  Something that an electrician learned at level 1 in Alberta wasn’t taught until level 3 in Ontario, something that made things complicated if, for instance, an apprentice level 2 electrician got laid off in Windsor and wanted to try his/her luck in Alberta.

As I say, I’ve been out of this file awhile but what seems to have happened is that the provincial directors of apprenticeship seem to have got together and actually co-ordinated things like training sequencing, number of weeks of in-class training, etc, and this is what they refer to as “harmonization”.  According to that federal website, this harmonization initiative is about halfway done – i.e about half the Red Seal trades were harmonized in 2016 and 2017 and the rest will be rolled out in stages over the next couple of years.

So, a triumph for the Canadian apprenticeship system?  Well, not so fast.

Not all trades programs are apprenticeship programs, but the curriculum still has to line up because everyone wants graduates of pre-employment trades programs to be able to become apprentices in that area.  So what that means is that national harmonization of apprenticeship programs in effect means nationalization of the entire trades curriculum.  And what that means is the effectiveness of all those local industry committees that every community college program has suddenly just got a lot less effective, because significant curriculum changes now have to be negotiated among ten provincial directors of apprenticeships.

Traditionally, those committees have been a point of pride in Canada because they have given trades programs the ability to respond quickly to business needs.  Now, their effectiveness has been traded away in the name not of journeyperson mobility but of apprentice mobility, which was a thing in the resource boom but maybe not so much in the bust.  Is that a smart trade-off?  I suspect the answer varies quite a bit by trade, and yet solution this is being applied uniformly across Red Seal Trades.

We are told “industry” asked for this change, but I really wonder who was part of the consultation.  I can certainly believe that big industry with training efforts in many different provinces asked for it.  I can believe that extractive industries asked for it.  I have a harder time believing that smaller and medium enterprises asked for it because it substantially lowers their ability to affect curriculum and to some degree lowers the values of apprentices to them.

Silver linings have clouds, basically.  And centralized curricula have trade-offs.

May 10

Why Education in IT Fields is Different

A couple of years ago, an American academic by the name of James Bessen wrote a fascinating book called Learning by Doing: The Real Connection Between Innovation, Wages and Wealth.  (It’s brilliant.  Read it).  It’s an examination of what happened to wages and productivity over the course of the industrial revolution, particularly in the crucial cotton mill industry.  And the answer, it turns out, is that despite all the investment in capital which permitted vast jumps in labour productivity, in fact wages didn’t rise that much at all.  Like, for about fifty years.

Sound familiar?

What Bessen does in this book is to try to get to grips with what happens to skills during a technological revolution.  And the basic problem is that while the revolution is going on, while new machines are being installed, it is really difficult to invest in skills.  It’s not simply that technology changes quickly and so one has to continually retrain (thus lowering returns to any specific bit of training); it’s also that technology is implemented in very non-standard ways, so that (for instance) the looms at one mill are set up completely differently from the looms at another and workers have to learn new sets of skills every time they switch employers.  Human capital was highly firm-specific.

The upshot of all this: In fields where technologies are volatile and skills are highly non-standardized, the only way to reliably increase skills levels is through “learning by doing”.  There’s simply no way to learn the skills in advance.  That meant that workers had lower levels bargaining power because they couldn’t necessarily use the skills acquired at one job at another.  It also meant that, not to put too fine a point on it, that formal education becomes much less important compared to “learning by doing”.

The equivalent industry today is Information Technology.  Changes in the industry happen so quickly that it’s difficult for institutions to provide relevant training; it’s still to a large extent a “learning by doing” field.  Yet, oddly, the preoccupation among governments and universities is: “how do we make more tech graduates”?

The thing is, it’s not 100% clear the industry even wants more graduates.  It just wants more skills.  If you look at how community colleges and polytechnics interact with the IT industry, it’s often through the creation of single courses which are designed in response to very specific skill needs.  And what’s interesting is that – in the local labour market at least – employers treat these single courses as more or less equivalent to a certificate of competency in a particular field.  That means that these college IT courses these are true “microcredentials” in the sense that they are short, potentially stackable, and have recognized labour market value.  Or at least they do if the individual has some demonstrable work experience in the field as well (so-called coding “bootcamps” attempt to replicate this with varying degrees of success, though since they are usually starting with people from outside the industry, it’s not as clear that the credentials they offer are viewed the same way by industry).

Now, when ed-tech evangelists go around talking about how the world in future is going to be all about competency-based badges, you can kind of see where they are coming from because that’s kind of the way the world already works – if you’re in IT.  The problem is most people are not in IT.  Most employers do not recognize individual skills the same way, in part because work gets divided into tasks in a somewhat different way in IT than it does in most other industries.  You’re never going to get to a point in Nursing (to take a random example) where someone gets hired because they took a specific course on opioid dosages.  There is simply no labour-market value to disaggregating a nursing credential, so why bother?

And so the lesson here is this: IT work is a pretty specific type of work in which much store is put in learning-by-doing and formal credentials like degrees and diplomas are to some degree replaceable by micro-credentials.  But most of the world of work doesn’t work that way.  And as a result, it’s important not to over-generalize future trends in education based on what happens to work in IT.  It’s sui generis.

Let tech be tech.  And let everything else be everything else.  Applying tech “solutions” to non-tech “problems” isn’t likely to end well.

May 09

Conservative Leadership Platform Analysis

So, I just read through all the thirteen leadership candidates’ websites, looking for their thoughts on all the stuff this blog cares about: post-secondary education, skills, science, innovation, youth, etc.

The things I do for you people.

Actually, it was a pretty quick exercise because it turns out almost no one in the Tory leadership race places much importance on post-secondary education, skills, innovation, youth.  They seem to care a lot about taxes, and immigration (and to a lesser extent guns), but for a party that was in government less than two years ago, the Conservative candidates seem to have remarkably little appreciation for the things that actually drive a modern economy.  Anyways, briefly, here is what the candidates say about the issues this blog cares about.

 Chris Alexander (Former Minister of Citizenship & Immigration, ex-MP Ajax-Pickering): No specific platform on higher education, but the topic does come up frequently in his policies.  Expanding educational exports to Asia is priority.  He says he wants 400,000 new international students/year by 2020 and 500,000 per year by 2023 (I’m pretty sure he does not actually mean “new” as in new visa applications every year, I think that’s total in the country at any one time).  He also wants to spend money on new National Centres of Excellence and Centres of Excellence for Commercialization and Research for the digital economy as well as invest more in research related to art and design (I assume OCAD’s Robert Luke has something to do with that one).  He also has a general pledge to incentivize PSE institutions to collaborate more with “incubators accelerators and companies of all sizes”, whatever that means.

Maxime Bernier (Former Minister of industry, Foreign Affairs, and Min. of State for Small Business, MP for Beauce)The main point of interest in the Bernier platform is the rise in the personal tax exemption to $15,000 per year, which will have favourable impacts for many students.  Under his health platform, Bernier indicates he wants the federal government to vacate the health field and transfer tax points to the provinces; though he does not say so explicitly, it’s a fairly safe assumption that the same would apply to the transfer of funds to provinces for post-secondary education under the Canada Social Transfer.

Steven Blaney (Former Minister of Public Safety, MP Bellechasse—Les Etchemins—Lévis): Nothing at all.

Michael Chong (Former Minister of Intergovernmental Affairs, and Sport, MP Wellington-Halton Hills):  Nothing at all.

Kellie Leitch (Former Minister of Labour and the Status of Women, MP Simcoe-Grey): Nothing at all.

Pierre Lemieux (Former MP Glengarry-Prescott-Russell): Nothing at all.  Are you seeing a pattern yet?

Deepak Obhrai (MP Calgary Forest Lawn)Nothing at all.

 Erin O’Toole: (Former Minister of Veterans Affairs, MP Durham): O’Toole is the only candidate with anything even vaguely resembling plans for science and Innovation in the form of a scheme to extend the notion of “flow-through shares” –a tax gimmick heavily used in resource industries to defray development expenses – to new life-sciences and tech companies as well.  More intriguing is O’Toole’s “Generation Kick-Start” platform, which promises everyone who completes a degree, diploma or apprenticeship with an extra $100,000 of personal exemptions (i.e. $15K in reduced taxes) to be used before they turn 30.  That goes up to $300,000 if their credential in an area where skills are in “short supply” (definition vague but seems to include engineers, coders and “skilled tradespeople” even though 3 years into the oil slump the latter wouldn’t really qualify as “in demand”).  The latter half of the proposal is goofy, but the basic idea has a lot of merit.

 Rick Peterson: (A BC Investment Advisor of Some Sort): Nothing at all.

Lisa Raitt (Former Minister of Natural Resources, Labour, and Transportation, MP Milton). Like Maxime Bernier proposal, Raitt proposes to raise the basic tax exemption to 15K.  She also wants to increase the (totally useless) apprenticeship and completion grant up to $4,000.

 Andrew Saxton (ex-MP, North Vancouver)Saxton’s policy pages are – to put it mildly – light on detail.  However, he says he does want to invest in “skills training to ensure Canadian skills are matched with Canadian jobs” (whatever that means).  Also, having lived in Switzerland for some time, he advocates a Swiss-style apprenticeship program which extends into industries like banking, pharmaceuticals, etc.

Andrew Scheer (Former Speaker of the House of Commons, MP Regina-Qu’appelle) Scheer’s money proposals in education are limited to a pledge that parents of students attending independent schools a tax deduction of up to $4000 tuition annually per child, and a tax credit of $1,000 (i.e. a $150 reduction in taxes) to parents who choose to homeschool their child.  In addition, Scheer pledges that “public universities or colleges that do not foster a culture of free speech and inquiry on campus” will “not have support from the federal government”.  He then lists the tri-councils and CRCs as specific funding mechanisms for which institutions would not be eligible: it is unclear if the ban would include CFI and – more importantly – CSLP.  Note that the ban would only cover public institutions; private (i.e. religious) institutions would be able to limit free inquiry – as indeed faith-based institutions do for obvious reasons – and still be eligible for council funding.

Brad Trost (ex-MP Saskatoon-University): Nothing apart from a pledge for tax support to private education and homeschooling identical to Scheer’s.

And that’s the lot.  I think it’s fair to say that the field’s appreciation for the role of knowledge and skills in the modern economy is pretty weak.   Maybe dangerously so.  Still, if you are voting in this election and you think PSE and skills are important, your best bet is probably Chris Alexander; if you want to raise youth living standards, vote for O’Toole followed perhaps by Maxime Bernier or Lisa Raitt.

(And yes, I know the percentage of Conservative voters motivated by those two sets of issues are vanishingly small, but I only have this one shtick, so cut me some slack).

 

May 08

Naylor Report, Part II

Morning all.  Sorry about the service interruption.  Nice to be back.

So, I promised you some more thoughts about the Fundamental Science Review.  Now that I’ve lot of time to think about it, I think I’m actually surprised by what it doesn’t say, says and how many questions remain open.

What’s best about the report?  The history and most of the analysis are pretty good.  I think a few specific recommendations (if adopted) might actually be a pretty big deal – in particular the one saying that the granting councils should stop any programs forcing researchers to come up with matching funding, mainly because it’s a waste of everyone’s time.

What’s so-so about it?  The money stuff for a start.  As I noted in my last blog post, I don’t really think you can justify a claim to more money based on “proportion of higher ed investment research coming from federal government”.  I’m more sympathetic to the argument that there needs to be more funds, especially for early career researchers, but as noted back here it’s hard to argue simultaneously that institutions should have unfettered rights to hire researchers but that the federal government should be pick up responsibility for their career progression.

The report doesn’t even bother, really, to make the case that more money on basic research means more innovation and economic growth.  Rather, it simply states it, as if it were a fact (it’s not).  This is the research community trying to annex the term “innovation” rather than co-exist with it.  Maybe that works in today’s political environment; I’m not sure it improves overall policy-making.  In some ways, I think it would have been preferable to just say: we need so many millions because that’s what it takes to do the kind of first-class science we’re capable of.  It might not have been politic, but it would have had the advantage of clarity.

…and the Governance stuff?  The report backs two big changes in governance.  One is a Four Agency Co-ordinating Board for the three councils plus the Canada Foundation for Innovation (which we might as well now call the fourth council, provided it gets an annual budget as recommended here), to ensure greater cross-council coherence in policy and programs.  The second is the creation of a National Advisory Committee on Research and Innovation (NACRI) to replace the current Science, Technology and Innovation Council and do a great deal else besides.

The Co-ordinating committee idea makes sense: there are some areas where there would be clear benefits to greater policy coherence.  But setting up a forum to reconcile interests is not the same thing as actually bridging differences.  There are reasons – not very good ones, perhaps, but reasons nonetheless – why councils don’t spontaneously co-ordinate their actions; setting up a committee is a step towards getting them to do so, but success in this endeavour requires sustained good will which will not necessarily be forthcoming.

NACRI is a different story.  Two points here.  The first is that it is pretty clear that NACRI is designed to try to insulate the councils and the investigator-driven research they fund from politicians’ bright ideas about how to run scientific research.  Inshallah, but if politicians want to meddle – and the last two decades seem to show they want to do it a lot – then they’re going to meddle, NACRI or no.  Second, the NACRI as designed here is somewhat heavier on the “R” than on the “I”.  My impression is that as with some of the funding arguments, this is an attempt to hijack the Innovation agenda in Research’s favour.  I think a lot of people are OK with this because they’d prefer the emphasis to be on science and research rather than innovation but I’m not sure we’re doing long-term policy-making in the area any favours by not being explicit about this rationale.

What’s missing?  The report somewhat surprisingly punted what I expected to be a major issue: namely, the government’s increasing tendency over time to fund science outside the framework of the councils in such programs as the Canada Excellence Research Chairs (CERC) and the Canada First Research Excellence Fund (CFREF).  While the text of the report makes clear the authors’ have some reservations about these programs, the recommendations are limited to a “you should review that, sometime soon”.  This is too bad, because phasing out these kinds of programs would be an obvious way to pay for increase investigator-driven funding (though as Nassif Ghoussoub points out here  it’s not necessarily a quick solution because funds are already committed for several years in advance).  The report therefore seems to suggest that though it deplores past trends away from investigator-driven funding, it doesn’t want to see these recent initiatives defunded, which might be seen in government as “having your cake and eating it too”.

What will the long-term impact of the report be? Hard to say: much depends on how much of this the government actually takes up, and it will be some months before we know that.  But I think the way the report was commissioned may have some unintended adverse consequences.  Specifically, I think the fact that this review was set up in such a way as to exclude consideration of applied research – while perfectly understandable – is going to contribute to the latter being something of a political orphan for the foreseeable future.  Similarly, the fact that the report was done in isolation from the broader development of Innovation policy might seem like a blessing given the general ham-fistedness surrounding the Innovation file, in the end I wonder if the end result won’t be an effective division of policy, with research being something the feds pay universities do and innovation something they pay firms to do.  That’s basically the right division, of course, but what goes missing are vital questions about how to make the two mutually reinforcing.

Bottom line: it’s a good report.  But even if the government fully embraces the recommendations, there are still years of messy but important work ahead.

April 18

Naylor Report, Take 1

People are asking why I haven’t talked about the Naylor Report (aka the Review of Fundamental Science) yet.  The answer, briefly, is i) I’m swamped ii) there’s a lot to talk about in there and iii) I want to have some time to think it over.  But I did have some thoughts about chapter 3, where I think there is either an inadvertent error or the authors are trying to pull a fast one (and if it’s the latter I apologize for narking on them).  So I thought I would start there.

The main message of chapter 3 is that the government of Canada is not spending enough on inquiry-driven research in universities (this was not, incidentally, a question the Government of Canada asked of the review panel, but the panel answered it anyway).  One of the ways that the panel argues this point is that while Canada has among the world’s highest levels of Research and Development in the higher education sector – known as HERD if you’re in the R&D policy nerdocracy – most of the money for this comes from higher education institutions themselves and not the federal government.  This, that say, is internationally anomalous and a reason why the federal government should spend more money.

Here’s the graph they use to make this point:

Naylor Report

Hmm.  Hmmmmm.

So, there are really two problems here.  The first is that HERD can be calculated differently in different countries for completely rational reasons.  Let me give you the example of Canada vs. the US.  In Canada, the higher education portion of the contribution to HERD is composed of two things: i) aggregate faculty salaries times the proportion of time profs spend on research (Statscan occasionally does surveys on this – I’ll come back to it in a moment) plus ii) some imputation about unrecovered research overhead.  In the US, it’s just the latter.  Why?  Because the way the US collects data on HERD, the only faculty costs they capture are the chunks taken out of federal research grants.  Remember, in the US, profs are only paid 9 months per year and at least in the R&D accounts, that’s *all* teaching.  Only the pieces of research grant they take out as summer salary gets recorded as R&D expenditure (and also hence as a government-sponsored cost rather than a higher education-sponsored one).

But there’s a bigger issue here.  If one wants to argue that what matters is the ratio of federal portion of HERD to the higher-education portion of HERD, then it’s worth remembering what’s going on in the denominator.  Aggregate salaries are the first component.  The second component is research intensity, as measured through surveys.  This appears to be going up over time.  In 2000, Statscan did a survey which seemed to show the average prof spending somewhere between 30-35% of their time on research. A more recent survey shows that this has risen to 42%.  I am not sure if this latest co-efficient has been factored into the most recent HERD data, but when it does, it will show a major jump in higher education “spending” (or “investment”, if you prefer) on research, despite nothing really having changed at all (possibly it has been and it is what explains the bump seen in expenditures in 2012-13)

What the panel ends up arguing is for federal funding to run more closely in tune with higher education’s own “spending”.  But in practice what this means is: every time profs get a raise, federal funding would have to rise to keep pace.  Every time profs decide – for whatever reasons – to spend more time on research, federal funds should rise to keep pace.  And no doubt that would be awesome for all concerned, but come on.  Treasury Board would have conniptions if someone tried to sell that as a funding mechanism.

None of which is to say federal funding on inquiry-driven research shouldn’t rise.  Just to say that using data on university-funded HERD might not be a super-solid base from which to argue that point

April 17

British Columbia: Provincial Manifesto Analysis

On May 9th, our left-coasters go to the polls.  What are their options as far a post-secondary education is concerned?

Let’s start with the governing Liberals.  As is often the case with ruling parties, some of their promises are things that are both baked into the fiscal framework and will take longer than one term to complete (e.g. “complete re-alignment of $3 billion in training funds by 2024”), or are simply re-announcements of previous commitments (page 85-6 of the manifesto appears to simply be a list of all the SIF projects the province already agreed to co-fund), or take credit for things that will almost certainly happen anyways (“create 1000 new STEM places”…. in a province which already has 55,000 STEM seats and where STEM spots have been growing at a rate of about 1700/year anyway…interestingly the Liberals didn’t even bother to cost that one…)

When you throw those kinds of promises away, what you are left with is a boatload of micro-promises, including: i) making permanent the current BC Training Tax Credit for employers, ii) creating a new Truck Logger training credit (yes, really), iii) spending $10M on open textbooks over the next 4 years, iv) reducing interest rates on BC student loans to prime, v) making minor improvements to student aid need assessment, vi) providing a 50% tuition rebate to Armed Forces Veterans, vii) creating a centralized province-wide admission system and viii) allowing institutions to build more student housing (currently they are restricted from doing so because any institutional debt is considered provincial debt and provincial debt is more or less verboten…so this is a $0 promise just to relax some rules).  There’s nothing wrong with any of those, of course, but only the last one is going to make any kind of impact and as a whole it certainly doesn’t add up to a vision.  And not all of this appears to be new money: neither the student loan changes nor the centralized application system promises are costed, which suggests funds for these will cannibalized from elsewhere within the system.  The incremental cost of the remaining promises?  $6.5 million/year.  Whoop-de-do.  Oh, and they’re leaving the 2% cap on tuition rises untouched.

What about the New Democrats?  Well, they make two main batches of promises.  One is about affordability, and consists of matching the Liberal pledge on a tuition cap, slightly outdoing them on provincial student loan interest (eliminating it on future and past loans, which is pretty much the textbook definition of “windfall gains”), and getting rid of fees for Adult Basic Education and English as a Second Language Program (which, you know, GOOD).  There’s also an oddly-worded pledge to provide a $1,000 completion grant “for graduates of university, college and skilled trades programs to help pay down their debt when their program finishes”: based on the costing and wording, I think that means the grant is restricted to those who have provincial student loans.

The NDP also has a second batch of policies around research – $50M over two years to create a graduate scholarship fund and $100M (over an unspecified period, but based on the costing, it’s more than two years) to fund expansion of technology-related programs in BC PSE institutions.  There is also an unspecified (and apparently uncosted) promise to expand tech-sector co-op programs.  Finally, they are also promising to match the Liberals on the issue of allowing universities to build student housing outside of provincial controls on capital spending.

Finally, there are the Greens, presently running at over 20% in the polls and with a real shot at achieving a significant presence in the legislature for the first time.  They have essentially two money promises: one, “to create a need-based grant system” (no further details) and two, an ungodly bad idea to create in BC the same graduate tax credit rebate that New Brunswick, Nova Scotia and now Manitoba all have had a shot at (at least those provinces had the excuse that they were trying to combat out-migration; what problem are the BC Greens trying to solve?).

Hilariously, the Green’s price-tag for these two items together is…$10 million.  Over three years.  Just to get a sense of how ludicrous that is, the Manitoba tax credit program cost $55 million/year in a province a quarter the size.  And within BC, the feds already give out about $75M/year in up-front grants.  So I think we need to credit the Greens with being more realistic than their federal cousins (remember the federal green manifesto?  Oy.), but they have a ways to go on realistic budgeting.

(I am not doing a manifesto analysis for the BC Conservatives because a) they haven’t got one and b) I’ve been advised that if they do release one it will probably be printed in comic sans.)

What to make of all this?  Under Gordon Campbell, the Liberals were a party that “got” post-secondary education and did reasonably well by it; under Christy Clark it’s pretty clear PSE can at best expect benign neglect.  The Greens’ policies focus on price rather than quality, one of their two signature policies is inane and regressive, and their costing is off by miles.

That leaves the NDP.  I wouldn’t say this is a great manifesto, but it beats the other two.  Yeah, their student aid policies are sub-optimally targeted (they’re all for people who’ve already finished their programs, so not much access potential), but to their credit they’ve avoided going into a “tuition freezes are magic!” pose.  Alone among the parties, they are putting money into expansion and graduate studies and even if you don’t like the tech focus, that’s still something.

But on the whole, this is a weak set of manifestos.  I used to say that if I was going to run a university anywhere I’d want it to be In British Columbia.  It’s the least-indebted jurisdiction in Canada, has mostly favourable demographics, has easy access from both Asia (and its students) and from the well-off American northwest.  And it’s got a diversified set of institutions which are mostly pretty good at what they do.  Why any province would want to neglect a set of institutions like that is baffling; but based on these manifestos it seems clear that BC’s PSE sector isn’t getting a whole lot of love from any of the parties.  And that’s worrying for the province’s long-term future.

April 12

Access: A Canadian Success Story

Statscan put out a very important little paper on access to post-secondary education on Monday.  It got almost zero coverage despite conclusively putting to bed a number of myths about fees and participation, so I’m going to rectify that by explaining it to y’all in minute detail.

To understand this piece, you need to know something about a neat little Statscan tool called the Longitudinal Administrative Database (LAD).  Every time someone files an income tax form for the first time, LAD randomly selects one in five of them and follows them for their entire lifetime.  If at the time someone first files a tax return they have the same address as someone who is already in the LAD (and who is the right age to have a kid submitting a tax form for the first time), one can make a link between a parent and child.  In other words, for roughly 4% of the population, LAD has data on both the individual and the parent, which allows some intergenerational analysis.  Now, because we have tax credits for post-secondary education (PSE), tax data allows us to know who went to post-secondary education and who did not (it can’t tell us what type of institution they attended, but we know that they did attend PSE).  And with LAD’s backward link to parents, it means we can measure attendance by parental income.

Got that?  Good.  Let’s begin.

The paper starts by looking at national trends in PSE participation (i.e. university and college combined) amongst 19 year-olds since 2001, by family income quintile.  Nationally, participation rates rose by just over 20%, from 52.6% to 63.8%.  They also rose for every quintile.  Even for youth the lowest income quintile, participation is now very close to 50%.

 Figure 1: PSE enrolment rates by Income Quintile, Canada 2001-2014

PSE by Income Quintile

This positive national story about rates by income quintile is somewhat offset by a more complex set of results for participation rates by region.  In the 6 eastern provinces, participation rate rose on average by 13.6 percentage points; in the four western provinces, it rose by just 2.8 percentage points (and in Saskatchewan it actually fell slightly).  The easy answer here is that it’s about the resource boom, but if that were the case, you’d expect to see a similar pattern in Newfoundland, and a difference within the west between Manitoba and the others.  In fact, neither is true: Manitoba is slightly below the western average and Newfoundland had the country’s highest PSE participation growth rate.

 Figure 2: PSE Participation rates by region, 2002-2014

PSE by region

(actually, my favourite part of figure 2 is data showing that 19 year-old Quebecers – who mostly attend free CEGEPs, have a lower part rate than 19 year-old Ontarians who pay significant fees, albeit with benefit of a good student aid system.)

But maybe the most interesting data here is with respect to the closing of the gap between the top and bottom income quintile.  Figure 3 shows the ratio of participation rates of students from the bottom quintile (Q1) to those from the top quintile (Q5), indexed to the ratio as it existed in 2001, for Canada and selected provinces.  So a larger number means Q1 students are becoming more likely to attend PSE relative to Q5s and a smaller number means they are becoming less likely.  Nationally, the gap has narrowed by about 15%, but the interesting story is actually at the provincial level.

Figure 3: Ratio of Q1 participation rates to Q5 participation rates, Canada and selected provinces, 2001-2014

Q1 to Q5 participation rates

At the top end, what we find is that Newfoundland and Ontario are the provinces where the gap between rich and poor has narrowed the most.  Given that one of these provinces has the country’s highest tuition and the other the lowest, I think we can safely rule out tuition, on its own, as a plausible independent variable (especially as Quebec, the country’s other low-tuition province, posted no change over the period in question).  At the bottom end, we have the very puzzling case of Saskatchewan, where inequality appears to have got drastically worse over the past decade or so.  And again, though it’s tempting to reach for a resource boom explanation, nothing similar happened in Alberta so that’s not an obvious culprit.

Anyways, here’s why this work is important.  For decades, the usual suspects (the Canadian Federation of Students, the Canadian Center for Policy Alternatives) have blazed with self-righteousness about the effects of higher tuition and higher debts (debt actually hasn’t increased that much in real terms since 2000, but whatever).  But it turns out there are no such effects.  Over a decade of tuition continuing to increase slowly and average debts among those who borrow of over $25,000 and it turns out not only did participation rates increase, but participation rates of the poorest quintile rose fastest of all.

And – here’s the kicker – different provincial strategies on tuition appear to have had diddly-squat to do with it.  So the entire argument the so-called progressives make in favour of lower tuition is simply out the window.  That doesn’t mean they will change their position, of course.  They will continue to talk about the need to eliminate student debt because it is creating inequality (it’s actually the reverse, but whatever).  But of course, this make the free-tuition position even sillier.  If the problem is simply student debt, then why advocate a policy in which over half your dollars go to people who have no debt?

It’s the Ontario result in particular that matters: it proves that a high-tuition/high-aid policy is compatible with a substantial widening of access.  And that’s good news for anyone who wants smart funding policies in higher education.

April 11

Populists and Universities, Round Two

There is a lot of talk these days about populists and universities.  There are all kinds of thinkpieces about “universities and Trump”, “universities and Brexit”, etc.  Just the other day, Sir Peter Scott delivered a lecture on “Populism and the Academy” at OISE, saying that over the past twelve months it has sometimes felt like universities were “on the wrong side of history”.

Speaking of history, one of the things that I find a bit odd about this whole discussion is how little the present discussion is informed by the last time this happened – namely, the populist wave of the 1890s in the United States.  Though the populists never took power nationally, they did capture statehouses in many southern and western states, most of whom had relatively recently taken advantage of the Morrill Act to establish important state universities.  And so we do have at least some historical record to work from – one that was very ably summarized by Scott Gelber in his book The University and the People.

The turn-of-the-20th-century populists wanted three things from universities. First, they wanted them to be accessible to farmers’ children – by which they meant both laxer admissions standards and “cheap”.  That didn’t necessarily mean they wanted to increase expenditures on university budgets substantially (though in practice universities did OK under populist governors and legislators); what it meant was they wanted tuition to remain low and if that entailed universities having to tighten their belts, so be it.  And the legacy of the populists lives on today: average state tuition in the US still has a remarkable correlation to William Jennings Bryan’s share of the vote in the 1896 Presidential election.

 

Fig 1: 2014-15 In-State Tuition Versus William Jennings Bryan’s Vote Share in 1896

Populism Graph

 

The second thing populists wanted was more “practical” education.  They were not into learning for the sake of learning, they were into learning for the sake of material progress and making life easier for workers and farmers; in many ways, one could argue that their attitude about the purpose of higher education was pretty close to that of Deng/Jiang-era China.  And to some extent they were pushing on an open door because the land-grant universities – particularly the A&Ms – were already supposed to have that mandate.

But there was a tension in the populists’ views on curriculum.  They weren’t crazy about law and humanities programs at state universities (too much useless high culture that divided the masses from the classes), but they did grasp that an awful lot of people who were successful in politics had gone through law and humanities programs and – so to speak – learned the tricks of the trade there (recall that rhetoric was one of the seven Liberal arts which still played a role in 19th century curricula).  And so, there was also concern that if public higher education were made too vocational, its beneficiaries would still be at a disadvantage politically.  There were various solutions to this problem, not all of which were to the benefit of humanities subjects, but the key point was this: universities should remain places where leaders are made.  If that meant reading some Marcus Aurelius, so be it: universities were a ladder into the ruling class, and the populists wanted to make sure their kids were on it.

And here, I think is where times have really changed. The new populists are, in a sense, more Gramscian than their predecessors.  They get that universities are ladders to power for individuals, but they also understand that the cultural function of universities goes well beyond that.  Universities are – perhaps even more so than the entertainment industry – arbiters of acceptable political discourse.  They are where the hegemonic culture is made.  And however much they may want their own kids to get a good education, today’s populists really want to smash those sources of cultural hegemony.

This is, obviously, not good for universities.  We can – as Peter Scott suggested – spend more time trying to make universities “relevant” to the communities that surround them.  Nothing wrong with that.  We can keep plugging away at access: that’s a given no matter who is in power.  But on the core issue of the culture of universities, there is no compromise.  Truth and open debate matter.  A commitment to the scientific method and free inquiry matter.  Sure, universities can exist without these things: see China, or Saudi Arabia.  But not here.  That’s what makes our universities different and, frankly, better.

No compromise, no pasarán.

April 10

Evaluating Teaching

The Ontario Confederation of University Faculty Associations (OCUFA) put out an interesting little piece the week before last summarizing the problems with student evaluations of teaching.  It contains reasonable summary of the literature and I thought some of it would be worth looking at here.

We’ve known for awhile now that the results of student evaluations are statistically biased in various ways.  Perhaps the most important way they are biased is that professors who mark more leniently get higher rankings from their students.  There is also the issue of what appears to be discrimination: female professors and visible minority professors tend to get lower ratings than white men.  And then there’s the point that OCUFA makes with respect to the comments section of these evaluations being a hotbed of statements which amount to harassment.  These points are all well worth making.

One might well ask: given that we all know about the problems with teaching evaluations, why in God’s name do institutions still use them?  Fair question.  Three hypotheses:

  1. Despite flaws in the statistical measurement of teaching, the comments actually do provide helpful feedback, which professors use to improve their teaching.
  2. When it comes to pay and promotion, research is weighted far more highly than teaching, so unless someone completely tanks their teaching evals – and by tanking I mean doing so much below par that it can’t reasonably be attributed to one of the biases listed above – they don’t really matter all that much (note: while this probably holds for tenured and tenure-track profs, I suspect the stakes are higher for sessionals).
  3. No matter how bad a measurement instrument they are, the idea that one wouldn’t treat student opinions seriously is totally untenable, politically.

In other words, there are benefits despite the flaws, the consequences of flaws might not be as great as you think, and to put it bluntly, it’s not clear what the alternative is.  At least with student evaluations you can maintain the pretense that teaching matters to pay and promotion.  Kill those, and what have you got?  People already think professors don’t care enough about teaching.  Removing the one piece of measurement and accountability for teaching that exists in the system – no matter how flawed – is simply not on.

That’s not to say there aren’t alternatives to measuring teaching.  One could imagine a system of peer evaluation, where professors rate one another.  Or one could imagine a system where the act of teaching and the act of marking are separated – and teachers are rated on how well their students perform.  It’s not obvious to me that professors would prefer such a system.

Besides, it’s not as though the current system can’t be redeemed.  Solutions exist.  If we know that easy markers get systematically better ratings, then normalize ratings based on the class average mark.  Same thing for gender and race: if you know what the systematic bias looks like, you can correct for it.  And as for ugly stuff in the comments section, it’s hardly rocket science to have someone edit the material for demeaning comments prior to handing it to the prof in question.

There’s one area where the OCUFA commentary goes beyond the evidence however, and that’s in trying to translate the findings of student teaching evaluations (ie. how did Professor X do in Class Y) to surveys of institutional satisfaction.  The argument they make here is that because the one is known to have certain biases, the other should never be used to make funding decisions.  Now, without necessarily endorsing the idea of using student satisfaction as a funding metric, this is terrible logic. The two types of questionnaires are entirely different, ask different questions, and simply are not subject to the same kinds of biases.  It is deeply misleading to imply otherwise.

Still, all that said, it’s good that this topic is being brought into the spotlight.   Teaching is the most important thing universities do.  We should have better ways of measuring its impact.  If OCUFA can get us moving along that path, more power to them.

Page 2 of 2112345...1020...Last »