Higher Education Strategy Associates

Category Archives: Uncategorized

October 03

Peas in a Pod

A few weeks ago, there was an absolutely hysterical story on CBC about a Fraser Institute report on carbon taxes.  You can read the article for yourself, but the argument was basically this: carbon taxes are bad because they would have a disproportionate effect on people in lower income brackets.

Assuming you believe the Fraser Institute actually gives a rat’s hairy behind about people in lower income brackets, this is not an entirely stupid point; multiple studies in the US have come to this conclusion. But it depends quite a bit on the design of the tax: if you use part of the revenue to fund lump-sum transfers to poorer families to offset the effects of the tax, one can actually develop a tax which is relatively progressive (see this paper from Resources for the Future for some simulations on the incidence of different types of carbon taxes).  So yes, if you design a bad carbon tax, it probably will have regressive effects.  But design a good one and you’re off to the races.

You may at this point be asking yourself “why is Alex droning on about carbon taxes in what is ostensibly a higher education blog”?  Fair question. And the answer is: because the Fraser Institute’s argument about carbon taxes is EXACTLY the same as the CFS/CAUT/Usual suspects on the left argument against tuition.  Fees are seen as regressive because they represent a higher proportion of family income to the poor than the rich (see here for example)

Now, if we believe that CFS and the usual suspects on one side and the Fraser Institute on the other both actually believe their own argument, then we have a possibility of some radical political re-alignment in Canada.  The hard left should oppose carbon taxes, the hard right should oppose tuition fees – after all, who would want to hurt the poor?

But, as you may suspect, that isn’t the whole story.  In precisely the same way that the Fraser Institute assumes away any sensible attempt to hold the poor harmless for a carbon tax through rebates or transfers, the usual suspects on the left completely ignore grants and scholarships as an offset to tuition fees, and so exaggerate – and occasionally entirely misrepresent – the actual distributional impact of net tuition.  One of the reasons I was so pleased last year about the Government of Ontario’s decision to make net tuition “free” for low income students was not so much because it improved students’ welfare (net tuition was already less than zero for many thousands of students), but precisely because it makes this rhetorical BS harder to maintain.

Anyway, even if students grants or energy tax rebates didn’t exist, objecting to putting a price on something because any non-zero price “impacts the poor more than the rich” is insane.  You could object to every product in a market economy that way: beer, popcorn, baby formula, pistachios – they all “impact the poor more than the rich”.  The point is to raise incomes at the bottom to help people purchase more goods at less of a burden, not get rid of the price mechanism.  You’d think that a right-wing pro-market think-tank might actually grasp that.

But then of course, said right-wing think tank does understand this.  Their argument is an argument of convenience and not conviction.  In the service of defeating carbon taxes, no argument is too stupid to make.  As is the case for the usual suspects and their hatred of tuition.  Peas in a pod.

September 30

Athletics Scholarships in Canada

Time was, about twenty years ago, Canadian universities didn’t spend money on university athletic scholarships.  Then things changed and universities turned on the taps.  Today we ask the question: “how’s that going for everyone”?

Well, it’s not going too badly, if you’re an athlete.  Just under 5,830 students received athletic scholarships totalling $15,981,189 in 2013-14 – that’s a little over $3,000 a pop.  CIS officially recognizes twenty-one sports, nine of which have teams for both genders (eighteen total), plus football which is male-only and rugby and field-hockey which are female-only.  However, roughly 85% of the scholarship dollars are concentrated in just nine sports, as shown below in Figure 1.  Some have almost no scholarships at all: inter-collegiate curling, for instance, has only 16 scholarships nationally for both sexes.

Figure 1: Top Sports by Scholarship Expenditure, 2013-14


What’s interesting here is that over time, the amount of money spent on Athletics scholarship has been rising quickly and steadily.  Even after accounting for inflation, Canadian universities spent nearly three times as much on athletics scholarships ($16 million vs. $5.8 million) in 2013-14 as they did ten years earlier.  It’s an interesting choice of expenditure by allegedly cash-strapped institutions.

Figure 2: Total Athletics Scholarships by Gender, 2003-4 vs 2013-4, in constant $2014


I suspect most institutions would probably defend it as a kind of strategic enrolment investment, much the way they defend other kinds of tuition discounting.  I mean, does it really matter if you give someone a $5,000 academic entrance scholarship or a $5,000 athletic scholarship?  They’re both forms of tuition discounting.  And of course, the absolute amounts are trivial.  $16 million is only 1% of the total amount of funding given by universities to students (if you include funding to graduate students).  And if you want into get into truly ludicrous comparisons, it’s less than what the University of Michigan spends on salaries for its football coaching staff.

A final point to make here is around gender equity.  Male and female athletes receive awards at roughly the same rate (45% of athletes of each gender receive an award), which is good.  However, imbalances remain in terms of the number of athletics spots for men than women (53% of all athletic team spots are male, compared to about 41% of undergraduates as a whole), and in terms of the size of the average award ($3,286 vs. $2,737).  Those results are better than they were a decade ago, and they appear to be slightly better than they are in the US, where actual legislation exists in the form of Title IX to enforce equity in sports, but they are still some ways from equal.

September 29

The Ontario NDP’s Bad Student Loan Math

The Ontario NDP have started down the road to madness on student aid.  Someone needs to stop them.

Here’s the issue: the NDP have decided to promise to make all Ontario student loans interest-free.  As a policy, this is pretty meh.  It’s not the kind of policy that increases participation because students don’t really pay attention to loan interest, and it’s not going to make loans a whole lot more affordable because Ontario forgives most loans anyway (as a consequence something like 90% of all loans in repayment in Ontario are federal loans which wouldn’t be subject to this policy).   My back-of-the-envelope calculation is that this policy might save a typical borrower in repayment something like $5/month, which isn’t a big deal as far as affordability is concerned.  One could argue that affordability of loan repayments shouldn’t be a big priority since loan payments as a fraction of average graduate income has gone down by about a third in the past fifteen years, but on the other hand, this isn’t likely to cost very much either, so really, who cares?

No, the problem isn’t so much the proposed program as it is the tagline that’s gone along with it. To wit: “The government shouldn’t be making a profit from student debt”.


I mean, where to begin with this stonking bit of nonsense?

The worst-case interpretation of this is that the NDP actually believes that “interest” equals “profit”, or, to put it another way, that money has no time-value.  Read literally, it suggests that all interest is usury.  The NDP is sometimes accused of being stuck in the 70s as far as economic policy is concerned; this particular slogan suggests it might be more 1370s than 1970s.

More likely, though, this is the NDP aping Massachusetts Senator Elizabeth Warren, who has been saying these kinds of things about US student loans for a few years now.  The essence of the critique is this: governments borrow money cheaply and lend to students at a higher rate (in the US, the rate on Stafford undergraduate subsidized loans is the 10-year Treasury rate plus 250 basis points, and somewhat higher for other types of public loans).  The gap between the two rates is needed because of course the government loses money on loans through loan defaults (it also loses money by assuming the loan interest while a student is in school, but that’s a separate issue).  For reasons beyond comprehension, the US government does not base its financial calculations for student loans on actuarial reports which are linked to actual student behaviour, but rather according to “standard conventions”, one of which essentially assumes no loan losses at all.  It is by using this convention – i.e. basically ignoring all actual costs – that Warren came to the conclusion that student loans “make money”. For a more complete description of why this is total nonsense, check out Jason Delisle’s work on the subject here as well as articles from the Atlantic, the Washington Post and the Brookings Institute.

But even to the limited extent the Warren critique makes sense in the US, it doesn’t work in Ontario.  OSAP loses money.  A lot of it.  It doesn’t publish numbers directly on this, but it’s easy enough to work it out.  Ontario 10-year bonds go for about 2.5% these days, and OSAP lends to students at prime + 1%, or about 3.7%.  So Ontario’s spread is only 120 basis points, or half the American spread (CSLP loans, are different: the feds borrow at 1% and lend at prime plus 250 basis points, for a total spread of 420 basis points).  120 basis points per year is not much when you consider that simply covering the cost of borrowing while students are in school is twice that.  Basically, it means that for someone who borrows for four years, the government loses money every time they pay back the loan in less than eight years.  And that’s not counting the cost of defaults, which are in the tens of millions of dollars each year.

Put simply: Ontario students get to borrow at zero interest while in school, and positive-but-below-market rates after graduation despite default rates which are astronomical by the standards of any other personal loan product.  That costs the government money.  If it defrays some of that cost through an interest rate spread, so be it – that does not constitute “making a profit”.  It is simply stupid of any political party which wishes to be entrusted with public finances to suggest otherwise.

September 28

International Rankings Round-Up

So, the international rankings season is now more or less at an end.  What should everyone take away from it?  Well, here’s how Canadian Universities did in the three main rankings (the Shanghai Academic Ranking of World Universities, the QS Rankings and the Times Higher Rankings).


Basically, you can paint any picture you want out of that.  Two rankings say UBC is better than last year and one says it is worse.  At McGill and Toronto, its 2-1 the other way.  Universities in the top 200?  One says we dropped from 8 to 7, another says we grew from 8 to 9 and a third says we stayed stable at 6.  All three agree we have fewer universities in the top 500, but they disagree as to which ones are out (ARWU figures it’s Carleton, QS says its UQ and Guelph, and for the Times Higher it’s Concordia).

Do any of these changes mean anything?  No.  Not a damn thing.  Most year-to-year changes in these rankings are statistical noise: but this year, with all three rankings making small methodological changes to their bibliometric measures, the year-to-year comparisons are especially fraught.

I know rankings sometimes get accused of tinkering with methodology in order to get new results and hence generate new headlines, but in all cases, this year’s changes made the rankings better, either making them more difficult to game, more reflective of the breadth of academia, or better at handling outlier publications and genuine challenges in bibliometrics.  Yes, the THE rankings threw up some pretty big year-to-year changes and the odd goofy result (do read my colleague Richard Holmes’ comments the subject here) but I think on the whole the enterprise is moving in the right direction.

The basic picture is the same across all of them.  Canada has three serious world-class universities (Toronto, UBC, McGill), and another handful which are pretty good (McMaster, Alberta, Montreal and then possibly Waterloo and Calgary).  16 institutions make everyone’s top 500 (the U-15 plus Victoria and Simon Fraser but minus Manitoba, which doesn’t quite make the grade on QS), and then there’s another half-dozen on the bubble, making it into some rankings’ top 500 but not others (York, Concordia, Quebec, Guelph, Manitoba, Concordia).  In other words, pretty much exactly what you’d expect in a global rankings.  It’s also almost exactly what we here at HESA Towers found when doing our domestic research rankings four years ago. So: no surprises, no blown calls.

Which is as it should be: universities are gargantuan, slow-moving, predictable organizations.  Relative levels of research output and prestige change very slowly; the most obvious sign of a bad university ranking is rapid changing of positions from year to year.   Paradoxically, of course, this makes better rankings less newsworthy.

More globally, most of the rankings are showing rises for Chinese universities, which is not surprising given the extent to which their research budgets have expanded in the past decade.  The Times threw up two big surprises; first by declaring Oxford the top university in the world when no other ranker, international or domestic, has them in first place in the UK, and second by excluding Trinity College Dublin from the rankings altogether because it had submitted some dodgy data.

The next big date on the rankings calendar is the Times Higher Education’s attempt to break into the US market.  It’s partnering with the Wall Street Journal to create an alternative to the US News and World Report rankings.  The secret sauce of these rankings appears to be a national student survey, which has never been used in the US before.  However, in order to get a statistically significant sample (say, the 210-students per institution minimum we used to use in the annual Globe and Mail Canadian University Report) at every institution currently covered by USNWR would imply an astronomically large sample size – likely north of a million students.  I can pretty much guarantee THE does not have this kind of sample.  So I doubt that we’re going to see students reviewing their own institution; rather, I suspect the survey is simply going to ask students which institutions they think are “the best”, which amounts to an enormous pooling of ignorance.  But I’ll be back with a more detailed review once this one is released.

September 27

Lying With Statistics, BC Edition

A couple of weeks ago I came across a story in the Vancouver Sun quoting a Federation of Post-Secondary Educators of BC (FPSE) “report” (actually more of a backgrounder) which contained two eye-catching claims:

  1.  “per-student operating grants have declined by 20 per cent since 2001 when adjusted for inflation.”
  2.  “government revenues from tuition fees have increased by almost 400 per cent since 2001”

The subtext here is clear.  20% down vs. 400% up?  How heinous!  How awful can the Government of British Columbia be?

Well now.  What to make of this dog’s breakfast?

Let’s start with the second point.  First of all, it’s institutional income, and not government income.  But leaving that aside, there was indeed a very big rise in tuition fees back in 2001-2 and 2002-3 (presumably why the authors chose 2001 as a base…if one used 2003 as a base, it would be a very different and much less dramatic story).   But if you simply look at average university tuition (college tuition is untracked) the increase since 2001 is only 110% (in nominal dollars).  Assume the increase for colleges was a bit higher because they were working from a lower base and perhaps we can nudge that up to 125%.  Still: how does one get from there to 400%?

First, remember that the authors (whoever they may be) are talking about aggregate tuition, not average tuition.  So some of this simply reflects an increase in enrollments.  In 2001-2, there were 196,000 students in BC.  In 2013-14, the last year for which we currently have data, there were 277,515 – an increase of 41%.  Back of the envelope, multiply that by the 110% nominal tuition increase and that gets you to a 176%.  Still a ways to go to 400% though.

Second, a lot of students are moving from lower-cost programs to higher cost-programs.  Some of that is happening within universities (e.g., from Arts to Engineering), but in BC it’s mostly a function of colleges turning themselves into universities and charging more tuition.  University enrollment rose from 80,388 to 179,917 while college enrolments went from 116,007 to 197,698.  That’s a lot of extra fees.

Third, BC has a lot more international students than it used to, and they pay more in fees on an individual basis than domestic students do.  Add those two factors together and you get another 19% or so increase in aggregate fees, which brings us to a 210% total increase.

That’s still nowhere near 400%.  So, I went and checked the source data – Statistics Canada’s Financial Information of Universities and Colleges (FIUC) for Universities (cansim 477-0058 if you’re a nerd) and the Financial Information of Community Colleges and Vocational Schools (cansim 477-0060) to try to find an answer.  Here’s what I found:


Yeah, so actually not 400%, more like 207% – reasonably close to the 210% from our back-of-the-envelope exercise.  The best excuse I can come up with for the Federation of BC Post-Secondary Educators’ number is that if you extend the universities number out another year (to 2014-15), you get to $1.258B, which is almost four times (actually 3.74x) of the 2001-02 figure (which is still only a 274% increase).  But you have to a) torque the living daylights out of the numbers and b) actively confuse percentage increases and multiples to get there.

But now let’s move over to the other side of the ledger, where the Federation notes a 20% drop in government support per student, adjusted for inflation.  Let’s note straight off the first inconsistency: they’re adjusting the government grants for inflation and not doing the same for tuition.  Second inconsistency: they’re adjusting the government grants for the size of the student population and not doing the same for tuition.

It’s easy to see why FPSE does this.  As we’ve already noted, student numbers were up by 41% between 2001-2 and 2013-14.  Just do the math: a 20% per student cut while student numbers are rising by 41% actually means that government support has risen by 13%.  In real dollars. (I went back to the source data myself and came up with 14% –  Close enough).  Chew on that for a second: FPSE is ragging on a government which has increased funding for post-secondary education by – on average – 1% over and above inflation every year since 2001-2.

So quite apart from any problems with the 400% number, FPSE is making a deliberate apples-to-oranges comparison by adjusting only one set of figures for student growth and inflation.  Here’s how those numbers compare on a number of different apples-to-apples basis (and I’m being nice to FPSE here and allowing different end dates for fees and grants based on different data availability):


Now, it seems to me there’s enough in the Clark government’s record to mount a decent attack without resorting this kind of nonsense.  It certainly under-invests relative to what it could be doing given the province’s growing population.  It carries a faux-populist pro-extraction industry line to the detriment of investing in expanding knowledge industries.  It has stayed out of step with most of the rest of the country in the last ten years by not improving student assistance.  And a fair, non-torqued comparison between student fees and government grants still shows students are bearing an increasing share of the cost.

So why stoop to using transparently false figures?  One might expect that kind of chicanery from the Canadian Federation of Students, which has form in this area.  But this is from an organization which represents professors: people who actually use statistics in the search for the truth.  So why is the organization which represents them using statistics in a way that wouldn’t pass muster in an undergraduate course?

I’m quite sure most profs wouldn’t be OK with this.  So why do FPSE’s member locals tolerate it?

September 26

Reforming Funding for First Nations Students

I see from this article by John Ivison of the National Post that the issue of funding for post-secondary education for First Nations is becoming a bit of a hot potato.  Time for us to take a look at the situation.

I think most people now get that First Nations’ students don’t receive “free education”.  They pay tuition fees like everyone else.  What they do have (if they have “status”) is a parallel student aid system, which is called the Post-Secondary Student Support Program or PSSSP. If you are unclear on the difference between “status” and “non-status” Indians, have a peek at this primer from the âpihtawikosisân  blog.  PSSSP is a $322 million/year program, under which Aboriginal Affairs and Northern Development Canada (AANDC) distributes this money according to a somewhat obscure formula to the 600-odd bands across the country.  They in turn hand out that money to their own members who wish to take higher education courses.

In theory, PSSSP is a need-based program, and bands are supposed to allocate money according to need “up to” maximums in various categories (tuition, living expenses, books, child care, etc).  But individual bands aren’t blessed with a whole lot of need-assessment capability, and in practice pretty much everyone who is admitted to the system gets the maximum.  And so given a relatively stable overall budget (the program’s growth has been capped at 2% per year since – if I recall correctly – 1990), increasing costs and increasing numbers of people wishing to use the program, what happens is that many are not able to access the program at all.  About 20,000 students receive money each year, with each receiving on average about $15,000.  Each band has a “waiting list” in which people are prioritized according to various criteria.  The total number of people on waiting lists is not an easy number to pin down, but it’s generally estimated at just north of 10,000 students.

Lifting the growth cap on PSSSP should be a relatively high priority, and the Liberals did promise an extra $50 million infusion in their manifesto.  One of the biggest disappointments in last year’s federal budget was the Liberal government’s failure to follow through on this promise.  The story here is complicated, but basically goes like this: in their costing document, the Liberals misunderstood how much was actually in the AANDC budget and so had a hole in their projections when it came to paying for their promises on Aboriginal K-12 education.  In order to meet Aboriginal groups halfway (actually somewhat less than that) on K-12 spending, they killed the PSSSP increase and threw the money into the K-12 pot.

Unwilling to spend any political capital going after a government which was sky-high in the polls, aboriginal groups were very low-key in their criticism of the budget.  After all, a majority government is in power for four years – and there’s some chance the Liberals will make good on their PSSSP promise in their second budget.  Perhaps to that end, towards the end of the summer criticism grew from both First nations and student groups (a hat-tip to the Canadian Federation of Students on this one); expect a renewed push on this front over the fall.

But there’s another element to this story not getting much play.  A few years ago, in a paper I wrote for AANDC , I noted that in fact almost everyone who qualifies for PSSSP would also qualify for the Canada Student Grant.  That would be another $3,000 per student per year.  Multiply that out across the 20,000 or so students currently getting PSSSP, that’s $60 million a year.  So if you could get everyone who currently gets PSSSP to also sign up for the Canada Student Grant and then have bands deduct that amount from their PSSSP award, bands would save enough to fund another 4,000 students at current PSSSP levels of funding.

Sound straightforward?  It’s not.  There are a whole bunch of barriers to getting bands to behave this way, not least of all that First Nations deserve to get their money for PSE on a nation-to-nation basis (i.e. like PSSSP) rather than having to go around clawing back $3,000 at a time with other government programs.  And yet, according to that Ivison article, it does seem like the Government of Canada is trying to push the idea of getting more status Indians with PSSSP funding onto student grants.

Is this an alternative to an extra $50 million in PSSP or in addition?  Who knows?  Is the plan to have bands claw back the new grant money from current students, and then distribute it to alleviate wait lists, or to give existing students more money? Who knows. I suspect in practice the answer may vary from band to band.  Either way, the next few months may promise a new era in First Nations Post-Secondary funding.

September 22

MOOCs at Five

It was five years ago last month that Stanford set up the first MOOC.  MOOCs were supposed to change the world: Udacity, Coursera and EdX were going to utterly transform education, putting many universities out of business.  Time to see how that’s going.

(Ok, ok: the actual use of the term MOOC was applied to a 2008 University of Manitoba course led by George Siemens and Stephen Downes.  Technically, using Downes’ taxonomy, the 2008 MOOC was a “cMOOC” – the “c” standing for connectivist, if I am not mistaken – while the versions that became popular through Coursera, Udacity and EdX, etc. are “xMOOCs”, the difference being essentially that learning in the former is more participative and collaborative while the latter has more in common with textbooks, only with video.  But the Stanford MOOC is what usually gets the attention, so I’m going to date it from there).

In the interests of schadenfreude if nothing else, allow me to take you back to 2012/3 to look at some of the ludicrous things people said about the likely effects of MOOCs.

  • “In 50 years there will only be 10 institutions in the world delivering higher education” (Sebastian Thrun, former CEO of Udacity)
  • “Higher Education is now being disrupted; our MP3 is the massive open online course (or mooc), and our Napster is Udacity” – Clay Shirky.
  • “Higher education is just on the edge of a crevisse (sic)…five years from now these enterprises (i.e. universities) are going to be in real trouble” Clayton Christensen

And of course who can forget that breathless cliché-ridden gem of an op-ed from Don Tapscott, about the week that higher education changed forever in January 2013 (i.e. the week he sat in on a couple of seminars on the subject in Davos).  Classic.

So, five years on, where are the MOOC pioneers now?  Well, Sebastian Thrun of Udacity got out of the disrupting-higher-education business early after coming to the realization that his company “didn’t have a good product”; the company pivoted to providing corporate training.  Over at Coursera, the most hyped of the early pioneers, founders Andrew Ng and Daphne Koller have both left the company (Ng left two years ago for Baidu, Koller left last month for one of Alphabet’s biotech enterprises).  Shortly after Koller’s departure, Coursera released this announcement  which was widely interpreted as the company throwing in the towel on the higher education market and following Udacity down the corporate training route.

EdX, the platform owned jointly by MIT and Harvard thus seems to be the last MOOC provider standing.  Perhaps not coincidentally, it is also the one which has (arguably) been most successful in helping students translate MOOC work into actual credits.  It has partnered with Arizona State University in its “Global Freshman Academy” and even allows conversion of some credits towards a specific MIT MBA (conditional on actually spending a semester on campus and paying normal fees to finish the program).   These “micro-MBAs” seem to catching on, but precisely because they are “micro”, they haven’t made a big impact on EdX’s overall numbers: their user base is still less than half Coursera’s.

So what’s gone wrong?  It isn’t a lack of sign-ups.  The numbers of people taking MOOCs continues to grow at a healthy clip, with Global enrolments to date now over 35 million.  The problem is there’s no revenue model here.  Depending on whose numbers you’re using, the number of users paying for some kind of certification (a fee which is usually priced in double digits) is at best around 3%.  So, work that out: 35 million users, with a 3% conversion rate, at $50 per user, and you’ve got a grand total of $52.5 million in total revenue.  Over five years.  Using content existing institutions are giving them essentially for free at a cost of anywhere between $50,000 and $250,000 per course.

This is not sustainable and never was.  Whatever valid points MOOC boosters had about the current state of education (and I think they had more than a few), the proposed solution wasn’t one that met the market test.  The basic problem is (and always has been) that higher education is fundamentally a prestige market.  Distance education is low prestige; distance education which doesn’t give out actual course credit doubly so.  You can disguise this by making delivery the domain of top universities, as EdX and to a lesser extent Coursera did – but top institutions don’t want to diminish their prestige by handing out actual credit to the hoi polloi over the internet.   So what you get is this unsatisfying compromise which in the end not enough people want to pay for.

Some of us said this five years ago (here, here and here) when MOOC-mania was in full flow and critical faculties were widely suspended.  Which just goes to show: higher education is the world’s most conservative industry and the rate of successful innovation is tiny.  Your best bet for imagining what higher education looks like in the future is what it looks like today, only more expensive.


September 21

Unit of Analysis

The Globe carried an op-ed last week from Ken Coates and Douglas Auld, who are writing a paper for the MacDonald Laurier institute on the evaluation of Canadian post-secondary institutions. At one level, it’s pretty innocuous (“we need better/clearer data”) but at another level I worry this approach is going to take us all down a rabbit hole. Or rather, two of them.

The first rabbit hole is the whole “national approach” thing. Coates and Auld don’t make the argument directly, but they manage to slip a federal role in there. “Canada lacks a commitment to truly high-level educational accomplishment”, needs a “national strategy for higher education improvement” and so “the Government of Canada and its provincial and territorial partners should identify some useful outcomes”. To be blunt: no, they shouldn’t. I know there is a species of anglo-Canadian that genuinely believes the feds have a role in education because reasons, but Section 93 of the constitution is clear about this for a reason. Banging on about national strategies and federal involvement just gets in the way of actual work getting done.

Coates & Auld’s point about the need for better data applies to provinces individually as well as collectively. They all need to get in the habit of using more and better data to improve higher education outcomes. I also think Coates and Auld are on the right track about the kinds of indicators most people would care about: scholarly output, graduation rates, career outcomes, that sort of thing. But here’s where they fall into the second rabbit hole: they assume that the institution is the right unit of analysis for these indicators. On this, they are almost certainly mistaken.

It’s an understandable mistake to make. Institutions are a unit of higher education management. Data comes from institutions. And they certainly sell themselves as a unified institutions carrying out a concerted mission (as opposed to the collections of feuding academic baronetcies united by grievances about parking and teaching loads they really are). But when you look at things like scholarly output, graduation rates, and career outcomes the institution is simply the wrong unit of analysis.

Think about it: the more professional programs a school has, the lower the drop-out rate and the higher the eventual incomes. If a school has medical programs, and large graduate programs in hard sciences, it will have greater scholarly output. It’s the palette of program offerings rather than their quality which makes the difference when making inter-institutional comparisons. A bad university in with lots of professional programs will always beat a good small liberal arts school on these measures.

Geography play a role, too. If we were comparing short-term graduate employment rates across Canada for most of the last ten years, we’d find Calgary and Alberta at the top – and most Maritime schools (plus  some of the Northern Ontario schools) at the bottom. If we were comparing them today, we might find them looking rather similar. Does that mean there’s been a massive fall-off in the quality of Albertan universities? Of course not. It just means that (in Canada at least) location matters a lot more than educational quality when you’re dealing with career outcomes.

You also need to understand something about the populations entering each institution. Lots of people got very excited when Ross Finnie and his EPRI showed big inter-institutional gaps in graduates incomes (I will get round to covering Ross’ excellent work on the blog soon, I promise). “Ah, interesting!” people said. “Look At The Inter-Institutional Differences Now We Can Talk Quality”. Well, no. Institutional selectivity kind of matters here. Looking at outputs alone, without taking into account inputs, tells you squat about quality. And Ross would be the first to agree with me on this (and I know this because he and I co-authored a damn good paper on quality measurement a decade ago which made exactly this point).

Now, maybe Coates and Auld have thought all this through and I’m getting nervous for no reason, but their article’s focus on institutional performance when most relevant outcomes are driven by geography, program and selectivity suggests to me that there’s a desire here to impose some simple rough justice over some pretty complicated cause-effect issues. I think you can use some of these simple outcome metrics to classify institutions – as HEQCO has been doing with some success over the past couple of years – but  “grading” institutions that way is too simplistic.

A focus on better data is great. But good data needs good analytical frameworks, too.

September 20

Sessionals: Equal Pay for Equal Work?

Following up on yesterday’s piece about counting sessionals, I thought it would be a useful time to address how sessionals get paid.  Every so often, the Ontario Confederation of University Faculty Associations (OCUFA) issues a press release asking that contract faculty get “equal pay for work of equal value”.  And who could be against that?  But what they don’t say, because no one wants to say this out loud is that, in Canada , adjuncts and sessionals are far from being underpaid: for the most part they actually are compensated fairly.  At least according to the standards of the academy itself.

I know that’s an unpopular opinion, but hear me out.  Think about what the correct comparator to a sessional academic is: it is a junior-rank academic, one who has been given assistant professor status but is not yet tenured.  These days in Canada, average pay for such folks is in the $80,000 range (your mileage may vary based on an institution’s location and prestige).

How much of that $80,000 is specifically for teaching?  Well, within the Canadian academy, there is a rule of thumb that a professor’s time should be spent 40% on teaching, 40% on research and 20% on some hazily-defined notion of service.  So, multiply that out and what you find is that only $32,000 of a new tenure-track prof’s salary is devoted to teaching.

Now break that down per class.   Depending on the institution, a professor is (in theory at least) teaching either four or five semester-long classes per academic year (2/2 or 3/2, in the academic vernacular).  Divide that $32,000 payment for teaching by four and you get $8,000 per one-semester class; divide it by five and you get $6,400.  An “equal work for equal pay” number therefore needs to be somewhere in that range.

Here’s what we know about adjuncts’ salaries: in 2014, the Higher Education Quality Council of Ontario published a study on salaries of “non-full-time instructors” in the province.  It showed that sessional instructors’ salaries in 2012/13 ranged from about a little $6,000 per course to a little over $8,000 per course (with inflation, it is likely slightly higher now), with most of the big universities clustered in the low-mid $7,000 range.  At a majority of institutions, sessionals also get health benefits and may participate in a pension plan.  In 2013, University Affairs, the in-house publication of Universities Canada published results on a nine-institution survey of sessional lecturer compensation (see here).  This showed a slightly wider range of compensation rates: at Quebec schools they were comparable to or slightly higher than Ontario rates, while elsewhere they were somewhat below.

To re-cap: if you buy the 40-40-20 logic of professorial pay, most universities in Canada – at least in central Canada – are in fact paying sessionals roughly the same as they are paying early-career tenure-track academics.  In some cases the benefits are not the same, and there may be a case for boosting pay a bit to compensate for that.  But the complaint that sessionals are vastly underpaid for the work they are contracted for?  Hard to sustain.

Sessionals themselves would naturally argue that they do far more than what they are contracted for: they too are staying academically active, doing research, etc.  To which the university response is: fine, but that’s not what we’re paying you for – you’re doing that on your own time.  The fight thus isn’t really about “equal pay”, it’s a fight about the right to be paid for doing research.

And of course OCUFA knows all this.  The math involved is pretty elementary.  It can’t really think these staff are underpaid unless it believes a) that the 40-40-20 thing is wrong and teaching should be a higher % of time and salary (good luck getting that one past the membership) or that sessionals need to be paid not on the same scale as assistant profs but on the scale of associates or full profs (again, I would question the likelihood of OCUFA’s membership thinking this is a good idea).

But if neither of those things is true, why does OCUFA use such language?  It’s a mystery worth pondering.

September 19

Counting Sessionals

Much rejoicing last Thursday when Science Minister Kirsty Duncan announced that the federal government was re-instating the funding for the Universities and Colleges Academic Staff System (UCASS), which was last run in 2011.     But what caught most people’s attention was the coda to the announcement, which said that Statistics Canada was going to “test the feasibility” of expanding the survey to include “part-time and public college staff” (the “C” in UCASS stands for colleges in the Trinity College sense, not the community college sense, so despite the name public colleges have never been in the survey).

What to make of this?  It seems that by “part-time” Statscan meant sessionals/adjuncts/ contract faculty.  That’s a bit off because every university I know of makes a very sharp distinction between “part-time” (many of whom are tenured) and “sessionals”.  It make one worry that Statistics Canada doesn’t understand universities well enough to use the correct terminology, which in turn bodes ill for their future negotiations with universities around definitions.

Because let’s be clear about this: universities will do almost anything they can to throw sand in the gears on this.  They do not want data on sessionals in the public eye, period.  Oh sure, in public the Presidents will welcome transparency, evidence-based decision-making, etc.  But institutional research shops – the ones who will actually be dealing with Statscan on this file – are Olympic champions in shutting down government attempts to liberate information.  In fact, that’s arguably their main purpose.  They won’t actually say no to anything – they’ll just argue relentlessly about definitions until Statscan agrees to a reduced program of data collection.  Statscan knows this is coming – they have apparently allocated four years (!!!) for negotiations with institutions, but the safest guess is that this simply isn’t going to happen.

And to be fair to universities, the kind of data UCASS would provide about sessionals would be pretty useless – a lot of work for almost nothing.  UCASS can count individuals, and track their average salaries.  But average salary data would be useless: it would conflate people teaching one course with people teaching five.  And since UCASS had no way to track workload (you’d actually need to blow up the survey and start again if you wanted to get at workload, and as interesting as that might be, good luck getting universities to green-light it), the data is meaningless.  Knowing the number of sessionals tells you nothing about what proportion of undergraduates are being taught by sessionals.  Are 200 sessionals teaching one course each worse than 100 teaching two courses apiece?  Of course not.  But if raw numbers are the only thing on offer then we’ll ascribe meanings to them where they arguably shouldn’t exist.

You see, “sessionals” are not really a single phenomenon.  Many are professionals who have full-time jobs and like teaching a class on a side, and they’re usually a huge boon to a department (especially in professional faculties like law, nursing an business) because they help expose students to a life beyond academia.  Others are PhD students teaching a class while another professor is away – and thus learning valuable skills.  The “bad” sessionals – the one people claim to want to stamp out – are the ones who have a PhD, are teaching multiple classes the way professors do.  I suspect this is a pretty small percentage of total sessionals, but we don’t know for sure.  And adding sessionals to UCASS won’t get us any closer to finding out because even if they wanted to, universities couldn’t collect data on which of their employees have other full-time jobs outside the institution.

Despite all the kumbayahs on Tuesday about how this UCASS expansion is going to promote “evidence-based decision-making”, I’m genuinely having trouble imagining a single policy problem where data from UCASS would make a difference.  Universities already know how many sessionals they employ and whether numbers are going up or down; UCASS might let them know how many sessionals other universities employ but frankly who cares?  It’s not going to make a difference to policy at an institutional level.

If you really wanted to know something about sessionals, you’d probably start with requiring institutions simply to provide contact information for every individual with teaching responsibilities who is not tenure-track, along with the amount paid to them in the previous academic year (note: Statscan couldn’t do this because it would never use the Stats Act to compel data this way.  Provincial governments could do so, however).  Then you’d do a survey of the instructors themselves – number of classes taught, other jobs they have, income from other jobs, etc.  Now I know some of you are going to say; didn’t some folks at OISE do that just recently?  Well, almost.  Yes, they administered almost this kind of survey, but because they weren’t drawing their sample from an administrative database, there’s no way to tell how representative their sample is and hence how accurate their results are.  Which is kind of important.

So, anyways, two cheers for the return of UCASS.  More data is better than less data.  But the effort to get data on sessionals seems like a lot of work for very little practical return even if universities can be brought round to co-operate.

Page 10 of 19« First...89101112...Last »