HESA

Higher Education Strategy Associates

Author Archives: Alex Usher

September 29

The Ontario NDP’s Bad Student Loan Math

The Ontario NDP have started down the road to madness on student aid.  Someone needs to stop them.

Here’s the issue: the NDP have decided to promise to make all Ontario student loans interest-free.  As a policy, this is pretty meh.  It’s not the kind of policy that increases participation because students don’t really pay attention to loan interest, and it’s not going to make loans a whole lot more affordable because Ontario forgives most loans anyway (as a consequence something like 90% of all loans in repayment in Ontario are federal loans which wouldn’t be subject to this policy).   My back-of-the-envelope calculation is that this policy might save a typical borrower in repayment something like $5/month, which isn’t a big deal as far as affordability is concerned.  One could argue that affordability of loan repayments shouldn’t be a big priority since loan payments as a fraction of average graduate income has gone down by about a third in the past fifteen years, but on the other hand, this isn’t likely to cost very much either, so really, who cares?

No, the problem isn’t so much the proposed program as it is the tagline that’s gone along with it. To wit: “The government shouldn’t be making a profit from student debt”.

ottsyd20160929

I mean, where to begin with this stonking bit of nonsense?

The worst-case interpretation of this is that the NDP actually believes that “interest” equals “profit”, or, to put it another way, that money has no time-value.  Read literally, it suggests that all interest is usury.  The NDP is sometimes accused of being stuck in the 70s as far as economic policy is concerned; this particular slogan suggests it might be more 1370s than 1970s.

More likely, though, this is the NDP aping Massachusetts Senator Elizabeth Warren, who has been saying these kinds of things about US student loans for a few years now.  The essence of the critique is this: governments borrow money cheaply and lend to students at a higher rate (in the US, the rate on Stafford undergraduate subsidized loans is the 10-year Treasury rate plus 250 basis points, and somewhat higher for other types of public loans).  The gap between the two rates is needed because of course the government loses money on loans through loan defaults (it also loses money by assuming the loan interest while a student is in school, but that’s a separate issue).  For reasons beyond comprehension, the US government does not base its financial calculations for student loans on actuarial reports which are linked to actual student behaviour, but rather according to “standard conventions”, one of which essentially assumes no loan losses at all.  It is by using this convention – i.e. basically ignoring all actual costs – that Warren came to the conclusion that student loans “make money”. For a more complete description of why this is total nonsense, check out Jason Delisle’s work on the subject here as well as articles from the Atlantic, the Washington Post and the Brookings Institute.

But even to the limited extent the Warren critique makes sense in the US, it doesn’t work in Ontario.  OSAP loses money.  A lot of it.  It doesn’t publish numbers directly on this, but it’s easy enough to work it out.  Ontario 10-year bonds go for about 2.5% these days, and OSAP lends to students at prime + 1%, or about 3.7%.  So Ontario’s spread is only 120 basis points, or half the American spread (CSLP loans, are different: the feds borrow at 1% and lend at prime plus 250 basis points, for a total spread of 420 basis points).  120 basis points per year is not much when you consider that simply covering the cost of borrowing while students are in school is twice that.  Basically, it means that for someone who borrows for four years, the government loses money every time they pay back the loan in less than eight years.  And that’s not counting the cost of defaults, which are in the tens of millions of dollars each year.

Put simply: Ontario students get to borrow at zero interest while in school, and positive-but-below-market rates after graduation despite default rates which are astronomical by the standards of any other personal loan product.  That costs the government money.  If it defrays some of that cost through an interest rate spread, so be it – that does not constitute “making a profit”.  It is simply stupid of any political party which wishes to be entrusted with public finances to suggest otherwise.

September 27

Lying With Statistics, BC Edition

A couple of weeks ago I came across a story in the Vancouver Sun quoting a Federation of Post-Secondary Educators of BC (FPSE) “report” (actually more of a backgrounder) which contained two eye-catching claims:

  1.  “per-student operating grants have declined by 20 per cent since 2001 when adjusted for inflation.”
  2.  “government revenues from tuition fees have increased by almost 400 per cent since 2001”

The subtext here is clear.  20% down vs. 400% up?  How heinous!  How awful can the Government of British Columbia be?

Well now.  What to make of this dog’s breakfast?

Let’s start with the second point.  First of all, it’s institutional income, and not government income.  But leaving that aside, there was indeed a very big rise in tuition fees back in 2001-2 and 2002-3 (presumably why the authors chose 2001 as a base…if one used 2003 as a base, it would be a very different and much less dramatic story).   But if you simply look at average university tuition (college tuition is untracked) the increase since 2001 is only 110% (in nominal dollars).  Assume the increase for colleges was a bit higher because they were working from a lower base and perhaps we can nudge that up to 125%.  Still: how does one get from there to 400%?

First, remember that the authors (whoever they may be) are talking about aggregate tuition, not average tuition.  So some of this simply reflects an increase in enrollments.  In 2001-2, there were 196,000 students in BC.  In 2013-14, the last year for which we currently have data, there were 277,515 – an increase of 41%.  Back of the envelope, multiply that by the 110% nominal tuition increase and that gets you to a 176%.  Still a ways to go to 400% though.

Second, a lot of students are moving from lower-cost programs to higher cost-programs.  Some of that is happening within universities (e.g., from Arts to Engineering), but in BC it’s mostly a function of colleges turning themselves into universities and charging more tuition.  University enrollment rose from 80,388 to 179,917 while college enrolments went from 116,007 to 197,698.  That’s a lot of extra fees.

Third, BC has a lot more international students than it used to, and they pay more in fees on an individual basis than domestic students do.  Add those two factors together and you get another 19% or so increase in aggregate fees, which brings us to a 210% total increase.

That’s still nowhere near 400%.  So, I went and checked the source data – Statistics Canada’s Financial Information of Universities and Colleges (FIUC) for Universities (cansim 477-0058 if you’re a nerd) and the Financial Information of Community Colleges and Vocational Schools (cansim 477-0060) to try to find an answer.  Here’s what I found:

ottsyd2016093001

Yeah, so actually not 400%, more like 207% – reasonably close to the 210% from our back-of-the-envelope exercise.  The best excuse I can come up with for the Federation of BC Post-Secondary Educators’ number is that if you extend the universities number out another year (to 2014-15), you get to $1.258B, which is almost four times (actually 3.74x) of the 2001-02 figure (which is still only a 274% increase).  But you have to a) torque the living daylights out of the numbers and b) actively confuse percentage increases and multiples to get there.

But now let’s move over to the other side of the ledger, where the Federation notes a 20% drop in government support per student, adjusted for inflation.  Let’s note straight off the first inconsistency: they’re adjusting the government grants for inflation and not doing the same for tuition.  Second inconsistency: they’re adjusting the government grants for the size of the student population and not doing the same for tuition.

It’s easy to see why FPSE does this.  As we’ve already noted, student numbers were up by 41% between 2001-2 and 2013-14.  Just do the math: a 20% per student cut while student numbers are rising by 41% actually means that government support has risen by 13%.  In real dollars. (I went back to the source data myself and came up with 14% –  Close enough).  Chew on that for a second: FPSE is ragging on a government which has increased funding for post-secondary education by – on average – 1% over and above inflation every year since 2001-2.

So quite apart from any problems with the 400% number, FPSE is making a deliberate apples-to-oranges comparison by adjusting only one set of figures for student growth and inflation.  Here’s how those numbers compare on a number of different apples-to-apples basis (and I’m being nice to FPSE here and allowing different end dates for fees and grants based on different data availability):

ottsyd2016093001-2

Now, it seems to me there’s enough in the Clark government’s record to mount a decent attack without resorting this kind of nonsense.  It certainly under-invests relative to what it could be doing given the province’s growing population.  It carries a faux-populist pro-extraction industry line to the detriment of investing in expanding knowledge industries.  It has stayed out of step with most of the rest of the country in the last ten years by not improving student assistance.  And a fair, non-torqued comparison between student fees and government grants still shows students are bearing an increasing share of the cost.

So why stoop to using transparently false figures?  One might expect that kind of chicanery from the Canadian Federation of Students, which has form in this area.  But this is from an organization which represents professors: people who actually use statistics in the search for the truth.  So why is the organization which represents them using statistics in a way that wouldn’t pass muster in an undergraduate course?

I’m quite sure most profs wouldn’t be OK with this.  So why do FPSE’s member locals tolerate it?

September 26

Reforming Funding for First Nations Students

I see from this article by John Ivison of the National Post that the issue of funding for post-secondary education for First Nations is becoming a bit of a hot potato.  Time for us to take a look at the situation.

I think most people now get that First Nations’ students don’t receive “free education”.  They pay tuition fees like everyone else.  What they do have (if they have “status”) is a parallel student aid system, which is called the Post-Secondary Student Support Program or PSSSP. If you are unclear on the difference between “status” and “non-status” Indians, have a peek at this primer from the âpihtawikosisân  blog.  PSSSP is a $322 million/year program, under which Aboriginal Affairs and Northern Development Canada (AANDC) distributes this money according to a somewhat obscure formula to the 600-odd bands across the country.  They in turn hand out that money to their own members who wish to take higher education courses.

In theory, PSSSP is a need-based program, and bands are supposed to allocate money according to need “up to” maximums in various categories (tuition, living expenses, books, child care, etc).  But individual bands aren’t blessed with a whole lot of need-assessment capability, and in practice pretty much everyone who is admitted to the system gets the maximum.  And so given a relatively stable overall budget (the program’s growth has been capped at 2% per year since – if I recall correctly – 1990), increasing costs and increasing numbers of people wishing to use the program, what happens is that many are not able to access the program at all.  About 20,000 students receive money each year, with each receiving on average about $15,000.  Each band has a “waiting list” in which people are prioritized according to various criteria.  The total number of people on waiting lists is not an easy number to pin down, but it’s generally estimated at just north of 10,000 students.

Lifting the growth cap on PSSSP should be a relatively high priority, and the Liberals did promise an extra $50 million infusion in their manifesto.  One of the biggest disappointments in last year’s federal budget was the Liberal government’s failure to follow through on this promise.  The story here is complicated, but basically goes like this: in their costing document, the Liberals misunderstood how much was actually in the AANDC budget and so had a hole in their projections when it came to paying for their promises on Aboriginal K-12 education.  In order to meet Aboriginal groups halfway (actually somewhat less than that) on K-12 spending, they killed the PSSSP increase and threw the money into the K-12 pot.

Unwilling to spend any political capital going after a government which was sky-high in the polls, aboriginal groups were very low-key in their criticism of the budget.  After all, a majority government is in power for four years – and there’s some chance the Liberals will make good on their PSSSP promise in their second budget.  Perhaps to that end, towards the end of the summer criticism grew from both First nations and student groups (a hat-tip to the Canadian Federation of Students on this one); expect a renewed push on this front over the fall.

But there’s another element to this story not getting much play.  A few years ago, in a paper I wrote for AANDC (available here (link to:), I noted that in fact almost everyone who qualifies for PSSSP would also qualify for the Canada Student Grant.  That would be another $3,000 per student per year.  Multiply that out across the 20,000 or so students currently getting PSSSP, that’s $60 million a year.  So if you could get everyone who currently gets PSSSP to also sign up for the Canada Student Grant and then have bands deduct that amount from their PSSSP award, bands would save enough to fund another 4,000 students at current PSSSP levels of funding.

Sound straightforward?  It’s not.  There are a whole bunch of barriers to getting bands to behave this way, not least of all that First Nations deserve to get their money for PSE on a nation-to-nation basis (i.e. like PSSSP) rather than having to go around clawing back $3,000 at a time with other government programs.  And yet, according to that Ivison article, it does seem like the Government of Canada is trying to push the idea of getting more status Indians with PSSSP funding onto student grants.

Is this an alternative to an extra $50 million in PSSP or in addition?  Who knows?  Is the plan to have bands claw back the new grant money from current students, and then distribute it to alleviate wait lists, or to give existing students more money? Who knows. I suspect in practice the answer may vary from band to band.  Either way, the next few months may promise a new era in First Nations Post-Secondary funding.

September 23

Social License and Tuition Fees

So, to Johannesburg, where South African Education Minister (and Communist Party chief) Blade Nzimande finally announced the government’s decision on tuition for next year. He was in a tricky place: students are still demanding free tuition (see my previous story on the Fees Must Fall movement here) and will not accept a hike in fees. Meanwhile, universities are quite rightly feeling very stretched (it’s tough trying to maintain developed-world caliber institutions on a tax base which is only partially of the developed-world): with inflation running at around 6.5%, a fee freeze would amount to a substantial cut in real income.

So what did the minister do? He pulled an Ontario (or a Chile, or a Clinton, if you prefer). Tuition to rise, but students from families with income of R600,000 or less (roughly C$56,000, or US$43,000) would be exempt from paying the higher tuition. Who exactly was going to verify students’ income is a bit of a mystery since the cut-off for student financial aid in South Africa is considerably below R600,000 (a justified cause of further student complaint), but no matter. The basic idea was clear: the well-off will pay, the needy will not. The exact amount extra they would pay? That would be up to individual universities. They could set their own tuition but were strongly advised not to try increasing fees by more than 8%.

It took student unions less than five seconds to find this inadequate and to denounce the government. Several unions have threatened to boycott classes if their institution raised fees.

This raises an interesting question. Why, if students in Chile and Ontario are claiming victory (or partial victory at least) over their fee regimens, do South African students reject it? Well, context is everything. The key here is government legitimacy, or lack thereof.

Let’s take the Charest government in the Spring of 2012. The tuition fee increase that the government proposed was not excessive, and poorer students in fact might have been better off once tax credits were factored in. But absolutely no one paid the slightest bit of attention to the policy details. This was a government that had outstayed its welcome, and was badly tarred by corruption scandals (my favourite joke from that spring: what’s the difference between a student leader and a Montreal mafia boss? Only one of them has to forswear violence in order to get a meeting with the Minister of Education). It had a good, saleable plan, but literally no political capital on which to draw. The plan, as we all know, failed.

(By the by, this is why, if the Couillard government is going to move on tuition fees, it’s going to have to do it this year. Their window is closing.)

I could go down the list here. The big anti-tuition fee protests that got the President of South Korea to promise to reduce tuition in the spring of 2011? That was at the tail end of a profoundly unpopular Presidency (though to be fair in Korea it’s the rare presidency that doesn’t end in profound unpopularity). The Chilean tuition protests of 2011-2? Also at the end of an unpopular presidency. By contrast, the largest tuition fee increase in the history of the world – the increase announced for England in the fall of 2010 – was essentially met with only a single rally, in part because the measure was introduced by a brand-new government which led in the polls. Basically, you need “social license” in order to do something unpopular on tuition fees. Some governments have it, others don’t.

The South African government is in precisely this kind of legitimacy crisis right now. It is not a simple matter of President Zuma’s unpopularity, though his increasingly kleptocratic regime is profoundly unhelpful. It’s a bigger crisis of post-apartheid society. Formal racial equality exists, but equality in economic opportunity, equality in educational opportunity: those are still very far away and in many ways are not much better than they were 20 years ago. Today’s youth, born after Nelson Mandela’s release from prison, no longer feel much loyalty to the ANC as the leader of “the struggle”. They simply see the party as being incompetent, corrupt, and incapable of delivering a better and more equal society.

And it’s that anger, that rage, which is driving the #feesmustfall movement. I think there’s a real chance this won’t end well; there has already been a serious uptick in violence on South African campuses. South Africa’s universities, unfortunately, may end up as collateral damage in a larger fight for the country’s future.

 

September 22

MOOCs at Five

It was five years ago last month that Stanford set up the first MOOC.  MOOCs were supposed to change the world: Udacity, Coursera and EdX were going to utterly transform education, putting many universities out of business.  Time to see how that’s going.

(Ok, ok: the actual use of the term MOOC was applied to a 2008 University of Manitoba course led by George Siemens and Stephen Downes.  Technically, using Downes’ taxonomy, the 2008 MOOC was a “cMOOC” – the “c” standing for connectivist, if I am not mistaken – while the versions that became popular through Coursera, Udacity and EdX, etc. are “xMOOCs”, the difference being essentially that learning in the former is more participative and collaborative while the latter has more in common with textbooks, only with video.  But the Stanford MOOC is what usually gets the attention, so I’m going to date it from there).

In the interests of schadenfreude if nothing else, allow me to take you back to 2012/3 to look at some of the ludicrous things people said about the likely effects of MOOCs.

  • “In 50 years there will only be 10 institutions in the world delivering higher education” (Sebastian Thrun, former CEO of Udacity)
  • “Higher Education is now being disrupted; our MP3 is the massive open online course (or mooc), and our Napster is Udacity” – Clay Shirky.
  • “Higher education is just on the edge of a crevisse (sic)…five years from now these enterprises (i.e. universities) are going to be in real trouble” Clayton Christensen

And of course who can forget that breathless cliché-ridden gem of an op-ed from Don Tapscott, about the week that higher education changed forever in January 2013 (i.e. the week he sat in on a couple of seminars on the subject in Davos).  Classic.

So, five years on, where are the MOOC pioneers now?  Well, Sebastian Thrun of Udacity got out of the disrupting-higher-education business early after coming to the realization that his company “didn’t have a good product”; the company pivoted to providing corporate training.  Over at Coursera, the most hyped of the early pioneers, founders Andrew Ng and Daphne Koller have both left the company (Ng left two years ago for Baidu, Koller left last month for one of Alphabet’s biotech enterprises).  Shortly after Koller’s departure, Coursera released this announcement  which was widely interpreted as the company throwing in the towel on the higher education market and following Udacity down the corporate training route.

EdX, the platform owned jointly by MIT and Harvard thus seems to be the last MOOC provider standing.  Perhaps not coincidentally, it is also the one which has (arguably) been most successful in helping students translate MOOC work into actual credits.  It has partnered with Arizona State University in its “Global Freshman Academy” and even allows conversion of some credits towards a specific MIT MBA (conditional on actually spending a semester on campus and paying normal fees to finish the program).   These “micro-MBAs” seem to catching on, but precisely because they are “micro”, they haven’t made a big impact on EdX’s overall numbers: their user base is still less than half Coursera’s.

So what’s gone wrong?  It isn’t a lack of sign-ups.  The numbers of people taking MOOCs continues to grow at a healthy clip, with Global enrolments to date now over 35 million.  The problem is there’s no revenue model here.  Depending on whose numbers you’re using, the number of users paying for some kind of certification (a fee which is usually priced in double digits) is at best around 3%.  So, work that out: 35 million users, with a 3% conversion rate, at $50 per user, and you’ve got a grand total of $52.5 million in total revenue.  Over five years.  Using content existing institutions are giving them essentially for free at a cost of anywhere between $50,000 and $250,000 per course.

This is not sustainable and never was.  Whatever valid points MOOC boosters had about the current state of education (and I think they had more than a few), the proposed solution wasn’t one that met the market test.  The basic problem is (and always has been) that higher education is fundamentally a prestige market.  Distance education is low prestige; distance education which doesn’t give out actual course credit doubly so.  You can disguise this by making delivery the domain of top universities, as EdX and to a lesser extent Coursera did – but top institutions don’t want to diminish their prestige by handing out actual credit to the hoi polloi over the internet.   So what you get is this unsatisfying compromise which in the end not enough people want to pay for.

Some of us said this five years ago (here, here and here) when MOOC-mania was in full flow and critical faculties were widely suspended.  Which just goes to show: higher education is the world’s most conservative industry and the rate of successful innovation is tiny.  Your best bet for imagining what higher education looks like in the future is what it looks like today, only more expensive.

 

September 21

Unit of Analysis

The Globe carried an op-ed last week from Ken Coates and Douglas Auld, who are writing a paper for the MacDonald Laurier institute on the evaluation of Canadian post-secondary institutions. At one level, it’s pretty innocuous (“we need better/clearer data”) but at another level I worry this approach is going to take us all down a rabbit hole. Or rather, two of them.

The first rabbit hole is the whole “national approach” thing. Coates and Auld don’t make the argument directly, but they manage to slip a federal role in there. “Canada lacks a commitment to truly high-level educational accomplishment”, needs a “national strategy for higher education improvement” and so “the Government of Canada and its provincial and territorial partners should identify some useful outcomes”. To be blunt: no, they shouldn’t. I know there is a species of anglo-Canadian that genuinely believes the feds have a role in education because reasons, but Section 93 of the constitution is clear about this for a reason. Banging on about national strategies and federal involvement just gets in the way of actual work getting done.

Coates & Auld’s point about the need for better data applies to provinces individually as well as collectively. They all need to get in the habit of using more and better data to improve higher education outcomes. I also think Coates and Auld are on the right track about the kinds of indicators most people would care about: scholarly output, graduation rates, career outcomes, that sort of thing. But here’s where they fall into the second rabbit hole: they assume that the institution is the right unit of analysis for these indicators. On this, they are almost certainly mistaken.

It’s an understandable mistake to make. Institutions are a unit of higher education management. Data comes from institutions. And they certainly sell themselves as a unified institutions carrying out a concerted mission (as opposed to the collections of feuding academic baronetcies united by grievances about parking and teaching loads they really are). But when you look at things like scholarly output, graduation rates, and career outcomes the institution is simply the wrong unit of analysis.

Think about it: the more professional programs a school has, the lower the drop-out rate and the higher the eventual incomes. If a school has medical programs, and large graduate programs in hard sciences, it will have greater scholarly output. It’s the palette of program offerings rather than their quality which makes the difference when making inter-institutional comparisons. A bad university in with lots of professional programs will always beat a good small liberal arts school on these measures.

Geography play a role, too. If we were comparing short-term graduate employment rates across Canada for most of the last ten years, we’d find Calgary and Alberta at the top – and most Maritime schools (plus  some of the Northern Ontario schools) at the bottom. If we were comparing them today, we might find them looking rather similar. Does that mean there’s been a massive fall-off in the quality of Albertan universities? Of course not. It just means that (in Canada at least) location matters a lot more than educational quality when you’re dealing with career outcomes.

You also need to understand something about the populations entering each institution. Lots of people got very excited when Ross Finnie and his EPRI showed big inter-institutional gaps in graduates incomes (I will get round to covering Ross’ excellent work on the blog soon, I promise). “Ah, interesting!” people said. “Look At The Inter-Institutional Differences Now We Can Talk Quality”. Well, no. Institutional selectivity kind of matters here. Looking at outputs alone, without taking into account inputs, tells you squat about quality. And Ross would be the first to agree with me on this (and I know this because he and I co-authored a damn good paper on quality measurement a decade ago which made exactly this point).

Now, maybe Coates and Auld have thought all this through and I’m getting nervous for no reason, but their article’s focus on institutional performance when most relevant outcomes are driven by geography, program and selectivity suggests to me that there’s a desire here to impose some simple rough justice over some pretty complicated cause-effect issues. I think you can use some of these simple outcome metrics to classify institutions – as HEQCO has been doing with some success over the past couple of years – but  “grading” institutions that way is too simplistic.

A focus on better data is great. But good data needs good analytical frameworks, too.

September 20

Sessionals: Equal Pay for Equal Work?

Following up on yesterday’s piece about counting sessionals, I thought it would be a useful time to address how sessionals get paid.  Every so often, the Ontario Confederation of University Faculty Associations (OCUFA) issues a press release asking that contract faculty get “equal pay for work of equal value”.  And who could be against that?  But what they don’t say, because no one wants to say this out loud is that, in Canada , adjuncts and sessionals are far from being underpaid: for the most part they actually are compensated fairly.  At least according to the standards of the academy itself.

I know that’s an unpopular opinion, but hear me out.  Think about what the correct comparator to a sessional academic is: it is a junior-rank academic, one who has been given assistant professor status but is not yet tenured.  These days in Canada, average pay for such folks is in the $80,000 range (your mileage may vary based on an institution’s location and prestige).

How much of that $80,000 is specifically for teaching?  Well, within the Canadian academy, there is a rule of thumb that a professor’s time should be spent 40% on teaching, 40% on research and 20% on some hazily-defined notion of service.  So, multiply that out and what you find is that only $32,000 of a new tenure-track prof’s salary is devoted to teaching.

Now break that down per class.   Depending on the institution, a professor is (in theory at least) teaching either four or five semester-long classes per academic year (2/2 or 3/2, in the academic vernacular).  Divide that $32,000 payment for teaching by four and you get $8,000 per one-semester class; divide it by five and you get $6,400.  An “equal work for equal pay” number therefore needs to be somewhere in that range.

Here’s what we know about adjuncts’ salaries: in 2014, the Higher Education Quality Council of Ontario published a study on salaries of “non-full-time instructors” in the province.  It showed that sessional instructors’ salaries in 2012/13 ranged from about a little $6,000 per course to a little over $8,000 per course (with inflation, it is likely slightly higher now), with most of the big universities clustered in the low-mid $7,000 range.  At a majority of institutions, sessionals also get health benefits and may participate in a pension plan.  In 2013, University Affairs, the in-house publication of Universities Canada published results on a nine-institution survey of sessional lecturer compensation (see here).  This showed a slightly wider range of compensation rates: at Quebec schools they were comparable to or slightly higher than Ontario rates, while elsewhere they were somewhat below.

To re-cap: if you buy the 40-40-20 logic of professorial pay, most universities in Canada – at least in central Canada – are in fact paying sessionals roughly the same as they are paying early-career tenure-track academics.  In some cases the benefits are not the same, and there may be a case for boosting pay a bit to compensate for that.  But the complaint that sessionals are vastly underpaid for the work they are contracted for?  Hard to sustain.

Sessionals themselves would naturally argue that they do far more than what they are contracted for: they too are staying academically active, doing research, etc.  To which the university response is: fine, but that’s not what we’re paying you for – you’re doing that on your own time.  The fight thus isn’t really about “equal pay”, it’s a fight about the right to be paid for doing research.

And of course OCUFA knows all this.  The math involved is pretty elementary.  It can’t really think these staff are underpaid unless it believes a) that the 40-40-20 thing is wrong and teaching should be a higher % of time and salary (good luck getting that one past the membership) or that sessionals need to be paid not on the same scale as assistant profs but on the scale of associates or full profs (again, I would question the likelihood of OCUFA’s membership thinking this is a good idea).

But if neither of those things is true, why does OCUFA use such language?  It’s a mystery worth pondering.

September 19

Counting Sessionals

Much rejoicing last Thursday when Science Minister Kirsty Duncan announced that the federal government was re-instating the funding for the Universities and Colleges Academic Staff System (UCASS), which was last run in 2011.     But what caught most people’s attention was the coda to the announcement, which said that Statistics Canada was going to “test the feasibility” of expanding the survey to include “part-time and public college staff” (the “C” in UCASS stands for colleges in the Trinity College sense, not the community college sense, so despite the name public colleges have never been in the survey).

What to make of this?  It seems that by “part-time” Statscan meant sessionals/adjuncts/ contract faculty.  That’s a bit off because every university I know of makes a very sharp distinction between “part-time” (many of whom are tenured) and “sessionals”.  It make one worry that Statistics Canada doesn’t understand universities well enough to use the correct terminology, which in turn bodes ill for their future negotiations with universities around definitions.

Because let’s be clear about this: universities will do almost anything they can to throw sand in the gears on this.  They do not want data on sessionals in the public eye, period.  Oh sure, in public the Presidents will welcome transparency, evidence-based decision-making, etc.  But institutional research shops – the ones who will actually be dealing with Statscan on this file – are Olympic champions in shutting down government attempts to liberate information.  In fact, that’s arguably their main purpose.  They won’t actually say no to anything – they’ll just argue relentlessly about definitions until Statscan agrees to a reduced program of data collection.  Statscan knows this is coming – they have apparently allocated four years (!!!) for negotiations with institutions, but the safest guess is that this simply isn’t going to happen.

And to be fair to universities, the kind of data UCASS would provide about sessionals would be pretty useless – a lot of work for almost nothing.  UCASS can count individuals, and track their average salaries.  But average salary data would be useless: it would conflate people teaching one course with people teaching five.  And since UCASS had no way to track workload (you’d actually need to blow up the survey and start again if you wanted to get at workload, and as interesting as that might be, good luck getting universities to green-light it), the data is meaningless.  Knowing the number of sessionals tells you nothing about what proportion of undergraduates are being taught by sessionals.  Are 200 sessionals teaching one course each worse than 100 teaching two courses apiece?  Of course not.  But if raw numbers are the only thing on offer then we’ll ascribe meanings to them where they arguably shouldn’t exist.

You see, “sessionals” are not really a single phenomenon.  Many are professionals who have full-time jobs and like teaching a class on a side, and they’re usually a huge boon to a department (especially in professional faculties like law, nursing an business) because they help expose students to a life beyond academia.  Others are PhD students teaching a class while another professor is away – and thus learning valuable skills.  The “bad” sessionals – the one people claim to want to stamp out – are the ones who have a PhD, are teaching multiple classes the way professors do.  I suspect this is a pretty small percentage of total sessionals, but we don’t know for sure.  And adding sessionals to UCASS won’t get us any closer to finding out because even if they wanted to, universities couldn’t collect data on which of their employees have other full-time jobs outside the institution.

Despite all the kumbayahs on Tuesday about how this UCASS expansion is going to promote “evidence-based decision-making”, I’m genuinely having trouble imagining a single policy problem where data from UCASS would make a difference.  Universities already know how many sessionals they employ and whether numbers are going up or down; UCASS might let them know how many sessionals other universities employ but frankly who cares?  It’s not going to make a difference to policy at an institutional level.

If you really wanted to know something about sessionals, you’d probably start with requiring institutions simply to provide contact information for every individual with teaching responsibilities who is not tenure-track, along with the amount paid to them in the previous academic year (note: Statscan couldn’t do this because it would never use the Stats Act to compel data this way.  Provincial governments could do so, however).  Then you’d do a survey of the instructors themselves – number of classes taught, other jobs they have, income from other jobs, etc.  Now I know some of you are going to say; didn’t some folks at OISE do that just recently?  Well, almost.  Yes, they administered almost this kind of survey, but because they weren’t drawing their sample from an administrative database, there’s no way to tell how representative their sample is and hence how accurate their results are.  Which is kind of important.

So, anyways, two cheers for the return of UCASS.  More data is better than less data.  But the effort to get data on sessionals seems like a lot of work for very little practical return even if universities can be brought round to co-operate.

September 16

OECD data says still no underfunding

The OECD’s annual datapalooza-tastic publication Education at a Glance was released yesterday.  The pdf is available for free here.  Let me take you through a couple of the highlights around Higher Education.

For the following comparisons, I show Canada against the rest of the G7 (minus Italy because honestly, economically, who cares?), plus Australia because it’s practically our twin, Korea because it’s cool, Sweden because someone always asks about Scandinavia and the OECD average because hey that just makes sense.  First off, let’s look at attainment rates among inhabitants 25-34.  This is a standard measure to compare how countries have performed in the recent past in terms of providing access to education.

Figure 1: Attainment Rates, 25-34 years olds, selected OECD countries

ottsyd20160915-01

*Data for Master’s & above not provided separately for Korea and Japan, and is included in Bachelor’s

Education-fevered Korea is light-years ahead of everyone else on this measure, with 69% of its 25-34 yr old population attaining some kind of credential, but Canada is still close to the top at 59%.  In fact we’re right at the top if you look just at short-cycle (i.e. sub-baccalaureate) PSE (see previous comments here about Canada’s world-leading strengths in College education); in terms of university attainment alone, our 34% is slightly below the OECD average of 36%.

Now let’s turn to finances.  Figure 2 shows total public and private expenditure on Tertiary educational institutions.

Figure 2: Public and Private Expenditures on Tertiary Institutions, as a Percentage of GDP, Selected OECD Countries

ottsyd20160915-02

Canada spends 2.5% of GDP on institutions, just below the US but ahead of pretty much everybody else, more than 50% higher than the OECD average.  For those of you who have spent the last couple of years arguing how great Germany because of free tuition is and why can’t Canadian governments spend money like Germany, the answer is clearly they can.  All they would need to do is cut spending by about 30%.

(If you’re wondering how UK claims 58% of all money in higher ed comes from government when the latest data from Universities UK shows it to be 25%, the answer I think is that this is 2013 data, when only 1/3 of the shift from a mainly state-based university funding system to mainly student-based funding system had been completed)

Turning now to the issue of how that money is split between different parts of the tertiary sector, here we see Canada’s college sector standing out again: by some distance, it receives more funding than any other comparable sector in the OECD (with 0.9% of GDP in funding).  The university sector, by contrast,  gets only 1.6% of GDP, which is closer to the OECD average of 1.4%.

Figure 3: Expenditure on Tertiary Institutions, by sector, as a Percentage of GDP, selected OECD countries

ottsyd20160915-03

*US data not available for short-course, 2.6% is combined total

Now this is the point where some of you will jump up and say “see, Usher?  We’re only barely above the OECD average! Canadian universities aren’t as well-funded as you usually make out.”  But hold on.  We’re talking % of GDP here.  And Canada, within the OECD is a relatively rich country.  And, recall from figure 1 that out university attainment rate is below the OECD average, which means those dollars are being spread over fewer students.  So when you look just at expenditures per student in degree-level programs, you get the following:

Figure 4: Annual Expenditures per Student in $US at PPP, Degree-level Programs only, Selected OECD Countries

ottsyd20160915-04

Again, Canada is very close to the top of the OECD charts here: at just over $25,000 US per student we spend over 50% more per student than the OECD average (and Germany, incidentally – just sayin’).

So, yeah, I’m going to give you my little sermon again: Canada’s is not an underfunded university system by any metric that makes the remotest bit of sense.  If we’re underfunded, everyone’s underfunded, which kind of robs the term of meaning.

That doesn’t mean cuts are easy: our system is rigid and brittle and even slowing down the rate of increase of funds causes problems.  But Perhaps if we directed even a fraction of the attention we pay to “underfunding” to the problem of our universities’ brittleness we might be on our way to a better system.

I won’t hold my breath.

September 15

Innovation Policy: Are Universities Part of the Problem?

We’re talking a lot about Innovation in Canada these days. Especially in universities, where innovation policy is seen as a new cash funnel. I would like to suggest that this attitude on the part of universities is precisely part of Canada’s problem when it comes to Innovation.

Here’s the basic issue: innovation – the kind that expands the economy – is something that firms do. They take ideas from here and there and put them together to create new processes or services that fill a market need in a way that creates value (there’s public sector innovation too but the “creating value” thing is a bit trickier, so we’ll leave that aside for now while acknowledging it exists and matters a lot).

Among the many places the ideas come from are higher education institutions (HEIs). Not necessarily local HEIs: ideas travel, so Toronto firms can grab ideas from universities in Texas, Tromso or Tianjin as well as U of T. The extent to which they will focus on ideas generated locally has to do not only with the quality of the local ideas, but also with the way the ideas get propagated locally. Institutions whose faculty are active and involved in local innovation networks will tend to see their ideas picked up more often that those who do not, partly because contact with local firms generates “better” scientific questions and partly because they will have more people paying attention to their work.

But ideas are really only a part of what matters in innovation. Does the business climate encourage firms to innovate? What’s the structure of business taxation? What kind of management and worker skill base exists? What regulations impede or encourage innovation? What barriers to competition and new entrants exist? What kind of venture capital is available? Does government procurement work in favour of or against new products or services? All of this matters in terms of helping to set firms’ priorities and set it on a more-innovative or less-innovative path.

The problem is, all this stuff is boring to politicians and in some cases, requires directly attacking entrenched interests (in Canada, this specifically has to do with protectionism in agriculture telecoms and banking). It requires years of discipline and trade-offs and politicians hate discipline and trade-offs. If only there were some other way of talking about innovation that didn’t require such sacrifice.

And here’s where universities step in to enable bad policies. They write about how innovation is “really” about the scientific process. How it’s “really” about high tech industries of the future and hey, look at all these shiny labs we have in Canada, wouldn’t it be great if we had more? And then all of a sudden “innovation” isn’t about “innovation” anymore, it’s about spending money on STEM research at universities and writing cheques to tech companies (or possibly to real estate companies to mediate a lot of co-working spaces for startups). Which as far as I can tell seems to be how Innovation Minister Navdeep Bains genuinely approaches his file.

Think I’m exaggerating? Check out this article from Universities Canada’s Paul Davidson about innovation in which the role of firms is not mentioned at all except insofar as they are not handing enough money to universities. Now, I get it: Paul’s a lobbyist and he’s arguing his members’ case for public support, which is what he is paid to do. But what comes across from that article is a sense that for Universities , l’Innovation c’est nous. Which, as statements of innovation policy go, is almost Nickelbackian in its levels of wrongness.

I don’t think this is a universal view among universities, by the way. I note SFU President Andrew Petter’s recent article in the same issue of Policy magazine which I think is much clearer in noting that universities are only part of the solution and even then, universities have to get better at integrating with local innovation networks. And of courses colleges, by putting themselves at the more applied end of the spectrum, are inherently aware that their role is as an adjunct to firms.

Universities are a part – a crucial part, even – of innovation systems. But they are a small crucial part. Innovation Policy is not (or should not be, anyway) code for “industrial policy in sci/tech things universities are good at”. It is (or should be) about firms, not universities. And we all need to remember that.

Page 1 of 9912345...102030...Last »