HESA

Higher Education Strategy Associates

Tag Archives: Ontario

August 30

A Francophone University for Ontario?

On Monday, the Government of Ontario released a proposal for a francophone university in Ontario, saying, effectively, “it’s about time we had one”.  This came as a surprise to many, who wondered “well, what about University of Ottawa, Laurentian University and Glendon College?”

But of course, none of these are truly francophone. Well, U of O is in theory but it was swamped by anglophones long ago and now does a majority of its teaching in English.  Laurentian was from its founding a bilingual university rather than a francophone one, but in practice it has not always lived up to the ideal, much to the irritation of some of its francophone staff.  And Glendon – well, Glendon’s a francophone college, but it’s part of York University, which is about as anglo as it gets.

Where this new institution is supposed to be different is that it will teach only in French,  And it will be governed entirely by francophones.  Which, to the francophone community, makes quite a difference. And with over half a million francophones in the province, it’s not difficult to argue that maybe such an institution exist.   But the question is: will students actually attend?  Whatever the rationale for such an institution, can it compete with Ottawa/Laurentian/Glendon – let alone anglophone institutions?

Well, here’s where it gets tricky.  The recommendation in the report suggests that the new institution be set up in Toronto, which I think strikes many people as odd because the city is not exactly known as a francophone hub.  Supporters of the ideas can turn around and note that over a third of the province’s francophone population lives in Central and southern Ontario.  That said, there aren’t many employers in the region that would put much of a premium on French education, which may limit its attractiveness to students in the area.

Perhaps more to the point: if there were significant demand for French education in the city, you kind of think that either Laurentian or Ottawa would have met it by delivering programs there.  The fact that they haven’t may suggest that predictions of thousands of students flocking to a new institution with no track record may be based more in hope than reality.

(The report itself suggests 1,000 FTE students by 2023-2024 and 2,220 by 2030.  This is pretty much a fantasy, and I suspect it owes at least something to a piece of market research which was conducted on the idea about four years ago which was – and I am not exaggerating here – the actual worst piece of social science I have ever seen.  Among many other data atrocities – bar graphs adding up to over 100%, that kind of thing – it calculated potential attendance at a new university by asking students in francophone high schools in south-central Ontario if they wanted to go to university in French but never probed about alternatives to a new university such as Laurentian, Ottawa and Glendon.   SMH, as the kids say.)

Back in the early 1990s, there was an attempt to provide French-language college-level programming in Toronto, through a new institution called College de Grand Lacs.  It failed through lack of enrolments within about five years, with Collège Boréale eventually coming in to pick up the pieces.  That’s not to say this institution will necessarily suffer the same fate; but it’s not a great precedent and probably more consideration should have been given to it in the report itself.

Now low enrolments aren’t necessarily a barrier to creating and maintaining a minority language institution.  It’s really a question of how much you want to pay and what kind of programs you expect to support.  Could Toronto support something like Nova Scotia’s Université Ste. Anne or Manitoba’s Université St. Boniface?  Almost certainly, though getting up to the latter’s status might take more time than the report suggests (getting students to go to new universities is hard– no one wants to be a guinea pig).  And if that’s the ambition, then it’s probably do-able.

But if the ambition is something more Moncton than Manitoba, then that probably won’t fly.  Like it or not, Laurentian and Ottawa will be competing for these same students: and that’s a lot of fish in a not-terribly large pond.  Bottom line: this is a manageable project if ambitions are small, but the greater the ambition, the riskier this idea becomes.

April 10

Evaluating Teaching

The Ontario Confederation of University Faculty Associations (OCUFA) put out an interesting little piece the week before last summarizing the problems with student evaluations of teaching.  It contains reasonable summary of the literature and I thought some of it would be worth looking at here.

We’ve known for awhile now that the results of student evaluations are statistically biased in various ways.  Perhaps the most important way they are biased is that professors who mark more leniently get higher rankings from their students.  There is also the issue of what appears to be discrimination: female professors and visible minority professors tend to get lower ratings than white men.  And then there’s the point that OCUFA makes with respect to the comments section of these evaluations being a hotbed of statements which amount to harassment.  These points are all well worth making.

One might well ask: given that we all know about the problems with teaching evaluations, why in God’s name do institutions still use them?  Fair question.  Three hypotheses:

  1. Despite flaws in the statistical measurement of teaching, the comments actually do provide helpful feedback, which professors use to improve their teaching.
  2. When it comes to pay and promotion, research is weighted far more highly than teaching, so unless someone completely tanks their teaching evals – and by tanking I mean doing so much below par that it can’t reasonably be attributed to one of the biases listed above – they don’t really matter all that much (note: while this probably holds for tenured and tenure-track profs, I suspect the stakes are higher for sessionals).
  3. No matter how bad a measurement instrument they are, the idea that one wouldn’t treat student opinions seriously is totally untenable, politically.

In other words, there are benefits despite the flaws, the consequences of flaws might not be as great as you think, and to put it bluntly, it’s not clear what the alternative is.  At least with student evaluations you can maintain the pretense that teaching matters to pay and promotion.  Kill those, and what have you got?  People already think professors don’t care enough about teaching.  Removing the one piece of measurement and accountability for teaching that exists in the system – no matter how flawed – is simply not on.

That’s not to say there aren’t alternatives to measuring teaching.  One could imagine a system of peer evaluation, where professors rate one another.  Or one could imagine a system where the act of teaching and the act of marking are separated – and teachers are rated on how well their students perform.  It’s not obvious to me that professors would prefer such a system.

Besides, it’s not as though the current system can’t be redeemed.  Solutions exist.  If we know that easy markers get systematically better ratings, then normalize ratings based on the class average mark.  Same thing for gender and race: if you know what the systematic bias looks like, you can correct for it.  And as for ugly stuff in the comments section, it’s hardly rocket science to have someone edit the material for demeaning comments prior to handing it to the prof in question.

There’s one area where the OCUFA commentary goes beyond the evidence however, and that’s in trying to translate the findings of student teaching evaluations (ie. how did Professor X do in Class Y) to surveys of institutional satisfaction.  The argument they make here is that because the one is known to have certain biases, the other should never be used to make funding decisions.  Now, without necessarily endorsing the idea of using student satisfaction as a funding metric, this is terrible logic. The two types of questionnaires are entirely different, ask different questions, and simply are not subject to the same kinds of biases.  It is deeply misleading to imply otherwise.

Still, all that said, it’s good that this topic is being brought into the spotlight.   Teaching is the most important thing universities do.  We should have better ways of measuring its impact.  If OCUFA can get us moving along that path, more power to them.

February 23

Garbage Data on Sexual Assaults

I am going to do something today which I expect will not put me in good stead with one of my biggest clients.  But the Government of Ontario is considering something unwise and I feel it best to speak up.

As many of you know, the current Liberal government is very concerned about sexual harassment and sexual assault on campus, and has devoted no small amount of time and political capital to getting institutions to adopt new rules and regulations around said issues.  One can doubt the likely effectiveness of such policies, but not the sincerity of the motive behind them.

One of the tools the Government of Ontario wishes to use in this fight is more public disclosure about sexual assault.  I imagine they have been influenced with how the US federal government collects and publishes statistics on campus crime, including statistics on sexual assaults.  If you want to hold institutions accountable for making campuses safer, you want to be able to measure incidents and show change over time, right?

Well, sort of.  This is tricky stuff.

Let’s assume you had perfect data on sexual assaults by campus.  What would that show?  It would depend in part on the definitions used.  Are we counting sexual assaults/harassment which occur on campus?  Or are we talking about sexual assaults/harassment experiences by students?  Those are two completely different figures.  If the purpose of these figures is accountability and giving prospective students the “right to know” (personal safety is after all a significant concern for prospective students), how useful is that first number?  To what extent does it make sense for institutions to be held accountable for things which do not occur on their property?

And that’s assuming perfect data, which really doesn’t exist.  The problems multiply exponentially when you decided to rely on sub-standard data.  And according to a recent Request for Proposals placed on the government tenders website MERX, the Government of Ontario is planning to rely on some truly awful data for its future work on this file.

Here’s the scoop: the Ministry of Advanced Education and Skills Development is planning to do two surveys: one in 2018 and one in 2024.  They plan on getting contact lists of emails of every single student in the system – at all 20 public universities, 24 colleges and 417 private institutions and handing them over to a contractor so they can do a survey. (This is insane from a privacy perspective – the much safer way to do this is to get institutions to send out an email to students with a link to a survey so the contractor never sees the names without students’ consent).  Then they are going to send out an email to all those students – close to 700,000 in total – offering $5/per head to answer a survey.

Its not clear what Ontario plans to do with this data.  But the fact that they are insistent that *every* student at *every* institution be sent the survey suggests to me that they want the option to be able to analyze and perhaps publish the data from this anonymous voluntary survey on a campus by campus basis.

Yes, really.

Now, one might argue: so what?  Pretty much every student survey works this way.  You send out a message to as many students as you can, offer an inducement and hope for the best in terms of response rate.  Absent institutional follow-up emails, this approach probably gets you a response rate between 10 and 15% (a $5 incentive won’t move that many students)  Serious methodologists grind their teeth over those kinds of low numbers, but increasingly this is the way of the world.  Phone polls don’t get much better than this.  The surveys we used to do for the Globe and Mail’s Canadian University Report were in that range.  The Canadian University Survey Consortium does a bit better than that because of multiple follow-ups and strong institutional engagement.  But hell, even StatsCan is down to a 50% response rate on the National Graduates Survey.

Is there non-response bias?  Sure.  And we have no idea what it is.  No one’s ever checked.  But these surveys are super-reliable even if they’re not completely valid.  Year after year we see stable patterns of responses, and there’s no reason to suspect that the non-response bias is different across institutions.  So if we see differences in satisfaction of ten or fifteen percent from one institution from another, most of us in the field are content to accept that finding.

So why is the Ministry’s approach so crazy when it’s just using the same one as every one else?  First of all, the stakes are completely different.  It’s one thing to be named an institution with low levels of student satisfaction.  It’s something completely different to be called the sexual assault capital of Ontario.  So accuracy matters a lot more.

Second, the differences between institutions are likely to be tiny.  We have no reason to believe a priori that rates differ much by institutions.  Therefore small biases in response patterns might alter the league table (and let’s be honest, even if Ontario doesn’t publish this as a league table, it will take the Star and the Globe about 30 seconds to turn it into one).  But we have no idea what the response biases might be and the government’s methodology makes no attempt to work that out.

Might people who have been assaulted be more likely to answer than those who did not?  If so, you’re going to get inflated numbers.  Might people have reasons to distort the results?  Might a Men’s Rights group encourage all its members to indicate they’d been assaulted to show that assault isn’t really a women’s issue?  With low response rates, it wouldn’t take many respondents to get that tactic to work.

The Government is never going to get accurate overall response rates from this approach.  They might, after repeated tries, start to see patterns in the data: sexual assault is more prevalent in institutions in large communities than in small ones, maybe; or it might happen more often to students in certain fields of study than others.  That might be valuable.  But if the first time the data is published all that makes the papers is a rank order of places where students are assaulted, we will have absolutely no way to contextualize the data, no way to assess its reliability or validity.

At best, if the data is reported system-wide, the data will be weak.  A better alternative would be to go with a smaller random sample and better incentives so as to obtain higher  response rates.  But if it remains a voluntary survey *and* there is some intention to publish on a campus-by campus basis, then it will be garbage.  And garbage data is a terrible way to support good policy objectives.

Someone – preferably with a better understanding of survey methodology – needs to put a stop to this idea.  Now.

February 16

How to Fund (3)

You all may remember that in early 2015, the province of Ontario announced it was going to review its university funding formula.  There was no particular urgency to do so, and many were puzzled as to “why now”?  The answer, we were told, was that the Liberal government thought it could make improvements in the system by changing the funding structure.  Specifically, they said in their consultation document that they thought they could use a new formula to improve i) improve quality/student experience, ii) support differentiation, iii) enhance sustainability, iv) increase transparency and accountability.

Within the group of maybe 100 people who genuinely understand this stuff, I think the scoffing over points iii) and iv) were audible as far as the Maritimes.  Transparency and accountability are nice, but you don’t need a new funding formula to get them.  The Government of Ontario could compel institutions to provide data any time it wants to (and often does).  If institutions are “insufficiently transparent” it means government isn’t asking for the right data.

As for enhancing sustainability?  HA!  At a system-level, sustainability means keeping costs and income in some kind of balance.  Once it became clear that there was no extra government money on the table for this exercise, and that tuition fees were off the table, and they would not use the formula to in any way rein in staff salaries or pensions (as I suggested back here) , everybody said “ok, guess nothing’s happening on that front” (we were wrong, as it turned out, as we’ll see in a second).  But the bit about quality, student experience and differentiation got people’s attention.  That sounded like incentivizing certain things.  Output-like things, which would need to be measured and quantified.  So the government was clearly entertaining the idea of some output-based measures, even as late as December 2015 when the report on the consultation went out (see that report here).  Indeed, the number one recommendation was, essentially, “the ministry should apply an outcomes-based lens to all of its investments).

One year later, the Deputy Minister for Advanced Education sent out a note to all institutions which included the following passage:

 The funding formulas are meant to support stability in funding at a time when the sector is facing demographic challenges while strengthening government’s stewardship role in the sector. The formulas also look to create accountable outcomes, beyond enrollment, that reflect the Strategic Mandate Agreements (SMAs) of each institution.

 As you know, our goal is to will focus our sector on high-quality student outcomes and away from a focus on growth. As such, the funding formula models are corridors which give protection on the downside and do not automatically commit funds for growth on the upside.

Some of that may require translation but the key point does not: all of a sudden, funding formulas were not about applying an outcome based lens on investment, they were about “stability”.  Outcomes, yes, but only as they apply to each institution’s SMA, and no one I know in the sector thinks that the funding envelope that will be devoted to promoting SMAs is going to be over five percent.  Which, given that tuition is over 50% of income, means that maybe, at best, we’re looking to about 2% of total funding might be outcome-based.  As I’ve said before, this is not even vaguely enough to affect institutional behaviour.

What happened?  My guess is it’s a mix of four things.  First, there was a change of both Minister and Deputy Minister and that’s always a crap shoot.  Priorities change, sometimes radically.  Second, the university sector made its displeasure known.  They didn’t do it very publicly, and I have no insider knowledge of what kind of lobbying occurred, but clearly, a number of people argued very strenuously that this was a Bad Idea.  One that gored oxes.  Very Bad.  Third, it finally dawned on some people at the top of the food chain that a funding formula change, in the absence of any new revenue tools, meant some institutions would win, and others would lose.  And as the provincial government’s recent 180 on Toronto toll roads has shown, this low-in-the-polls government is prepared to go a long way to avoid making any new “losers”.

Finally, that “sustainability” thing came back in a new form.  But now it was no longer about making the system sustainable, but about finding ways to make sure that a few specific small institutions with precarious finances (mostly but not exclusively in northern Ontario) didn’t lose out as adverse demographics and falling student numbers began to eat into their incomes.  Hence the language about corridors “giving protection on the downside”.  It’s ridiculous for three reasons.  One, it’s a half-solution because institutions vulnerable to demographic decline lose at least as much from lost tuition revenue as they do in lost government grant.  Two, it’s a departing horse/open barn door issue: the bulk of the demographic shift has already happened and so to some extent previous losses are just going to be locked in.  Three – and this is most important – the vulnerable institutions make up maybe 8% of total enrolments.  Building an entire new funding system just to solve a problem that affects 8% of students is…I don’t know.  I’m kind of lost for words.  But I bet if you looked it up in the dictionary it would be under “ass backwards”.

And that, my friends, is how Ontario blew a perfectly good chance to introduce a sensible, modern performance-based funding system.  A real shame.  Perhaps others can learn from it.

February 01

Loving It

Back in the summer you may have heard a bit of a brouhaha about a deal signed between Colleges Ontario and McDonald’s, allowing McDonald’s management trainees to receive advanced standing in business programs at Ontario colleges.  If you read the papers, what you probably saw was a he-said/she-said story in which someone from Colleges Ontario said something like “Ontario colleges are providing advanced credit for people who have been through a MacDonald’s management training program and that’s a good thing for access” and someone from the Ontario Public Service Employees Union (OPSEU) saying something like “Corporate education McDonald’s bad!”

This should have been an unequivocally good news story.  It is a travesty that it was not.  Here’s the real story.

McDonald’s, a company which employees around 400,000 employees directly and whose franchisees employ another 1.5 million, runs one of the largest internal corporate training programs in the world.  That’s not just the famous training center known as Hamburger University in Illinois, which is mainly for mid-management and executive development: they also have training centers in various locations around the world providing training programs for restaurant managers and crews.  While not many young employees stay at McDonald’s very long (turnover is something like 150% per year), a small fraction do stick with it to become managers.  And those that do receive a substantial education through the company in how to run a business.

Now, if you believe in the principles of prior learning recognition, you’ll recognize that this situation is a slam-dunk to create a standardized system of assessment to award credit.  Assessing prior knowledge can be a right mess; assessing knowledge gained through work experience (paid or unpaid) or in other forms of informal or non-formal learning in a way that maps on to some kind of credit or credential system is time-consuming and inexact.  But this situation is different.  With McDonald’s, there’s an actual written-down curriculum that can be used to do the curriculum mapping.  This is – comparatively – easy-peasy.

So what happened prior to last summer was that McDonald’s approached Colleges Ontario to try to work out such an arrangement.  Both sides had previous experience in doing something similar: McDonald’s had worked out a similar agreement in British Columbia with BCIT and Fanshawe College had led a national process to do an analogous type of curriculum mapping with the Canadian Military to allow its soldiers/veterans to count various parts of its training programs towards college credentials.  Faculty and admin representatives from all 24 colleges agreed on the parameters of the deal, then allowed a smaller technical group to work on mapping all the elements of McDonald’s coursework up to the Second Assistant Manager level of training onto the common (Ontario) college standard outcomes for the Business Administration diploma.  At the end of it, it was decided that one level was more or less equivalent to another, and so individuals who had reached Second Assistant Manager could automatically get a year’s worth of credit (there’s no partial credit for having complete some McDonald’s training: this is an all-or-nothing deal).

So what are the criticisms?  Basically, they amount to:

  1. College-level courses need to be taught by college teachers in a college atmosphere
  2. McDonald’s is a big evil corporation. Why with McDonald’s?  Why not others?
  3. Why isn’t mapping available publicly?

The first argument, taken to its logical conclusion, essentially says that PLAR is illegitimate because no knowledge derived from outside the classroom can possibly count. Presumably people who believe this also believe mapping arrangements for Armed Forces training is also a complete scandal.

The second…well, if that’s your belief, I suppose there is no shaking it.  As for why McDonald’s – it’s because they asked.  And they had a hell of a well-documented curriculum to present to Colleges Ontario.  Presumably similar deals are open to other businesses, but no one (to my knowledge) has asked.  As for the third, it’s clear why it’s not public: McDonald’s treats the curriculum of its courses as corporate intelligence – as they have every right to do – and don’t want it published for the world to see.  One could make the argument that a decision involving credits at public institutions needs to be to be fully in the public domain.  But, one, that would mean that virtually every program at Ontario university is suspect (just try finding curriculum maps or un-redacted program evaluations online and see how many are publicly available) and two, faculty co-ordinators responsible for Business Administration from all 24 institutions (all of whom are OPSEU members, incidentally) all saw the detailed curriculum in confidence and signed off on the deal, which seems like a reasonable saw-off.

In short, this is a good deal.  If we want to promote life-long learning and increase prior learning recognition, we need more of these, not less.  Bravo to everyone involved.

January 17

Another Lens on Bleak Graduate Income Data

So, yesterday we looked at Ontario university graduate employment data (link to: previous).  Today I want to zero in a little bit on what’s happening by field of study.

(I can hear two objections popping up already.  First; “why just Ontario”?  Answer: while Quebec, Alberta, British Columbia and the Maritimes – via MPHEC – all publish similar data, they all publish the data in slightly different ways, making it irritating (and in some cases impossible) to come up with a composite national figure.  The National Graduate Survey (NGS) in theory does this, but only every five years but as I explained last week has made itself irrelevant by changing the survey period.  So, in short, I can’t do national, and Ontario a) is nearly half the country in terms of university enrolments and b) publishes slightly more detailed data than most.  Second, “why just universities”?  Answer: “fair point, I’ll be publishing that data soon”.

Everyone clear? OK, let’s keep going).

Let’s look first at employment rates 6 months after graduation by field of study (I include only the six largest – Business/Commerce, Education, Engineering, Humanities, Physical Sciences and Social Sciences – because otherwise these graphs would be an utter mess), shown below in Figure 1.  As was the case yesterday, the dates along the x-axis are the cohort graduation year.

ottsyd-20170116-1

Two take-aways here, I think.  The first is that the post-08 recession really affected graduates of all fields more or less equally, with employment rates falling by between 6 and 8 percentage points (the exception is humanities, where current rates are only four percentage points below where they were in 2007).  The second is that pretty much since 2001, it’s graduates in the physical sciences who have had the weakest results.

OK, but as many in the academy say: 6 months isn’t enough to judge anything.  What about employment rates after, say, 2 years?  These are shown below in Figure 2.

ottsyd-20170116-2

This graph is smoother than the previous one, which suggests the market for graduates with 2 years in the labour market is a lot more stable than that for graduates with just 6 months.    If you compare the class of 2013 with the clss of 2005 (the last one to completely miss the 2008-9 recession), business and commerce students’ employment rates have fallen only by one percentage point while those in social sciences have dropped by six percentage points, with the others falling somewhere in between.  One definite point to note for all those STEM enthusiasts out there: there’s no evidence here that students in STEM programs have fared much better than everyone else.

But employment is one thing; income is another.  I’ll spare you the graph of income at six months because really, who cares?  I’ll just go straight to what’s happening at two years.

ottsyd-20170116-3

To be clear, what figure 3 shows is average graduate salaries two years after graduation in real dollars – that is, controlling for inflation.  And what we see here is that in all fields of study, income bops along fairly steadily until 2007 (i.e. class of 2005) at which point things change and incomes start to decline in all six subject areas.  Engineering was down, albeit only by three percent.  But income for business students was down 10%, physical sciences down 16%, and humanities, social sciences and education were down 19%, 20% and 21%, respectively.

This, I shouldn’t need to emphasize, is freaking terrible.  Actual employment rates (link to: previous) may not be down that much but this drop in early graduate earnings is  pretty disastrous for the majority of students.  Until a year or two ago I wasn’t inclined to put a lot of weight on this: average graduate earnings have always popped back after recessions.  This time seems to be different.

Now as I said yesterday, we shouldn’t be too quick to blame this on a huge changes economy to which institutions are not responding; it’s likely that part of the fall in averages comes from allowing more students to access education in the first place.  As university graduates take up an increasing space on the right-hand side of an imaginary bell-curve representing all youth, “average earnings” will naturally decline even if there’s no overall change in the average or distribution of earnings as a whole.  And the story might not be as negative if we were to take a five- or ten-year perspective on earnings.  Ross Finnie has done some excellent work showing that in the long-term nearly all university graduates make a decent return (though, equally, there is evidence that students with weak starts in the labour force have lower long-term earnings as well through a process known as “labour market scarring”).

Whatever the cause, universities (and Arts faculties in particular) have to start addressing this issue honestly.  People know in their gut that university graduates’ futures in general (and Arts graduates in particular) are not as rosy as they used to be. So when the Council of Ontario puts out a media release, as it did last month, patting universities on the back for a job well-done with respect to graduate outcomes, it rings decidedly false.

Universities can acknowledge challenges in graduate without admitting that they are somehow at fault.  What they cannot do is pretend there isn’t a problem, or shirk taking significant steps to improve employment outcomes.

January 16

Ever-bleaker Graduate Employment Data?

So just before I quit blogging in December, the Council of Ontario Universities released its annual survey of graduate outcomes, this time of the class of 2013.  The release contained the usual platitudes: “future is bright”, “vast majority getting well-paying jobs”, etc etc.   And I suppose if one looks at a single year’s results in isolation, one can make that case.  But a look at longer-term trends suggests cause for concern.

These surveys began at the behest of the provincial government seventeen years ago.  Every graduating cohort is surveyed twice: once six months after graduation and once two years after graduation.  Students are asked questions about their employment status, their income and about the level of relationship between their job and their education.  COU publishes only high-level aggregate data, so we don’t know about things like response rates, but the ministry seems pleased enough by data quality, so I assume it’s within industry standards.

Figure 1 shows employment rates of graduates six months and two years out.  At the two-year check point, employment rates fell by four points in the immediate wake of the 2008-9 recession, (be careful in reading the chart: the x-axis is the graduating class, not the year of the survey, so the line turns down in 2006 because that’s the group that was surveyed in 2008).  Since then it has recovered by a little more than a point and a half, though further recovery seems stalled.  At the six-month point, things are much worse.  Though employment rates at this point are no longer falling, they remain stubbornly seven percentage points below where they were pre-recession.

Figure 1: Employment Rates, Ontario University Graduates, 6 Months and 2 Years Out, by Graduating Class, 1996-2013

OTTSYD 20170115-1

If you want to paint a good story here, it’s that employment rates at 2 years out are still within three percentage-points of their all-time peak, which isn’t terrible.  But there doesn’t seem much doubt that students are on average taking a bit longer to “launch” than they used to; employment rates six months out seem to have hit a new, and permanently lower floor.

Now, take a look at what’s happening to starting salaries.  As with the previous graph, I show results for at both the six-month and the two-year mark.

 

 Figure 2: Average salaries, Ontario University Graduates, 6 Months and 2 Years Out, by Graduating Class, 1996-2013, in $2016

OTTSYD 20170115-2

What we see in Figure 2 is the following:  holding inflation constant, during the late 1990s, recent graduates saw their incomes grow at a reasonably rapid clip.  For most of the 2000s, income was pretty steady for graduates two years out (less so six months out).  But since the 2008 recession, incomes have been falling steadily for several years; unlike the situation with employment rates, we have yet to see a floor, let alone a bounceback.  Real average incomes of the class of 2013 six months after graduation were 11% lower than those of the class of 2005 (the last fully pre-recession graduating class); at 2 years out the gap was 13%.  Somehow these points did not make it into the COU release.

That, frankly, is not good.  But it seems to me that we need to hold on a little bit before hitting panic buttons about universities being a bad deal, not being relevant to shifting labour market, etc, etc.  Sure, the drop-off in both employment rates and incomes started around the time of the recession and so it’s easy to create a narrative around changed economy/new normal, etc etc.  But there’s something else that probably playing a role, and that’s an increase in the supply of graduates.

 

Figure 3: Number of Undergraduate Degrees Awarded, Ontario, 1999-2013

OTTSYD 20170115-3

The other big event we need to control for here is the massive expansion of access to higher education.  In 2003, the “double-cohort” arrived on campus and that forced government to expand institutional capacity, which did not subsequently shrink.  Compared to the year 2000, the number of graduates has increased by over 50%; Such an expansion of supply must have had some effect on average outcomes. It’s not simply that there are more students competing for jobs – something one would naturally assume would place downward pressure on wages – but also, the average quality of graduates has probably dropped somewhat.  Where once graduates represented the top 20% of a cohort in terms of academic ability, now they probably represent the top 30% or so.  Assuming one’s marginal product in the labour market is at least loosely tied to academic ability, that would also predict a drop in average post-graduation incomes.  To really get a sense of what if anything has changed in terms of how higher education affects individuals’ fortunes in the labour market, you’d want to measure not average income vs. average income, but 66th percentile of income now vs. 50th percentile of income fifteen years ago.  Over to you, COU, since you could make the microdata public if you wanted to.

In short, don’t let institutions off the hook on this, but recognize that some of this was bound to happen anyway because of access trends.

More graduate income data fun tomorrow.

September 29

The Ontario NDP’s Bad Student Loan Math

The Ontario NDP have started down the road to madness on student aid.  Someone needs to stop them.

Here’s the issue: the NDP have decided to promise to make all Ontario student loans interest-free.  As a policy, this is pretty meh.  It’s not the kind of policy that increases participation because students don’t really pay attention to loan interest, and it’s not going to make loans a whole lot more affordable because Ontario forgives most loans anyway (as a consequence something like 90% of all loans in repayment in Ontario are federal loans which wouldn’t be subject to this policy).   My back-of-the-envelope calculation is that this policy might save a typical borrower in repayment something like $5/month, which isn’t a big deal as far as affordability is concerned.  One could argue that affordability of loan repayments shouldn’t be a big priority since loan payments as a fraction of average graduate income has gone down by about a third in the past fifteen years, but on the other hand, this isn’t likely to cost very much either, so really, who cares?

No, the problem isn’t so much the proposed program as it is the tagline that’s gone along with it. To wit: “The government shouldn’t be making a profit from student debt”.

ottsyd20160929

I mean, where to begin with this stonking bit of nonsense?

The worst-case interpretation of this is that the NDP actually believes that “interest” equals “profit”, or, to put it another way, that money has no time-value.  Read literally, it suggests that all interest is usury.  The NDP is sometimes accused of being stuck in the 70s as far as economic policy is concerned; this particular slogan suggests it might be more 1370s than 1970s.

More likely, though, this is the NDP aping Massachusetts Senator Elizabeth Warren, who has been saying these kinds of things about US student loans for a few years now.  The essence of the critique is this: governments borrow money cheaply and lend to students at a higher rate (in the US, the rate on Stafford undergraduate subsidized loans is the 10-year Treasury rate plus 250 basis points, and somewhat higher for other types of public loans).  The gap between the two rates is needed because of course the government loses money on loans through loan defaults (it also loses money by assuming the loan interest while a student is in school, but that’s a separate issue).  For reasons beyond comprehension, the US government does not base its financial calculations for student loans on actuarial reports which are linked to actual student behaviour, but rather according to “standard conventions”, one of which essentially assumes no loan losses at all.  It is by using this convention – i.e. basically ignoring all actual costs – that Warren came to the conclusion that student loans “make money”. For a more complete description of why this is total nonsense, check out Jason Delisle’s work on the subject here as well as articles from the Atlantic, the Washington Post and the Brookings Institute.

But even to the limited extent the Warren critique makes sense in the US, it doesn’t work in Ontario.  OSAP loses money.  A lot of it.  It doesn’t publish numbers directly on this, but it’s easy enough to work it out.  Ontario 10-year bonds go for about 2.5% these days, and OSAP lends to students at prime + 1%, or about 3.7%.  So Ontario’s spread is only 120 basis points, or half the American spread (CSLP loans, are different: the feds borrow at 1% and lend at prime plus 250 basis points, for a total spread of 420 basis points).  120 basis points per year is not much when you consider that simply covering the cost of borrowing while students are in school is twice that.  Basically, it means that for someone who borrows for four years, the government loses money every time they pay back the loan in less than eight years.  And that’s not counting the cost of defaults, which are in the tens of millions of dollars each year.

Put simply: Ontario students get to borrow at zero interest while in school, and positive-but-below-market rates after graduation despite default rates which are astronomical by the standards of any other personal loan product.  That costs the government money.  If it defrays some of that cost through an interest rate spread, so be it – that does not constitute “making a profit”.  It is simply stupid of any political party which wishes to be entrusted with public finances to suggest otherwise.

September 20

Sessionals: Equal Pay for Equal Work?

Following up on yesterday’s piece about counting sessionals, I thought it would be a useful time to address how sessionals get paid.  Every so often, the Ontario Confederation of University Faculty Associations (OCUFA) issues a press release asking that contract faculty get “equal pay for work of equal value”.  And who could be against that?  But what they don’t say, because no one wants to say this out loud is that, in Canada , adjuncts and sessionals are far from being underpaid: for the most part they actually are compensated fairly.  At least according to the standards of the academy itself.

I know that’s an unpopular opinion, but hear me out.  Think about what the correct comparator to a sessional academic is: it is a junior-rank academic, one who has been given assistant professor status but is not yet tenured.  These days in Canada, average pay for such folks is in the $80,000 range (your mileage may vary based on an institution’s location and prestige).

How much of that $80,000 is specifically for teaching?  Well, within the Canadian academy, there is a rule of thumb that a professor’s time should be spent 40% on teaching, 40% on research and 20% on some hazily-defined notion of service.  So, multiply that out and what you find is that only $32,000 of a new tenure-track prof’s salary is devoted to teaching.

Now break that down per class.   Depending on the institution, a professor is (in theory at least) teaching either four or five semester-long classes per academic year (2/2 or 3/2, in the academic vernacular).  Divide that $32,000 payment for teaching by four and you get $8,000 per one-semester class; divide it by five and you get $6,400.  An “equal work for equal pay” number therefore needs to be somewhere in that range.

Here’s what we know about adjuncts’ salaries: in 2014, the Higher Education Quality Council of Ontario published a study on salaries of “non-full-time instructors” in the province.  It showed that sessional instructors’ salaries in 2012/13 ranged from about a little $6,000 per course to a little over $8,000 per course (with inflation, it is likely slightly higher now), with most of the big universities clustered in the low-mid $7,000 range.  At a majority of institutions, sessionals also get health benefits and may participate in a pension plan.  In 2013, University Affairs, the in-house publication of Universities Canada published results on a nine-institution survey of sessional lecturer compensation (see here).  This showed a slightly wider range of compensation rates: at Quebec schools they were comparable to or slightly higher than Ontario rates, while elsewhere they were somewhat below.

To re-cap: if you buy the 40-40-20 logic of professorial pay, most universities in Canada – at least in central Canada – are in fact paying sessionals roughly the same as they are paying early-career tenure-track academics.  In some cases the benefits are not the same, and there may be a case for boosting pay a bit to compensate for that.  But the complaint that sessionals are vastly underpaid for the work they are contracted for?  Hard to sustain.

Sessionals themselves would naturally argue that they do far more than what they are contracted for: they too are staying academically active, doing research, etc.  To which the university response is: fine, but that’s not what we’re paying you for – you’re doing that on your own time.  The fight thus isn’t really about “equal pay”, it’s a fight about the right to be paid for doing research.

And of course OCUFA knows all this.  The math involved is pretty elementary.  It can’t really think these staff are underpaid unless it believes a) that the 40-40-20 thing is wrong and teaching should be a higher % of time and salary (good luck getting that one past the membership) or that sessionals need to be paid not on the same scale as assistant profs but on the scale of associates or full profs (again, I would question the likelihood of OCUFA’s membership thinking this is a good idea).

But if neither of those things is true, why does OCUFA use such language?  It’s a mystery worth pondering.

August 10

Ontario’s Quiet Revolution

Last year, the Government of Ontario announced it was moving to a new and more generous systems of student grants.  Partly, that was piggybacking on a new and enhanced federal grants and partly it was converting its own massive system of loan forgiveness and tax credits into a system which – more sensibly – delivered them upfront to students.  For most students from low-income backgrounds, this means they will receive more in grants than they pay in tuition.

Now, while the new federal grants came into place last week (yay!), the new provincial program isn’t due to be introduced until 2017-18.  But the *really* important piece of the Ontario reform actually won’t kick in until even later.  As I noted back here, it’s the move to “net billing” (that is, harmonizing the student aid and institutional application systems) which has the most interesting potential because now students will see net costs at the time of acceptance rather than just sticker costs.  It has been generally appreciated (in part because I keep banging on about it) that this will be revolutionary for students and their perceptions of cost.  What is not as well appreciated is how revolutionary this change will be for institutions.

Currently, Ontario universities use merit scholarships as a major tool in enrolment management.  At the time students are accepted, institutions offer them money based on their grades.  The scale differs a bit between institutions (an 85% might get you $1,000 at one university and $2,000 at another), but the basics of it is that over two-thirds of entering students receive some kind of financial reward, usually for one year.  It’s a total waste of money for institutions, but everybody does it – so no institution feels it can stop doing it.

But the effect this money has on students is predicated on the fact that the institutional award offer is the first time anyone has talked about money with them.  In our current system of student aid, you have to be accepted at an institution before you can apply for student aid.  Even $1,000 is a big deal when nobody else is offering you any money.  But as of early 2018, students will learn about their institutional award at exactly the same time as they find out their student aid award.  How will that affect the psychology of the money being offered?  No one knows. How should universities therefore adjust their policy?

Bigger questions abound.  “Net billing” implies that institutions will know the outcome of a student’s provincial need assessment before the student does.  Will they be allowed to adjust their own aid offers as a result?  Could the province stop them from doing so even if they wanted to?

What will new letters of acceptance look like?  When an institution tells a student about tuition, aid, and “net cost”, will they be required to lump all aid together, or will they be allowed to label their own portion of the aid separately?  You would think institutions would fight hard to keep the label on their own money but prohibiting labeling might be the best way to cut down on these scholarships and re-direct the money to better use, something I advocated a couple of years ago.  With no labellings, there would be no incentive to spend on this item, and institutions could back away from it with no opprobrium.  We’ll see if institutions are actually that shrewdly or not.

Even if they do retain the right to separate labeling, what will the effect on students be?  Getting an offer of a $1,000 merit scholarship is undoubtedly psychologically different than receiving a $1,000 scholarship on top of a $6,000 need-based grant.  And when placed in context with a tuition fee, the effect may vary again.  In other words, we’re heading into a world where Ontario universities – who collectively spend tens of millions of dollars a year on these scholarships – have literally no idea what effect they will have in the minds of the people they are trying to attract.  I suspect we may see one or two institutions re-profile their aid money and head out in very new strategic directions as a result of this.

Universities have a lot of business-process work to do to make net billing work over the next 12 months or so.  But more importantly, they have some big strategic decisions to make about how to dish out money to students in the absence of much hard intelligence.  How they react will be one of the more interesting stories of 2017-18.

Page 1 of 712345...Last »