HESA

Higher Education Strategy Associates

January 25

The Science Policy Review

So, any day now, the report of the Government of Canada’s Science Policy review should be appearing.  What is that, you ask?  Good question.

“Science policy” is one of those tricky terms.  Sometimes it can mean science as a way of making policy (like when someone claims they want all policy to be “evidence-based); sometimes it’s about policy for or about Science, and the rules and regulations under which it is funded.  This particular Science policy review, chaired by former U of T President David Naylor is a bit narrower; as the mandate letter and key questions show, this review is fundamentally about funding.   In fact, three sets of questions about funding: funding of fundamental research, funding of facilities/equipment, and funding of “platform technologies” (which is irritating innovation policy jargon which makes a lot more senses in IT than in the rest of science, but whatever).

For the first two sets of questions, there’s a heavy tilt towards fitness of purpose of existing funding agencies.  The review’s emphasis is not so much “are we spending enough money” (that’s a political decision) but rather “does the way we spend money make sense”.  For example, one might well ask “does a country of 35 million people and fewer than 100 universities actually need three granting councils, plus CFI, the Foundation for Sustainable Development, Brain Canada, Genome Canada, the Canada First Research Excellence Fund… you get the idea.

There was a frisson of excitement last year when the UK decided to fold all their granting councils into One Big Council – might our Science Review recommend something similar?  Personally, I’m not entirely sold on the idea that fewer councils means lest paperwork and more coherence (the reasons usually given in favour of rationalization), because policies and agendas can survive institutional mergers.  And as a colleague of mine who used to be quite high up in a central agency once said to me: the reason all these agencies proliferated in the first place was that politicians got frustrated with the traditional granting councils and wanted something more responsive.  Paring them back doesn’t necessarily solve the problem – it just re-sets the clock until the next time politicians get itchy.

This itchiness could happen sooner than you think.  Even as the government has been asking Naylor and his expert panel to come up with a more rational scheme of science management A couple of weeks ago it emerged that one of the ideas the Liberals had decided to test in their regular pre-budget focus group work was the idea of spending mega-millions (billions?) on a scientific “Moonshot”: that is, a huge focused effort on one goal or technology such as  – and I quote – driverless cars, unmanned aircraft, or “a network of balloons travelling on the edge of space designed to help people connect to the internet in remote areas or in a crisis situation”.  Seriously.  If any of you thought supporting big science projects over broad-based basic science was a Tory thing, I’m afraid you were sorely mistaken.

Anyways, back to the review.  There’s probably room for the review to provide greater coherence on “big science” and science infrastructure – Nassif Ghoussoub of UBC has provided some useful suggestions here.  There may be some room for reduction in the number of granting agencies (though – bureaucratic turf protection ahoy!) and definitely room to get the councils – especially CIHR – to back off on the idea that every piece of funded research needs to have an end-use in mind (I’d be truly shocked if Naylor didn’t beat the crap out of that particular drum in his final report).

But the problem is that the real challenges in Canadian Science are much more intractable.  Universities hired a lot of new staff in the last fifteen years, both in order to improve their research output and to deal with all those new undergraduates we’ve been letting in.  This leads to more competition.  Meanwhile, government funding has declined somewhat since 2008 – even after that nice little unexpected boost the feds gave the councils last budget.  At the same time, granting councils – most of all CIHR – have been increasing the average size of awards.  Which is great if you can get a grant; the problem is that with stagnant budgets the absolute number of grants is falling.  So what do rational individual researchers do with more competition for fewer awards?  They submit more applications to increase their chances of getting an award.  Except that this drives down acceptance rates still further – on current trends, we’ll be below 10% before too long.

Again, this isn’t just a Canadian phenomenon – we’re seeing similar results in a number of countries.  The only solution (bar more funding, which isn’t really in the Review’s remit) is to give out a larger number of smaller awards.  But this runs directly contrary to the prevailing political wind, which seems to be about making fewer, bigger awards: Canada Excellence Research Chairs (there’s rumours of a new round!), CFREF, Moonshots, whatever.  You can make a case for all those programs but the question is one of opportunity costs.  CFREF might be brilliant at focusing institutional resources on a particular problem and acting as anchors for new tech/business clusters: but is it a better use of money than seeding money widely to researchers through the usual peer-review mechanism?  (for the record, I think CFREF makes infinitely more sense than CERCs or Moonshots, but am a bit more agnostic on CFERF vs. granting councils).

Or, to be more brutal: should we have moonshots and CFREF and a 10% acceptance rate on council competitions, or no moonshots or CFREF and a 20% acceptance rate?  We’ve avoided public discussion on these kinds of trade-offs for too long.  Hopefully, Naylor’s review will bring us the more pointed debate this topic needs.

January 24

Budget Fun at the University of Ottawa

Back in early December, the Ottawa Citizen reported on a controversy at the University of Ottawa.  Basically, the story was that the University is facing a $20 million budget shortfall, the administration is consulting re: how to cut its budget and some people are very upset with some of the proposed solutions.

Of course, cutbacks anywhere, anytime, are unacceptable to someone in the institution.  The library, for instance, is being asked to contemplate a cut of $2 million).  “A source” told the Citizen that such a cut would “decimate its holdings and – horror of horrors – “severely damage its reputation”.

(Regular readers of this column may recall some data on library statistics I published earlier this year.  A quick check here suggests that even with these cuts, library expenditures per student at Ottawa would still be ahead of the not-reputationally-challenged-last-I-heard Queen’s, and that University of Ottawa has over the past ten years increased its collections budget by 50%. Even with the proposed cuts, it would still have the largest net increase in funding of any research university library in the country since 2004.  It would be surprising, to say the least, if any real reputational damage ensued from this.

How did Ottawa get into this position in the first place?  Here’s what a university spox told the Citizen: “The University of Ottawa is one of many Ontario universities facing financial challenges.  This is due in part to a permanent reduction in revenues from the provincial government of two per cent (over 2012-2013 levels) in government grants per registered students, resulting in decreased funding … At the same time, the university’s expenditures, especially those related to salaries, benefits, and special payments for pension plans continue to grow.”

OK, so let’s parse this.  Government has actually increased funding since 2012-13, not decreased.  Not by much, admittedly, but it has increased.  And the funding formula hasn’t changed.  So if “dollars per student” are down, it’s because  a) the University is admitting more students outside the formula (i.e. international students), or b) it’s new enrolments are disproportionately enrolled in “cheap” subjects like Arts and business, rather than expensive ones like medicine –  and that’s bringing down the ratio of “basic income units” (Ontario government jargon for “weighted student units” in an enrolment-driven funding formula) to actual students. The third option is that c) other universities have chosen to grow faster than University of Ottawa, thus dropping its share of system-wide BIUs slightly.  I have no idea which of these is true – I’m fairly confident both “a” and “b” are true but am not sure about “c”, but in any case those are institutional policy decisions, not government ones.  Blaming the government for them isn’t really kosher.

The government could of course, be blamed for not putting more money in the system as a whole.  But look – this province had a $17 billion deficit no so long ago.  The 2010 provincial budget was crystal clear that between 2010 and 2016, total program funding was only going to increase by $5 billion (about 4.4%). In fact, universities as a whole did better than this: they got an increase of about 6% in nominal terms (yes, still a decrease in real terms but actually better than anticipated).  The University of Ottawa  got an increase of about 5% in that period.

Would life be easier if government wrote bigger checks? Sure.  But everyone has had plenty of warning that government assistance wasn’t going to be increasing very much.  The real question is: given that everyone knew this, why were university expenditures allowed to grow so much?

Well, let’s go back to the summer of 2013, when the university was negotiating a new contract with the Association of Professors of the University of Ottawa (APUO).  Remember, at this point, everyone had known for more than three years that government wasn’t coming through with new money anytime soon.  And yet the institution conceded this astonishing deal.  The APUO’s summary of the deal is here, but the two key bits were:

  • A guaranteed increase in academic/librarian staff complement of 4% (from 1250 to 1311).
  • An increase in pay of 11.5% on top of PTR  over four years (but really three since the deal was backdated to 2012).

The result: between 2010 and 2016, the period in which government said “sorry guys, no more money”, aggregate expenditures on academic staff salaries – the biggest single line-item in the institution’s budget – increased by 29.4%.

Did I mention that the Faculty Union responsible for this $50 million rise in costs has launched a grievance re: the cuts?  No?  Well, now you know.

Of course, that’s not the only reason the university is in trouble.  While U of O made valiant efforts to increase its revenues in other ways (revenue from investments increased, and revenue from fees rose almost 45%, thanks in part to higher fees but equally if not more to increased international student enrolment), it wasn’t exactly doing much to curb expenditures in non-salary items either.  Sure, academic staff wages were increasing at close to 5% per year throughout this period, but other operating costs were rising at a little over 4%.  Bluntly: strong cost controls are not in place.  There’s blame to go around here.

It may seem like I am picking on the University of Ottawa, but trust me: I could tell a story like this about virtually any university in the country.  Governments are not ponying up for higher education the way they used to.  Student fees – international student fees especially – have made up some but not all of the gap.  Faculty unions are demanding pay increases in excess of institutional income growth and by and large they are getting them.  And cost controls in other areas of spending are little more effective.

Something has to give.

January 23

A Puzzling Pattern in the Humanities

Big news in Alberta the other day: the University of Alberta has decided to cut fourteen (14!) programs, in the humanities. That’s on top of a programs cull just two years ago in which seventeen programs – mostly in Arts – were also axed! Oh my God! War on the humanities, etc, etc.

Or at least that’s the way it sounds, until you read the fine print around the announcement and realise that these fourteen programs, collectively, have 30 students enrolled in them. The puzzle here, it seems, is not so much “why are these programs being cancelled” as “why on earth were they ever approved in the first place”?

For the record, here are the programs being axed: Majors programs in Latin American studies, Scandinavian studies, honours programs in classical languages, creative writing, history/classics (combined) religious studies, women and gender studies, comparative literature, French, math (that is, a BA Hon in math – which is completely separate from the BSc in Math, which is going nowhere), and also Scandinavian studies (again). And technically, they are not being axed, but rather “suspending admissions”, which means that current students will be able to finish their degrees.

Two takeaways from this:

The first is that the term “programs” is a very odd and sometimes misunderstood one. Universities can get rid of programs without affecting a single job, without even reducing a single course offerings. In the smorgasboard world of North American universities, all programs are essentially virtual. The infrastructure of a university is essentially the panoply of courses offered by departments. Academic entrepreneurs can then choose to bundle certain configurations of courses into “programs” (with the approval of a lot of committees and Senate of course). Of course, programs need co-ordinators and a co-ordinators get stipends and more importantly a small bump in prestige. But overall, programs are very close to costless because departments are absorbing all the costs of delivering the actual courses. (The real costs are actually the ludicrous amount of programming time involved in getting registrarial software to recognize all these different degree pathway requirements).

It doesn’t actually have to be this way. Harvard’s Faculty of Arts and Science only has about fifty degree programs; pretty much every mid-size Canadian university has twice that. And there’s no obvious benefit to students in this degree of specialization. What’s the advantage of this? Why, apart from inertia and a desire not to rock the boat, do we put up with this?

A second point, though. Readers may well ask “why do these kinds of program cuts always affect the humanities more than any other faculty”. This is a good question. And the answer is: because no other faculty hacks itself into ever-tinier pieces the way humanities does. Seriously. This isn’t a question of specialization – every field has that – it’s a question of whether or not to create academic structures and bureaucracies to parallel every specialization.

Imagine, for instance, what biology would look like if it were run like humanities. You’d probably have separate degrees and program co-ordinators for epigenetics, ichnology, bioclimatology, cryobiology, limnology, morphology – the potential list goes on and on. But of course biology doesn’t do that, because biology is not ridiculous. Humanities, on the other hand…

There are lots of good histories of the humanities out there (I recommend Rens Bod’s A New History of the Humanities and James Turner’s Philology: the Origins of the Modern Humanities), but as far as I know no one has ever really looked in a historical way as to why humanities, alone among branches of the academy, chose to Balkanize itself administratively in such an odd way.  For a set of disciplines which constantly worries about being under attack, you’d think that grouping together in larger units would be an obvious defence posture.  Why not just have big programs in philosophy, languages and literature and philology/history and be done with it?

January 20

Puzzles in the youth labour market

A couple of days ago, after looking at employment patterns among recent graduate using Ontario graduate survey data, I promised a look at broader youth labour market data. I now wish I hadn’t promised that because Statistics Canada’s CANSIM database is an ungodly mess and has got significantly worse since the last time I tried to use its data. Too little of the data on employment and income allows users to focus in by age *and* education level, and even getting details down to 5-year age brackets (e.g. 20-24, 25-29), which might be useful for looking at youth labour markets, is frustratingly difficult.

(WHY CAN’T WE HAVE NICE THINGS, STATSCAN? WHY???)

Anyways.

Ok, so let’s start by looking at employment rates. Figure 1 looks at employment rates for Canadians in the 15-19, 20-24 and 25-29 age brackets since the turn of the Millennium.

Figure 1: Employment Rates by Age-group 1999-2016

OTTSYD 2017-01-20-1

The takeaway in figure 1 is that by and large employment rates are steady except for one big hiccup in 2008-9. In that year, the employment rate for 24-29 year olds fell by about two percentage points, that for 20-24 years olds fell by three and a half percentage points, and that for 15-19 year olds by five percentage points. Not only did the size of the drop vary inversely with age, so too has subsequent performance. Employment rates for 25-29 year-olds and 20-24 year olds have held fairly steady since 2009; those for 15-19 year olds have continued to fall, and are now over seven percentage points off their 2008 peak.

Ah, you say: employment is one thing: what about hours of work? Aren’t we seeing more part-time work and less full-time work? Well, sort of.

Figure 2: Part-time Employment as a Percentage of Total Employment, by Age-group, 1999-2016

OTTSYD 2017-01-20-2

Across all age-groups, the percentage of workers who are part-time (that is, working less than 30 hours per week) rose after 2008. In the case of the 25-29 year olds, this was pretty minor, rising from 12% in 2008 to 14% today. Among the 15-19 year-olds the movement was not especially large either, rising from 70 to 74% (remember, we *want* these kids to be part-time: they’re supposed to be in school). The biggest jump was for the 20-24 group – that is, traditional-aged students and recent graduates – where the part-time labour jumped from 29% to 35%. Now some of that might be due to higher university enrolment rates (workers are more likely to be part-time if they are also studying), but at least some of that is simply a push towards increased casualization of the labour market.

So far, all of this is roughly consistent with what we saw Monday through Wednesday – which is that there was a one-time shock to employment around 2008, and that the effect is much more pronounced among younger graduates (say, those 6 months out from graduation ) than it is among older ones (say, those 24 months after graduation. What is not quite consistent, though, is what is happening to wages. Unfortunately, CANSIM no longer makes available any decent time series data on wages or income for the 20-24 or 25-29 age groups (one of these days I am going to have to stump up for a public use microdata file but today is not that day.) But it does offer  some data on what’s going on for 15-24 year olds. Sub-optimal (25-29 would be best) but still useful. Here’s what that data looks like:

Figure 3: Hourly and Weekly Wages, 15-24 year-olds, in $2016, 2001-2016

OTTSYD 2017-01-20-3

Figure 3 shows hourly and weekly wages for 15-24 year olds, in 2016 dollars. Hourly wages (measured on the right axis) grew faster than weekly wages (measured on the left axis) because average hours worked fell by 3.5% (this is the shift to part-time work we saw in figure 2). Hourly wage growth has not been as strong since 2008 as it was between 2004 and 2008, but it is still up 6% over that time. It’s probably safe to assume that the situation for 25-29 year olds is not worse than it is for 20-24 year olds. Which means we have an interesting puzzle here: wage growth for the youth cohort as a whole is positive, at least among those who have wages – but as we saw Monday and Tuesday wage growth is negative for university students. What’s going on?

There are two possibilities here. The first is that wage growth since 2008 is stronger for those without university degrees than it is for those with. With the oil/gas boom, that might have been a reasonable explanation up to 2014; it’s hard to see it still being true now. The second is the proposition advanced here earlier this week: that while university graduates may still all cluster at the right-end of the bell curve, as they encompass a greater proportion of the bell curve, the average as a whole necessarily falls.

In short: post-2008, something has happened to the labor market which makes it more difficult for young people to “launch”. We shouldn’t overstate how big this problem is: employment is down slightly, as is the proportion of employment which is full-time. But unlike previous recessions, the youth-labour market does not seem to be bouncing back – these changes seem to be permanent, which is a bit disquieting. But it’s also true that these effects are more severe among the youngest: which is exactly what you’d expect if the labour market was putting a greater emphasis on skills and experience. By the time youth get to their late 20s, the effects mostly disappear.

In other words, what we are seeing is less “failure to launch” than “delays in the launch”. To the extent anything has changed, it’s that the transition to the labour market is on average a little bit rougher and a little bit slower than it used to be, but that’s likely as much to do with the expansion of access to university as it is a skills-biased change in the labour market.

January 19

American Higher Education Under Trump

Tomorrow, Donald Trump will be sworn in as the 45th President of the United States (actually, the 44th person to be President: Grover Cleveland’s two non-consecutive terms screw up the count).  What does this mean for higher education?

First off, let’s recollect that where higher education is concerned, the US, like Canada, is a federation where the main decisions about funding public education are made at the state level. Decreased state investment in institutions and consequent rises in tuition have given the federal government a larger though indirect role in the system because the salience of student aid has risen.  And of course, the government spends an awful lot of money on scientific research, primarily but not exclusively through the National Institutes for Health (NIH) and the National Science Foundation (NSF).  And let’s also recollect that while the President names the Secretary of Education, a lot of control over specific budget items rests with Congress, which, despite being controlled by Republicans, will have ideas of their own.

Recall that Trump barely spoke about higher education during the campaign, other than endorsing an even-more-expensive version of income-based repayment than the existing one which was recently discovered to be costing nearly over $50 billion more than expected (short version: he wants to raise the repayment maximum from 10% of income to 12.5% but shorten the time before forgiveness to just 15 years).  Also, his education secretary Betsy DeVos, is a K-12 specialist (I’m using the term loosely) with very few known views on higher education.  I think it’s a given that their instincts will anti-regulatory and pro-market (which means things are looking up for private for-profits), but it’s hard to see them initiating a lot of new policy.  Which means the policy reins, such as they are, will likely be held by the Republican Congress and not the White House.

So what to expect?  Well, I think we can rule out any continuation of the Obama White House’s free college agenda, or anything vaguely like it.  That idea won’t disappear, but it’s something that’s going to happen in the states rather than in DC (witness Andrew Cuomo’s decision earlier this month to launch his own Ontario-like free tuition-plan).  Beyond that, you’re likely to see some cutting back on institutional reporting requirements, particularly with respect to Title IX, the federal law on sex-discrimination in education, and possibly a push towards more competency-based education.

Where it gets interesting, though, is on student-aid.  It’s not just that we’re likely to see cuts in things like loans to graduate students and (pace Trump’s own views) loan forgiveness.  We may see a return to more private capital in student loans (which would mostly be a bad things); we may also see institutions be required to pay for some of the costs of their own students’ loan defaults (an idea colloquially referred to as requiring institutions to have “skin in the game”.  Some think that the new Congress may push what are known as “Income Share Agreements”, which are kind of like graduate taxes only the entity giving the student money and then collecting a percentage of income afterwards is some kind of private investment firm rather than government.  One of the most crazy/plausible ideas I’ve heard is from University Ventures’ Ryan Craig who mused recently on twitter about setting rules whereby institutions might have to provide a certain fraction of total aid via ISAs in order to be eligible to receive federal aid.

On the research side: who knows?  Clearly, climate science is going to have a hard time.  But health sciences often do well under Republicans; the National Institutes of Health went from $18 billion/year to $30 billion/year under Bush Jr, for instance.  And Trump might decide to do something big and crazy like announcing a lunar base or a Mars mission (the former is a favourite of Newt Gingrich, the latter an obsession of Elon Musk, who suddenly seems quite close with the incoming White House), either of which would have substantial positive ramifications for university science budgets.  So we’ll see.

But put all this into some perspective: as far as Congressional priorities are concerned, changes to student aid are going to come several light years behind repealing Obamacare and dismantling various environmental protections.  The former in particular has some pretty serious budget impacts as repealing Obamacare is going to cost a ton of money.  That’s going to cause a scramble for offsetting budget cuts – one could imagine some pretty big across-the-board cuts in which higher education-related programs will simply be collateral damage.

It’s bound to be interesting, anyway.  Though I for one am glad I get to watch it all from a safe distance.

January 18

More Bleak Data, But This Time on Colleges

Everyone seems to be enjoying data on graduate outcomes, so I thought I’d keep the party going by looking at similar data from Ontario colleges. But first, some of you have written to me suggesting I should throw some caveats on what’s been covered so far. So let me get a few things out of the way.

First, I goofed when saying that there was no data on response rates from these surveys. Apparently there is and I just missed it. The rate this year was 40.1%, a figure which will make all the economists roll their eyes and start muttering about response bias, but which anyone with field experience in surveys will tell you is a pretty good response for a mail survey these days (and since the NGS response rate is now down around the 50% mark, it’s not that far off the national “gold standard”).

Second: all this data on incomes I’ve been giving you is a little less precise than it sounds. Technically, the Ontario surveys do not ask income, they ask income ranges (e.g. $0-20K, $20-40K, etc). When data is published either by universities or the colleges, this is turned into more precise-looking figures by assigning the mid-point value of each and then averaging those points. Yes, yes, kinda dreadful. Why can’t we just link this stuff to tax data like EPRI does? Anyways, that means you should probably take the point values with a pinch of salt: but the trend lines are likely still meaningful.

Ok, with all that out of the way, let’ turn to the issue of colleges. Unfortunately, Ontario does not collect or display data on college graduates’ outcomes the way they do for universities. There is no data around income, for instance. And no data on employment 2 years after graduation, either. The only real point of comparison is employment 6 months after graduation, and even this is kind of painful: for universities the data is available only by field of study; for colleges, it is only available by institution. (I know, right?) And even then it is not even calculated on quite the same basis: universities include graduates with job offers while the college one does not. So you can’t even quite do an apples-to-apples comparison, even at the level of the sector as a whole. But if you ignore that last small difference in calculation and focus not on the point-estimates but on the trends, you can still see something interesting. Here we go:

Figure 1: Employment Rates 6 months after Graduation, Ontario Universities vs. Ontario Colleges, by Graduating Cohort, 1999-2015

ottsyd-20170117

So, like I said, ignore the actual values in Figure 1 because they’re calculated in two slightly different ways; instead, focus on the trends. And if you do that, what you see is (a blip in 2015 apart), the relationship between employment rates in the college and university sector looks pretty much the same throughout the period. Both had a wobble in the early 2000s, and then both took a big hit in the 2008 recession. Indeed, on the basis of this data, it’s hard to make a case that one sector has done better than another through the latest recession: both got creamed, neither has yet to recover.

(side point: why does the university line stop at 2013 while the college one goes out to 2015? Because Ontario doesn’t interview university grads until 2 years after grad and then asks them retroactively what they were doing 18 months earlier. So the 2014 cohort was just interviewed last fall and it’ll be a few months until their data is released. College grads *only* get interviewed at 6 months, so data is out much more quickly)

What this actually goes is put a big dent in the argument that the problem for youth employment is out-of-touch educators, changing skill profiles, sociologists v. welders and all that other tosh people were talking a few years ago. We’re just having more trouble than we used to integrating graduates into the labour market. And I’ll be taking a broader look at that using Labour Force Survey data tomorrow.

January 17

Another Lens on Bleak Graduate Income Data

So, yesterday we looked at Ontario university graduate employment data (link to: previous).  Today I want to zero in a little bit on what’s happening by field of study.

(I can hear two objections popping up already.  First; “why just Ontario”?  Answer: while Quebec, Alberta, British Columbia and the Maritimes – via MPHEC – all publish similar data, they all publish the data in slightly different ways, making it irritating (and in some cases impossible) to come up with a composite national figure.  The National Graduate Survey (NGS) in theory does this, but only every five years but as I explained last week has made itself irrelevant by changing the survey period.  So, in short, I can’t do national, and Ontario a) is nearly half the country in terms of university enrolments and b) publishes slightly more detailed data than most.  Second, “why just universities”?  Answer: “fair point, I’ll be publishing that data soon”.

Everyone clear? OK, let’s keep going).

Let’s look first at employment rates 6 months after graduation by field of study (I include only the six largest – Business/Commerce, Education, Engineering, Humanities, Physical Sciences and Social Sciences – because otherwise these graphs would be an utter mess), shown below in Figure 1.  As was the case yesterday, the dates along the x-axis are the cohort graduation year.

ottsyd-20170116-1

Two take-aways here, I think.  The first is that the post-08 recession really affected graduates of all fields more or less equally, with employment rates falling by between 6 and 8 percentage points (the exception is humanities, where current rates are only four percentage points below where they were in 2007).  The second is that pretty much since 2001, it’s graduates in the physical sciences who have had the weakest results.

OK, but as many in the academy say: 6 months isn’t enough to judge anything.  What about employment rates after, say, 2 years?  These are shown below in Figure 2.

ottsyd-20170116-2

This graph is smoother than the previous one, which suggests the market for graduates with 2 years in the labour market is a lot more stable than that for graduates with just 6 months.    If you compare the class of 2013 with the clss of 2005 (the last one to completely miss the 2008-9 recession), business and commerce students’ employment rates have fallen only by one percentage point while those in social sciences have dropped by six percentage points, with the others falling somewhere in between.  One definite point to note for all those STEM enthusiasts out there: there’s no evidence here that students in STEM programs have fared much better than everyone else.

But employment is one thing; income is another.  I’ll spare you the graph of income at six months because really, who cares?  I’ll just go straight to what’s happening at two years.

ottsyd-20170116-3

To be clear, what figure 3 shows is average graduate salaries two years after graduation in real dollars – that is, controlling for inflation.  And what we see here is that in all fields of study, income bops along fairly steadily until 2007 (i.e. class of 2005) at which point things change and incomes start to decline in all six subject areas.  Engineering was down, albeit only by three percent.  But income for business students was down 10%, physical sciences down 16%, and humanities, social sciences and education were down 19%, 20% and 21%, respectively.

This, I shouldn’t need to emphasize, is freaking terrible.  Actual employment rates (link to: previous) may not be down that much but this drop in early graduate earnings is  pretty disastrous for the majority of students.  Until a year or two ago I wasn’t inclined to put a lot of weight on this: average graduate earnings have always popped back after recessions.  This time seems to be different.

Now as I said yesterday, we shouldn’t be too quick to blame this on a huge changes economy to which institutions are not responding; it’s likely that part of the fall in averages comes from allowing more students to access education in the first place.  As university graduates take up an increasing space on the right-hand side of an imaginary bell-curve representing all youth, “average earnings” will naturally decline even if there’s no overall change in the average or distribution of earnings as a whole.  And the story might not be as negative if we were to take a five- or ten-year perspective on earnings.  Ross Finnie has done some excellent work showing that in the long-term nearly all university graduates make a decent return (though, equally, there is evidence that students with weak starts in the labour force have lower long-term earnings as well through a process known as “labour market scarring”).

Whatever the cause, universities (and Arts faculties in particular) have to start addressing this issue honestly.  People know in their gut that university graduates’ futures in general (and Arts graduates in particular) are not as rosy as they used to be. So when the Council of Ontario puts out a media release, as it did last month, patting universities on the back for a job well-done with respect to graduate outcomes, it rings decidedly false.

Universities can acknowledge challenges in graduate without admitting that they are somehow at fault.  What they cannot do is pretend there isn’t a problem, or shirk taking significant steps to improve employment outcomes.

January 16

Ever-bleaker Graduate Employment Data?

So just before I quit blogging in December, the Council of Ontario Universities released its annual survey of graduate outcomes, this time of the class of 2013.  The release contained the usual platitudes: “future is bright”, “vast majority getting well-paying jobs”, etc etc.   And I suppose if one looks at a single year’s results in isolation, one can make that case.  But a look at longer-term trends suggests cause for concern.

These surveys began at the behest of the provincial government seventeen years ago.  Every graduating cohort is surveyed twice: once six months after graduation and once two years after graduation.  Students are asked questions about their employment status, their income and about the level of relationship between their job and their education.  COU publishes only high-level aggregate data, so we don’t know about things like response rates, but the ministry seems pleased enough by data quality, so I assume it’s within industry standards.

Figure 1 shows employment rates of graduates six months and two years out.  At the two-year check point, employment rates fell by four points in the immediate wake of the 2008-9 recession, (be careful in reading the chart: the x-axis is the graduating class, not the year of the survey, so the line turns down in 2006 because that’s the group that was surveyed in 2008).  Since then it has recovered by a little more than a point and a half, though further recovery seems stalled.  At the six-month point, things are much worse.  Though employment rates at this point are no longer falling, they remain stubbornly seven percentage points below where they were pre-recession.

Figure 1: Employment Rates, Ontario University Graduates, 6 Months and 2 Years Out, by Graduating Class, 1996-2013

OTTSYD 20170115-1

If you want to paint a good story here, it’s that employment rates at 2 years out are still within three percentage-points of their all-time peak, which isn’t terrible.  But there doesn’t seem much doubt that students are on average taking a bit longer to “launch” than they used to; employment rates six months out seem to have hit a new, and permanently lower floor.

Now, take a look at what’s happening to starting salaries.  As with the previous graph, I show results for at both the six-month and the two-year mark.

 

 Figure 2: Average salaries, Ontario University Graduates, 6 Months and 2 Years Out, by Graduating Class, 1996-2013, in $2016

OTTSYD 20170115-2

What we see in Figure 2 is the following:  holding inflation constant, during the late 1990s, recent graduates saw their incomes grow at a reasonably rapid clip.  For most of the 2000s, income was pretty steady for graduates two years out (less so six months out).  But since the 2008 recession, incomes have been falling steadily for several years; unlike the situation with employment rates, we have yet to see a floor, let alone a bounceback.  Real average incomes of the class of 2013 six months after graduation were 11% lower than those of the class of 2005 (the last fully pre-recession graduating class); at 2 years out the gap was 13%.  Somehow these points did not make it into the COU release.

That, frankly, is not good.  But it seems to me that we need to hold on a little bit before hitting panic buttons about universities being a bad deal, not being relevant to shifting labour market, etc, etc.  Sure, the drop-off in both employment rates and incomes started around the time of the recession and so it’s easy to create a narrative around changed economy/new normal, etc etc.  But there’s something else that probably playing a role, and that’s an increase in the supply of graduates.

 

Figure 3: Number of Undergraduate Degrees Awarded, Ontario, 1999-2013

OTTSYD 20170115-3

The other big event we need to control for here is the massive expansion of access to higher education.  In 2003, the “double-cohort” arrived on campus and that forced government to expand institutional capacity, which did not subsequently shrink.  Compared to the year 2000, the number of graduates has increased by over 50%; Such an expansion of supply must have had some effect on average outcomes. It’s not simply that there are more students competing for jobs – something one would naturally assume would place downward pressure on wages – but also, the average quality of graduates has probably dropped somewhat.  Where once graduates represented the top 20% of a cohort in terms of academic ability, now they probably represent the top 30% or so.  Assuming one’s marginal product in the labour market is at least loosely tied to academic ability, that would also predict a drop in average post-graduation incomes.  To really get a sense of what if anything has changed in terms of how higher education affects individuals’ fortunes in the labour market, you’d want to measure not average income vs. average income, but 66th percentile of income now vs. 50th percentile of income fifteen years ago.  Over to you, COU, since you could make the microdata public if you wanted to.

In short, don’t let institutions off the hook on this, but recognize that some of this was bound to happen anyway because of access trends.

More graduate income data fun tomorrow.

January 13

Restore the NGS!

One of the best things that Statistics Canada ever did in the higher education field was the National Graduates’ Survey (NGS). OK, it wasn’t entirely Statscan – NGS has never been a core product funded from the Statscan budget but rather funded periodically by Employment and Social Development Canada (ESDC) or HRDC or HRSDC or whatever earlier version of the department you care to name – but they were the ones doing the execution. After a trial run in the late 1970s (the results of which I outlined back here), Statscan tracked the results of the graduating cohorts of 1982, 1986, 1990, 1995, 2000 and 2005 two and five years after graduation (technically, only the 2-year was called NGS – the 5-year survey was called the Follow-up of Graduates or FOG but no one used the name because it was too goofy). It became the prime way Canada tracked transitions from post-secondary education to the labour market, and also issues related to student debt.

Now NGS was never a perfect instrument. Most of the income data could have been obtained much more simply through administrative records, the way Ross Finnie is currently doing at EPRI. We could get better data on student debt of provinces ever got their act together and actually released student data on a consistent and regular basis (I’m told there is some chance of this happening in the near future). It didn’t ask enough questions about activities in school, and so couldn’t examine the effects of differences in provision (except for, say, Field of Study) on later outcomes. But for all that it was still a decent survey, and more to the point one with a long history which allowed one to make solid comparisons over time.

Then, along comes budget cutting exercises during the Harper Government. ESDC decides it only has enough money for one survey, not two. Had Statscan or ESDC bothered to consult anyone about what to do in this situation, the answer would almost certainly have been: keep the 2-year survey and ditch the 5-year one. The 5-year survey was always beset with the twin problems of iffy response rates and being instantly out of date by the time it came out (“that was seven graduating classes ago!” people would say – “what about today’s graduates”?). But the 2-year? That was gold, with a decent time series going back (in some topic areas) back almost 30 years. Don’t touch that, we all would have said, FOR GOD’S SAKE DON’T TOUCH IT, LEAVE IT AS IT IS.

But of course, Statscan and ESDC didn’t consult and they didn’t leave it alone. Instead of sticking with a 2-years out survey, they decided to do a survey of students three years out, thereby making the results for labour market transitions totally incompatible with the previous six iterations of the survey. They spent millions to get a whole bunch of data which was hugely sub-optimal because they murdered a perfectly good time-series to get it.

I have never heard a satisfactory explanation as to why this happened. I think it’s either a) someone said: “hey, if we’re ditching a 2-year and a 5-year survey, why not compromise and make a single 3-year survey?” or b) Statscan drew a sample frame from institutions for the 2010 graduating class, ESDC held up the funding until it was too late to do a two-year sample and then when it eventually came through Statscan said, “well we already have a frame for 2010, so why not sample them three years out instead of doing the sensible thing and going back and getting a new frame for the 2011 cohort which would allow us to sample two years out”. To be clear, both of these possible reasons are ludicrous and utterly indefensible as a way to proceed with a valuable dataset, albeit in different ways. But this is Ottawa so anything is possible.

I have yet to hear anything about what, if anything, Statscan and ESDC plan to do about surveying the graduating cohort of 2015. If they were going to return to a two-year cycle, that would mean surveying would have to happen this spring; if they’re planning on sticking with three, the survey would happen in Spring 2018. But here’s my modest proposal: there is nothing more important about NGS than bringing back the 2-year survey frame. Nothing at all. Whatever it takes, do it two years out. If that means surveying the class of 2016 instead of 2015, do it. We’ll forget the Class of 2010 survey ever happened. Do not, under any circumstances, try to build a new standard based on a 3-year frame. We spent 30 years building a good time series at 24 months out from graduation. Better have a one-cycle gap in that time series than spend another 30 years building up an equally good time-series at 36 months from graduation.

Please, Statscan. Don’t mess this up.

January 12

Post-Brexit Options

One highly amusing by-product of the frantic Canada-EU-Walloon trade negotiation finale last fall was watching the UK government suddenly realize that negotiating agreements with a 27-country trade bloc is actually really difficult and that this Brexit thing is almost certainly not going to end well.  Which of course has some reasonably significant implications for UK universities.  But how exposed are UK universities to Brexit?

Arguably, the bigger post-Brexit implications have to do with staff who may be denied residency, future staff who won’t be allowed entry and broken research partnerships with EU-funded colleagues on the continent.  But I’m going to limit my analysis here to the student intake because it’s a little easier to quantify.

Let’s at what’s at stake for the UK in terms of international student numbers.

UK International Student Numbers by Country of Origin


ottsyd-20170111-1

Source: UK Council for International Student Affairs. EU shown in red, non-EU in blue

Somewhat surprisingly (to me at least), only about 30% of the UK’s international student body comes from the EU, with Germany and France the largest source countries.  That’s about 125,000 students, paying roughly £9,000 per year, so that’s a £1.1B hit to the sector.  That sounds big (and of course it’s nothing to be sneezed at), but in a sector worth around £33B, it’s not *that* crucial.

Now, how much of this money would institutions actually give up if Brexit goes through?  That’s still a big unknown, because it depends on how many foreigners will be allowed to get visas post-2019 and whether or not students will be considered within the cap.  For the past few years – since now-PM Theresa May became Home Secretary in 2010 in fact – the Home Office has been including non-EU students in the cap, and as a result international student numbers have been falling for quite a while now and are now about a third lower than they were before the Cameron government took office.  A similar result with EU students would see a loss of about £400 million to the sector.

But, say some, that’s without accounting for any loss from higher tuition fees.  Pre-Brexit, EU students pay what domestic students pay.  Post-Brexit, they will in theory pay a higher “international” fee.  These fees depend on the type of course undertaken: they average £13,394 for lecture-based programs, £15,034 for laboratory-based programs and £24,169 for clinical disciplines (see here for more details).  Some feel that a shift to these higher fees may deter even more students.  Frankly, this is a weak argument: if institutions really want foreign students, they can lower the fees (the bigger threat is probably these students’ loss of access to UK student loans, without which many might find even the current fees a struggle to bear).  And anyways, these higher fees mean that if UK universities only lost 1/3 of their EU students, they’d actually be up on the deal thanks to higher tuition rates.

Anyways, as you can tell, I’m not convinced that the loss of EU students is in fact a major challenge to the UK higher ed sector, though obviously it might be to specific universities who are overweight with this group.  It certainly makes you wonder why some institutions are musing about creating “overseas” campuses inside the EU (see here, here).    The answer, primarily, is that these proposed campuses are about trying to get around research collaboration barriers more than they are about gaining student numbers through branch campuses. I can’t actually imagine many EU countries (or the EU itself) would be daft enough to leave such loopholes open, but you never know.  But in any event, branch campuses are high-cost, high-risk and for students tend to be very much second-choice to home institutions.  If there are a lot of frustrated, wannabe-English students in Europe as a result of Brexit, they’re probably likelier to head to Ireland or North America as they are to go to University of East Anglia – Lens, or University of Chichester-Malmo.

In short, the student-side of Brexit should be a lot less concerning than the staff side of Brexit.

Page 3 of 10912345...102030...Last »