HESA

Higher Education Strategy Associates

Tag Archives: employment

May 16

Jobs: Hot and Not-So-Hot

Remember when everyone was freaking out because there were too many sociology graduates and not enough welders?  When otherwise serious people like Ken Coates complained about the labour market being distorted by the uninformed choices of 17-19 year-olds?  2015 seems like a long time ago.

Just for fun the other day, I decided to look at which occupations have fared best and worst in Canada over the past ten years (ok, I grant you my definition of fun may not be universal).  Using public data, the most granular data I can look at are two-digit National Occupation Codes, so some of these categories are kind of broad.  But anyway, here are the results:

Table 1: Fastest-growing occupations in Canada, 2007-2017

May 16-17 Table 1 Fastest Growing

See any trades in there?  No, me neither.  Four out of the top ten fastest-growing occupations are health-related in one way or another.  There are two sets of professional jobs – law/social/community/ government services (which includes educational consultants, btw) and natural/applied sciences) which pretty clearly require bachelor’s if not master’s degrees.  There are three other categories (Admin/financial supervisors, Technical occupations in art, and paraprofessional occupations in legal, social, etc) which have a hodgepodge of educational requirements but on balance probably have more college than university graduates.   And then there is the category retail sales supervisors and specialized sales occupations, which takes in everything from head cashiers to real estate agents and aircraft sales representatives.  Hard to know what to make of that one.  But the other nine all seem to require training which is pretty squarely in traditional post-secondary education specialties.

Now, what about the ten worst-performing occupations?

Table 2: Fastest-shrinking Occupations in Canada 2007-2017

May 16-17 Table 2 Fastest Shrinking Occupation
This is an interesting grab bag.  I’m fairly sure, given the amount of whining about managerialism one hears these days, that it will be a surprise to most people that the single worst-performing job sector in Canada is “senior management occupations”.  It’s probably less of a surprise that four of the bottom ten occupations are manufacturing-related, and that two others – Distribution, Tracking and Scheduling and Office Support Occupations – which are highly susceptible to automation are there, too.  But interestingly, almost none of these occupations, bar senior managers, have significant numbers of university graduates in them. Many wouldn’t even necessarily have a lot of college graduates either, at least outside the manufacturing and resources sectors.

Allow me to hammer this point home a bit, for anyone who is inclined to ever again take Ken Coates or his ilk seriously on the subject of young people’s career choices.  Trades are really important in Canada.  But the industries they serve are cyclical.  If we counsel people to go into these areas, we need to be honest that people in these areas are going to have fat years and lean years – sometimes lasting as long as a decade at a time.  On the other hand, professional occupations (nearly all requiring university study) and health occupations (a mix of university and college study) are long-term winners.

Maybe students knew that all along, and behaved accordingly.  When it comes to their own futures, they’re pretty smart, you know.

 

January 18

More Bleak Data, But This Time on Colleges

Everyone seems to be enjoying data on graduate outcomes, so I thought I’d keep the party going by looking at similar data from Ontario colleges. But first, some of you have written to me suggesting I should throw some caveats on what’s been covered so far. So let me get a few things out of the way.

First, I goofed when saying that there was no data on response rates from these surveys. Apparently there is and I just missed it. The rate this year was 40.1%, a figure which will make all the economists roll their eyes and start muttering about response bias, but which anyone with field experience in surveys will tell you is a pretty good response for a mail survey these days (and since the NGS response rate is now down around the 50% mark, it’s not that far off the national “gold standard”).

Second: all this data on incomes I’ve been giving you is a little less precise than it sounds. Technically, the Ontario surveys do not ask income, they ask income ranges (e.g. $0-20K, $20-40K, etc). When data is published either by universities or the colleges, this is turned into more precise-looking figures by assigning the mid-point value of each and then averaging those points. Yes, yes, kinda dreadful. Why can’t we just link this stuff to tax data like EPRI does? Anyways, that means you should probably take the point values with a pinch of salt: but the trend lines are likely still meaningful.

Ok, with all that out of the way, let’ turn to the issue of colleges. Unfortunately, Ontario does not collect or display data on college graduates’ outcomes the way they do for universities. There is no data around income, for instance. And no data on employment 2 years after graduation, either. The only real point of comparison is employment 6 months after graduation, and even this is kind of painful: for universities the data is available only by field of study; for colleges, it is only available by institution. (I know, right?) And even then it is not even calculated on quite the same basis: universities include graduates with job offers while the college one does not. So you can’t even quite do an apples-to-apples comparison, even at the level of the sector as a whole. But if you ignore that last small difference in calculation and focus not on the point-estimates but on the trends, you can still see something interesting. Here we go:

Figure 1: Employment Rates 6 months after Graduation, Ontario Universities vs. Ontario Colleges, by Graduating Cohort, 1999-2015

ottsyd-20170117

So, like I said, ignore the actual values in Figure 1 because they’re calculated in two slightly different ways; instead, focus on the trends. And if you do that, what you see is (a blip in 2015 apart), the relationship between employment rates in the college and university sector looks pretty much the same throughout the period. Both had a wobble in the early 2000s, and then both took a big hit in the 2008 recession. Indeed, on the basis of this data, it’s hard to make a case that one sector has done better than another through the latest recession: both got creamed, neither has yet to recover.

(side point: why does the university line stop at 2013 while the college one goes out to 2015? Because Ontario doesn’t interview university grads until 2 years after grad and then asks them retroactively what they were doing 18 months earlier. So the 2014 cohort was just interviewed last fall and it’ll be a few months until their data is released. College grads *only* get interviewed at 6 months, so data is out much more quickly)

What this actually goes is put a big dent in the argument that the problem for youth employment is out-of-touch educators, changing skill profiles, sociologists v. welders and all that other tosh people were talking a few years ago. We’re just having more trouble than we used to integrating graduates into the labour market. And I’ll be taking a broader look at that using Labour Force Survey data tomorrow.

January 17

Another Lens on Bleak Graduate Income Data

So, yesterday we looked at Ontario university graduate employment data (link to: previous).  Today I want to zero in a little bit on what’s happening by field of study.

(I can hear two objections popping up already.  First; “why just Ontario”?  Answer: while Quebec, Alberta, British Columbia and the Maritimes – via MPHEC – all publish similar data, they all publish the data in slightly different ways, making it irritating (and in some cases impossible) to come up with a composite national figure.  The National Graduate Survey (NGS) in theory does this, but only every five years but as I explained last week has made itself irrelevant by changing the survey period.  So, in short, I can’t do national, and Ontario a) is nearly half the country in terms of university enrolments and b) publishes slightly more detailed data than most.  Second, “why just universities”?  Answer: “fair point, I’ll be publishing that data soon”.

Everyone clear? OK, let’s keep going).

Let’s look first at employment rates 6 months after graduation by field of study (I include only the six largest – Business/Commerce, Education, Engineering, Humanities, Physical Sciences and Social Sciences – because otherwise these graphs would be an utter mess), shown below in Figure 1.  As was the case yesterday, the dates along the x-axis are the cohort graduation year.

ottsyd-20170116-1

Two take-aways here, I think.  The first is that the post-08 recession really affected graduates of all fields more or less equally, with employment rates falling by between 6 and 8 percentage points (the exception is humanities, where current rates are only four percentage points below where they were in 2007).  The second is that pretty much since 2001, it’s graduates in the physical sciences who have had the weakest results.

OK, but as many in the academy say: 6 months isn’t enough to judge anything.  What about employment rates after, say, 2 years?  These are shown below in Figure 2.

ottsyd-20170116-2

This graph is smoother than the previous one, which suggests the market for graduates with 2 years in the labour market is a lot more stable than that for graduates with just 6 months.    If you compare the class of 2013 with the clss of 2005 (the last one to completely miss the 2008-9 recession), business and commerce students’ employment rates have fallen only by one percentage point while those in social sciences have dropped by six percentage points, with the others falling somewhere in between.  One definite point to note for all those STEM enthusiasts out there: there’s no evidence here that students in STEM programs have fared much better than everyone else.

But employment is one thing; income is another.  I’ll spare you the graph of income at six months because really, who cares?  I’ll just go straight to what’s happening at two years.

ottsyd-20170116-3

To be clear, what figure 3 shows is average graduate salaries two years after graduation in real dollars – that is, controlling for inflation.  And what we see here is that in all fields of study, income bops along fairly steadily until 2007 (i.e. class of 2005) at which point things change and incomes start to decline in all six subject areas.  Engineering was down, albeit only by three percent.  But income for business students was down 10%, physical sciences down 16%, and humanities, social sciences and education were down 19%, 20% and 21%, respectively.

This, I shouldn’t need to emphasize, is freaking terrible.  Actual employment rates (link to: previous) may not be down that much but this drop in early graduate earnings is  pretty disastrous for the majority of students.  Until a year or two ago I wasn’t inclined to put a lot of weight on this: average graduate earnings have always popped back after recessions.  This time seems to be different.

Now as I said yesterday, we shouldn’t be too quick to blame this on a huge changes economy to which institutions are not responding; it’s likely that part of the fall in averages comes from allowing more students to access education in the first place.  As university graduates take up an increasing space on the right-hand side of an imaginary bell-curve representing all youth, “average earnings” will naturally decline even if there’s no overall change in the average or distribution of earnings as a whole.  And the story might not be as negative if we were to take a five- or ten-year perspective on earnings.  Ross Finnie has done some excellent work showing that in the long-term nearly all university graduates make a decent return (though, equally, there is evidence that students with weak starts in the labour force have lower long-term earnings as well through a process known as “labour market scarring”).

Whatever the cause, universities (and Arts faculties in particular) have to start addressing this issue honestly.  People know in their gut that university graduates’ futures in general (and Arts graduates in particular) are not as rosy as they used to be. So when the Council of Ontario puts out a media release, as it did last month, patting universities on the back for a job well-done with respect to graduate outcomes, it rings decidedly false.

Universities can acknowledge challenges in graduate without admitting that they are somehow at fault.  What they cannot do is pretend there isn’t a problem, or shirk taking significant steps to improve employment outcomes.

September 21

Unit of Analysis

The Globe carried an op-ed last week from Ken Coates and Douglas Auld, who are writing a paper for the MacDonald Laurier institute on the evaluation of Canadian post-secondary institutions. At one level, it’s pretty innocuous (“we need better/clearer data”) but at another level I worry this approach is going to take us all down a rabbit hole. Or rather, two of them.

The first rabbit hole is the whole “national approach” thing. Coates and Auld don’t make the argument directly, but they manage to slip a federal role in there. “Canada lacks a commitment to truly high-level educational accomplishment”, needs a “national strategy for higher education improvement” and so “the Government of Canada and its provincial and territorial partners should identify some useful outcomes”. To be blunt: no, they shouldn’t. I know there is a species of anglo-Canadian that genuinely believes the feds have a role in education because reasons, but Section 93 of the constitution is clear about this for a reason. Banging on about national strategies and federal involvement just gets in the way of actual work getting done.

Coates & Auld’s point about the need for better data applies to provinces individually as well as collectively. They all need to get in the habit of using more and better data to improve higher education outcomes. I also think Coates and Auld are on the right track about the kinds of indicators most people would care about: scholarly output, graduation rates, career outcomes, that sort of thing. But here’s where they fall into the second rabbit hole: they assume that the institution is the right unit of analysis for these indicators. On this, they are almost certainly mistaken.

It’s an understandable mistake to make. Institutions are a unit of higher education management. Data comes from institutions. And they certainly sell themselves as a unified institutions carrying out a concerted mission (as opposed to the collections of feuding academic baronetcies united by grievances about parking and teaching loads they really are). But when you look at things like scholarly output, graduation rates, and career outcomes the institution is simply the wrong unit of analysis.

Think about it: the more professional programs a school has, the lower the drop-out rate and the higher the eventual incomes. If a school has medical programs, and large graduate programs in hard sciences, it will have greater scholarly output. It’s the palette of program offerings rather than their quality which makes the difference when making inter-institutional comparisons. A bad university in with lots of professional programs will always beat a good small liberal arts school on these measures.

Geography play a role, too. If we were comparing short-term graduate employment rates across Canada for most of the last ten years, we’d find Calgary and Alberta at the top – and most Maritime schools (plus  some of the Northern Ontario schools) at the bottom. If we were comparing them today, we might find them looking rather similar. Does that mean there’s been a massive fall-off in the quality of Albertan universities? Of course not. It just means that (in Canada at least) location matters a lot more than educational quality when you’re dealing with career outcomes.

You also need to understand something about the populations entering each institution. Lots of people got very excited when Ross Finnie and his EPRI showed big inter-institutional gaps in graduates incomes (I will get round to covering Ross’ excellent work on the blog soon, I promise). “Ah, interesting!” people said. “Look At The Inter-Institutional Differences Now We Can Talk Quality”. Well, no. Institutional selectivity kind of matters here. Looking at outputs alone, without taking into account inputs, tells you squat about quality. And Ross would be the first to agree with me on this (and I know this because he and I co-authored a damn good paper on quality measurement a decade ago which made exactly this point).

Now, maybe Coates and Auld have thought all this through and I’m getting nervous for no reason, but their article’s focus on institutional performance when most relevant outcomes are driven by geography, program and selectivity suggests to me that there’s a desire here to impose some simple rough justice over some pretty complicated cause-effect issues. I think you can use some of these simple outcome metrics to classify institutions – as HEQCO has been doing with some success over the past couple of years – but  “grading” institutions that way is too simplistic.

A focus on better data is great. But good data needs good analytical frameworks, too.