HESA

Higher Education Strategy Associates

Category Archives: PSE Outcomes

June 12

The Nordstrom Philologist

People are always nattering on about skills for the new economy, but apart from some truly unhelpful ideas like “everyone should learn to code”, they are usually pretty vague on specifics about what that means.  But I think I have solved that.

What the economy needs – or more accurately, what enterprises (private and public) need – is more Nordstrom Philologists.

Let me explain.

One of the main consequences of the management revolutions of the last couple of decades has been the decline of middle-management.  But, as we are now learning, one of the key – if unacknowledged – functions of middle-management was to act as a buffer between clients and upper management on the one side, and raw new employees on the other.  By doing so, they could bring said new employees along slowly into the culture of the company, show them the ropes and hold their hands a bit as they gained in confidence and ability in dealing with new and unfamiliar situations.

But that’s gone at many companies now.  New employees are now much more likely to be thrown headfirst into challenging situations.  They are more likely to be dealing with clients directly, which of course means they have greater responsibility for the firm’s reputation and its bottom line.  They are also more likely to have to report directly to upper management, which requires a level of communication skills and overall maturity which many don’t have.

When employers say young hires “lack skills”, this is what they are talking about.  Very few complain that the “hard skills” – technical skills related to the specific job – are missing. Rather, what they are saying is they lack the skills to deal with clients and upper management.  And broadly, what that means is, they can’t communicate well and they can’t figure out how to operate independently without being at the (senior) boss’ door every few minutes asking “what should I do now”?

When it comes to customer service, everyone knows Nordstrom is king.  And a large part of that has to do with its staff and its commitment to customer care.  Communications are at the centre of what Nordstrom does, but it’s not communicating to clients; rather, it’s listening to them.  Really listening, I mean: understanding what clients actually want, rather than just what they ask for.  And then finding ways to make sure they get what they need.  That’s what makes clients and/or citizens feel valued.  And it’s what the best employees know how to provide.

And then there’s philology* – the study of written texts.  We don’t talk much about this discipline anymore in North America since its constituent parts have it’s been partitioned into history, linguistics, religious studies and a tiny little bit into art history (in continental Europe it retains a certain independence and credibility as an independent discipline).  The discipline consists essentially in constructing plausible hypotheses from extremely fragmentary information: who wrote the Dead Sea scrolls?  Are those Hitler diaries real?  And so on.   It’s about understanding cultural contexts, piecing together clues.

Which is an awful lot like day-to-day business.  There’s no possible way to learn how to behave in every situation, particularly when the environment is changing rapidly.  Being effective in the workplace is to a large degree about developing modes of understanding and action based on some simple heuristics and a constant re-evaluation of options as new data becomes available.  And philology, the ultimate “figure it out for yourself” discipline, is excellent training for it (history is a reasonably close second).

That’s pretty much it.  Nordstrom for the really-listening-to-client skills, philology for the figuring-it-out-on-your-own-and-getting-stuff-done skills.  Doesn’t matter what line of business you’re in, these are the competencies employers need.  And similarly, it doesn’t matter what field of study is being taught, these are the elements that need to be slipped into the curriculum.

*(On the off-chance you want to know more about philology, you could do a lot worse than James Turner’s Philology: The Forgotten Origins of the Modern Humanities.  Quite a useful piece on the history of thought). 

May 16

Jobs: Hot and Not-So-Hot

Remember when everyone was freaking out because there were too many sociology graduates and not enough welders?  When otherwise serious people like Ken Coates complained about the labour market being distorted by the uninformed choices of 17-19 year-olds?  2015 seems like a long time ago.

Just for fun the other day, I decided to look at which occupations have fared best and worst in Canada over the past ten years (ok, I grant you my definition of fun may not be universal).  Using public data, the most granular data I can look at are two-digit National Occupation Codes, so some of these categories are kind of broad.  But anyway, here are the results:

Table 1: Fastest-growing occupations in Canada, 2007-2017

May 16-17 Table 1 Fastest Growing

See any trades in there?  No, me neither.  Four out of the top ten fastest-growing occupations are health-related in one way or another.  There are two sets of professional jobs – law/social/community/ government services (which includes educational consultants, btw) and natural/applied sciences) which pretty clearly require bachelor’s if not master’s degrees.  There are three other categories (Admin/financial supervisors, Technical occupations in art, and paraprofessional occupations in legal, social, etc) which have a hodgepodge of educational requirements but on balance probably have more college than university graduates.   And then there is the category retail sales supervisors and specialized sales occupations, which takes in everything from head cashiers to real estate agents and aircraft sales representatives.  Hard to know what to make of that one.  But the other nine all seem to require training which is pretty squarely in traditional post-secondary education specialties.

Now, what about the ten worst-performing occupations?

Table 2: Fastest-shrinking Occupations in Canada 2007-2017

May 16-17 Table 2 Fastest Shrinking Occupation
This is an interesting grab bag.  I’m fairly sure, given the amount of whining about managerialism one hears these days, that it will be a surprise to most people that the single worst-performing job sector in Canada is “senior management occupations”.  It’s probably less of a surprise that four of the bottom ten occupations are manufacturing-related, and that two others – Distribution, Tracking and Scheduling and Office Support Occupations – which are highly susceptible to automation are there, too.  But interestingly, almost none of these occupations, bar senior managers, have significant numbers of university graduates in them. Many wouldn’t even necessarily have a lot of college graduates either, at least outside the manufacturing and resources sectors.

Allow me to hammer this point home a bit, for anyone who is inclined to ever again take Ken Coates or his ilk seriously on the subject of young people’s career choices.  Trades are really important in Canada.  But the industries they serve are cyclical.  If we counsel people to go into these areas, we need to be honest that people in these areas are going to have fat years and lean years – sometimes lasting as long as a decade at a time.  On the other hand, professional occupations (nearly all requiring university study) and health occupations (a mix of university and college study) are long-term winners.

Maybe students knew that all along, and behaved accordingly.  When it comes to their own futures, they’re pretty smart, you know.

 

May 10

Why Education in IT Fields is Different

A couple of years ago, an American academic by the name of James Bessen wrote a fascinating book called Learning by Doing: The Real Connection Between Innovation, Wages and Wealth.  (It’s brilliant.  Read it).  It’s an examination of what happened to wages and productivity over the course of the industrial revolution, particularly in the crucial cotton mill industry.  And the answer, it turns out, is that despite all the investment in capital which permitted vast jumps in labour productivity, in fact wages didn’t rise that much at all.  Like, for about fifty years.

Sound familiar?

What Bessen does in this book is to try to get to grips with what happens to skills during a technological revolution.  And the basic problem is that while the revolution is going on, while new machines are being installed, it is really difficult to invest in skills.  It’s not simply that technology changes quickly and so one has to continually retrain (thus lowering returns to any specific bit of training); it’s also that technology is implemented in very non-standard ways, so that (for instance) the looms at one mill are set up completely differently from the looms at another and workers have to learn new sets of skills every time they switch employers.  Human capital was highly firm-specific.

The upshot of all this: In fields where technologies are volatile and skills are highly non-standardized, the only way to reliably increase skills levels is through “learning by doing”.  There’s simply no way to learn the skills in advance.  That meant that workers had lower levels bargaining power because they couldn’t necessarily use the skills acquired at one job at another.  It also meant that, not to put too fine a point on it, that formal education becomes much less important compared to “learning by doing”.

The equivalent industry today is Information Technology.  Changes in the industry happen so quickly that it’s difficult for institutions to provide relevant training; it’s still to a large extent a “learning by doing” field.  Yet, oddly, the preoccupation among governments and universities is: “how do we make more tech graduates”?

The thing is, it’s not 100% clear the industry even wants more graduates.  It just wants more skills.  If you look at how community colleges and polytechnics interact with the IT industry, it’s often through the creation of single courses which are designed in response to very specific skill needs.  And what’s interesting is that – in the local labour market at least – employers treat these single courses as more or less equivalent to a certificate of competency in a particular field.  That means that these college IT courses these are true “microcredentials” in the sense that they are short, potentially stackable, and have recognized labour market value.  Or at least they do if the individual has some demonstrable work experience in the field as well (so-called coding “bootcamps” attempt to replicate this with varying degrees of success, though since they are usually starting with people from outside the industry, it’s not as clear that the credentials they offer are viewed the same way by industry).

Now, when ed-tech evangelists go around talking about how the world in future is going to be all about competency-based badges, you can kind of see where they are coming from because that’s kind of the way the world already works – if you’re in IT.  The problem is most people are not in IT.  Most employers do not recognize individual skills the same way, in part because work gets divided into tasks in a somewhat different way in IT than it does in most other industries.  You’re never going to get to a point in Nursing (to take a random example) where someone gets hired because they took a specific course on opioid dosages.  There is simply no labour-market value to disaggregating a nursing credential, so why bother?

And so the lesson here is this: IT work is a pretty specific type of work in which much store is put in learning-by-doing and formal credentials like degrees and diplomas are to some degree replaceable by micro-credentials.  But most of the world of work doesn’t work that way.  And as a result, it’s important not to over-generalize future trends in education based on what happens to work in IT.  It’s sui generis.

Let tech be tech.  And let everything else be everything else.  Applying tech “solutions” to non-tech “problems” isn’t likely to end well.

April 05

Student/Graduate Survey Data

This is my last thought on data for awhile, I promise.  But I want to talk a little bit today about what we’re doing wrong with the increasing misuse of student and graduate surveys.

Back about 15 years ago, the relevant technology for email surveys became sufficiently cheap and ubiquitous that everyone started using them.  I mean, everyone.  So what has happened over the last decade and a half has been a proliferation of surveys and with it – surprise, surprise – a steady decline in survey response rates.  We know that these low-participation surveys (nearly all are below 50%, and most are below 35%) are reliable, in the sense that they give us similar results year after year.  But we have no idea whether they are accurate, because we have no way of dealing with response bias.

Now, every once in awhile you get someone with the cockamamie idea that the way to deal with low response rates is to expand the sample.  Remember how we all laughed at Tony Clement when he claimed  the (voluntary) National Household Survey would be better than the (mandatory) Long-Form Census because the sample size would be larger?  Fun times.  But this is effectively what governments do when they decide – as the Ontario government did in the case of its sexual assault survey  – to carry out what amounts to a (voluntary) student census.

So we have a problem: even as we want to make policy on a more data-informed basis, we face the problem that the quality of student data is decreasing (this also goes for graduate surveys, but I’ll come back to those in a second).  Fortunately, there is an answer to this problem: interview fewer students, but pay them.

What every institution should do – and frankly what every government should do as well – is create a balanced, stratified panel of about 1000 students.   And it should pay them maybe $10/survey to complete surveys throughout the year.  That way, you’d have good response rates from a panel that actually represented the student body well, as opposed to the crapshoot which currently reigns.  Want accurate data on student satisfaction, library/IT usage, incidence of sexual assault/harassment?  This is the way to do it.  And you’d also be doing the rest of your student body a favour by not spamming them with questionnaires they don’t want.

(Costly?  Yes.  Good data ain’t free.  Institutions that care about good data will suck it up).

It’s a slightly different story for graduate surveys.  Here, you also have a problem of response rates, but with the caveat that at least as far as employment and income data is concerned, we aren’t going to have that problem for much longer.  You may be aware of Ross Finnie’s work  linking student data to tax data to work out long-term income paths.  An increasing number of institutions are now doing this, as indeed is Statistic Canada for future versions of its National Graduate Survey (I give Statscan hell, deservedly, but for this they deserve kudos).

So now that we’re going to have excellent, up-to-date data about employment and income data we can re-orient our whole approach to graduate surveys.  We can move away from attempted censuses with a couple of not totally convincing questions about employment and re-shape them into what they should be: much more qualitative explorations of graduate pathways.  Give me a stratified sample of 2000 graduates explaining in detail how they went from being a student to having a career (or not) three years later rather than asking 50,000 students a closed-ended question about whether their job is “related” to their education every day of the week.  The latter is a boring box-checking exercise: the former offers the potential for real understanding and improvement.

(And yeah, again: pay your survey respondents for their time.  The American Department of Education does it on their surveys and they get great data.)

Bottom line: We need to get serious about ending the Tony Clement-icization of student/graduate data. That means getting serious about constructing better samples, incentivizing participation, and asking better questions (particularly of graduates).  And there’s no time like the present. If anyone wants to get serious about this discussion, let me know: I’d be overjoyed to help.

January 26

An Amazing Statscan Skills Study

I’ve been hard on Statscan lately because of their mostly-inexcusable data collection practices.  But every once in awhile the organization redeems itself.  This week, that redemption takes the form of an Analytical Studies Branch research paper by Marc Frenette and Kristyn Frank entitled Do Postsecondary Graduates Land High-Skilled Jobs?  The implications of this paper are pretty significant, but also nuanced and susceptible to over-interpretation.  So let’s go over in detail what this paper’s about.

The key question Frenette & Frank are answering is “what kinds of skills are required in the jobs in which recent graduates (defined operationally here as Canadians aged 25-34 with post-secondary credentials) find themselves”.  This is not, to be clear, an attempt to measure what skills these students possess; rather it is an attempt to see what skills their jobs require.  Two different things.  People might end up in jobs requiring skills they don’t have; alternatively, they may end up in jobs which demand fewer skills than the ones they possess.  Keep that definition in mind as you read.

The data source Frenette & Frank use is something called the Occupational Information Network (O*NET), which was developed by the US Department of Labour.  Basically, they spend ages interviewing employees, employers, and occupational analysts to work out skill levels typically required in hundreds of different occupations.  For the purpose of this paper, the skills analyzed and rated include reading, writing, math, science, problem solving, social, technical operation, technical design and analysis and resource management (i.e. management of money and people).  They then take all that data and transpose it onto Canadian occupational definitions.  So now they can assign skill levels to nine different dimensions of skills to each Canadian occupation.  Then they use National Household Survey data (yes, yes, I know), to look at post-secondary graduates and what kind of occupations they have.  On the basis of this, at the level of the individual, they can link highest credential received to the skills required in their occupation.  Multiply that over a couple of million Canadians and Frenette and Frank have themselves one heck of a database.

So, the first swing at analysis is to look at occupational skill requirements by level of education.   With only a couple of exceptions – technical operations being the most obvious one – these more or less all rise according to the level of education. The other really amusing exception is that apparently PhDs do not occupy/are not offered jobs which require management skills.  But it’s when they get away from level of education and move to field of study that things get really interesting.  To what extent are graduates from various fields of study employed in jobs that require,  for instance, high levels of reading comprehension or writing ability?  I reproduce Frenette & Frank’s results below.

ottsyd-20170125-1

Yep.  You read that right.  Higher reading comprehension skill requirements are for jobs occupied by Engineers.  Humanities?  Not so much.

ottsyd-20170125-2

It’s pretty much the same story with writing, though business types tend to do better on that measure.

ottsyd-20170125-3

…and critical thinking rounds out the set.

So what’s going on here?  How is it that that humanities (“We teach people to think!“) get such weak scores and “mere” professional degrees like business and engineering do so well?  Well, let’s be careful about interpretation.  These charts are not saying that BEng and BCom grads are necessarily better than BA grads at reading, writing and critical thinking, though one shouldn’t rule that out.  They’re saying that BEng and BCom grads get jobs with higher reading, writing and critical thinking requirements than do BAs.  Arguably, it’s a measure of underemployment rather than a measure of skill.  I’m not sure I personally would argue that, but it is at least arguable.

But whatever field of study you’re from, there’s a lot of food for thought here.  If reading and writing are such a big deal for BEngs, should universities wanting to give their BEngs a job boost spend more time giving them communication skills?  If you’re in social sciences or humanities, what implications do these results have for curriculum design?

I know if I were a university President, these are the kinds of questions I’d be asking my deans after reading this report.

January 20

Puzzles in the youth labour market

A couple of days ago, after looking at employment patterns among recent graduate using Ontario graduate survey data, I promised a look at broader youth labour market data. I now wish I hadn’t promised that because Statistics Canada’s CANSIM database is an ungodly mess and has got significantly worse since the last time I tried to use its data. Too little of the data on employment and income allows users to focus in by age *and* education level, and even getting details down to 5-year age brackets (e.g. 20-24, 25-29), which might be useful for looking at youth labour markets, is frustratingly difficult.

(WHY CAN’T WE HAVE NICE THINGS, STATSCAN? WHY???)

Anyways.

Ok, so let’s start by looking at employment rates. Figure 1 looks at employment rates for Canadians in the 15-19, 20-24 and 25-29 age brackets since the turn of the Millennium.

Figure 1: Employment Rates by Age-group 1999-2016

OTTSYD 2017-01-20-1

The takeaway in figure 1 is that by and large employment rates are steady except for one big hiccup in 2008-9. In that year, the employment rate for 24-29 year olds fell by about two percentage points, that for 20-24 years olds fell by three and a half percentage points, and that for 15-19 year olds by five percentage points. Not only did the size of the drop vary inversely with age, so too has subsequent performance. Employment rates for 25-29 year-olds and 20-24 year olds have held fairly steady since 2009; those for 15-19 year olds have continued to fall, and are now over seven percentage points off their 2008 peak.

Ah, you say: employment is one thing: what about hours of work? Aren’t we seeing more part-time work and less full-time work? Well, sort of.

Figure 2: Part-time Employment as a Percentage of Total Employment, by Age-group, 1999-2016

OTTSYD 2017-01-20-2

Across all age-groups, the percentage of workers who are part-time (that is, working less than 30 hours per week) rose after 2008. In the case of the 25-29 year olds, this was pretty minor, rising from 12% in 2008 to 14% today. Among the 15-19 year-olds the movement was not especially large either, rising from 70 to 74% (remember, we *want* these kids to be part-time: they’re supposed to be in school). The biggest jump was for the 20-24 group – that is, traditional-aged students and recent graduates – where the part-time labour jumped from 29% to 35%. Now some of that might be due to higher university enrolment rates (workers are more likely to be part-time if they are also studying), but at least some of that is simply a push towards increased casualization of the labour market.

So far, all of this is roughly consistent with what we saw Monday through Wednesday – which is that there was a one-time shock to employment around 2008, and that the effect is much more pronounced among younger graduates (say, those 6 months out from graduation ) than it is among older ones (say, those 24 months after graduation. What is not quite consistent, though, is what is happening to wages. Unfortunately, CANSIM no longer makes available any decent time series data on wages or income for the 20-24 or 25-29 age groups (one of these days I am going to have to stump up for a public use microdata file but today is not that day.) But it does offer  some data on what’s going on for 15-24 year olds. Sub-optimal (25-29 would be best) but still useful. Here’s what that data looks like:

Figure 3: Hourly and Weekly Wages, 15-24 year-olds, in $2016, 2001-2016

OTTSYD 2017-01-20-3

Figure 3 shows hourly and weekly wages for 15-24 year olds, in 2016 dollars. Hourly wages (measured on the right axis) grew faster than weekly wages (measured on the left axis) because average hours worked fell by 3.5% (this is the shift to part-time work we saw in figure 2). Hourly wage growth has not been as strong since 2008 as it was between 2004 and 2008, but it is still up 6% over that time. It’s probably safe to assume that the situation for 25-29 year olds is not worse than it is for 20-24 year olds. Which means we have an interesting puzzle here: wage growth for the youth cohort as a whole is positive, at least among those who have wages – but as we saw Monday and Tuesday wage growth is negative for university students. What’s going on?

There are two possibilities here. The first is that wage growth since 2008 is stronger for those without university degrees than it is for those with. With the oil/gas boom, that might have been a reasonable explanation up to 2014; it’s hard to see it still being true now. The second is the proposition advanced here earlier this week: that while university graduates may still all cluster at the right-end of the bell curve, as they encompass a greater proportion of the bell curve, the average as a whole necessarily falls.

In short: post-2008, something has happened to the labor market which makes it more difficult for young people to “launch”. We shouldn’t overstate how big this problem is: employment is down slightly, as is the proportion of employment which is full-time. But unlike previous recessions, the youth-labour market does not seem to be bouncing back – these changes seem to be permanent, which is a bit disquieting. But it’s also true that these effects are more severe among the youngest: which is exactly what you’d expect if the labour market was putting a greater emphasis on skills and experience. By the time youth get to their late 20s, the effects mostly disappear.

In other words, what we are seeing is less “failure to launch” than “delays in the launch”. To the extent anything has changed, it’s that the transition to the labour market is on average a little bit rougher and a little bit slower than it used to be, but that’s likely as much to do with the expansion of access to university as it is a skills-biased change in the labour market.

January 18

More Bleak Data, But This Time on Colleges

Everyone seems to be enjoying data on graduate outcomes, so I thought I’d keep the party going by looking at similar data from Ontario colleges. But first, some of you have written to me suggesting I should throw some caveats on what’s been covered so far. So let me get a few things out of the way.

First, I goofed when saying that there was no data on response rates from these surveys. Apparently there is and I just missed it. The rate this year was 40.1%, a figure which will make all the economists roll their eyes and start muttering about response bias, but which anyone with field experience in surveys will tell you is a pretty good response for a mail survey these days (and since the NGS response rate is now down around the 50% mark, it’s not that far off the national “gold standard”).

Second: all this data on incomes I’ve been giving you is a little less precise than it sounds. Technically, the Ontario surveys do not ask income, they ask income ranges (e.g. $0-20K, $20-40K, etc). When data is published either by universities or the colleges, this is turned into more precise-looking figures by assigning the mid-point value of each and then averaging those points. Yes, yes, kinda dreadful. Why can’t we just link this stuff to tax data like EPRI does? Anyways, that means you should probably take the point values with a pinch of salt: but the trend lines are likely still meaningful.

Ok, with all that out of the way, let’ turn to the issue of colleges. Unfortunately, Ontario does not collect or display data on college graduates’ outcomes the way they do for universities. There is no data around income, for instance. And no data on employment 2 years after graduation, either. The only real point of comparison is employment 6 months after graduation, and even this is kind of painful: for universities the data is available only by field of study; for colleges, it is only available by institution. (I know, right?) And even then it is not even calculated on quite the same basis: universities include graduates with job offers while the college one does not. So you can’t even quite do an apples-to-apples comparison, even at the level of the sector as a whole. But if you ignore that last small difference in calculation and focus not on the point-estimates but on the trends, you can still see something interesting. Here we go:

Figure 1: Employment Rates 6 months after Graduation, Ontario Universities vs. Ontario Colleges, by Graduating Cohort, 1999-2015

ottsyd-20170117

So, like I said, ignore the actual values in Figure 1 because they’re calculated in two slightly different ways; instead, focus on the trends. And if you do that, what you see is (a blip in 2015 apart), the relationship between employment rates in the college and university sector looks pretty much the same throughout the period. Both had a wobble in the early 2000s, and then both took a big hit in the 2008 recession. Indeed, on the basis of this data, it’s hard to make a case that one sector has done better than another through the latest recession: both got creamed, neither has yet to recover.

(side point: why does the university line stop at 2013 while the college one goes out to 2015? Because Ontario doesn’t interview university grads until 2 years after grad and then asks them retroactively what they were doing 18 months earlier. So the 2014 cohort was just interviewed last fall and it’ll be a few months until their data is released. College grads *only* get interviewed at 6 months, so data is out much more quickly)

What this actually goes is put a big dent in the argument that the problem for youth employment is out-of-touch educators, changing skill profiles, sociologists v. welders and all that other tosh people were talking a few years ago. We’re just having more trouble than we used to integrating graduates into the labour market. And I’ll be taking a broader look at that using Labour Force Survey data tomorrow.

January 17

Another Lens on Bleak Graduate Income Data

So, yesterday we looked at Ontario university graduate employment data (link to: previous).  Today I want to zero in a little bit on what’s happening by field of study.

(I can hear two objections popping up already.  First; “why just Ontario”?  Answer: while Quebec, Alberta, British Columbia and the Maritimes – via MPHEC – all publish similar data, they all publish the data in slightly different ways, making it irritating (and in some cases impossible) to come up with a composite national figure.  The National Graduate Survey (NGS) in theory does this, but only every five years but as I explained last week has made itself irrelevant by changing the survey period.  So, in short, I can’t do national, and Ontario a) is nearly half the country in terms of university enrolments and b) publishes slightly more detailed data than most.  Second, “why just universities”?  Answer: “fair point, I’ll be publishing that data soon”.

Everyone clear? OK, let’s keep going).

Let’s look first at employment rates 6 months after graduation by field of study (I include only the six largest – Business/Commerce, Education, Engineering, Humanities, Physical Sciences and Social Sciences – because otherwise these graphs would be an utter mess), shown below in Figure 1.  As was the case yesterday, the dates along the x-axis are the cohort graduation year.

ottsyd-20170116-1

Two take-aways here, I think.  The first is that the post-08 recession really affected graduates of all fields more or less equally, with employment rates falling by between 6 and 8 percentage points (the exception is humanities, where current rates are only four percentage points below where they were in 2007).  The second is that pretty much since 2001, it’s graduates in the physical sciences who have had the weakest results.

OK, but as many in the academy say: 6 months isn’t enough to judge anything.  What about employment rates after, say, 2 years?  These are shown below in Figure 2.

ottsyd-20170116-2

This graph is smoother than the previous one, which suggests the market for graduates with 2 years in the labour market is a lot more stable than that for graduates with just 6 months.    If you compare the class of 2013 with the clss of 2005 (the last one to completely miss the 2008-9 recession), business and commerce students’ employment rates have fallen only by one percentage point while those in social sciences have dropped by six percentage points, with the others falling somewhere in between.  One definite point to note for all those STEM enthusiasts out there: there’s no evidence here that students in STEM programs have fared much better than everyone else.

But employment is one thing; income is another.  I’ll spare you the graph of income at six months because really, who cares?  I’ll just go straight to what’s happening at two years.

ottsyd-20170116-3

To be clear, what figure 3 shows is average graduate salaries two years after graduation in real dollars – that is, controlling for inflation.  And what we see here is that in all fields of study, income bops along fairly steadily until 2007 (i.e. class of 2005) at which point things change and incomes start to decline in all six subject areas.  Engineering was down, albeit only by three percent.  But income for business students was down 10%, physical sciences down 16%, and humanities, social sciences and education were down 19%, 20% and 21%, respectively.

This, I shouldn’t need to emphasize, is freaking terrible.  Actual employment rates (link to: previous) may not be down that much but this drop in early graduate earnings is  pretty disastrous for the majority of students.  Until a year or two ago I wasn’t inclined to put a lot of weight on this: average graduate earnings have always popped back after recessions.  This time seems to be different.

Now as I said yesterday, we shouldn’t be too quick to blame this on a huge changes economy to which institutions are not responding; it’s likely that part of the fall in averages comes from allowing more students to access education in the first place.  As university graduates take up an increasing space on the right-hand side of an imaginary bell-curve representing all youth, “average earnings” will naturally decline even if there’s no overall change in the average or distribution of earnings as a whole.  And the story might not be as negative if we were to take a five- or ten-year perspective on earnings.  Ross Finnie has done some excellent work showing that in the long-term nearly all university graduates make a decent return (though, equally, there is evidence that students with weak starts in the labour force have lower long-term earnings as well through a process known as “labour market scarring”).

Whatever the cause, universities (and Arts faculties in particular) have to start addressing this issue honestly.  People know in their gut that university graduates’ futures in general (and Arts graduates in particular) are not as rosy as they used to be. So when the Council of Ontario puts out a media release, as it did last month, patting universities on the back for a job well-done with respect to graduate outcomes, it rings decidedly false.

Universities can acknowledge challenges in graduate without admitting that they are somehow at fault.  What they cannot do is pretend there isn’t a problem, or shirk taking significant steps to improve employment outcomes.

January 16

Ever-bleaker Graduate Employment Data?

So just before I quit blogging in December, the Council of Ontario Universities released its annual survey of graduate outcomes, this time of the class of 2013.  The release contained the usual platitudes: “future is bright”, “vast majority getting well-paying jobs”, etc etc.   And I suppose if one looks at a single year’s results in isolation, one can make that case.  But a look at longer-term trends suggests cause for concern.

These surveys began at the behest of the provincial government seventeen years ago.  Every graduating cohort is surveyed twice: once six months after graduation and once two years after graduation.  Students are asked questions about their employment status, their income and about the level of relationship between their job and their education.  COU publishes only high-level aggregate data, so we don’t know about things like response rates, but the ministry seems pleased enough by data quality, so I assume it’s within industry standards.

Figure 1 shows employment rates of graduates six months and two years out.  At the two-year check point, employment rates fell by four points in the immediate wake of the 2008-9 recession, (be careful in reading the chart: the x-axis is the graduating class, not the year of the survey, so the line turns down in 2006 because that’s the group that was surveyed in 2008).  Since then it has recovered by a little more than a point and a half, though further recovery seems stalled.  At the six-month point, things are much worse.  Though employment rates at this point are no longer falling, they remain stubbornly seven percentage points below where they were pre-recession.

Figure 1: Employment Rates, Ontario University Graduates, 6 Months and 2 Years Out, by Graduating Class, 1996-2013

OTTSYD 20170115-1

If you want to paint a good story here, it’s that employment rates at 2 years out are still within three percentage-points of their all-time peak, which isn’t terrible.  But there doesn’t seem much doubt that students are on average taking a bit longer to “launch” than they used to; employment rates six months out seem to have hit a new, and permanently lower floor.

Now, take a look at what’s happening to starting salaries.  As with the previous graph, I show results for at both the six-month and the two-year mark.

 

 Figure 2: Average salaries, Ontario University Graduates, 6 Months and 2 Years Out, by Graduating Class, 1996-2013, in $2016

OTTSYD 20170115-2

What we see in Figure 2 is the following:  holding inflation constant, during the late 1990s, recent graduates saw their incomes grow at a reasonably rapid clip.  For most of the 2000s, income was pretty steady for graduates two years out (less so six months out).  But since the 2008 recession, incomes have been falling steadily for several years; unlike the situation with employment rates, we have yet to see a floor, let alone a bounceback.  Real average incomes of the class of 2013 six months after graduation were 11% lower than those of the class of 2005 (the last fully pre-recession graduating class); at 2 years out the gap was 13%.  Somehow these points did not make it into the COU release.

That, frankly, is not good.  But it seems to me that we need to hold on a little bit before hitting panic buttons about universities being a bad deal, not being relevant to shifting labour market, etc, etc.  Sure, the drop-off in both employment rates and incomes started around the time of the recession and so it’s easy to create a narrative around changed economy/new normal, etc etc.  But there’s something else that probably playing a role, and that’s an increase in the supply of graduates.

 

Figure 3: Number of Undergraduate Degrees Awarded, Ontario, 1999-2013

OTTSYD 20170115-3

The other big event we need to control for here is the massive expansion of access to higher education.  In 2003, the “double-cohort” arrived on campus and that forced government to expand institutional capacity, which did not subsequently shrink.  Compared to the year 2000, the number of graduates has increased by over 50%; Such an expansion of supply must have had some effect on average outcomes. It’s not simply that there are more students competing for jobs – something one would naturally assume would place downward pressure on wages – but also, the average quality of graduates has probably dropped somewhat.  Where once graduates represented the top 20% of a cohort in terms of academic ability, now they probably represent the top 30% or so.  Assuming one’s marginal product in the labour market is at least loosely tied to academic ability, that would also predict a drop in average post-graduation incomes.  To really get a sense of what if anything has changed in terms of how higher education affects individuals’ fortunes in the labour market, you’d want to measure not average income vs. average income, but 66th percentile of income now vs. 50th percentile of income fifteen years ago.  Over to you, COU, since you could make the microdata public if you wanted to.

In short, don’t let institutions off the hook on this, but recognize that some of this was bound to happen anyway because of access trends.

More graduate income data fun tomorrow.

December 07

Two (Relatively) Good News Studies

A quick summary of two studies that came out this week which everyone should know about.

Programme for International Student Assessment (PISA)

On Tuesday, the results for the 2015 PISA tests were released.  PISA is, of course, that multi-country assessment of 15 year-olds in math, science and reading which takes place every three years and is managed by the Organization for Economic Co-operation and Development (OECD).  PISA is not a test of curriculum knowledge (in an international context that would be really tough); what it is instead is a test of how well individuals’ knowledge of reading, math and science can be applied to real-world challenges.  So the outcomes of the test can best be thought of as some sort of measure of cognitive ability in various domains.

In addition to taking the tests, students also answer questions about themselves, their study habits and their family background. Schools also provide information about the kinds of resources they have and what kind of curriculum structure they use, there is an awful lot of background information about each student who takes the test, and that permits some pretty interesting and detailed cross-national examination in the determinants of this cognitive ability.  And from this kind of analysis, the good folks at OECD have determined that government policy is best focused in four areas.

But heck, nobody wants to hear about that; what everybody wants to know is “where did we rank”?  And the answer is: pretty high.  The short version is here and the long version here, but here are the headlines: Out of the 72 countries where students took the test, Canada came 2nd in Reading, 7th in Science and 10th in Math.  If you break things down to the sub-jurisdictional level (Canada vastly oversamples compared to other countries so that it can get results at a provincial level), BC comes first in the world for reading (Singapore second, Alberta third, Quebec fourth and Ontario fifth).  In Science, Alberta and British Columbia come second and third in the world (behind only Singapore which as a country came top in every category).  In Math, the story is not quite as good, but Quebec still cracks the top three.

CMEC also has a publication out which goes into more depth at the provincial level (available here).  The short story is our four big provinces do well across the board but the little ones less so (in some cases much less so).  Worth a glance if comparing provinces rather than countries is your thing.

One final little nugget from the report: the survey taken by students asks if the students see themselves heading towards a Science-based career in the future.  In Canada, 34% said yes, the second highest of any country in the survey (after the US).  I’d like to think this will put to rest all the snarky remarks about how kids aren’t sufficiently STEM-geared these days (<cough> Ken Coates <cough>), but I’m not holding my breath.

Statscan Report on Youth Employment

Statistics Canada’s put out some interesting data youth employment by Rene Morisette on Monday.  It’s one of those half-full/half-empty stories: the youth unemployment rate is back down to 13% where it was in 1976 (and hence lower than it has been for most of the intervening 40 years), but the percentage of youth working full-time has dropped.  The tricky part of this analysis – not really covered by the paper – is that the comparison in both time periods excludes students.  That makes for a tricky comparison because there are proportionately about 3 times as many students as there were 40 years ago.  To put that another way, there are a lot fewer bright kids – that is, the kind likely to get and keep jobs – not in school now than in 1976.  So it’s not quite an apples-to-apples comparison and it’s hard to know what having more young people in school actually does to the employment rate.

Aside from data on employment rates, the report (actually a condensation of some speaking notes and graphs from a presentation made earlier this year) also includes a mishmash of other related data, from differing recent youth employment trends in oil provinces vs. non-oil provinces (short version: they’re really different) to gender differences in graduate wage premiums (bigger for women than men, which may explain participation rate differences), to trends in overall graduate wage premiums.  Intriguingly, these rose through the 80s and 90s but are now declining back to 1980 levels, though whether that is due to an increase in the supply of educated labour or reflects broader changes in the labour market such as the “Great Reversal” in the demand for cognitive skills that UBC’s David Green and others have described is a bit of a mystery.

But don’t take my word for it: have a skim through the report (available here).  Well worth a few minutes of your time.

Page 1 of 912345...Last »