Higher Education Strategy Associates

Tag Archives: Employment

January 26

An Amazing Statscan Skills Study

I’ve been hard on Statscan lately because of their mostly-inexcusable data collection practices.  But every once in awhile the organization redeems itself.  This week, that redemption takes the form of an Analytical Studies Branch research paper by Marc Frenette and Kristyn Frank entitled Do Postsecondary Graduates Land High-Skilled Jobs?  The implications of this paper are pretty significant, but also nuanced and susceptible to over-interpretation.  So let’s go over in detail what this paper’s about.

The key question Frenette & Frank are answering is “what kinds of skills are required in the jobs in which recent graduates (defined operationally here as Canadians aged 25-34 with post-secondary credentials) find themselves”.  This is not, to be clear, an attempt to measure what skills these students possess; rather it is an attempt to see what skills their jobs require.  Two different things.  People might end up in jobs requiring skills they don’t have; alternatively, they may end up in jobs which demand fewer skills than the ones they possess.  Keep that definition in mind as you read.

The data source Frenette & Frank use is something called the Occupational Information Network (O*NET), which was developed by the US Department of Labour.  Basically, they spend ages interviewing employees, employers, and occupational analysts to work out skill levels typically required in hundreds of different occupations.  For the purpose of this paper, the skills analyzed and rated include reading, writing, math, science, problem solving, social, technical operation, technical design and analysis and resource management (i.e. management of money and people).  They then take all that data and transpose it onto Canadian occupational definitions.  So now they can assign skill levels to nine different dimensions of skills to each Canadian occupation.  Then they use National Household Survey data (yes, yes, I know), to look at post-secondary graduates and what kind of occupations they have.  On the basis of this, at the level of the individual, they can link highest credential received to the skills required in their occupation.  Multiply that over a couple of million Canadians and Frenette and Frank have themselves one heck of a database.

So, the first swing at analysis is to look at occupational skill requirements by level of education.   With only a couple of exceptions – technical operations being the most obvious one – these more or less all rise according to the level of education. The other really amusing exception is that apparently PhDs do not occupy/are not offered jobs which require management skills.  But it’s when they get away from level of education and move to field of study that things get really interesting.  To what extent are graduates from various fields of study employed in jobs that require,  for instance, high levels of reading comprehension or writing ability?  I reproduce Frenette & Frank’s results below.


Yep.  You read that right.  Higher reading comprehension skill requirements are for jobs occupied by Engineers.  Humanities?  Not so much.


It’s pretty much the same story with writing, though business types tend to do better on that measure.


…and critical thinking rounds out the set.

So what’s going on here?  How is it that that humanities (“We teach people to think!“) get such weak scores and “mere” professional degrees like business and engineering do so well?  Well, let’s be careful about interpretation.  These charts are not saying that BEng and BCom grads are necessarily better than BA grads at reading, writing and critical thinking, though one shouldn’t rule that out.  They’re saying that BEng and BCom grads get jobs with higher reading, writing and critical thinking requirements than do BAs.  Arguably, it’s a measure of underemployment rather than a measure of skill.  I’m not sure I personally would argue that, but it is at least arguable.

But whatever field of study you’re from, there’s a lot of food for thought here.  If reading and writing are such a big deal for BEngs, should universities wanting to give their BEngs a job boost spend more time giving them communication skills?  If you’re in social sciences or humanities, what implications do these results have for curriculum design?

I know if I were a university President, these are the kinds of questions I’d be asking my deans after reading this report.

September 21

Unit of Analysis

The Globe carried an op-ed last week from Ken Coates and Douglas Auld, who are writing a paper for the MacDonald Laurier institute on the evaluation of Canadian post-secondary institutions. At one level, it’s pretty innocuous (“we need better/clearer data”) but at another level I worry this approach is going to take us all down a rabbit hole. Or rather, two of them.

The first rabbit hole is the whole “national approach” thing. Coates and Auld don’t make the argument directly, but they manage to slip a federal role in there. “Canada lacks a commitment to truly high-level educational accomplishment”, needs a “national strategy for higher education improvement” and so “the Government of Canada and its provincial and territorial partners should identify some useful outcomes”. To be blunt: no, they shouldn’t. I know there is a species of anglo-Canadian that genuinely believes the feds have a role in education because reasons, but Section 93 of the constitution is clear about this for a reason. Banging on about national strategies and federal involvement just gets in the way of actual work getting done.

Coates & Auld’s point about the need for better data applies to provinces individually as well as collectively. They all need to get in the habit of using more and better data to improve higher education outcomes. I also think Coates and Auld are on the right track about the kinds of indicators most people would care about: scholarly output, graduation rates, career outcomes, that sort of thing. But here’s where they fall into the second rabbit hole: they assume that the institution is the right unit of analysis for these indicators. On this, they are almost certainly mistaken.

It’s an understandable mistake to make. Institutions are a unit of higher education management. Data comes from institutions. And they certainly sell themselves as a unified institutions carrying out a concerted mission (as opposed to the collections of feuding academic baronetcies united by grievances about parking and teaching loads they really are). But when you look at things like scholarly output, graduation rates, and career outcomes the institution is simply the wrong unit of analysis.

Think about it: the more professional programs a school has, the lower the drop-out rate and the higher the eventual incomes. If a school has medical programs, and large graduate programs in hard sciences, it will have greater scholarly output. It’s the palette of program offerings rather than their quality which makes the difference when making inter-institutional comparisons. A bad university in with lots of professional programs will always beat a good small liberal arts school on these measures.

Geography play a role, too. If we were comparing short-term graduate employment rates across Canada for most of the last ten years, we’d find Calgary and Alberta at the top – and most Maritime schools (plus  some of the Northern Ontario schools) at the bottom. If we were comparing them today, we might find them looking rather similar. Does that mean there’s been a massive fall-off in the quality of Albertan universities? Of course not. It just means that (in Canada at least) location matters a lot more than educational quality when you’re dealing with career outcomes.

You also need to understand something about the populations entering each institution. Lots of people got very excited when Ross Finnie and his EPRI showed big inter-institutional gaps in graduates incomes (I will get round to covering Ross’ excellent work on the blog soon, I promise). “Ah, interesting!” people said. “Look At The Inter-Institutional Differences Now We Can Talk Quality”. Well, no. Institutional selectivity kind of matters here. Looking at outputs alone, without taking into account inputs, tells you squat about quality. And Ross would be the first to agree with me on this (and I know this because he and I co-authored a damn good paper on quality measurement a decade ago which made exactly this point).

Now, maybe Coates and Auld have thought all this through and I’m getting nervous for no reason, but their article’s focus on institutional performance when most relevant outcomes are driven by geography, program and selectivity suggests to me that there’s a desire here to impose some simple rough justice over some pretty complicated cause-effect issues. I think you can use some of these simple outcome metrics to classify institutions – as HEQCO has been doing with some success over the past couple of years – but  “grading” institutions that way is too simplistic.

A focus on better data is great. But good data needs good analytical frameworks, too.

January 28

The Future of Work (and What it Means for Higher Education), Part 2

Yesterday we looked at a few of the hypotheses out there about how IT is destroying jobs (particularly: good jobs).  Today we look at how institutions should react to these changes.

If I were running an institution, here’s what I’d do:

First, I’d ask every faculty to come up with a “jobs of the future report”.  This isn’t the kind of analysis that makes sense to do at an institutional level: trends are going to differ from one part of the economy (and hence, one set of fields of study) to another.  More to the point, curriculum gets managed at the faculty level, so it’s best to align the analysis there.

In their reports, all faculties would need to spell out: i) who currently employs their grads, and in what kinds of occupations (an answer of “we don’t know” is unacceptable – go find out); ii) what is the long-term economic outlook for those industries and occupations? iii) what is the outlook for those occupations with respect to tasks being susceptible to computerization (there are various places to look for this information, but this from two scholars at the University of Oxford is a pretty useful guide)? And, iv) talk to senior people in these industries and occupations to get a sense of how they see technology affecting employment in their industry.

This last point is important: although universities and colleges keep in touch with labour market trends through various types of advisory boards, the question that tends to get asked is “how are our grads doing now?  What improvements could we make so that out next set of grads is better than the current one?”  The emphasis is clearly on the very short-term; rarely if ever are questions posed about medium-range changes in the economy and what those might bring.  (Not that this is always front and centre in employers’ minds either – you might be doing them a favour by asking the question.)

The point of this exercise is not to “predict” jobs of the future.  If you could do that you probably wouldn’t be working in a university or college.  The point, rather, is to try to highlight certain trends with respect to how information technology is re-aligning work in different fields over the long-term.  It would be useful for each faculty to present their findings to others in the institution for critical feedback – what has been left out?  What other trends might be considered? Etc.

Then the real work begins: how should curriculum change in order to help graduates prepare for these shifts?  The answer in most fields of study would likely be “not much” in terms of mastery of content – a history program is going to be a history program, no matter what.  But what probably should change are the kinds of knowledge gathering and knowledge presentation activities that occur, and perhaps also the methods of assessment.

For instance, if you believe (as economist Tyler Cowen suggests in his book Average is Over that employment advantage is going to come to those who can most effectively mix human creativity with IT, then in a statistics course (for instance), maybe put more emphasis on imaginative presentation of data, rather than on the data itself.  If health records are going to be electronic, shouldn’t your nursing faculty be developing a lot of new coursework involving the manipulation of information on databases?  If more and more work is being done in teams, shouldn’t every course have at least one group-based component?  If more work is going to happen across multi-national teams, wouldn’t it be advantageous to increase language requirements in many different majors?

There are no “right” answers here.  In fact, some of the conclusions people will come to will almost certainly be dead wrong.  That’s fine.  Don’t sweat it.  Because if we don’t look forward at all, if we don’t change, then we’ll definitely be wrong.  And that won’t serve students at all.

January 27

The Future of Work (and What it Means for Higher Education), Part 1

Back in the 1990s when we were in a recession, Jeremy Rifkin wrote a book called The End of Work, which argued that unemployment would remain high forever because of robots, information technology, yadda yadda, whatever.  Cue the longest peacetime economic expansion of the century.

Now, we have a seemingly endless parade of books prattling on about how work is going to disappear: Brynjolfsson and McAfee’s The Second Machine Age, Martin Ford’s Rise of the RobotsJerry Kaplan’s Humans Need not Apply, Susskind and Susskind’s The Future of the Professions: How Technology will Transform the Work of Human Experts (which deals specifically with how info tech and robotics will affect occupations such as law, medicine, architecture, etc.), and from the Davos Foundation,  Klaus Schwab’s The Fourth Industrial Revolution. Some of these are insightful (such as the Susskinds’ effort, though their style leaves a bit to be desired); others are hysterical (Ford); while others are simply dreadful (Schwab: seriously, if this is what rich people find insightful we are all in deep trouble).

So how should we evaluate claims about the imminent implosion of the labour market?  Well first, as Martin Wolf says in this quite sober little piece in Foreign Affairs, we shouldn’t buy into the hype that “everything is different this time”.  Technology has been changing the shape of the labour market for centuries, sometimes quite rapidly.  We will go on changing.  The pace may accelerate a bit, but the idea that things are suddenly going to “go exponential” are simply wrong.  Just because we can imagine technology creating loads of radical disruption doesn’t mean it’s going to happen.  Remember the MOOC revolution, which was going to wipe out universities?  Exactly.

But just because the wilder versions of these stories are wrong doesn’t mean important things aren’t happening.  The key is to be able to lose the hype.  And to my mind, the surest way to get past the hype is to clear your mind of the idea that advances in robotics or information technology “replace jobs”.  This is simply wrong; what they replace are tasks.

We get a bit confused by this because we remember all the jobs that were lost to technology in manufacturing.  But what we forget is that the century-old technology of the assembly line had long turned jobs into tasks, with each individual performing a single task, repetitively.  So in manufacturing, replacing tasks looked like replacing jobs.  But the same is not true of the service sector (which covers everything from shop assistants to lawyers).  This work is not, for the most part, systematic and routinized, and so while IT can replace tasks, it cannot replace “jobs”  per se.  Jobs will change as certain tasks get automated, but they don’t necessarily get wiped out.  Recall, for instance, the story I told about ATMs a few months ago: that although ATMs had become ubiquitous over the previous forty years, the number of bank tellers not only hadn’t decreased, but had actually increased slightly.  It’s just that, mainly, they were now doing a different set of tasks.

Where I think there are some real reasons for concern is that a lot of the tasks that are being routinized are precisely the ones we used to give to new employees.  Take law, for instance, where automation is really taking over document analysis – that is, precisely the stuff they used to get articling students to do.  So now what do we do for an apprenticeship path?

Working conditions always change over time in every industry, of course, but it seems reasonable to argue that job change in white-collar industries – that is, the ones for which university education is effectively an entry-requirement – are going to change substantially over the next couple of decades.  Again, it’s not job losses; rather, it is job change.  And the question is: how are universities thinking through what this will mean for the way students are taught?  Too often, the answer is some variation on “well, we’ll muddle through the way we always do”.  Which is a pretty crap answer, if you ask me.  A lot more thought needs to go into this.  Tomorrow, I’ll talk about how to do that.

September 02

Some Basically Awful Graduate Outcomes Data

Yesterday, the Council of Ontario Universities released the results of the Ontario Graduates’ Survey for the class of 2012.  This document is a major source of information regarding employment and income for the province’s university graduates.  And despite the chipperness of the news release (“the best path to a job is still a university degree”), it actually tells a pretty awful story when you do things like, you know, place it in historical context, and adjust the results to account for inflation.

On the employment side, there’s very little to tell here.  Graduates got hit with a baseball bat at the start of the recession, and despite modest improvements in the overall economy, their employment rates have yet to resume anything like their former heights.

Figure 1: Employment Rates at 6-Months and 2-Years After Graduation, by Year of Graduating Class, Ontario














Now those numbers aren’t good, but they basically still say that the overwhelming majority of graduates get some kind of job after graduation.  The numbers vary by program, of course: in health professions, employment rates at both 6-months and 2-years out are close to 100%; in most other fields (Engineering, Humanities, Computer Science), it’s in the high 80s after six months – it’s lowest in the Physical Sciences (85%) and Agriculture/Biological Sciences (82%).

But changes in employment rates are mild compared to what’s been happening with income.  Six months after graduation, the graduating class of 2012 had average income 7% below the class of 2005 (the last class to have been entirely surveyed before the 2008 recession).  Two years after graduation, it had incomes 14% below the 2005 class.

Figure 2: Average Income of Graduates at 6-Months and 2-Years Out, by Graduating Class, in Real 2013/4* Dollars, Ontario














*For comparability, the 6-month figures are converted into real Jan 2013 dollars in order to match the timing of the survey; similarly, the 2-year figures are converted into June 2014 dollars.

This is not simply the case of incomes stagnating after the recession: incomes have continued to deteriorate long after a return to economic growth.  And it’s not restricted to just a few fields of study, either.  Of the 25 fields of study this survey tracks, only one (Computer Science) has seen recent graduates’ incomes rise in real terms since 2005.  Elsewhere, it’s absolute carnage: education graduates’ incomes are down 20%; Humanities and Physical Sciences down 19%; Agriculture/Biology down 18% (proving once again that, in Canada, the “S” in “STEM” doesn’t really belong, labour market-wise).  Even Engineers have seen a real pay cut (albeit by only a modest 3%).

Figure 3: Change in Real Income of Graduates, Class of 2012 vs. Class of 2005, by Time Graduation for Selected Fields of Study














Now, we need to be careful about interpreting this.  Certainly, part of this is about the recession having hit Ontario particularly harshly – other provinces may not see the same pattern.  And in some fields of study – Education for instance – there are demographic factors at work, too (fewer kids, less need of teachers, etc.).  And it’s worth remembering that there has been a huge increase in the number of graduates since 2005, as the double cohort – and later, larger cohorts – moved through the system.  This, as I noted back here, was always likely to affect graduate incomes, because it increased competition for graduate jobs (conceivably, it’s also a product of the new, wider intake, which resulted in a small drop in average academic ability).

But whatever the explanation, this is the story universities need to care about.  Forget tuition or student debt, neither of which is rising in any significant way.  Worry about employment rates.  Worry about income.  The number one reason students go to university, and the number one reason governments fund universities to the extent they do, is because, traditionally, universities have been the best path to career success.  Staying silent about long-term trends, as COU did in yesterday’s release, isn’t helpful, especially if it contributes to a persistent head-in-the-sand unwillingness to proactively tackle the problem.  If the positive career narrative disappears, the whole sector is in deep, deep trouble.

April 13

Five Questions for Ken Coates

So, Ken Coates of the University of Saskatchewan published a paper the week before last arguing that there were too many university students and not enough trades students, so we should reduce university enrolments by a third and what the hell is wrong with kids today anyway?  Despite being not much more than a warmed-over version of the paper he co-authored with Rick Miner in IRPP a couple of years ago, it got some attention because it played directly into both the elitist view of universities (all these students devalue the degree!) and the weird view some in Canada have that the only problem with the labour market is that workers are too stupid to see the opportunities in front of them.

The paper is a hot mess of unfounded assertions and questionable logic which raises at least 5 questions (I’d guess readers can come up with a few more of their own) which I think the author needs to answer before the paper can be taken seriously.

1.Why does Coates keep saying today’s young people feel too “entitled”? What does he mean by this disparaging term?  What evidence is there to suggest this generation display a greater sense of entitlement than any previous generation?  Or is this just an arrogant way of saying youth don’t do what Coates thinks they should do?  (Also: does Coates spend a lot of time yelling at kids to get off his lawn?)

2. Why does Coates repeatedly denigrate the idea that “the labour market should be directed by the uninformed educational choices of 17-19 year-olds”?  Has it not ever been thus?  Was there some golden age in Canadian history when the state or business made career decisions on young peoples’ behalf and where economic outcomes were demonstrably better?  Can Coates name a democratic nation where 17-19 year-olds don’t make their own educational choices?  

3. Why, if as Coates claims, no one can know the future of the labour market, is he so damn sure we need more college/trades graduates?  Coates: “it is extremely difficult to anticipate downstream market demand for employees”.  Coates: “Governments have a poor track record when it comes to picking winners in the economy”.  Well, if that’s true, isn’t this entire paper – which based on the idea that we know that more college/trades education and less university education is a good idea – an enormous waste of time?  (Or, more simply, “wrong”?)

4. What evidence does Coates have for saying Canadians are defaulting “to the traditional view that a university degree is the best avenue to prosperity” and “turning their children’s dreams against blue-collar work”?  Here’s a quick summary of educational attainment for Canadian males, aged 30 or under, who did their post-secondary education in Canada:

 Figure 1: Highest Level of Educational Attainment, Males Aged 30 and Under, Canada, 2010


Got that?  Among males under 30, there are almost as many apprentice and trades certificate holders as there are bachelor’s holders.  Throw the colleges in and it’s more than two to one.  Another way to look at the data is to compare the number of males with Bachelor’s degrees with those working in those “in-demand” area Coates is continually babbling about – construction trades, mechanics, precision production, transportation, and all Engineering sub-fields who have less than a bachelor’s degree.  Here are the numbers:

Figure 2: Bachelor’s Degree Holders vs. Workers in Five Key Trades, Males under 30, Canada, 2010


 In short: Averse to blue-collar work?  Not even vaguely true.


5.   Why does Coates think blue-collar work is so hot anyway?  The problem with blue collar work – apart from the fact that it’s seriously gender-biased – is that it’s cyclical.  A lot of people in Canada – including Coates, apparently – forget that because the current commodities cycle has been going on so long, but when commodities prices fall blue collar outcomes are pretty terrible.  Back in the mid-90s, when oil was cheap, we only had about a quarter as many apprentices as we do now.  In the 1980s, unemployment rates for trades grads was over 15%.  How good do you think blue collar will look if oil is permanently back down to $50 and China’s growth rate heads down to 3%?

Coates does have a point in that universities need to do more to make their graduates employable, and he’s also right that more post-secondary learning needs to be experiential in nature.  But to go from there and say that we need fewer university graduates is just a baseless assertion.  He can and should do better.

February 04

The “Skills for Jobs Blueprint”

I don’t pay as much attention as I should on this blog to matters British Columbian, mostly because I don’t get out there often enough.  But the province’s “Skills for Jobs  Blueprint” cries out for some critical treatment, because frankly it’s not all that smart.

Turn back the clock a bit: in April 2014, the BC government rolled-out a series of policies that were collectively branded as the “Skills for Jobs Blueprint”.  Much of it consisted of relatively sensible changes to trades training in view of the upcoming Liquid Natural Gas (LNG) mega-project.  However, included in this package was some other stuff that sounded like it had been dreamt up on the back of a cocktail napkin.  These included: more generous student aid to students enrolled in disciplines related to “high-demand” occupations, and requiring institutions to spend at least 25% of their budgets on disciplines related to “high-demand” occupations (to be phased in by 2017-18).

The student aid pledge was just silly: if these are truly high-demand occupations, they’ll pay more, and students will have less problem re-paying loans.  Why would you give more money to these people? The requirement for institutional spending had the potential to be ridiculous, but wasn’t necessarily so.  Whatever purists might think, public authorities spend money on higher education mainly to improve the local economy; and besides, depending on how broadly “high-demand” occupations were described, they might already be spending 25%.  There was the possibility, in other words, that it would require no change at all on institutions’ part.  But that would depend crucially on how BC defined “high-demand”.

This is where it gets maddening.  When the government finally released its definition of high-demand, it had nothing to do with a skills gap, and was not in any way based on analyses of supply and demand.  Instead, it was simply the 60 occupations with the most job openings.  Or, put differently: according to the government of BC, the highest-demand occupations are simply the 60 largest occupations.  Oy.

Now, it’s hard to tell whether institutions actually line up 25% of their spending on priority disciplines related to the “big 60”, since BC doesn’t work on any kind of funding formula.  However, it is possible to reverse engineer this kind of thing by looking at enrolment patterns, and assuming that spending weights are similar to what one would see in other provinces (read: Ontario and Quebec), as we demonstrated back here.  Which is what my colleague Jackie Lambert did.

The results were instructive.  Quite clearly, all colleges meet the test.  Among universities, it’s slightly more complicated.  If you simply take all enrolments in the academic programs most directly related to 59 of the 60 “most desired” occupations, and weight them in the ON/QC style, you find that province-wide, these programs already make up 32% of expenditures, and all universities except Emily Carr would meet the 25% cut.  However, the 60th occupation with the most “demand” is university professors (yes, really), which technically can be filled by doctoral students from any program.  Throw those in and you end up with almost 47% of all dollars being spent on “priority” areas.

Ideally, this result would mean the province could just declare victory (“Look!  25%! We showed them!”) and go home.  But these days, government can’t just be seen to be ordering institutions about; they have to actually be ordering them about.  So my guess is BC will avoid declaring victory, and instead use the ambiguity created by the lack of a funding formula to jerk institutions around a bit “(Spend here!  Don’t spend there!”), just to show everyone who’s boss.

Plus ça change…

September 08

Some Scary Graduate Income Numbers

Last week, the Council of Ontario Universities put out a media release with the headline “Ontario University Graduates are Getting Jobs”, and trumpeted the results of the annual provincial graduates survey, which showed that 93% of undergraduates had jobs two years after graduation, and their income was $49,398.  Hooray!

But the problem – apart from the fact that it’s not actually 93% of all graduates with jobs, but rather 93% of all graduates who are in the labour market (i.e. excluding those still in school) – is that the COU release neither talks about what’s going on at the field of study level, nor places the data in any kind of historical context.  Being a nerd, I collect these things when they come out each year and put the results in a little excel sheet.  Let’s just say that when you do compare these results to earlier years, things look considerably less rosy.

Let’s start with the employment numbers, which look like this:

Figure 1: Employment Rate of Ontario Graduates 2 Years Out, Classes of 1999 to 2011














Keep your eye on the class of 2005 – this was the last group to be measured 2 years out before the recession began (i.e. in 2007).  They had overall employment rates of about 97%, meaning that today’s numbers actually represent a 4-point drop from there.  If you really wanted to be mean about it, you could equally say that graduate unemployment in 2013 has doubled since 2007.  But look also at what’s happened to the Arts disciplines: in the first four years of the slowdown, their employment rates fell about two percentage points more than the average (though, since the class of ’09, their employment levels-out).

Still, one might think: employment rates in the 90s – not so bad, given the scale of the recession.  And maybe that’s true.  But take a look at the numbers on income:

Figure 2: Average Income (in $2013) 2 Years After Graduation, Ontario Graduating Classes from 2003-2011, Selected Disciplines














Figure 2 is unequivocally bad news.  The average in every single discipline is below where it was for the class of 2005.  Across all disciplines, the average is down 13%.  Engineering and Computer Science are down the least, and have made some modest real gains in the last couple of years; for everyone else, the decline is in double-digits.  Business: down 11%.  Humanities: down 20%.  Physical Sciences: down 22% (more evidence that generalizations about STEM disciplines are nonsense).

Now, at this point some of you may be saying: “hey, wait a minute – didn’t you say last year that incomes 2 years out were looking about the same as they did for the class of 2005?”  Well, yes – but you may also recall that a couple of days later I called it back because Statscan did a whoopsie and said: “you know that data we said was two years after graduation?  Actually it’s three years out”.

Basically, the Ontario data is telling us that 2 years out ain’t what it used to be, and the Statscan data is telling us is that three years out is the new two; simply, it now takes 36 months for graduates to reach the point they used to reach in 24.  That’s not a disaster by any means, but it does show that – in Ontario at least – recent graduates are having a tougher time in the recession.

Tomorrow: more lessons in graduate employment data interpretation.

August 11

Improving Career Services Offices

Over the last few years, what with the recession and all, there has been increased pressure on post-secondary institutions to ensure that their graduates get jobs.  Though that’s substantially the result of things like curriculum and one’s own personal characteristics, landing a job also depends on being able to get interviews and to do well in them.  That’s where Career Services Offices (CSOs) come in.

Today, HESA released a paper that looks at CSOs and their activities.  The study explores two questions.  The first question deals specifically with university CSOs and what qualities and practices are associated with offices that receive high satisfaction ratings from their students.  The second question deals with college career services – here we did not have any outcome measures like the Globe and Mail, so we focussed on a relatively simple question: how does their structure and offerings differ from what we see in the university sector?

Let’s deal with that second question first: college CSOs tend to be smaller and less sophisticated than those at universities of the same size.  At first glance, that seems paradoxical – these are career-focussed organizations, aren’t they?  But the reason for this is fairly straightforward: to a large extent, the responsibility for making connections between students and employers resides at the level of the individual program rather than with some central, non-academic service provider – a lot of what takes place in a CSO at universities takes place in the classroom at colleges.

Now, to universities, and the question: what is it that makes for a good career services department?  To answer this question we interviewed CSO staff at high- medium- and low-performing institutions (as measured by the Globe and Mail’s pre-2012 student satisfaction surveys) to try to work out what practices distinguished the high-performers.  So what is it that makes for a really good career services office?  Turns out that the budget, staff size, and location of Career Services Offices aren’t really the issue.  What really matters are the following:

  • Use of Data.  Everybody collects data on their operations, but not everyone puts it to good use.  What distinguishes the very best CSOs is that they have an effective, regular feedback loop to make sure insights in the data are being used to modify the way services are delivered.
  • Teaching Job-seeking Skills.  Many CSOs view their mission as making as many links as possible between students and employers.  The very best-performing CSOs find ways to teach job search and interview skills to students, so that they can more effectively capitalize on any connections.
  • Better Outreach Within the Institution.  It’s easy to focus on making partnerships outside the institution.  The really successful CSOs also make partnerships inside the institution. One of the key relationships to be nurtured is academic staff.  Students, for better or for worse, view profs as frontline staff and ask them lots of questions about things like jobs and careers.  At many institutions, profs simply aren’t prepared for questions like that, and don’t know how to respond.  The best CSOs take the time to reach out to staff and partner with them to ensure they have tools at their disposal to answer those questions, and to direct students to the right resources at the CSOs.

If you want better career services, there’s your recipe.  Bonne chance.

September 03

What The Heck Did You THINK Was Going to Happen?

I’m a bit bewildered by some of the recent commentary about declining returns to education, most notably last week’s paper from CIBC on the subject.  While the actual report was not nearly as stupid as the ream of press coverage that followed it, it still had a few howlers, and definitely lacked critical thinking.

First, the howlers.  1) The returns to Bachelor’s degrees are not declining; they are, in fact, growing at a slightly slower rate than at other levels of education, which isn’t the same thing.  2) The gap between college and university graduates is closing, but it’s because college grads are doing better, not because university grads are doing worse.  3) Yes, the difference in unemployment rates between university and high school graduates is, as the report says, only about 1.5 percentage points (which is down considerably over the last decade or so).  But why emphasize that fact when the gap in employment rates – which are presumably much more important, and yet were unmentioned by the report – remains over 12 percentage points?  There’s too much cherry-picking of data here for my taste.

But look, here’s the bigger picture: it really shouldn’t be a surprise if graduate wages are stagnating, and there’s one very simple reason for this: there are way more graduates than there used to be.  Between the late ‘90s and the late ‘00s, the country went from having 600,000 undergraduates to having 900,000 undergraduates.  That’s an extra 75,000-90,000 graduates hitting the labour market every year.  That’s a heck of a supply shock.

The surprise, frankly, isn’t that university graduates’ wages aren’t climbing as quickly as those of college and high school graduates.  The surprise is that they’re rising at all.  This suggests that there is, in fact, enormous labour market demand for the skills provided by university students; if there weren’t wages would have decreased.

I pointed this out on Twitter the day the CIBC paper came out only to learn that for many people – including people who would describe themselves as fiercely progressive – even the hint that relative rates of return might be falling turned them into foaming conservatives with respect to university admissions.  Too many students!  We need a labour market policy!  Etc., etc.

I mean, what exactly did everyone think was going to happen when we allowed enrolment to rise by 50%?  That there would be no change in returns?  And even if there was a slight fall in returns – who cares?  In a democracy, isn’t it better to have 150 people earning good returns than 100 people earning brilliant ones?


Page 1 of 41234