HESA

Higher Education Strategy Associates

Category Archives: PSE Outcomes

October 05

Why Companies Value(d) Higher Education

I recently read the book A Perfect Mess: the Unlikely Ascendancy of American Higher Education by David Larabee.  It’s very good – in fact, the first two chapters are for my money the best short history of pre-1900 American higher education ever written.  I’m going to refer to this book a few times over the next couple of weeks.  But today, I want to talk about an engaging little passage he penned about how business came to view college (that is, American “college”, our universities) as an indispensable pre-requisite to white collar jobs.

There was a time, of course, where this wasn’t the case.  Well after the Civil War, medicine and law remained jobs filled through apprenticeship rather than something that required higher education.  Andrew Carnegie famously said he had known few young men who were not injured by a higher education system that stamped the fire and energy out of them.  The conversion of men like Carnegie into boosters rather than detractors was key to higher education becoming a mass phenomenon.  If business had kept that attitude, the system today would look completely different (mainly, it would be much, much smaller).

So what was it that changed?  According to Larabee, business leaders came to appreciate college because of changes both in the structure of higher education and the structure of business.  Business changed because management – that is, the general ability to apply verbal and cognitive skills to organizations problems – became more important.  And higher education provided this training in three ways.

First, the importation of German ideas about how to run universities led to an increased emphasis on giving students broad assignments which they would need to work out on their own.  This gave graduates experience with the kind of autonomy and problem-solving needed in the modern workplace.  Second, the need to navigate the complex formal and informal social, academic and administrative hierarchies of the university gave graduates the skills needed to navigate the similarly complicated hierarchies of modern corporations.  Third, the socialization process that colleges put students through to promote institutional loyalty (the arrival of intercollegiate sports at the turn of the twentieth century was a big help here) was also important: businesses value loyalty highly.

It’s a persuasive theory, and I think it also speaks to what may be wrong in the college/business interface in the present-day as well.  If you look at universities, they’re still training and socializing students much as they always did: problem-solving, hierarchy-navigating, and loyalty-inducing (granted, that third one has been much more prominent in American and Japanese institutions than it has elsewhere, but the point is this hasn’t become any less important over time no matter what the starting baseline).  But business has changed.  Partly, it’s about speed but I would argue it is more about permeability.  In an age of flatter corporate hierarchies, young trainees are required to deal at a fairly early stage to deal with actors outside the companies in order to get things done.  They are doing front-line sales and communications more often.  They are dealing with external suppliers more often.  Success in the new business world relies not simply on navigating the internal environment, it requires a lot more horizontal networking and engaging with external hierarchies.

And this, I would argue, is something higher education – be it in colleges or universities – hasn’t yet figured out how to impart.  And it’s not entirely clear that the Work Integrated Learning fad du jour is a solution, since all that really does is transpose a student temporarily from one hierarchy to another.  (To be clear: I’m not saying WIL isn’t valuable, I’m just saying it doesn’t solve this particular problem).

I should emphasize that I’m not carping here.  It’s not obvious to me how universities could actually give students this kind of experience, and many may well say either that it’s impossible or that they shouldn’t try.  Fair enough.  But just remember that if institutions can’t perform their training/socializing role to employers’ satisfaction, there’s no reason to think higher education will continue to receive the public support it currently does, either.

Given that, working out how to give students those kinds of experiences – either in class or through extra-curricular activity – is probably worth a ponder.

October 04

New Quality Measurement Initiatives

One of the holy grails in higher education – if you’re on the government or management side of things, anyway – is to find some means of actually measuring institutional effectiveness.  It’s all very well to note that alumni at Harvard, Oxford, U of T (pick an elite university, any elite university) tend to go on to great things.  But how much of that has to do with them being prestigious and selective enough to only take the cream of the crop?  How can we measure the impact of the institution itself?

Rankings, of course, were one early way to try to get at this, but they mostly looked at inputs, not outputs.  Next came surveys of student “engagement”, which were OK as far as they went but didn’t really tell you anything about institutional performance (though it did tell you something about curriculum and resources).  Then came the Collegiate Learning Assessment and later the OECD’s attempt to build on it, which was called the Assessment of Higher Education Learning Outcomes, or AHELO.  AHELO was of course unceremoniously murdered two years ago by the more elite higher education institutions and their representatives (hello, @univcan and @aceducation!) who didn’t like its potential to be used as a ranking (and, in fairness, the OECD probably leant too hard in that direction during the development phase, which wasn’t politically wise).

So what’s been going on in quality measurement initiatives since then?  Well, two big ones you should know about.

The first is one being driven out of the Netherlands called CALOHEE (which is, sort of, short for “(Measuring and Comparing Achievements of Learning Outcomes in Higher Education in Europe).  It is being run by more or less the same crew that developed the Tuning Process about a decade ago, and who also participated in AHELO though they have broken with the OECD since then.  CALOHEE builds on Tuning and AHELO in the sense that it is trying to create a common framework for assessing how institutions are doing at developing students’ knowledge, skills and competencies.  It differs from AHELO in that if it is successful, you probably won’t be able to make league tables out of it.

One underlying assumption of AHELO was that all programs in a particular area (eg. Economics, Civil Engineering) were trying to impart the same knowledge skills and competencies – this was what made giving them a common test valid.  But CALOHEE assumes that there are inter-institutional differences that matter at the subject level.  And so while students will still get a common test, the scores will be broken up in ways that are relevant to each institution given the set of desired learning outcomes at each institution.  So Institution X’s overall score in History relative to institution Y’s is irrelevant, but their scores in, for instance, “social responsibility and civic awareness” or “abstract and analytical thinking” might be, if they both say that’s a desired learning outcome.  Thus, comparing learning outcomes in similar programs across institutions becomes possible, but only where both programs have similar goals.

The other big new initiative is south of the border and it’s called the Multi-State Collaborative to Advance Quality Student Learning (why can’t these things have better names?  This one’s so bad they don’t even bother with an initialism).  This project still focuses on institutional outcomes rather than program-level ones, which reflects a really basic difference of understanding of the purpose of undergraduate degrees between the US and Europe (the latter caring a whole lot less, it seems, about well-roundedness in institutional programming).  But, crucially in terms of generating acceptance in North America, it doesn’t base its assessment on a (likely low-stakes) test.  Rather, samples of ordinary student course work are scored according to various rubrics designed over a decade or more (see here for more on the rubrics and here for a very good Chronicle article on the project as a whole). This makes the outcomes measured more authentic, but implicitly the only things that can be measured are transversal skills (critical thinking, communication, etc) rather than subject-level material.  This will seem perfectly fine to many people (including governments), but it’s likely to be eyed suspiciously by faculty.

(Also, implicitly, scoring like this on a national scale will create a national cross-subject grade curve, because it will be possible to see how an 80 student in Engineering compares in to an 80 in history, or an 85 student at UMass to an 85 student at UWisconsin.  That should be fun.)

All interesting stuff and worth tracking.  But notice how none of it is happening in Canada.  Again.  I know that after 25 years in this business the lack of interest in measurable accountability by Canadian institutions shouldn’t annoy me, but it does.   As it should anyone who wants better higher education in this country.  We can do better.

June 12

The Nordstrom Philologist

People are always nattering on about skills for the new economy, but apart from some truly unhelpful ideas like “everyone should learn to code”, they are usually pretty vague on specifics about what that means.  But I think I have solved that.

What the economy needs – or more accurately, what enterprises (private and public) need – is more Nordstrom Philologists.

Let me explain.

One of the main consequences of the management revolutions of the last couple of decades has been the decline of middle-management.  But, as we are now learning, one of the key – if unacknowledged – functions of middle-management was to act as a buffer between clients and upper management on the one side, and raw new employees on the other.  By doing so, they could bring said new employees along slowly into the culture of the company, show them the ropes and hold their hands a bit as they gained in confidence and ability in dealing with new and unfamiliar situations.

But that’s gone at many companies now.  New employees are now much more likely to be thrown headfirst into challenging situations.  They are more likely to be dealing with clients directly, which of course means they have greater responsibility for the firm’s reputation and its bottom line.  They are also more likely to have to report directly to upper management, which requires a level of communication skills and overall maturity which many don’t have.

When employers say young hires “lack skills”, this is what they are talking about.  Very few complain that the “hard skills” – technical skills related to the specific job – are missing. Rather, what they are saying is they lack the skills to deal with clients and upper management.  And broadly, what that means is, they can’t communicate well and they can’t figure out how to operate independently without being at the (senior) boss’ door every few minutes asking “what should I do now”?

When it comes to customer service, everyone knows Nordstrom is king.  And a large part of that has to do with its staff and its commitment to customer care.  Communications are at the centre of what Nordstrom does, but it’s not communicating to clients; rather, it’s listening to them.  Really listening, I mean: understanding what clients actually want, rather than just what they ask for.  And then finding ways to make sure they get what they need.  That’s what makes clients and/or citizens feel valued.  And it’s what the best employees know how to provide.

And then there’s philology* – the study of written texts.  We don’t talk much about this discipline anymore in North America since its constituent parts have it’s been partitioned into history, linguistics, religious studies and a tiny little bit into art history (in continental Europe it retains a certain independence and credibility as an independent discipline).  The discipline consists essentially in constructing plausible hypotheses from extremely fragmentary information: who wrote the Dead Sea scrolls?  Are those Hitler diaries real?  And so on.   It’s about understanding cultural contexts, piecing together clues.

Which is an awful lot like day-to-day business.  There’s no possible way to learn how to behave in every situation, particularly when the environment is changing rapidly.  Being effective in the workplace is to a large degree about developing modes of understanding and action based on some simple heuristics and a constant re-evaluation of options as new data becomes available.  And philology, the ultimate “figure it out for yourself” discipline, is excellent training for it (history is a reasonably close second).

That’s pretty much it.  Nordstrom for the really-listening-to-client skills, philology for the figuring-it-out-on-your-own-and-getting-stuff-done skills.  Doesn’t matter what line of business you’re in, these are the competencies employers need.  And similarly, it doesn’t matter what field of study is being taught, these are the elements that need to be slipped into the curriculum.

*(On the off-chance you want to know more about philology, you could do a lot worse than James Turner’s Philology: The Forgotten Origins of the Modern Humanities.  Quite a useful piece on the history of thought). 

May 16

Jobs: Hot and Not-So-Hot

Remember when everyone was freaking out because there were too many sociology graduates and not enough welders?  When otherwise serious people like Ken Coates complained about the labour market being distorted by the uninformed choices of 17-19 year-olds?  2015 seems like a long time ago.

Just for fun the other day, I decided to look at which occupations have fared best and worst in Canada over the past ten years (ok, I grant you my definition of fun may not be universal).  Using public data, the most granular data I can look at are two-digit National Occupation Codes, so some of these categories are kind of broad.  But anyway, here are the results:

Table 1: Fastest-growing occupations in Canada, 2007-2017

May 16-17 Table 1 Fastest Growing

See any trades in there?  No, me neither.  Four out of the top ten fastest-growing occupations are health-related in one way or another.  There are two sets of professional jobs – law/social/community/ government services (which includes educational consultants, btw) and natural/applied sciences) which pretty clearly require bachelor’s if not master’s degrees.  There are three other categories (Admin/financial supervisors, Technical occupations in art, and paraprofessional occupations in legal, social, etc) which have a hodgepodge of educational requirements but on balance probably have more college than university graduates.   And then there is the category retail sales supervisors and specialized sales occupations, which takes in everything from head cashiers to real estate agents and aircraft sales representatives.  Hard to know what to make of that one.  But the other nine all seem to require training which is pretty squarely in traditional post-secondary education specialties.

Now, what about the ten worst-performing occupations?

Table 2: Fastest-shrinking Occupations in Canada 2007-2017

May 16-17 Table 2 Fastest Shrinking Occupation
This is an interesting grab bag.  I’m fairly sure, given the amount of whining about managerialism one hears these days, that it will be a surprise to most people that the single worst-performing job sector in Canada is “senior management occupations”.  It’s probably less of a surprise that four of the bottom ten occupations are manufacturing-related, and that two others – Distribution, Tracking and Scheduling and Office Support Occupations – which are highly susceptible to automation are there, too.  But interestingly, almost none of these occupations, bar senior managers, have significant numbers of university graduates in them. Many wouldn’t even necessarily have a lot of college graduates either, at least outside the manufacturing and resources sectors.

Allow me to hammer this point home a bit, for anyone who is inclined to ever again take Ken Coates or his ilk seriously on the subject of young people’s career choices.  Trades are really important in Canada.  But the industries they serve are cyclical.  If we counsel people to go into these areas, we need to be honest that people in these areas are going to have fat years and lean years – sometimes lasting as long as a decade at a time.  On the other hand, professional occupations (nearly all requiring university study) and health occupations (a mix of university and college study) are long-term winners.

Maybe students knew that all along, and behaved accordingly.  When it comes to their own futures, they’re pretty smart, you know.

 

May 10

Why Education in IT Fields is Different

A couple of years ago, an American academic by the name of James Bessen wrote a fascinating book called Learning by Doing: The Real Connection Between Innovation, Wages and Wealth.  (It’s brilliant.  Read it).  It’s an examination of what happened to wages and productivity over the course of the industrial revolution, particularly in the crucial cotton mill industry.  And the answer, it turns out, is that despite all the investment in capital which permitted vast jumps in labour productivity, in fact wages didn’t rise that much at all.  Like, for about fifty years.

Sound familiar?

What Bessen does in this book is to try to get to grips with what happens to skills during a technological revolution.  And the basic problem is that while the revolution is going on, while new machines are being installed, it is really difficult to invest in skills.  It’s not simply that technology changes quickly and so one has to continually retrain (thus lowering returns to any specific bit of training); it’s also that technology is implemented in very non-standard ways, so that (for instance) the looms at one mill are set up completely differently from the looms at another and workers have to learn new sets of skills every time they switch employers.  Human capital was highly firm-specific.

The upshot of all this: In fields where technologies are volatile and skills are highly non-standardized, the only way to reliably increase skills levels is through “learning by doing”.  There’s simply no way to learn the skills in advance.  That meant that workers had lower levels bargaining power because they couldn’t necessarily use the skills acquired at one job at another.  It also meant that, not to put too fine a point on it, that formal education becomes much less important compared to “learning by doing”.

The equivalent industry today is Information Technology.  Changes in the industry happen so quickly that it’s difficult for institutions to provide relevant training; it’s still to a large extent a “learning by doing” field.  Yet, oddly, the preoccupation among governments and universities is: “how do we make more tech graduates”?

The thing is, it’s not 100% clear the industry even wants more graduates.  It just wants more skills.  If you look at how community colleges and polytechnics interact with the IT industry, it’s often through the creation of single courses which are designed in response to very specific skill needs.  And what’s interesting is that – in the local labour market at least – employers treat these single courses as more or less equivalent to a certificate of competency in a particular field.  That means that these college IT courses these are true “microcredentials” in the sense that they are short, potentially stackable, and have recognized labour market value.  Or at least they do if the individual has some demonstrable work experience in the field as well (so-called coding “bootcamps” attempt to replicate this with varying degrees of success, though since they are usually starting with people from outside the industry, it’s not as clear that the credentials they offer are viewed the same way by industry).

Now, when ed-tech evangelists go around talking about how the world in future is going to be all about competency-based badges, you can kind of see where they are coming from because that’s kind of the way the world already works – if you’re in IT.  The problem is most people are not in IT.  Most employers do not recognize individual skills the same way, in part because work gets divided into tasks in a somewhat different way in IT than it does in most other industries.  You’re never going to get to a point in Nursing (to take a random example) where someone gets hired because they took a specific course on opioid dosages.  There is simply no labour-market value to disaggregating a nursing credential, so why bother?

And so the lesson here is this: IT work is a pretty specific type of work in which much store is put in learning-by-doing and formal credentials like degrees and diplomas are to some degree replaceable by micro-credentials.  But most of the world of work doesn’t work that way.  And as a result, it’s important not to over-generalize future trends in education based on what happens to work in IT.  It’s sui generis.

Let tech be tech.  And let everything else be everything else.  Applying tech “solutions” to non-tech “problems” isn’t likely to end well.

April 05

Student/Graduate Survey Data

This is my last thought on data for awhile, I promise.  But I want to talk a little bit today about what we’re doing wrong with the increasing misuse of student and graduate surveys.

Back about 15 years ago, the relevant technology for email surveys became sufficiently cheap and ubiquitous that everyone started using them.  I mean, everyone.  So what has happened over the last decade and a half has been a proliferation of surveys and with it – surprise, surprise – a steady decline in survey response rates.  We know that these low-participation surveys (nearly all are below 50%, and most are below 35%) are reliable, in the sense that they give us similar results year after year.  But we have no idea whether they are accurate, because we have no way of dealing with response bias.

Now, every once in awhile you get someone with the cockamamie idea that the way to deal with low response rates is to expand the sample.  Remember how we all laughed at Tony Clement when he claimed  the (voluntary) National Household Survey would be better than the (mandatory) Long-Form Census because the sample size would be larger?  Fun times.  But this is effectively what governments do when they decide – as the Ontario government did in the case of its sexual assault survey  – to carry out what amounts to a (voluntary) student census.

So we have a problem: even as we want to make policy on a more data-informed basis, we face the problem that the quality of student data is decreasing (this also goes for graduate surveys, but I’ll come back to those in a second).  Fortunately, there is an answer to this problem: interview fewer students, but pay them.

What every institution should do – and frankly what every government should do as well – is create a balanced, stratified panel of about 1000 students.   And it should pay them maybe $10/survey to complete surveys throughout the year.  That way, you’d have good response rates from a panel that actually represented the student body well, as opposed to the crapshoot which currently reigns.  Want accurate data on student satisfaction, library/IT usage, incidence of sexual assault/harassment?  This is the way to do it.  And you’d also be doing the rest of your student body a favour by not spamming them with questionnaires they don’t want.

(Costly?  Yes.  Good data ain’t free.  Institutions that care about good data will suck it up).

It’s a slightly different story for graduate surveys.  Here, you also have a problem of response rates, but with the caveat that at least as far as employment and income data is concerned, we aren’t going to have that problem for much longer.  You may be aware of Ross Finnie’s work  linking student data to tax data to work out long-term income paths.  An increasing number of institutions are now doing this, as indeed is Statistic Canada for future versions of its National Graduate Survey (I give Statscan hell, deservedly, but for this they deserve kudos).

So now that we’re going to have excellent, up-to-date data about employment and income data we can re-orient our whole approach to graduate surveys.  We can move away from attempted censuses with a couple of not totally convincing questions about employment and re-shape them into what they should be: much more qualitative explorations of graduate pathways.  Give me a stratified sample of 2000 graduates explaining in detail how they went from being a student to having a career (or not) three years later rather than asking 50,000 students a closed-ended question about whether their job is “related” to their education every day of the week.  The latter is a boring box-checking exercise: the former offers the potential for real understanding and improvement.

(And yeah, again: pay your survey respondents for their time.  The American Department of Education does it on their surveys and they get great data.)

Bottom line: We need to get serious about ending the Tony Clement-icization of student/graduate data. That means getting serious about constructing better samples, incentivizing participation, and asking better questions (particularly of graduates).  And there’s no time like the present. If anyone wants to get serious about this discussion, let me know: I’d be overjoyed to help.

January 26

An Amazing Statscan Skills Study

I’ve been hard on Statscan lately because of their mostly-inexcusable data collection practices.  But every once in awhile the organization redeems itself.  This week, that redemption takes the form of an Analytical Studies Branch research paper by Marc Frenette and Kristyn Frank entitled Do Postsecondary Graduates Land High-Skilled Jobs?  The implications of this paper are pretty significant, but also nuanced and susceptible to over-interpretation.  So let’s go over in detail what this paper’s about.

The key question Frenette & Frank are answering is “what kinds of skills are required in the jobs in which recent graduates (defined operationally here as Canadians aged 25-34 with post-secondary credentials) find themselves”.  This is not, to be clear, an attempt to measure what skills these students possess; rather it is an attempt to see what skills their jobs require.  Two different things.  People might end up in jobs requiring skills they don’t have; alternatively, they may end up in jobs which demand fewer skills than the ones they possess.  Keep that definition in mind as you read.

The data source Frenette & Frank use is something called the Occupational Information Network (O*NET), which was developed by the US Department of Labour.  Basically, they spend ages interviewing employees, employers, and occupational analysts to work out skill levels typically required in hundreds of different occupations.  For the purpose of this paper, the skills analyzed and rated include reading, writing, math, science, problem solving, social, technical operation, technical design and analysis and resource management (i.e. management of money and people).  They then take all that data and transpose it onto Canadian occupational definitions.  So now they can assign skill levels to nine different dimensions of skills to each Canadian occupation.  Then they use National Household Survey data (yes, yes, I know), to look at post-secondary graduates and what kind of occupations they have.  On the basis of this, at the level of the individual, they can link highest credential received to the skills required in their occupation.  Multiply that over a couple of million Canadians and Frenette and Frank have themselves one heck of a database.

So, the first swing at analysis is to look at occupational skill requirements by level of education.   With only a couple of exceptions – technical operations being the most obvious one – these more or less all rise according to the level of education. The other really amusing exception is that apparently PhDs do not occupy/are not offered jobs which require management skills.  But it’s when they get away from level of education and move to field of study that things get really interesting.  To what extent are graduates from various fields of study employed in jobs that require,  for instance, high levels of reading comprehension or writing ability?  I reproduce Frenette & Frank’s results below.

ottsyd-20170125-1

Yep.  You read that right.  Higher reading comprehension skill requirements are for jobs occupied by Engineers.  Humanities?  Not so much.

ottsyd-20170125-2

It’s pretty much the same story with writing, though business types tend to do better on that measure.

ottsyd-20170125-3

…and critical thinking rounds out the set.

So what’s going on here?  How is it that that humanities (“We teach people to think!“) get such weak scores and “mere” professional degrees like business and engineering do so well?  Well, let’s be careful about interpretation.  These charts are not saying that BEng and BCom grads are necessarily better than BA grads at reading, writing and critical thinking, though one shouldn’t rule that out.  They’re saying that BEng and BCom grads get jobs with higher reading, writing and critical thinking requirements than do BAs.  Arguably, it’s a measure of underemployment rather than a measure of skill.  I’m not sure I personally would argue that, but it is at least arguable.

But whatever field of study you’re from, there’s a lot of food for thought here.  If reading and writing are such a big deal for BEngs, should universities wanting to give their BEngs a job boost spend more time giving them communication skills?  If you’re in social sciences or humanities, what implications do these results have for curriculum design?

I know if I were a university President, these are the kinds of questions I’d be asking my deans after reading this report.

January 20

Puzzles in the youth labour market

A couple of days ago, after looking at employment patterns among recent graduate using Ontario graduate survey data, I promised a look at broader youth labour market data. I now wish I hadn’t promised that because Statistics Canada’s CANSIM database is an ungodly mess and has got significantly worse since the last time I tried to use its data. Too little of the data on employment and income allows users to focus in by age *and* education level, and even getting details down to 5-year age brackets (e.g. 20-24, 25-29), which might be useful for looking at youth labour markets, is frustratingly difficult.

(WHY CAN’T WE HAVE NICE THINGS, STATSCAN? WHY???)

Anyways.

Ok, so let’s start by looking at employment rates. Figure 1 looks at employment rates for Canadians in the 15-19, 20-24 and 25-29 age brackets since the turn of the Millennium.

Figure 1: Employment Rates by Age-group 1999-2016

OTTSYD 2017-01-20-1

The takeaway in figure 1 is that by and large employment rates are steady except for one big hiccup in 2008-9. In that year, the employment rate for 24-29 year olds fell by about two percentage points, that for 20-24 years olds fell by three and a half percentage points, and that for 15-19 year olds by five percentage points. Not only did the size of the drop vary inversely with age, so too has subsequent performance. Employment rates for 25-29 year-olds and 20-24 year olds have held fairly steady since 2009; those for 15-19 year olds have continued to fall, and are now over seven percentage points off their 2008 peak.

Ah, you say: employment is one thing: what about hours of work? Aren’t we seeing more part-time work and less full-time work? Well, sort of.

Figure 2: Part-time Employment as a Percentage of Total Employment, by Age-group, 1999-2016

OTTSYD 2017-01-20-2

Across all age-groups, the percentage of workers who are part-time (that is, working less than 30 hours per week) rose after 2008. In the case of the 25-29 year olds, this was pretty minor, rising from 12% in 2008 to 14% today. Among the 15-19 year-olds the movement was not especially large either, rising from 70 to 74% (remember, we *want* these kids to be part-time: they’re supposed to be in school). The biggest jump was for the 20-24 group – that is, traditional-aged students and recent graduates – where the part-time labour jumped from 29% to 35%. Now some of that might be due to higher university enrolment rates (workers are more likely to be part-time if they are also studying), but at least some of that is simply a push towards increased casualization of the labour market.

So far, all of this is roughly consistent with what we saw Monday through Wednesday – which is that there was a one-time shock to employment around 2008, and that the effect is much more pronounced among younger graduates (say, those 6 months out from graduation ) than it is among older ones (say, those 24 months after graduation. What is not quite consistent, though, is what is happening to wages. Unfortunately, CANSIM no longer makes available any decent time series data on wages or income for the 20-24 or 25-29 age groups (one of these days I am going to have to stump up for a public use microdata file but today is not that day.) But it does offer  some data on what’s going on for 15-24 year olds. Sub-optimal (25-29 would be best) but still useful. Here’s what that data looks like:

Figure 3: Hourly and Weekly Wages, 15-24 year-olds, in $2016, 2001-2016

OTTSYD 2017-01-20-3

Figure 3 shows hourly and weekly wages for 15-24 year olds, in 2016 dollars. Hourly wages (measured on the right axis) grew faster than weekly wages (measured on the left axis) because average hours worked fell by 3.5% (this is the shift to part-time work we saw in figure 2). Hourly wage growth has not been as strong since 2008 as it was between 2004 and 2008, but it is still up 6% over that time. It’s probably safe to assume that the situation for 25-29 year olds is not worse than it is for 20-24 year olds. Which means we have an interesting puzzle here: wage growth for the youth cohort as a whole is positive, at least among those who have wages – but as we saw Monday and Tuesday wage growth is negative for university students. What’s going on?

There are two possibilities here. The first is that wage growth since 2008 is stronger for those without university degrees than it is for those with. With the oil/gas boom, that might have been a reasonable explanation up to 2014; it’s hard to see it still being true now. The second is the proposition advanced here earlier this week: that while university graduates may still all cluster at the right-end of the bell curve, as they encompass a greater proportion of the bell curve, the average as a whole necessarily falls.

In short: post-2008, something has happened to the labor market which makes it more difficult for young people to “launch”. We shouldn’t overstate how big this problem is: employment is down slightly, as is the proportion of employment which is full-time. But unlike previous recessions, the youth-labour market does not seem to be bouncing back – these changes seem to be permanent, which is a bit disquieting. But it’s also true that these effects are more severe among the youngest: which is exactly what you’d expect if the labour market was putting a greater emphasis on skills and experience. By the time youth get to their late 20s, the effects mostly disappear.

In other words, what we are seeing is less “failure to launch” than “delays in the launch”. To the extent anything has changed, it’s that the transition to the labour market is on average a little bit rougher and a little bit slower than it used to be, but that’s likely as much to do with the expansion of access to university as it is a skills-biased change in the labour market.

January 18

More Bleak Data, But This Time on Colleges

Everyone seems to be enjoying data on graduate outcomes, so I thought I’d keep the party going by looking at similar data from Ontario colleges. But first, some of you have written to me suggesting I should throw some caveats on what’s been covered so far. So let me get a few things out of the way.

First, I goofed when saying that there was no data on response rates from these surveys. Apparently there is and I just missed it. The rate this year was 40.1%, a figure which will make all the economists roll their eyes and start muttering about response bias, but which anyone with field experience in surveys will tell you is a pretty good response for a mail survey these days (and since the NGS response rate is now down around the 50% mark, it’s not that far off the national “gold standard”).

Second: all this data on incomes I’ve been giving you is a little less precise than it sounds. Technically, the Ontario surveys do not ask income, they ask income ranges (e.g. $0-20K, $20-40K, etc). When data is published either by universities or the colleges, this is turned into more precise-looking figures by assigning the mid-point value of each and then averaging those points. Yes, yes, kinda dreadful. Why can’t we just link this stuff to tax data like EPRI does? Anyways, that means you should probably take the point values with a pinch of salt: but the trend lines are likely still meaningful.

Ok, with all that out of the way, let’ turn to the issue of colleges. Unfortunately, Ontario does not collect or display data on college graduates’ outcomes the way they do for universities. There is no data around income, for instance. And no data on employment 2 years after graduation, either. The only real point of comparison is employment 6 months after graduation, and even this is kind of painful: for universities the data is available only by field of study; for colleges, it is only available by institution. (I know, right?) And even then it is not even calculated on quite the same basis: universities include graduates with job offers while the college one does not. So you can’t even quite do an apples-to-apples comparison, even at the level of the sector as a whole. But if you ignore that last small difference in calculation and focus not on the point-estimates but on the trends, you can still see something interesting. Here we go:

Figure 1: Employment Rates 6 months after Graduation, Ontario Universities vs. Ontario Colleges, by Graduating Cohort, 1999-2015

ottsyd-20170117

So, like I said, ignore the actual values in Figure 1 because they’re calculated in two slightly different ways; instead, focus on the trends. And if you do that, what you see is (a blip in 2015 apart), the relationship between employment rates in the college and university sector looks pretty much the same throughout the period. Both had a wobble in the early 2000s, and then both took a big hit in the 2008 recession. Indeed, on the basis of this data, it’s hard to make a case that one sector has done better than another through the latest recession: both got creamed, neither has yet to recover.

(side point: why does the university line stop at 2013 while the college one goes out to 2015? Because Ontario doesn’t interview university grads until 2 years after grad and then asks them retroactively what they were doing 18 months earlier. So the 2014 cohort was just interviewed last fall and it’ll be a few months until their data is released. College grads *only* get interviewed at 6 months, so data is out much more quickly)

What this actually goes is put a big dent in the argument that the problem for youth employment is out-of-touch educators, changing skill profiles, sociologists v. welders and all that other tosh people were talking a few years ago. We’re just having more trouble than we used to integrating graduates into the labour market. And I’ll be taking a broader look at that using Labour Force Survey data tomorrow.

January 17

Another Lens on Bleak Graduate Income Data

So, yesterday we looked at Ontario university graduate employment data (link to: previous).  Today I want to zero in a little bit on what’s happening by field of study.

(I can hear two objections popping up already.  First; “why just Ontario”?  Answer: while Quebec, Alberta, British Columbia and the Maritimes – via MPHEC – all publish similar data, they all publish the data in slightly different ways, making it irritating (and in some cases impossible) to come up with a composite national figure.  The National Graduate Survey (NGS) in theory does this, but only every five years but as I explained last week has made itself irrelevant by changing the survey period.  So, in short, I can’t do national, and Ontario a) is nearly half the country in terms of university enrolments and b) publishes slightly more detailed data than most.  Second, “why just universities”?  Answer: “fair point, I’ll be publishing that data soon”.

Everyone clear? OK, let’s keep going).

Let’s look first at employment rates 6 months after graduation by field of study (I include only the six largest – Business/Commerce, Education, Engineering, Humanities, Physical Sciences and Social Sciences – because otherwise these graphs would be an utter mess), shown below in Figure 1.  As was the case yesterday, the dates along the x-axis are the cohort graduation year.

ottsyd-20170116-1

Two take-aways here, I think.  The first is that the post-08 recession really affected graduates of all fields more or less equally, with employment rates falling by between 6 and 8 percentage points (the exception is humanities, where current rates are only four percentage points below where they were in 2007).  The second is that pretty much since 2001, it’s graduates in the physical sciences who have had the weakest results.

OK, but as many in the academy say: 6 months isn’t enough to judge anything.  What about employment rates after, say, 2 years?  These are shown below in Figure 2.

ottsyd-20170116-2

This graph is smoother than the previous one, which suggests the market for graduates with 2 years in the labour market is a lot more stable than that for graduates with just 6 months.    If you compare the class of 2013 with the clss of 2005 (the last one to completely miss the 2008-9 recession), business and commerce students’ employment rates have fallen only by one percentage point while those in social sciences have dropped by six percentage points, with the others falling somewhere in between.  One definite point to note for all those STEM enthusiasts out there: there’s no evidence here that students in STEM programs have fared much better than everyone else.

But employment is one thing; income is another.  I’ll spare you the graph of income at six months because really, who cares?  I’ll just go straight to what’s happening at two years.

ottsyd-20170116-3

To be clear, what figure 3 shows is average graduate salaries two years after graduation in real dollars – that is, controlling for inflation.  And what we see here is that in all fields of study, income bops along fairly steadily until 2007 (i.e. class of 2005) at which point things change and incomes start to decline in all six subject areas.  Engineering was down, albeit only by three percent.  But income for business students was down 10%, physical sciences down 16%, and humanities, social sciences and education were down 19%, 20% and 21%, respectively.

This, I shouldn’t need to emphasize, is freaking terrible.  Actual employment rates (link to: previous) may not be down that much but this drop in early graduate earnings is  pretty disastrous for the majority of students.  Until a year or two ago I wasn’t inclined to put a lot of weight on this: average graduate earnings have always popped back after recessions.  This time seems to be different.

Now as I said yesterday, we shouldn’t be too quick to blame this on a huge changes economy to which institutions are not responding; it’s likely that part of the fall in averages comes from allowing more students to access education in the first place.  As university graduates take up an increasing space on the right-hand side of an imaginary bell-curve representing all youth, “average earnings” will naturally decline even if there’s no overall change in the average or distribution of earnings as a whole.  And the story might not be as negative if we were to take a five- or ten-year perspective on earnings.  Ross Finnie has done some excellent work showing that in the long-term nearly all university graduates make a decent return (though, equally, there is evidence that students with weak starts in the labour force have lower long-term earnings as well through a process known as “labour market scarring”).

Whatever the cause, universities (and Arts faculties in particular) have to start addressing this issue honestly.  People know in their gut that university graduates’ futures in general (and Arts graduates in particular) are not as rosy as they used to be. So when the Council of Ontario puts out a media release, as it did last month, patting universities on the back for a job well-done with respect to graduate outcomes, it rings decidedly false.

Universities can acknowledge challenges in graduate without admitting that they are somehow at fault.  What they cannot do is pretend there isn’t a problem, or shirk taking significant steps to improve employment outcomes.

Page 1 of 912345...Last »