HESA

Higher Education Strategy Associates

Category Archives: Outcomes

April 05

Student/Graduate Survey Data

This is my last thought on data for awhile, I promise.  But I want to talk a little bit today about what we’re doing wrong with the increasing misuse of student and graduate surveys.

Back about 15 years ago, the relevant technology for email surveys became sufficiently cheap and ubiquitous that everyone started using them.  I mean, everyone.  So what has happened over the last decade and a half has been a proliferation of surveys and with it – surprise, surprise – a steady decline in survey response rates.  We know that these low-participation surveys (nearly all are below 50%, and most are below 35%) are reliable, in the sense that they give us similar results year after year.  But we have no idea whether they are accurate, because we have no way of dealing with response bias.

Now, every once in awhile you get someone with the cockamamie idea that the way to deal with low response rates is to expand the sample.  Remember how we all laughed at Tony Clement when he claimed  the (voluntary) National Household Survey would be better than the (mandatory) Long-Form Census because the sample size would be larger?  Fun times.  But this is effectively what governments do when they decide – as the Ontario government did in the case of its sexual assault survey  – to carry out what amounts to a (voluntary) student census.

So we have a problem: even as we want to make policy on a more data-informed basis, we face the problem that the quality of student data is decreasing (this also goes for graduate surveys, but I’ll come back to those in a second).  Fortunately, there is an answer to this problem: interview fewer students, but pay them.

What every institution should do – and frankly what every government should do as well – is create a balanced, stratified panel of about 1000 students.   And it should pay them maybe $10/survey to complete surveys throughout the year.  That way, you’d have good response rates from a panel that actually represented the student body well, as opposed to the crapshoot which currently reigns.  Want accurate data on student satisfaction, library/IT usage, incidence of sexual assault/harassment?  This is the way to do it.  And you’d also be doing the rest of your student body a favour by not spamming them with questionnaires they don’t want.

(Costly?  Yes.  Good data ain’t free.  Institutions that care about good data will suck it up).

It’s a slightly different story for graduate surveys.  Here, you also have a problem of response rates, but with the caveat that at least as far as employment and income data is concerned, we aren’t going to have that problem for much longer.  You may be aware of Ross Finnie’s work  linking student data to tax data to work out long-term income paths.  An increasing number of institutions are now doing this, as indeed is Statistic Canada for future versions of its National Graduate Survey (I give Statscan hell, deservedly, but for this they deserve kudos).

So now that we’re going to have excellent, up-to-date data about employment and income data we can re-orient our whole approach to graduate surveys.  We can move away from attempted censuses with a couple of not totally convincing questions about employment and re-shape them into what they should be: much more qualitative explorations of graduate pathways.  Give me a stratified sample of 2000 graduates explaining in detail how they went from being a student to having a career (or not) three years later rather than asking 50,000 students a closed-ended question about whether their job is “related” to their education every day of the week.  The latter is a boring box-checking exercise: the former offers the potential for real understanding and improvement.

(And yeah, again: pay your survey respondents for their time.  The American Department of Education does it on their surveys and they get great data.)

Bottom line: We need to get serious about ending the Tony Clement-icization of student/graduate data. That means getting serious about constructing better samples, incentivizing participation, and asking better questions (particularly of graduates).  And there’s no time like the present. If anyone wants to get serious about this discussion, let me know: I’d be overjoyed to help.

September 02

Some Basically Awful Graduate Outcomes Data

Yesterday, the Council of Ontario Universities released the results of the Ontario Graduates’ Survey for the class of 2012.  This document is a major source of information regarding employment and income for the province’s university graduates.  And despite the chipperness of the news release (“the best path to a job is still a university degree”), it actually tells a pretty awful story when you do things like, you know, place it in historical context, and adjust the results to account for inflation.

On the employment side, there’s very little to tell here.  Graduates got hit with a baseball bat at the start of the recession, and despite modest improvements in the overall economy, their employment rates have yet to resume anything like their former heights.

Figure 1: Employment Rates at 6-Months and 2-Years After Graduation, by Year of Graduating Class, Ontario

1

 

 

 

 

 

 

 

 

 

 

 

 

Now those numbers aren’t good, but they basically still say that the overwhelming majority of graduates get some kind of job after graduation.  The numbers vary by program, of course: in health professions, employment rates at both 6-months and 2-years out are close to 100%; in most other fields (Engineering, Humanities, Computer Science), it’s in the high 80s after six months – it’s lowest in the Physical Sciences (85%) and Agriculture/Biological Sciences (82%).

But changes in employment rates are mild compared to what’s been happening with income.  Six months after graduation, the graduating class of 2012 had average income 7% below the class of 2005 (the last class to have been entirely surveyed before the 2008 recession).  Two years after graduation, it had incomes 14% below the 2005 class.

Figure 2: Average Income of Graduates at 6-Months and 2-Years Out, by Graduating Class, in Real 2013/4* Dollars, Ontario

2

 

 

 

 

 

 

 

 

 

 

 

 

*For comparability, the 6-month figures are converted into real Jan 2013 dollars in order to match the timing of the survey; similarly, the 2-year figures are converted into June 2014 dollars.

This is not simply the case of incomes stagnating after the recession: incomes have continued to deteriorate long after a return to economic growth.  And it’s not restricted to just a few fields of study, either.  Of the 25 fields of study this survey tracks, only one (Computer Science) has seen recent graduates’ incomes rise in real terms since 2005.  Elsewhere, it’s absolute carnage: education graduates’ incomes are down 20%; Humanities and Physical Sciences down 19%; Agriculture/Biology down 18% (proving once again that, in Canada, the “S” in “STEM” doesn’t really belong, labour market-wise).  Even Engineers have seen a real pay cut (albeit by only a modest 3%).

Figure 3: Change in Real Income of Graduates, Class of 2012 vs. Class of 2005, by Time Graduation for Selected Fields of Study

3

 

 

 

 

 

 

 

 

 

 

 

 

Now, we need to be careful about interpreting this.  Certainly, part of this is about the recession having hit Ontario particularly harshly – other provinces may not see the same pattern.  And in some fields of study – Education for instance – there are demographic factors at work, too (fewer kids, less need of teachers, etc.).  And it’s worth remembering that there has been a huge increase in the number of graduates since 2005, as the double cohort – and later, larger cohorts – moved through the system.  This, as I noted back here, was always likely to affect graduate incomes, because it increased competition for graduate jobs (conceivably, it’s also a product of the new, wider intake, which resulted in a small drop in average academic ability).

But whatever the explanation, this is the story universities need to care about.  Forget tuition or student debt, neither of which is rising in any significant way.  Worry about employment rates.  Worry about income.  The number one reason students go to university, and the number one reason governments fund universities to the extent they do, is because, traditionally, universities have been the best path to career success.  Staying silent about long-term trends, as COU did in yesterday’s release, isn’t helpful, especially if it contributes to a persistent head-in-the-sand unwillingness to proactively tackle the problem.  If the positive career narrative disappears, the whole sector is in deep, deep trouble.

November 25

Graduate Income Data Miracle on the Rideau

My friend and colleague Ross Finnie has just published a remarkable series of papers on long-term outcomes from higher education, which everyone needs to go read, stat.

What he’s done is taken 13 years of student data from the University of Ottawa and linked it to income tax data held by Statistics Canada.  That means he can track income patterns by field of study, not over the puny 6-24 month period commonly used by provincial surveys, or the new 36-month standard the National Graduate Survey now uses, but for up to 13 years out.  And guess what?  Those results are pretty good.  After only five years out, all fields of study are averaging at least $60K per year in annual income.  Income does flatten out pretty quickly after that, but by then, of course, people are earning a pretty solid middle-class existence – even the much-maligned Arts grads.

Figure 1: Average Post-Graduation Income of Class of 1998 University of Ottawa Graduates, by Field of Study and Number of Years After Graduation, in Thousands of 2011 Constant Dollars

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

One of the brilliant things about this data set is that you can not only compare across fields of study in a single cohort, but also you can compare across years for a single field of study.  Finnie’s data shows that in Math/Science, Humanities, Social Science, and Health, income pathways did not vary much between one cohort and another: a 2008 History grad had basically the same early income pathway as one from 1998.  In two other fields, though, it was a different story.  The first is Business, where the 1998 cohort clearly had it a lot better than its later counterparts; after two years out, that cohort was making $10K per year more than later ones, a lead that was then maintained for the rest of their career.  In ICT, the fate of various cohorts was even more diverse.

Figure 2: Average Post-Graduation Income, Selected Cohorts of University of Ottawa Engineering/Computer Science Graduates, by Number of Years After Graduation, in Thousands of 2011 Constant Dollars

unnamed-1

 

 

 

 

 

 

 

 

 

 

 

 

This is pretty stunning stuff: thanks to the dot-com bust, the first-year incomes of engineering and computer science graduates in 2004 was exactly half what it was in 2000 ($40,000 vs. $80,000).  If anyone wants to know why kids don’t flock to ICT as a career, consider uncertain returns as a fairly major reason.

Also examined is the question of income by gender:

Figure 3: Average Post-Graduation Income of Class of 1998 University of Ottawa Graduates, by Gender and Number of Years After Graduation, in Thousands of 2011 Constant Dollars

unnamed-2

 

 

 

 

 

 

 

 

 

 

 

 

Two interesting things are at work with respect to gender.  The initial income gap of $10,000 in the first year after graduation gap is almost entirely a field-of-study effect: take out Engineering/Computer Science, and earnings are almost the same.  But after that, the gap widens at a pretty continuous pace for all fields of study.  It’s most pronounced in Business, where top-quartile male incomes really blow the averages out, but the pattern is the same everywhere.  Because of the way the data is collected, it’s impossible to say how much of this reflects differences in labour-market participation and hours worked, and how much of this is differences in hourly pay, but the final result – a gender gap of $20,000 to $25,000 in average earnings, regardless of field of study – is pretty striking.

Are there caveats to this data?  Sure.  It’s just one university, located in a town heavy on government and ICT work.  My guess is that elsewhere, things might not look so good in Humanities and Social Science, and ICT outcomes may be less boom-and-bust-y.  But fortunately, Ross is on this one: he is currently building a consortium of institutions across the country to replicate this process, and build a more comprehensive national picture.

Let me press this point a bit on Ross’ behalf: there is no good reason why every institution in the country should not be part of this consortium.  If your institution is not part of it, ask yourself why.  This is the most important new source of data on education Canada has had in over a decade.  Everyone should contribute to it.

 

 

Nb. One tiny quibble about the papers is that they present everything in monochrome graphic form – no tabular data.  To make the above figures, I’ve had to eyeball the data and re-enter it myself.  Apologies for any deviations from the original.