Higher Education Strategy Associates

Category Archives: Data

January 13

Restore the NGS!

One of the best things that Statistics Canada ever did in the higher education field was the National Graduates’ Survey (NGS). OK, it wasn’t entirely Statscan – NGS has never been a core product funded from the Statscan budget but rather funded periodically by Employment and Social Development Canada (ESDC) or HRDC or HRSDC or whatever earlier version of the department you care to name – but they were the ones doing the execution. After a trial run in the late 1970s (the results of which I outlined back here), Statscan tracked the results of the graduating cohorts of 1982, 1986, 1990, 1995, 2000 and 2005 two and five years after graduation (technically, only the 2-year was called NGS – the 5-year survey was called the Follow-up of Graduates or FOG but no one used the name because it was too goofy). It became the prime way Canada tracked transitions from post-secondary education to the labour market, and also issues related to student debt.

Now NGS was never a perfect instrument. Most of the income data could have been obtained much more simply through administrative records, the way Ross Finnie is currently doing at EPRI. We could get better data on student debt of provinces ever got their act together and actually released student data on a consistent and regular basis (I’m told there is some chance of this happening in the near future). It didn’t ask enough questions about activities in school, and so couldn’t examine the effects of differences in provision (except for, say, Field of Study) on later outcomes. But for all that it was still a decent survey, and more to the point one with a long history which allowed one to make solid comparisons over time.

Then, along comes budget cutting exercises during the Harper Government. ESDC decides it only has enough money for one survey, not two. Had Statscan or ESDC bothered to consult anyone about what to do in this situation, the answer would almost certainly have been: keep the 2-year survey and ditch the 5-year one. The 5-year survey was always beset with the twin problems of iffy response rates and being instantly out of date by the time it came out (“that was seven graduating classes ago!” people would say – “what about today’s graduates”?). But the 2-year? That was gold, with a decent time series going back (in some topic areas) back almost 30 years. Don’t touch that, we all would have said, FOR GOD’S SAKE DON’T TOUCH IT, LEAVE IT AS IT IS.

But of course, Statscan and ESDC didn’t consult and they didn’t leave it alone. Instead of sticking with a 2-years out survey, they decided to do a survey of students three years out, thereby making the results for labour market transitions totally incompatible with the previous six iterations of the survey. They spent millions to get a whole bunch of data which was hugely sub-optimal because they murdered a perfectly good time-series to get it.

I have never heard a satisfactory explanation as to why this happened. I think it’s either a) someone said: “hey, if we’re ditching a 2-year and a 5-year survey, why not compromise and make a single 3-year survey?” or b) Statscan drew a sample frame from institutions for the 2010 graduating class, ESDC held up the funding until it was too late to do a two-year sample and then when it eventually came through Statscan said, “well we already have a frame for 2010, so why not sample them three years out instead of doing the sensible thing and going back and getting a new frame for the 2011 cohort which would allow us to sample two years out”. To be clear, both of these possible reasons are ludicrous and utterly indefensible as a way to proceed with a valuable dataset, albeit in different ways. But this is Ottawa so anything is possible.

I have yet to hear anything about what, if anything, Statscan and ESDC plan to do about surveying the graduating cohort of 2015. If they were going to return to a two-year cycle, that would mean surveying would have to happen this spring; if they’re planning on sticking with three, the survey would happen in Spring 2018. But here’s my modest proposal: there is nothing more important about NGS than bringing back the 2-year survey frame. Nothing at all. Whatever it takes, do it two years out. If that means surveying the class of 2016 instead of 2015, do it. We’ll forget the Class of 2010 survey ever happened. Do not, under any circumstances, try to build a new standard based on a 3-year frame. We spent 30 years building a good time series at 24 months out from graduation. Better have a one-cycle gap in that time series than spend another 30 years building up an equally good time-series at 36 months from graduation.

Please, Statscan. Don’t mess this up.

February 17

Some Curious Student Loan Numbers

Every once in awhile, it’s good to go searching through statistical abstracts just to see if the patterns you take for granted can still be taken for granted.  So I recently went hunting through some CSLP annual reports and statistical abstracts to see what I could find.  And I’m glad I did, because there are some really surprising numbers in the data.

So here’s the really big take-away: the number of students borrowing from the Canada Student Loan Program rose from 365,363 in 2008-09, to 472,167 in 2012-13.  In just four years, that’s an increase of 29%.  Which is kind of staggering.  It’s therefore important to ask the question: what the heck is going on?  Where are all these new borrowers coming from?

Well, for one thing, we know it isn’t being led by a new wave of students in private, for-profit institutions.  In fact, the increase occurred across all types of institutions, with a slightly more pronounced growth among students in community colleges.

Figure 1: Growth in Number of Student Borrowers by Type of Institution, 2008-09 to 2012-13













It’s a different story when we look at borrowing growth  by province.  Here, we see a more straightforward – and somewhat puzzling – story: borrower numbers are up fairly substantially everywhere west of the Ottawa river; however, numbers are even, or down slightly everywhere in the Atlantic (note: because we are looking only at CSLP borrowing, there is no data for Quebec, which has opted out of the program).

Figure 2: Growth in Number of Student Borrowers by Province, 2008-09 to 2012-13












One thing that Figure 2 obscures is the relative size of the provinces, and thus the portions of growth in borrower numbers.  Ontario, where growth in borrower numbers has been 38%, actually accounts for over three-quarters (77%) of all growth in borrowing within the CSLP zone; in total, Ontario now accounts for nearly two-thirds (64%) of the CSLP loan portfolio.

You can’t explain Figure 2 in terms of economic fundamentals: neither the recession’s effects nor education costs were that different in the Atlantic.  To a considerable degree, what Figure 2 is really showing is population change: youth populations in the Atlantic are shrinking, and that is primarily why their borrower numbers are going down (Newfoundland is falling even further because of real declines in costs and – probably – because family incomes rose quickly in this period thanks to the oil boom).

To get a better look at changes in the borrower population by province, we need to look at changes in the percentage of full-time students who are borrowing.  Now, it’s difficult to do this because CSLP itself doesn’t calculate this figure, and doesn’t quite break down figures enough to do it accurately.  So below in Figure 3 what we show is the number of total borrowers (including at private vocational colleges), divided by the number of students enrolled full-time in public universities and colleges.  This will slightly overstate the percentage of students borrowing (borrowers at private colleges make up about 10% of the borrowing population, so mentally adjust the numbers downward if you would); also, the denominator is total students enrolled in the province, not originating in the province, so Nova Scotia’s figure in particular is an undercount because of all the out-of-province students there.  With those caveats in mind, here are the percentages of students borrowing across the country:

Figure 3: Percentage of Full-Time Students with Loans, by Province, 2008-09 and 2012-13














The percentage of students borrowing grew in every province except Newfoundland, Saskatchewan, and New Brunswick. But the real story here is Ontario, where the percentage of students borrowing jumped by nine percentage points (from 44% to 53%), which led to a national rise of seven percentage points (42% to 49%). It’s not entirely clear why there was such a jump in Ontario.  The recession there was not that much more severe than elsewhere, and student costs, though high, were not rising that much more quickly than elsewhere.  Part of the answer may be that in the last couple of years the new Ontario Tuition Grant has been in effect, which enticed higher-income students into the student aid system with its outrageous $160,000 family-income cut-off line.  But that can’t be the entire story, as growth in numbers was actually fairly steady from year-to-year.

What might be going on? My guess is two things.  First, student numbers are expanding in most provinces.  Almost by necessity, if expansion is happening, it is going to happen disproportionately among those who we traditionally call “underserved” (that is, the poor, students with dependents, etc.), who by definition are more likely to be eligible for student aid.  This is to say, what we are seeing here is not evidence of a problem, but rather evidence of student aid working exactly as it should, to expand access.

The second factor is what I call delayed recognition.  Back in the 2000s, student aid eligibility for dependent students was expanded enormously.  Essentially, we went from a situation in 2003 where most families saw eligibility for student aid end at around the $85,000-$90,000 mark in family income, to one in 2006 and thereafter where the cutoff rose to about $160,000 (the number varies a bit by province and family size, but that’s roughly the scale of the change).  However, much to everyone’s surprise, take-up rates barely rose, presumably because CSLP didn’t go out of its way to advertise the changes much.  What may be happening is that families across the country – but especially in Ontario – may finally be cluing in to how much assistance they are entitled to under the post-2006 rules, and acting accordingly.  In other words, this could just be an improvement in take-up rates rather than a deterioration in family and student finances.

November 24

Class Size, Teaching Loads, and that Curious CUDO Data Redux

You may recall that last week I posted some curious data from CUDO, which suggested that the ratio of undergraduate “classes” (we’re not entirely sure what this means) to full-time professors in Ontario was an amazingly-low 2.4 to 1.  Three quick follow-ups to that piece.

1.  In the previous post, I offered space on the blog to anyone involved with CUDO who could clear up the mystery of why undergraduate teaching loads appeared to be so low.  No one has taken me up this offer.  Poor show, but it’s not too late; I hereby repeat the offer in the hope that someone will step forward with something convincing.

2.  I had a couple of people – both in Arts faculties at different medium-sized non-U15 Ontario universities – try to explain the 2.4 number as follows: teaching loads *are* in fact 4 courses per year (2/2), they said.  It’s just that once you count sabbaticals, maternity leaves, high enrolment (profs sometimes get a reduced load if one of their classes is particularly large), leaves for administrative duty, and “buyouts” (i.e. a prof pays to have a sessional teach the class so he/she can do research), you come down to around 2.5.

This is sort of fascinating.  I mean, if this were generally true, it essentially means that universities are managing their staff on the assumption that 35-40% of staff resources are theoretically available for teaching.  Now, obviously all industries overstaff to some extent: sickleaves and maternity happen everywhere.  But 40%?  That sounds extremely high.  It does not speak particularly well of an institution that gets its money primarily for the purpose of teaching.  Again, it would be useful if someone in an institution could confirm/deny, but it’s a heck of a stat.

3.  Turns out there’s actually a way to check this, because at least one university – give it up for Carleton, everyone – actually makes statistics about sessional professors public!  Like, on their website, for everyone to seeMirabile dictu.

Anyways, what Carleton says is that in 2014-15, 1,397 “course sections” were taught by contract or retired faculty, which translates into 756.3 “credits”.  At the same time, the university says it has 850 academic staff (actually, 878, but I’m excluding the librarians here).  Assuming they are all meant to teach 2/2, this would be 3,400 “classes” per year.  Now, it’s not entirely clear to me whether the definition of “classes” is closer to “credits” or “course sections”; I kind of think it is somewhere in between.  If it’s the former, then contract/retired faculty are teaching 22.2% of all undergraduate classes; if it’s the latter, then it’s 41.1%.  That’s a wide range, but probably about right.  And since Carleton is a pretty typical Canadian university, my guess is these numbers roughly hold throughout the system.

However, what this doesn’t tell you is what percentage of credit hours are taught by sessionals – if the undergraduate classes taught by these academics are larger, on average, than those taught by full-timers, then the proportion will be even higher than this.  I’ve had numerous conversations with people in a position to know who indicate that in many Ontario Arts faculties, the percentage of undergraduate credit hours taught by sessional faculty is roughly 50%. Elsewhere, of course, mileage may vary, but my guess is that with the possible exception of the Atlantic, this is the case pretty much everywhere.

I could be wrong, of course.  As with my CUDO offer, anyone who wants to step forward with actual data to show how I am wrong is welcome to take over the blog for a couple of days to present the evidence.

November 17

Curious Data on Teaching Loads in Ontario

Back in 2006, university Presidents got so mad at Maclean’s that they stopped providing data to the publication.  Recognizing that this might create the impression that they had something to hide, they developed something called “Common University Dataset Ontario” (CUDO) to provide the public with a number of important quantitative descriptors of each university.  In theory, this data is of better quality and more reliable than the stuff they used to give Maclean’s.

One of the data elements in CUDO has to do with teaching and class size.  There’s a table for each university, which shows the distribution of class sizes in each “year” (1st, 2nd, 3rd, 4th): below 30, 31-60, 61-90, 91-150, 151-250, and over 250.  The table is done twice, once including just “classes”, and another with slightly different cut-points that include “subsections”, as well (things like laboratories and course sections).  I was picking through this data when I realised it could be used to take a crude look at teaching loads because the same CUDO data also provides a handy number of full-time professors at each institution.  Basically, instead of looking at the distribution of classes, all you have to do is add up the actual number of undergraduate classes offered, divide it by the number of professors, and you get the number of courses per professor.  That’s not a teaching load per se, because many courses are taught by sessionals, and hell will freeze over before institutions release data on that subject. Thus, any “courses per professor” data that can be derived from this exercise is going to overstate the amount of undergradaute teaching being done by full-time profs.

Below is a list of Ontario universities, arranged in ascending order of the number of undergraduate courses per full-time professor.  It also shows the number of courses per professor if all subsections are also included.  Of course, in most cases, at most institutions, subsections are not handled by full-time professors but some are; and so assuming the underlying numbers are real, a “true” measure of courses per professors would be somewhere in between the two.  And remember, these are classes per year, not per term.

Classes Per Professor, Ontario, 2013


















Yes, you’re reading that right.  According to universities’ own data, on average, professors are teaching just under two and a half classes per year, or a little over one course per semester.  At Toronto, McMaster, and Windsor, the average is less than one course per semester.  If you include subsections, the figure rises to three courses per semester, but of course as we know subsections aren’t usually led by professors.   And, let me just say this again, because we are not accounting for classes taught by sessionals, these are all overstatements of course loads.

Now these would be pretty scandalous numbers if they were measuring something real.  But I think it’s pretty clear that they are not.  Teaching loads at Nipissing are not five times higher than they are at Windsor; they are not three and a half times higher at Guelph than at Toronto.  They’re just not.  And nor is the use of sessional faculty quite so different from one institution to another as to produce these anomalies.  The only other explanation is that there is something wrong with the data.

The problem is: this is a pretty simple ratio; it’s just professors and classes.  The numbers of professors reported by each institution look about right to me, so there must be something odd about the way that most institutions – Trent, Lakehead, Guelph, and Nipissing perhaps excepted – are counting classes.  To put that another way, although it’s labelled “common data”, it probably isn’t.  Certainly, I know of at least one university where the class-size data used within the institution explicitly rejects the CUDO definitions (that is, they produce one set of figures for CUDO and another for internal use because senior management thinks the CUDO definitions are nonsense).

Basically, you have to pick an interpretation here: either teaching loads are much, much lower than we thought, or there is something seriously wrong with the CUDO data used to show class sizes.  For what it’s worth, my money is on it being more column B than column A.  But that’s scarcely better: if there is a problem with this data, what other CUDO data might be similarly problematic?  What’s the point of CUDO if the data is not in fact common?

It would be good if someone associated with the CUDO project could clear this up.  If anyone wants to try, I can give them this space for a day to offer a response.  But it had better be good, because this data is deeply, deeply weird.

November 16

An Interesting but Irritating Report on Graduate Overqualification

On Thursday, the Office of Parliamentary Budget Officer (PBO) released a report on the state of the Canadian labour market.  It’s one of those things the PBO does because the state of the labour market drives the federal budget, to some extent.  But in this report, the PBO decided to do something different: it decided to look at the state of the labour market from the point of view of recent graduates, and specifically whether graduates are “overqualified” for their jobs.

The methodology was relatively simple: using the Labour Force Survey, determine the National Occupation Code (NOC) for every employed person between the ages of 25 and 34.  Since NOCs are classified according to the level of education they are deemed to require, it’s simple to compare each person’s level of education to the NOC of the job they are in, and on that basis decide whether someone is “overqualified”, “underqualified” or “rightly qualified”.

So here’s what the PBO found: over the past decade or so, among university graduates, the rate of overqualification is rising, and the rate of “rightly qualified” graduates is falling.  Among college graduates, the reverse is true.  Interesting, but as it turns out not quite the whole story.

Now, before I get into a read of the data, a small aside: take a look at the way the PBO chose to portray the data on university graduates.

Figure 1: Weaselly PBO Way of Presenting Data on Overqualification Among 25-34 Year Old University Graduates















Wow!  Startling reversal, right?  Wrong.  Take a look at the weaselly double Y-axis.  Here’s what the same data looks like if you plot it on a single axis:

Figure 2: Same Data on University Graduate Overqualification, Presented in Non-Weaselly Fashion














See?  A slightly less sensational story.  Clearly, someone in PBO wanted to spice up the narrative a bit, and did so by making a pretty unforgivable graph, one that was clearly meant to overstate the fundamental situation.  Very poor form from the PBO.

Anyways, what should we make of this change in university graduates’ fortunes?  Well, remember that there was a massive upswing in university access starting at the tail end of the 1990s.  This meant a huge change in attainment rates over the decade.

Figure 3: Attainment Rates Among 25-34 Year-Olds, Canada














What this upswing in the university attainment rate meant was that there were a heck of a lot more university graduates in the market in 2013 than there was, say, a decade earlier.  In fact, 540,000 more, on a base of just over a million – a 53% increase between 1999 and 2013.  Though the PBO doesn’t mention it in the report, it’s nevertheless an important background fact. Indeed, it likely explains a lot of the pattern change we are seeing.

To see how important that is, let’s look at this in terms of numbers rather than percentages.

Figure 4: Numbers of Rightly-Qualified and Overqualified 25-34 Year Old University Graduates, 1999-2013














In fact, the number of rightly-qualified graduates is up substantially over the last decade, and they’ve been increasing at almost (though not quite) as fast a rate as the number of “overqualified” graduates.  For comparison, here’s the situation in colleges:

Figure 5: Numbers of Rightly-Qualified and Overqualified 25-34 Year Old College Graduates, 1999-2013














As advertised, there’s no question that the trend in college outcomes looks better than the one for universities.  Partly that’s because of improvements in colleges’ offerings, and partly it has to do with the run-up in commodity prices, which made college credentials more valuable (remember the Green-Foley papers? Good times).

What should you take from all of this?  If nothing else, don’t forget that comparing university outcomes over time is hard because of the changing size and composition of the student body.  Remember: the median student today wouldn’t have made it into university 25 years ago.  Average outcomes were always likely to fall somewhat, both because more graduates means more competition for the same jobs, and also because the average academic ability of new entrants is somewhat lower.

It would be interesting, for instance, to see these PBO results while holding high school grades constant – then you’d be able to tell whether falling rates of “rightly-qualified” graduates were due to changing economy/less relevant education, or a changing student body.  But since we can’t, all one can really say about the PBO report is: don’t jump to conclusions.

Especially on the basis of those godawful graphs.

October 13

Statistics Canada is in the Wrong Century

If what you are looking for is agricultural statistics, Statistics Canada is a wondrous place.  See, Statscan even made a fabulous (if oddly truncated) little video about agricultural statistics.

Statscan can tell you *anything* about agriculture.  Monthly oilseed crushing statistics?  No problem (59,387 tonnes in August, in case you were wondering).  It can tell you on a weekly basis the weight of all eggs laid and processed in Canada (week of August 1st = 2.3 million kilograms); it can even break it down by “frozen” and “liquid”.  Want to know the annual value of ranch-raised pelts in Canada?  Statscan’s got you covered.

But let’s not stop here.  Wondering about barley, flaxseed, and canola deliveries for August, by province?  Check.  National stocks of sweetened concentrated whole milk, going back to 1970? Check (for comparison, GDP data only goes back to 1997).  Average farm prices for potatoes, per hundredweight, back to 1908?  Check.

There is even – and this one is my favourite – an annual Mushroom Growers’ Survey.  (Technically, it’s a census of mushroom growers, – and yes, this means Statscan expends resources to maintain a register of Canadian mushroom growers; let that sink in for a moment.)  From this survey – the instrument is here – one can learn what percentage of mushrooms grown in Canada are of the Shiitake variety, whether said Shiitake mushrooms are grown on logs, in sawdust, or pulp mill waste fibers, and then compare whether the value per employee of mushroom operations is greater or lesser for Shiitake mushrooms than for Agaricus or Oyster mushrooms.

According to Statistics Canada, this is actually worth spending money on.  This stuff matters.

Also according to Statistics Canada: the combined value of agriculture, forestry, fishing, and hunting is $25 billion.  Or about $10 billion a year less than the country spends on universities alone.  Total value of educational services is $86 billion a year.

And yet, here are a few things Statscan doesn’t know about education in Canada: the number of first-year students in Canada, the number of part-time instructors at Canadian universities, the number of part-time professors at universities, anything at all about college instructors, access rates to post-secondary education by ethnic background or family income, actual drop-out and completion rates in secondary or post-secondary education, the number of new entrants each year to post-secondary education, the rate at which students transfer between universities and colleges, or within universities and colleges, time-to-completion, rates of student loan default, scientific outputs of PSE institutions, average college tuition, absolutely anything at all about private for-profit trainers… do I need to go on?  You can all list your pet peeves here.

Even on topics they do know, they often know them badly, or slowly.  We know about egg hatchings from two months ago, but have no idea about college and university enrolment from fall 2013.  We have statistics on international students, but they do not line up cleanly with statistics from Immigration & Citizenship.  We get totals on student debt at graduation from the National Graduates Survey, but they are self-reports and are invariably published four years after the student graduates.

What does it say about Canada’s relationship to the knowledge economy, when it is official policy to survey Mushroom growers annually, but PSE graduates only every five years?  Who in their right mind thinks this is appropriate in this day and age?

Now, look, I get it: human capital statstics are more complicated than education statistics, and it takes more work, and you have to negotiate with provinces and institutions, and yadda yadda yadda.  Yes.  All true.  But it’s a matter of priorities.  If you actually thought human capital mattered, it would be measured, just as agriculture is.

The fact that this data gap exists is a governmental problem rather than one resulting from Stastcan, specifically.  The agency is hamstrung by its legislation (which mandates a substantial focus on agriculture) and its funding.  Nevertheless, the result is that we have a national statistical system that is perfectly geared to the Edwardian era, but one that is not fit for purpose when it comes to the modern knowledge economy.  Not even close.

October 08

Higher Education Data Glasnost

Many people complain that there is a lack of post-secondary data in Canada.  But this is actually not true.  There are tons of data about; it’s just that institutions won’t share or publish much of it.

Let me tell you a little story.  Once upon a time, there was a small, public-minded higher education research company that wanted to create the equivalent of Statistics Canada’s university tuition fee index for colleges.  The company had completed a project like this before, but had done so in a somewhat imprecise way because of the time and effort involved in getting enrollment data necessary to weight the program-level tuition data.  And so, politely, it began asking colleges for their enrolments by program.

Now, program-level enrolments are not a state secret.  We are talking about publicly-funded institutions here, and given the number of people who use these services, this is very much the definition of public information.  Nor are these data difficult to collect or display.  Institutions know exactly how many students are in each program, because it’s the basis on which they collect fees.

And yet, across most of the country, many institutions have simply refused to provide the data.  The reasons are a mix of the understandable and the indefensible.  Some probably don’t want to do it because it’s a disruptive task outside their workplan.  Others are cautious because they don’t quite know how the data will be used (or they disagree with how it will be used) and are afraid of internal repercussions if it turns out that the shared data ends up making their institution look bad (note: we’re using the data to publish provincial averages, not institutional ones; however, in single-institution provinces like Saskatchewan or Newfoundland and Labrador, this can’t be helped).  A few simply just don’t want to release the data because it’s theirs.

Regardless, it is unacceptable for public institutions to conceal basic operational data for reasons of convenience.  That’s not the way publicly-funded bodies are supposed to operate in a democracy.  And so, regretfully, we’ve had to resort to filing Access to Information (ATI) requests to find out how many students attend public college programs across Canada.  Sad, but true.

It then occurred to me how many of our national higher education data problems could be solved through Access to Information legislation.  Take Simona Chiose’s very good piece in the Globe and Mail last week in which she tried to piece together what Canadian universities are doing with sessional professors, and where many institutions simply refused to give her data.  If someone simply started hitting the universities with annual ATI requests on sessional lectures, and publishing the results, we’d have good data pretty quickly. Ditto for data on teaching loads.  All that excellent comparable data the U-15 collects every year?  You can’t ATI the U-15 because it’s a private entity, but it’s easy-peasy lemon squeezy to ATI all of the U-15 members for their correspondence with the Ottawa office, and get the data that way (or, conversely, ATI the U-15’s correspondence to each university, and get the collected data that way).

Oh, I could go on here.  Want better data on staff and students?  ATI the universities that have factbooks, but refuse to put them online (hello, McGill and UNB!).  Want better data on PhD graduate outcomes?  ATI every university’s commencement programs from last year’s graduation ceremonies, and presto, you’ve got a register of 3,000 or so PhDs, most of whom can be tracked on social media to create a statistical portrait of career paths (this would take a little bit of volunteer effort, but I can think of quite a few people who would provide it, so while not easy-peasy lemon squeezy, it wouldn’t be difficult-difficult lemon difficult, either).

It’s not a cure-all of course.  Even with all that ATI data, it would take time to process the data and put it into usable formats. Plus there’s an agency problem: who’s going to put all these requests together? Personally, I think student unions are the best place to do so, if not necessarily the best-equipped to subsequently handle the data.  And of course institutional data is only part of the equation.  Statistics Canada data has to improve significantly, too, in order to better look at student outcomes (a usable retention measure would be good, as would an annual PSIS-LAD-student aid database link to replace the now-terminally screwed National Graduates Survey).

To be clear, I’m not suggesting going hair-trigger on ATIs.  It’s important to always ask nicely for the data first; sometimes, institutions and governments can be very open and helpful.  But the basic issue is that data practices of post-secondary institutions in Canada have to change.  Secrecy in the name of protecting privacy is laudable; secrecy in the name of self-interested ass-covering is not.  We need some glasnost in higher education data in this country.  If it takes a wave of ATI requests to do it, so be it.   Eventually, once enough the results of these ATI requests filter into the public realm, institutions themselves will try to get ahead of the curve and become more transparent as a matter of course.

I’d like to think there was a simpler and less confrontational way of achieving data transparency, but I am starting to lose hope.

September 15

Visible Minority Numbers Rise Sharply

I was poking around some data from the Canadian Undergraduate Survey Consortium the other day and I found some utterly mind-blowing data.  Take a look at these statistics on the percentage of first-year students self-identifying as a “visible minority” on the Consortium’s triennial Survey of First Year Students:

Figure 1: Self-Identified Visible Minority Students as a Percentage of Entering Class, 2001-2013














Crazy, right?  Must be all those international students flooding in.

Er, no.  Well, there are more students with permanent residences outside Canada, but they aren’t necessarily affecting these numbers, because they represent only about 7% of survey respondents.  If we assume that 80% of these students are themselves visible minorities, and we pull them out of the data, the visible minority numbers look like this:

Figure 2: Visible Minority Students, International* vs. Domestic, 2001-2013














*assumes 80% of students with permanent residences outside Canada are “visible minorities”

That’s still a heck of a jump.  Maybe it has something to do with the changing demographics of Canadian youth?

Well, we can sort of track this by looking at census data on visible minorities, aged 15-24, from 2001 and 2006, and (yes, yes, I know) the 2011 National Household Survey, and then marry these up with the 2001, 2007, and 2013 CUSC data.  Not perfect, but it gives you a sense of contrasting trends.  Here’s what we find.

Figure 3: Domestic Visible Minority Students as a Percentage of Total vs. Visible Minorities as a Percentage of all 15-24 Year-Olds, 2001, 2007, 2013














So, yes, a greater proportion of domestic youth self-identify as visible minorities, but that doesn’t come close to explaining what seems to be going on here.

What about changes in the survey population?  Well, it’s true that the consortium metric isn’t stable, and that there is some movement in institutions over time.  If we just look at 2007 and 2014 – a period during which the number of visible minority students almost doubled – we can see how a change in participating schools might have shifted things.

Table 1: Schools Participating in CUSC First-Year Survey, 2007 and 2013




















Here’s what stands out to me on that list.  York and Waterloo are in the 2013 survey, but were not there in 2007, which you’d think would skew the 2013 data a bit higher on visible minorities (although not that much – together, these two schools were only 7% of total sample).  On the other hand, UBC Vancouver was there in the 2007 survey, but not 2013, which you’d think would skew things the other way.  On the basis of this, I’d say a school participation probably contributed somewhat to the change, but was not decisive.

I could end this post with a call for better data (always a good thing).  But if a trend is big enough, even bad data can pick it up.  I think that might be what we’re seeing here with the increase in visible minority students.  It’s a big, intriguing story.

September 14

Better Post-Secondary Data: Game On

On Saturday morning, the US Department of Education released the College Scorecard.  What the heck is the College Scorecard, you ask?  And why did they release it on a Saturday morning?  Well, I have no earthly idea about the latter, but as for the former: it’s a bit of a long story.

You might remember that a little over a year ago, President Obama came up with the idea for the US Government to “rate” colleges on things like affordability, graduation rates, graduate earnings and the like.  The thinking was that this kind of transparency would punish institutions that provided genuinely bad value for money by exposing said poor value to the market, while at the same encouraging all institutions to become more attentive to costs and outcomes.

The problem with the original idea was three-fold.  First, no one was certain that the quality of available data was good enough.  Second, the idea of using the same set ratings for both quality improvement and to enforce minimum standards was always a bit dicey.  And third, the politics of the whole thing were atrocious – the idea that a government might declare that institution X is better than institution Y was a recipe for angry alumni pretty much everywhere.

So back in July, the Administration gave up on the idea of rating institutions (though it had been quietly backing away from it for months); however, it didn’t give up on the idea of collecting and disseminating the data.  Thus, on Saturday, what it released instead was a “scorecard”; a way to look up data on every institution without actually rating those institutions.  But also – and this is what had nerds in datagasm over the weekend – it released all of the data (click “download all data” here).  Several hundred different fields worth.  For 20 years. It’s totally unbelievable.

Some of the data, being contextual, is pretty picayune: want to know which institution has the most students who die within four years of starting school?  It’s there (three separate branches of a private mechanics school called Universal Technical Institute).  But other bits of the data are pretty revealing.  School with the highest average family income? (Trinity College, Connecticut.)  With the lowest percentage of former students earning over $25,000 eight years after graduation? (Emma’s Beauty Academy in Mayaguez, PR.)  With the highest default rates? (Seven different institutions – six private, one public – have 100% default rates.)

Now, two big caveats about this data.  The first is that institutional-level data isn’t, in most cases, all that helpful (graduate incomes are more a function of field of study than institution, for instance). The second caveat is that information around former students and earnings relate only to student aid recipients (it’ a political/legal thing – basically, the government could look up the post-graduation earnings for students who received aid, but not for students who funded themselves).  The government plans to rectify that first caveat ahead of next year’s release; but you better believe that institutions will fight to their dying breath over that second caveat, because nothing scares them more than transparency.  As a result, while lots of the data is fun to look at, it’s not exactly the kind of stuff with which students should necessarily make decisions (a point made with great aplomb by the University of Wisconsin’s Sara Goldrick-Rab.

Caveats aside, this data release is an enormous deal.  It completely raises the bar for institutional transparency, not just in the United States but everywhere in the world.  Canadian governments should take a good long look at what America just did, and ask themselves why they can’t do the same thing.

No… scratch that.  We ALL need to ask governments why they can’t do this.  And we shouldn’t accept any answers about technical difficulties.  The simple fact is that it’s a lack of political will, an unwillingness to confront the obscurantist self-interest of institutions.

But as of Saturday, that’s not good enough anymore.  We all deserve better.

September 02

Some Basically Awful Graduate Outcomes Data

Yesterday, the Council of Ontario Universities released the results of the Ontario Graduates’ Survey for the class of 2012.  This document is a major source of information regarding employment and income for the province’s university graduates.  And despite the chipperness of the news release (“the best path to a job is still a university degree”), it actually tells a pretty awful story when you do things like, you know, place it in historical context, and adjust the results to account for inflation.

On the employment side, there’s very little to tell here.  Graduates got hit with a baseball bat at the start of the recession, and despite modest improvements in the overall economy, their employment rates have yet to resume anything like their former heights.

Figure 1: Employment Rates at 6-Months and 2-Years After Graduation, by Year of Graduating Class, Ontario














Now those numbers aren’t good, but they basically still say that the overwhelming majority of graduates get some kind of job after graduation.  The numbers vary by program, of course: in health professions, employment rates at both 6-months and 2-years out are close to 100%; in most other fields (Engineering, Humanities, Computer Science), it’s in the high 80s after six months – it’s lowest in the Physical Sciences (85%) and Agriculture/Biological Sciences (82%).

But changes in employment rates are mild compared to what’s been happening with income.  Six months after graduation, the graduating class of 2012 had average income 7% below the class of 2005 (the last class to have been entirely surveyed before the 2008 recession).  Two years after graduation, it had incomes 14% below the 2005 class.

Figure 2: Average Income of Graduates at 6-Months and 2-Years Out, by Graduating Class, in Real 2013/4* Dollars, Ontario














*For comparability, the 6-month figures are converted into real Jan 2013 dollars in order to match the timing of the survey; similarly, the 2-year figures are converted into June 2014 dollars.

This is not simply the case of incomes stagnating after the recession: incomes have continued to deteriorate long after a return to economic growth.  And it’s not restricted to just a few fields of study, either.  Of the 25 fields of study this survey tracks, only one (Computer Science) has seen recent graduates’ incomes rise in real terms since 2005.  Elsewhere, it’s absolute carnage: education graduates’ incomes are down 20%; Humanities and Physical Sciences down 19%; Agriculture/Biology down 18% (proving once again that, in Canada, the “S” in “STEM” doesn’t really belong, labour market-wise).  Even Engineers have seen a real pay cut (albeit by only a modest 3%).

Figure 3: Change in Real Income of Graduates, Class of 2012 vs. Class of 2005, by Time Graduation for Selected Fields of Study














Now, we need to be careful about interpreting this.  Certainly, part of this is about the recession having hit Ontario particularly harshly – other provinces may not see the same pattern.  And in some fields of study – Education for instance – there are demographic factors at work, too (fewer kids, less need of teachers, etc.).  And it’s worth remembering that there has been a huge increase in the number of graduates since 2005, as the double cohort – and later, larger cohorts – moved through the system.  This, as I noted back here, was always likely to affect graduate incomes, because it increased competition for graduate jobs (conceivably, it’s also a product of the new, wider intake, which resulted in a small drop in average academic ability).

But whatever the explanation, this is the story universities need to care about.  Forget tuition or student debt, neither of which is rising in any significant way.  Worry about employment rates.  Worry about income.  The number one reason students go to university, and the number one reason governments fund universities to the extent they do, is because, traditionally, universities have been the best path to career success.  Staying silent about long-term trends, as COU did in yesterday’s release, isn’t helpful, especially if it contributes to a persistent head-in-the-sand unwillingness to proactively tackle the problem.  If the positive career narrative disappears, the whole sector is in deep, deep trouble.

Page 1 of 712345...Last »