Higher Education Strategy Associates

A *Tiny* Statscan Mistake on the National Graduates Survey (NGS)

OK, everyone. Gather ’round for a kind of mind-bending story, which totally invalidates much of what I was saying earlier this week about the NGS.

So, NGS is a two-year follow up of graduates, done every five years. So, the 2002 survey looked at the class of 2000, 2007 surveys looked at the class of 2005.  Now, as I noted yesterday, for reasons unknown, Statistics Canada chose to wait until 2013 to do its follow-up of 2010 graduates.  My strong suspicion is that it was because Employment and Skills Development Canada (ESDC) was yanking their chain on funding for so long that they missed their 2012 window, but I don’t know for sure (NGS, like many surveys, isn’t actually funded by Statscan – it’s paid for by an Ottawa line department.  Yes, it’s ridiculous, but that’s the way Ottawa works.)

Now, waiting a year creates a problem because it screws with the time-series.  If the 2010 class is interviewed  three years out, it’s basically useless because you can’t legitimately compare it with the 2005 or data.  But of course Statscan’s not stupid, I thought to myself: they’ll just ask the questions with respect to a period twelve months earlier and get comparable data that way.  Because who in their right mind would deliberately screw with a time-series that goes back 30 years?

And when I asked someone about this a few months ago, that was more or less the answer I got – the survey would be changed to adjust for the difference in timing.  So when the first NGS release tables appeared late last week from Statscan, and they were labelled “2012 labour force activity of 2010 graduates”, I naturally assumed – hey, Statscan’s done the right thing.  And so I published it.

Then, yesterday AM, we received an email from Statscan.  It contained a new set of tables, with a note saying that while the data they published April 4th was correct, some “mislabeling” had occurred, and that I should destroy the earlier data.

Turns out that what Statscan actually published was data from 2013 data – that is, three years after graduation, not two.  This made me review the 2013 NGS instrument and realize that their re-adjustment of the instrument to account for the gap in surveys was half-assed at best. With a little bit of fooling around with the data, it might be possible to get numbers on employment and income two years out – but since Statscan has declared that it’s not going to put out a Public Use Microdata File (PUMF) for NGS, that’s essentially impossible for anyone to do independently.  Meanwhile, the only data Statscan’s giving out for free is data that is completely non-comparable with any other data in the survey’s 30-year history.

Bra. Vo.

The upshot is: ignore anything you read about comparative-over-time graduate labour market outcomes from NGS from me or anyone else.  Thanks to Statscan (and possibly ESDC), it’s all worthless.

If there were ever a time to just cut funding to NGS and replace the whole thing with administrative data linkages, it’s now.  The argument for keeping it on the grounds of it being a great time-series just disappeared.

This entry was posted in Uncategorized. Bookmark the permalink.

3 Responses to A *Tiny* Statscan Mistake on the National Graduates Survey (NGS)

  1. Frances Woolley says:

    Plus there’s the little thing about most Statistics Canada surveys being limited to people with landlines – hardly a random sample of the population under 30, no?

  2. Pingback: New Data on Student Debt: the 2010 National Graduates Survey | HESA

Leave a Reply

Your email address will not be published. Required fields are marked *

We encourage constructive debate. Therefore, all comments are moderated. While anonymous or pseudonymous comments are permitted, those that are inflammatory or disrespectful may be blocked.