Higher Education Strategy Associates

Tag Archives: Statistics Canada

April 12

Access: A Canadian Success Story

Statscan put out a very important little paper on access to post-secondary education on Monday.  It got almost zero coverage despite conclusively putting to bed a number of myths about fees and participation, so I’m going to rectify that by explaining it to y’all in minute detail.

To understand this piece, you need to know something about a neat little Statscan tool called the Longitudinal Administrative Database (LAD).  Every time someone files an income tax form for the first time, LAD randomly selects one in five of them and follows them for their entire lifetime.  If at the time someone first files a tax return they have the same address as someone who is already in the LAD (and who is the right age to have a kid submitting a tax form for the first time), one can make a link between a parent and child.  In other words, for roughly 4% of the population, LAD has data on both the individual and the parent, which allows some intergenerational analysis.  Now, because we have tax credits for post-secondary education (PSE), tax data allows us to know who went to post-secondary education and who did not (it can’t tell us what type of institution they attended, but we know that they did attend PSE).  And with LAD’s backward link to parents, it means we can measure attendance by parental income.

Got that?  Good.  Let’s begin.

The paper starts by looking at national trends in PSE participation (i.e. university and college combined) amongst 19 year-olds since 2001, by family income quintile.  Nationally, participation rates rose by just over 20%, from 52.6% to 63.8%.  They also rose for every quintile.  Even for youth the lowest income quintile, participation is now very close to 50%.

 Figure 1: PSE enrolment rates by Income Quintile, Canada 2001-2014

PSE by Income Quintile

This positive national story about rates by income quintile is somewhat offset by a more complex set of results for participation rates by region.  In the 6 eastern provinces, participation rate rose on average by 13.6 percentage points; in the four western provinces, it rose by just 2.8 percentage points (and in Saskatchewan it actually fell slightly).  The easy answer here is that it’s about the resource boom, but if that were the case, you’d expect to see a similar pattern in Newfoundland, and a difference within the west between Manitoba and the others.  In fact, neither is true: Manitoba is slightly below the western average and Newfoundland had the country’s highest PSE participation growth rate.

 Figure 2: PSE Participation rates by region, 2002-2014

PSE by region

(actually, my favourite part of figure 2 is data showing that 19 year-old Quebecers – who mostly attend free CEGEPs, have a lower part rate than 19 year-old Ontarians who pay significant fees, albeit with benefit of a good student aid system.)

But maybe the most interesting data here is with respect to the closing of the gap between the top and bottom income quintile.  Figure 3 shows the ratio of participation rates of students from the bottom quintile (Q1) to those from the top quintile (Q5), indexed to the ratio as it existed in 2001, for Canada and selected provinces.  So a larger number means Q1 students are becoming more likely to attend PSE relative to Q5s and a smaller number means they are becoming less likely.  Nationally, the gap has narrowed by about 15%, but the interesting story is actually at the provincial level.

Figure 3: Ratio of Q1 participation rates to Q5 participation rates, Canada and selected provinces, 2001-2014

Q1 to Q5 participation rates

At the top end, what we find is that Newfoundland and Ontario are the provinces where the gap between rich and poor has narrowed the most.  Given that one of these provinces has the country’s highest tuition and the other the lowest, I think we can safely rule out tuition, on its own, as a plausible independent variable (especially as Quebec, the country’s other low-tuition province, posted no change over the period in question).  At the bottom end, we have the very puzzling case of Saskatchewan, where inequality appears to have got drastically worse over the past decade or so.  And again, though it’s tempting to reach for a resource boom explanation, nothing similar happened in Alberta so that’s not an obvious culprit.

Anyways, here’s why this work is important.  For decades, the usual suspects (the Canadian Federation of Students, the Canadian Center for Policy Alternatives) have blazed with self-righteousness about the effects of higher tuition and higher debts (debt actually hasn’t increased that much in real terms since 2000, but whatever).  But it turns out there are no such effects.  Over a decade of tuition continuing to increase slowly and average debts among those who borrow of over $25,000 and it turns out not only did participation rates increase, but participation rates of the poorest quintile rose fastest of all.

And – here’s the kicker – different provincial strategies on tuition appear to have had diddly-squat to do with it.  So the entire argument the so-called progressives make in favour of lower tuition is simply out the window.  That doesn’t mean they will change their position, of course.  They will continue to talk about the need to eliminate student debt because it is creating inequality (it’s actually the reverse, but whatever).  But of course, this make the free-tuition position even sillier.  If the problem is simply student debt, then why advocate a policy in which over half your dollars go to people who have no debt?

It’s the Ontario result in particular that matters: it proves that a high-tuition/high-aid policy is compatible with a substantial widening of access.  And that’s good news for anyone who wants smart funding policies in higher education.

March 27

Losing Count

Stop me if you’ve heard this story before: Canada is not sufficiently innovative, and part of the reason is that we don’t spend enough on research.  It’s not that we don’t spend enough on *public* research; adjusted for GDP, we actually do above-average on that.  What pulls us down is in international comparisons corporate R & D.  Our narrow-minded, short-sighted, resource-obsessed business class spends far less on R&D than its equivalents in most other country, and that is what gives us such a low overall R&D spend.

Familiar?  It should be; it’s been standard cant in Canada for a couple of decades at least.  And it gets used to argue for two very specific things.  There’s the argument which basically says “look, if private R&D is terrible, we’ll just have to make it up on the public side, won’t we?”, and where else to spend but on university research?  (Universities Canada used to make this argument quite a bit, but not so much lately AFAIK).  Then there’s the argument that says: well, since under the linear model of innovation in which public “R” leads to private “D”, the problem must be that public “R” is too theoretical on insufficiently focussed on areas of national industrial strength – and what we really need to do is make research more applied/translational/whatever.

But what if that story is wrong?

Last year, the Impact Centre at the University of Toronto put out a little-noticed paper called Losing Count. It noted a major problem related to the collection and reporting of R&D.  Starting in 1997, Statistics Canada adopted a definition of Research and Development which aligned with Canada’s tax laws.  This makes perfect sense from a reporting point of view, because it reduces the reporting burden on big corporations (they can use the same data twice).  But from a measuring Canada against other countries perspective, it’s not so good, because it means the Canadian statistics are different from those in the rest of the world.

Specifically, Canada since 1997 has under-reported Business R&D in two ways.  First, it does not report any R&D in the social sciences and humanities.  All those other OECD countries are reporting research in business, financial management, psychology, information science, etc., but we are not.  Second, work that develops or improves materials, products and processes, but that draws on existing knowledge rather than new scientific or new technological advances is not counted as Research & Development in Canada but is counted elsewhere.

How big a problem is this?  Well, one problem is that literally every time the Canada Revenue Agency tightens eligibility for tax credits, reported business R&D falls.  As this has happened a number of times over the past two decades, it may well be that our declining business R&D figures are actually a function of stricter tax laws than they are of changing business activity.  As for the difference in absolute amount being measured, it’s impossible to say.  The authors of the study took a sample of ten companies (which they recognize as not being scientific in any way) and determined that if the broader, more OECD-consistent definition were used, spending on R&D salaries would rise by a factor of three.  If that were true across the board (it probably isn’t) it would shift Canada from being one of the world’s weakest business R&D performers to one of the best.

Still, even if this particular result is not generalizable, the study remains valuable for two reasons.  First, it underlines how tough it is for statistical agencies to capture data on something as fluid and amorphous as research and development in a sensible, simple way.  And second, precisely because data is so hard to collect, international comparisons are extremely hard to make.  National data can be off by a very wide factor simply because statistical agencies make slightly different decisions about to collect data efficiently.

The takeaway is this:  the next time someone tells a story about how innovation is being throttled by lack of business spending on research (compared to say, the US or Sweden), ask them if they’ve read Losing Ground.  Because while this study isn’t the last word on the subject, it poses questions that no one even vaguely serious about playing in the Innovation space should be able to ignore.

January 27

A Slice of Canadian Higher Education History

There are a few gems scattered through Statistics Canada’s archives. Digging around their site the other day, I came across a fantastic trove of documents published by the Dominion Bureau of Statistics (as StatsCan used to be called) called Higher Education in Canada. The earliest number in this series dates from 1938, and is available here. I urge you to read the whole thing, because it’s a hoot. But let me just focus in on a couple of points in this document worth pondering.

The first point of interest is the enrolment statistics (see page 65 of the PDF, 63 of the document). It won’t of course surprise anyone to know that enrolment at universities was a lot smaller in 1937-38 than it is today (33,600 undergraduates then, 970,000 or so now), or that colleges were non-existent back then. What is a bit striking is the large number of students being taught in universities who were “pre-matriculation” (i.e. high school students). Nearly one-third of all students in universities in 1937-38 had this “pre-matric” status. Now, two-thirds of these were in Quebec, where the “colleges classiques” tended to blur the line between secondary and post-secondary (and, in their new guise as CEGEPs, still kind of do). But outside of British Columbia, all universities had at least some pre-matric, which would have made these institutions quite different from modern ones.

The second point of interest is the section on entrance requirements at various universities (page 12-13 of the PDF, p. 10-11 of the document). With the exception of UNB, every Canadian university east of the Ottawa River required Latin or Greek in order to enter university, as did Queens, Western and McMaster. Elsewhere, Latin was an alternative to Mathematics (U of T), or an alternative to a modern language (usually French or German). What’s interesting here is not so much the decline in salience of classical languages, but the decline in salience of any foreign language. In 1938, it was impossible to gain admission to a Canadian university without first matriculating in a second language, and at a majority of them a third language was required as well. I hear a lot of blah blah about internationalization on Canadian campuses, but 80 years on there are no Canadian universities which require graduates to learn a second language, let alone set this as a condition of entry. An area, clearly, where we have gone backwards.

The third and final bit to enjoy is the section on tuition fees (page 13), which I reproduce here:


*$1 in 1937-38 = $13.95 in 2016
**$1 in 1928-29 = $16.26 in 2016

Be a bit careful in comparing across years here: because of deflation, $100 in 1928 was worth $85 in 1937 and so institutions which kept prices stable in fact saw a rise in income in real terms. There are a bunch of interesting stories here, including the fact that institutions had very different pricing strategies in the depression. Some (e.g. McGill, Saskatchewan, Acadia) increased tuition while others (mostly Catholic institutions like the Quebec seminaries and St. Dunstan’s) either held the line or reduced costs. Also mildly amusing is the fact that McGill’s tuition for in-province students is almost unchanged since 1937-38 (one can imagine the slogan: “McGill – we’ve been this cheap since the Rape of Nanking!”).

The more interesting point here is that if you go back to the 1920s, not all Canadian universities were receiving stable and recurrent operating grants from provincial governments (of note: nowhere in this digest of university statistics is government funding even mentioned). Nationally, in 1935, all universities combined received $5.4 million from provincial governments – and U of T accounted for about a quarter of that. For every dollar in fees universities received from students, they received $1.22 from government. So when you see that universities were for the most part charging around $125 per students in 1937-38, what that means is that total operating funding per student was maybe $275, or a shade under $4500 per student in today’s dollars. That’s about one-fifth of today’s operating income per student.

While most of that extra per-student income has gone towards making institutions more capital-intensive (scientific facilities in general were pretty scarce in the 1930s), there’s no question that the financial position of academics had improved. If you take a quick gander at page 15, which shows the distribution of professorial salaries, you’ll see that average annual salaries for associate profs was just below $3500, while those for full professors was probably in the $4200 range. Even after for inflation, that means academic salaries were less than half what they are today. Indeed, one of the reasons tenure was so valued back then was that job security made up for the not-stellar pay. Times change.

In any case, explore this document on your own: many hours (well, minutes anyway) of fun to be had here.

January 26

An Amazing Statscan Skills Study

I’ve been hard on Statscan lately because of their mostly-inexcusable data collection practices.  But every once in awhile the organization redeems itself.  This week, that redemption takes the form of an Analytical Studies Branch research paper by Marc Frenette and Kristyn Frank entitled Do Postsecondary Graduates Land High-Skilled Jobs?  The implications of this paper are pretty significant, but also nuanced and susceptible to over-interpretation.  So let’s go over in detail what this paper’s about.

The key question Frenette & Frank are answering is “what kinds of skills are required in the jobs in which recent graduates (defined operationally here as Canadians aged 25-34 with post-secondary credentials) find themselves”.  This is not, to be clear, an attempt to measure what skills these students possess; rather it is an attempt to see what skills their jobs require.  Two different things.  People might end up in jobs requiring skills they don’t have; alternatively, they may end up in jobs which demand fewer skills than the ones they possess.  Keep that definition in mind as you read.

The data source Frenette & Frank use is something called the Occupational Information Network (O*NET), which was developed by the US Department of Labour.  Basically, they spend ages interviewing employees, employers, and occupational analysts to work out skill levels typically required in hundreds of different occupations.  For the purpose of this paper, the skills analyzed and rated include reading, writing, math, science, problem solving, social, technical operation, technical design and analysis and resource management (i.e. management of money and people).  They then take all that data and transpose it onto Canadian occupational definitions.  So now they can assign skill levels to nine different dimensions of skills to each Canadian occupation.  Then they use National Household Survey data (yes, yes, I know), to look at post-secondary graduates and what kind of occupations they have.  On the basis of this, at the level of the individual, they can link highest credential received to the skills required in their occupation.  Multiply that over a couple of million Canadians and Frenette and Frank have themselves one heck of a database.

So, the first swing at analysis is to look at occupational skill requirements by level of education.   With only a couple of exceptions – technical operations being the most obvious one – these more or less all rise according to the level of education. The other really amusing exception is that apparently PhDs do not occupy/are not offered jobs which require management skills.  But it’s when they get away from level of education and move to field of study that things get really interesting.  To what extent are graduates from various fields of study employed in jobs that require,  for instance, high levels of reading comprehension or writing ability?  I reproduce Frenette & Frank’s results below.


Yep.  You read that right.  Higher reading comprehension skill requirements are for jobs occupied by Engineers.  Humanities?  Not so much.


It’s pretty much the same story with writing, though business types tend to do better on that measure.


…and critical thinking rounds out the set.

So what’s going on here?  How is it that that humanities (“We teach people to think!“) get such weak scores and “mere” professional degrees like business and engineering do so well?  Well, let’s be careful about interpretation.  These charts are not saying that BEng and BCom grads are necessarily better than BA grads at reading, writing and critical thinking, though one shouldn’t rule that out.  They’re saying that BEng and BCom grads get jobs with higher reading, writing and critical thinking requirements than do BAs.  Arguably, it’s a measure of underemployment rather than a measure of skill.  I’m not sure I personally would argue that, but it is at least arguable.

But whatever field of study you’re from, there’s a lot of food for thought here.  If reading and writing are such a big deal for BEngs, should universities wanting to give their BEngs a job boost spend more time giving them communication skills?  If you’re in social sciences or humanities, what implications do these results have for curriculum design?

I know if I were a university President, these are the kinds of questions I’d be asking my deans after reading this report.

January 13

Restore the NGS!

One of the best things that Statistics Canada ever did in the higher education field was the National Graduates’ Survey (NGS). OK, it wasn’t entirely Statscan – NGS has never been a core product funded from the Statscan budget but rather funded periodically by Employment and Social Development Canada (ESDC) or HRDC or HRSDC or whatever earlier version of the department you care to name – but they were the ones doing the execution. After a trial run in the late 1970s (the results of which I outlined back here), Statscan tracked the results of the graduating cohorts of 1982, 1986, 1990, 1995, 2000 and 2005 two and five years after graduation (technically, only the 2-year was called NGS – the 5-year survey was called the Follow-up of Graduates or FOG but no one used the name because it was too goofy). It became the prime way Canada tracked transitions from post-secondary education to the labour market, and also issues related to student debt.

Now NGS was never a perfect instrument. Most of the income data could have been obtained much more simply through administrative records, the way Ross Finnie is currently doing at EPRI. We could get better data on student debt of provinces ever got their act together and actually released student data on a consistent and regular basis (I’m told there is some chance of this happening in the near future). It didn’t ask enough questions about activities in school, and so couldn’t examine the effects of differences in provision (except for, say, Field of Study) on later outcomes. But for all that it was still a decent survey, and more to the point one with a long history which allowed one to make solid comparisons over time.

Then, along comes budget cutting exercises during the Harper Government. ESDC decides it only has enough money for one survey, not two. Had Statscan or ESDC bothered to consult anyone about what to do in this situation, the answer would almost certainly have been: keep the 2-year survey and ditch the 5-year one. The 5-year survey was always beset with the twin problems of iffy response rates and being instantly out of date by the time it came out (“that was seven graduating classes ago!” people would say – “what about today’s graduates”?). But the 2-year? That was gold, with a decent time series going back (in some topic areas) back almost 30 years. Don’t touch that, we all would have said, FOR GOD’S SAKE DON’T TOUCH IT, LEAVE IT AS IT IS.

But of course, Statscan and ESDC didn’t consult and they didn’t leave it alone. Instead of sticking with a 2-years out survey, they decided to do a survey of students three years out, thereby making the results for labour market transitions totally incompatible with the previous six iterations of the survey. They spent millions to get a whole bunch of data which was hugely sub-optimal because they murdered a perfectly good time-series to get it.

I have never heard a satisfactory explanation as to why this happened. I think it’s either a) someone said: “hey, if we’re ditching a 2-year and a 5-year survey, why not compromise and make a single 3-year survey?” or b) Statscan drew a sample frame from institutions for the 2010 graduating class, ESDC held up the funding until it was too late to do a two-year sample and then when it eventually came through Statscan said, “well we already have a frame for 2010, so why not sample them three years out instead of doing the sensible thing and going back and getting a new frame for the 2011 cohort which would allow us to sample two years out”. To be clear, both of these possible reasons are ludicrous and utterly indefensible as a way to proceed with a valuable dataset, albeit in different ways. But this is Ottawa so anything is possible.

I have yet to hear anything about what, if anything, Statscan and ESDC plan to do about surveying the graduating cohort of 2015. If they were going to return to a two-year cycle, that would mean surveying would have to happen this spring; if they’re planning on sticking with three, the survey would happen in Spring 2018. But here’s my modest proposal: there is nothing more important about NGS than bringing back the 2-year survey frame. Nothing at all. Whatever it takes, do it two years out. If that means surveying the class of 2016 instead of 2015, do it. We’ll forget the Class of 2010 survey ever happened. Do not, under any circumstances, try to build a new standard based on a 3-year frame. We spent 30 years building a good time series at 24 months out from graduation. Better have a one-cycle gap in that time series than spend another 30 years building up an equally good time-series at 36 months from graduation.

Please, Statscan. Don’t mess this up.

December 07

Two (Relatively) Good News Studies

A quick summary of two studies that came out this week which everyone should know about.

Programme for International Student Assessment (PISA)

On Tuesday, the results for the 2015 PISA tests were released.  PISA is, of course, that multi-country assessment of 15 year-olds in math, science and reading which takes place every three years and is managed by the Organization for Economic Co-operation and Development (OECD).  PISA is not a test of curriculum knowledge (in an international context that would be really tough); what it is instead is a test of how well individuals’ knowledge of reading, math and science can be applied to real-world challenges.  So the outcomes of the test can best be thought of as some sort of measure of cognitive ability in various domains.

In addition to taking the tests, students also answer questions about themselves, their study habits and their family background. Schools also provide information about the kinds of resources they have and what kind of curriculum structure they use, there is an awful lot of background information about each student who takes the test, and that permits some pretty interesting and detailed cross-national examination in the determinants of this cognitive ability.  And from this kind of analysis, the good folks at OECD have determined that government policy is best focused in four areas.

But heck, nobody wants to hear about that; what everybody wants to know is “where did we rank”?  And the answer is: pretty high.  The short version is here and the long version here, but here are the headlines: Out of the 72 countries where students took the test, Canada came 2nd in Reading, 7th in Science and 10th in Math.  If you break things down to the sub-jurisdictional level (Canada vastly oversamples compared to other countries so that it can get results at a provincial level), BC comes first in the world for reading (Singapore second, Alberta third, Quebec fourth and Ontario fifth).  In Science, Alberta and British Columbia come second and third in the world (behind only Singapore which as a country came top in every category).  In Math, the story is not quite as good, but Quebec still cracks the top three.

CMEC also has a publication out which goes into more depth at the provincial level (available here).  The short story is our four big provinces do well across the board but the little ones less so (in some cases much less so).  Worth a glance if comparing provinces rather than countries is your thing.

One final little nugget from the report: the survey taken by students asks if the students see themselves heading towards a Science-based career in the future.  In Canada, 34% said yes, the second highest of any country in the survey (after the US).  I’d like to think this will put to rest all the snarky remarks about how kids aren’t sufficiently STEM-geared these days (<cough> Ken Coates <cough>), but I’m not holding my breath.

Statscan Report on Youth Employment

Statistics Canada’s put out some interesting data youth employment by Rene Morisette on Monday.  It’s one of those half-full/half-empty stories: the youth unemployment rate is back down to 13% where it was in 1976 (and hence lower than it has been for most of the intervening 40 years), but the percentage of youth working full-time has dropped.  The tricky part of this analysis – not really covered by the paper – is that the comparison in both time periods excludes students.  That makes for a tricky comparison because there are proportionately about 3 times as many students as there were 40 years ago.  To put that another way, there are a lot fewer bright kids – that is, the kind likely to get and keep jobs – not in school now than in 1976.  So it’s not quite an apples-to-apples comparison and it’s hard to know what having more young people in school actually does to the employment rate.

Aside from data on employment rates, the report (actually a condensation of some speaking notes and graphs from a presentation made earlier this year) also includes a mishmash of other related data, from differing recent youth employment trends in oil provinces vs. non-oil provinces (short version: they’re really different) to gender differences in graduate wage premiums (bigger for women than men, which may explain participation rate differences), to trends in overall graduate wage premiums.  Intriguingly, these rose through the 80s and 90s but are now declining back to 1980 levels, though whether that is due to an increase in the supply of educated labour or reflects broader changes in the labour market such as the “Great Reversal” in the demand for cognitive skills that UBC’s David Green and others have described is a bit of a mystery.

But don’t take my word for it: have a skim through the report (available here).  Well worth a few minutes of your time.

October 13

Statistics Canada is in the Wrong Century

If what you are looking for is agricultural statistics, Statistics Canada is a wondrous place.  See, Statscan even made a fabulous (if oddly truncated) little video about agricultural statistics.

Statscan can tell you *anything* about agriculture.  Monthly oilseed crushing statistics?  No problem (59,387 tonnes in August, in case you were wondering).  It can tell you on a weekly basis the weight of all eggs laid and processed in Canada (week of August 1st = 2.3 million kilograms); it can even break it down by “frozen” and “liquid”.  Want to know the annual value of ranch-raised pelts in Canada?  Statscan’s got you covered.

But let’s not stop here.  Wondering about barley, flaxseed, and canola deliveries for August, by province?  Check.  National stocks of sweetened concentrated whole milk, going back to 1970? Check (for comparison, GDP data only goes back to 1997).  Average farm prices for potatoes, per hundredweight, back to 1908?  Check.

There is even – and this one is my favourite – an annual Mushroom Growers’ Survey.  (Technically, it’s a census of mushroom growers, – and yes, this means Statscan expends resources to maintain a register of Canadian mushroom growers; let that sink in for a moment.)  From this survey – the instrument is here – one can learn what percentage of mushrooms grown in Canada are of the Shiitake variety, whether said Shiitake mushrooms are grown on logs, in sawdust, or pulp mill waste fibers, and then compare whether the value per employee of mushroom operations is greater or lesser for Shiitake mushrooms than for Agaricus or Oyster mushrooms.

According to Statistics Canada, this is actually worth spending money on.  This stuff matters.

Also according to Statistics Canada: the combined value of agriculture, forestry, fishing, and hunting is $25 billion.  Or about $10 billion a year less than the country spends on universities alone.  Total value of educational services is $86 billion a year.

And yet, here are a few things Statscan doesn’t know about education in Canada: the number of first-year students in Canada, the number of part-time instructors at Canadian universities, the number of part-time professors at universities, anything at all about college instructors, access rates to post-secondary education by ethnic background or family income, actual drop-out and completion rates in secondary or post-secondary education, the number of new entrants each year to post-secondary education, the rate at which students transfer between universities and colleges, or within universities and colleges, time-to-completion, rates of student loan default, scientific outputs of PSE institutions, average college tuition, absolutely anything at all about private for-profit trainers… do I need to go on?  You can all list your pet peeves here.

Even on topics they do know, they often know them badly, or slowly.  We know about egg hatchings from two months ago, but have no idea about college and university enrolment from fall 2013.  We have statistics on international students, but they do not line up cleanly with statistics from Immigration & Citizenship.  We get totals on student debt at graduation from the National Graduates Survey, but they are self-reports and are invariably published four years after the student graduates.

What does it say about Canada’s relationship to the knowledge economy, when it is official policy to survey Mushroom growers annually, but PSE graduates only every five years?  Who in their right mind thinks this is appropriate in this day and age?

Now, look, I get it: human capital statstics are more complicated than education statistics, and it takes more work, and you have to negotiate with provinces and institutions, and yadda yadda yadda.  Yes.  All true.  But it’s a matter of priorities.  If you actually thought human capital mattered, it would be measured, just as agriculture is.

The fact that this data gap exists is a governmental problem rather than one resulting from Stastcan, specifically.  The agency is hamstrung by its legislation (which mandates a substantial focus on agriculture) and its funding.  Nevertheless, the result is that we have a national statistical system that is perfectly geared to the Edwardian era, but one that is not fit for purpose when it comes to the modern knowledge economy.  Not even close.

September 10

How StatsCan Measures Changes in Tuition

Every September, Statistics Canada publishes data on “average tuition fees”. It’s a standard date on the back-to-school media calendar, where everyone gets to freak out about the cost of education.  And we all take it for granted that the data StatsCan publishes is “true”.  But there are some… subtleties… to the data that are worth pointing out.

Statistics Canada collects data on tuition from individual institutions through a survey called the Tuition and Living Accommodation Survey (TLAC).  For each field of study at each institution, TLAC asks for “lower” and “upper” fees separately for Canadian and foreign students, for both graduate and undergraduate students.  Now, in provinces where the “upper” and “lower” figure are the same (eg. Newfoundland), it’s pretty simple to translate lower/upper to “average”.  In Quebec and Nova Scotia, where “upper” and “lower” are functionally equivalent to “in-province” and “out-of-province”, averages can be worked out simply by cross-referencing to PSIS enrolment data, and weighting the numbers according to place of student origin.  Everywhere else, it’s a total mess.  In Ontario, significant variation between “upper” and “lower” numbers are the norm, even inside the institution (for instance, with different tuition levels for different years of study).  Somehow, StatsCan uses some kind of enrolment weighting to produce an average, but how the weights are derived is a mystery.  Finally, in a couple of provinces where there are differences between the “lower” and “upper” figures, StatsCan chooses to use the “lower” figure as an average.  (No, I have absolutely no idea why).

But the tuition data is squeaky clean compared to the mess that is StatsCan’s data on ancillary fees.  Institutions fill in the ancillary fee part of the questionnaire every year, but usually without much reference to what was reported the year before.   Since StatsCan doesn’t have the staff to thoroughly check the information, institutional figures swing pretty wildly up and down from one year to the next, even though everyone knows perfectly well ancillary fees only ever go in one direction.

Another complication is that “average” is a central tendency – it is affected not just by posted prices, but also by year-to-year shifts in enrolments.  As students switch from cheaper to more expensive programs (e.g. out of humanities and into professional programs), average tuition rises.  As student populations grow more quickly in the more expensive provinces (e.g. Ontario) than in cheaper ones (e.g. Quebec, Newfoundland), then again average tuitions rise – even if all fees stayed exactly the same.  Both of these things are in fact happening, and are small but noticeable contributors to the “higher tuition” phenomenon.

A final complicating factor: the data on tuition and the data on enrolment by which it’s weighted come from completely different years.  Tuition is up-to-the-minute: the 2014-15 data will be from the summer of 2014; the enrolment data by which it is weighted will be 2012-3.  And, to make things even weirder, when StatsCan presents the ’14-15 data next year as a baseline against which to measure the ’15-16 data, it will be on the basis of revised figures weighted by an entirely different year’s enrolment data (2013-4).

In short, using SatsCan tuition data is a lot like eating sausages: they’re easier to digest if you don’t know how they’re made.

September 08

Some Scary Graduate Income Numbers

Last week, the Council of Ontario Universities put out a media release with the headline “Ontario University Graduates are Getting Jobs”, and trumpeted the results of the annual provincial graduates survey, which showed that 93% of undergraduates had jobs two years after graduation, and their income was $49,398.  Hooray!

But the problem – apart from the fact that it’s not actually 93% of all graduates with jobs, but rather 93% of all graduates who are in the labour market (i.e. excluding those still in school) – is that the COU release neither talks about what’s going on at the field of study level, nor places the data in any kind of historical context.  Being a nerd, I collect these things when they come out each year and put the results in a little excel sheet.  Let’s just say that when you do compare these results to earlier years, things look considerably less rosy.

Let’s start with the employment numbers, which look like this:

Figure 1: Employment Rate of Ontario Graduates 2 Years Out, Classes of 1999 to 2011














Keep your eye on the class of 2005 – this was the last group to be measured 2 years out before the recession began (i.e. in 2007).  They had overall employment rates of about 97%, meaning that today’s numbers actually represent a 4-point drop from there.  If you really wanted to be mean about it, you could equally say that graduate unemployment in 2013 has doubled since 2007.  But look also at what’s happened to the Arts disciplines: in the first four years of the slowdown, their employment rates fell about two percentage points more than the average (though, since the class of ’09, their employment levels-out).

Still, one might think: employment rates in the 90s – not so bad, given the scale of the recession.  And maybe that’s true.  But take a look at the numbers on income:

Figure 2: Average Income (in $2013) 2 Years After Graduation, Ontario Graduating Classes from 2003-2011, Selected Disciplines














Figure 2 is unequivocally bad news.  The average in every single discipline is below where it was for the class of 2005.  Across all disciplines, the average is down 13%.  Engineering and Computer Science are down the least, and have made some modest real gains in the last couple of years; for everyone else, the decline is in double-digits.  Business: down 11%.  Humanities: down 20%.  Physical Sciences: down 22% (more evidence that generalizations about STEM disciplines are nonsense).

Now, at this point some of you may be saying: “hey, wait a minute – didn’t you say last year that incomes 2 years out were looking about the same as they did for the class of 2005?”  Well, yes – but you may also recall that a couple of days later I called it back because Statscan did a whoopsie and said: “you know that data we said was two years after graduation?  Actually it’s three years out”.

Basically, the Ontario data is telling us that 2 years out ain’t what it used to be, and the Statscan data is telling us is that three years out is the new two; simply, it now takes 36 months for graduates to reach the point they used to reach in 24.  That’s not a disaster by any means, but it does show that – in Ontario at least – recent graduates are having a tougher time in the recession.

Tomorrow: more lessons in graduate employment data interpretation.

August 26

What Students Really Pay

In a couple of weeks, Statistics Canada will publish its annual Tuition and Living Accommodation Cost (TLAC) survey, which is an annual excuse to allow the usual suspects to complain about tuition fees.  But sticker price is only part of the equation: while governments and institutions ask students to pay for part of the educational costs, they also find ways to lessen the burden through subsidies like grants, loan remission, and tax expenditures.  And Statscan never bothers to count that stuff.

Today, we at HESA are releasing a publication called The Many Prices of Knowledge: How Tuition and Subsidies Interact in Canadian Higher Education.  Unlike any previous publication, it looks not just at a single sticker price, but rather at the many different possible prices that students face depending on their situation.  We take ten student cases (e.g. first-year dependent student in college, family income = $80,000; married university student, spousal income = $40,000; etc.), and we examine how much each student would be able to receive in grants, tax credits, and loan remission in each of the ten provinces.  It thus allows us to compare up-front net tuition (i.e. tuition minus grants) and all-inclusive net tuition (i.e. tuition minus all subsidies) not just across provinces, but also across different students within a single province.

Some nuggets:

  • On average, a first-year, first-time student attending university direct from high-school, with a family income of $40,000 or less receives $63 more in subsidies than they pay in tuition, after all subsidies – including graduate rebates – are accounted for (i.e. they pay net zero tuition on an all-inclusive basis).  If they attend college, they receive roughly $1,880 more in subsidies than they pay in tuition (i.e. -$1800 tuition);
  • A first-year, first-time student attending university from a family with $40K in Quebec, after all government subsidies, pays -$393 in all-inclusive net tuition.  In Ontario, the same student pays -$200.  But if we were to include institutional aid, the student in Ontario would likely be the one better off, since students in Ontario with entering averages over 80% regularly get $1,000 entrance awards, while students in Quebec tend not to.  For some students at least, Ontario is cheaper than Quebec;
  • On average, college students who are also single parents receive something on the order of $11,000 in non-repayable aid – that is, about $8,500 over and above the cost of tuition.   In effect, it seems to be the policy of nearly all Canadian governments to provide single parents with tuition plus the cost of raising kids in non-repayable aid, leaving the student to borrow only for his/her own living costs.

The upshot of the study is that Canada’s student aid system is indeed generous: in none of our case studies did we find a student who ended up paying more than 62% of the sticker price of tuition when all was said and done, and most paid far less.  But if that’s the case, why are complaints about tuition so rife?

Two reasons, basically.  First, Canada’s aid system may be generous, but it is also opaque.  We don’t communicate net prices effectively to students because institutions, the provinces, and Ottawa each want to get credit for their own contributions.  If you stacked all the student aid up in a comprehensible single pile, no one would get credit.  And we can’t have that.

The second reason is that Canada only provides about a third of its total grant aid at the point where students pay tuition fees.  Nearly all the rest, stupidly, arrives at the end of a year of studies.  More on that tomorrow.

Page 1 of 41234