HESA

Higher Education Strategy Associates

Tag Archives: Statistics Canada

March 05

The Changing Face of Vocational Training

Canada’s community colleges are often thought of as places to learn “vocational” skills. But what counts as a “vocational” skill these days doesn’t always line up with popular perceptions. One area in which this is particularly true is with respect to the trades. My associate Lori McElroy has been doing some interesting work in this respect which I think is worth sharing.

When we think of colleges and trades, we often think of a lot of programs that are not actually “college level.” There’s apprenticeship technical training which takes place in colleges: roughly 150,000 students per year coming for eight weeks each on average. In FTE terms, these make up less than 5% of what colleges do. There are technical pre-employment programs of less than a year’s length, but these are increasingly rare and not even considered post-secondary level by Statistics Canada.

Within college-level programs – that is, diplomas and certificates – there are of course large numbers of trades and technical programs, but they are a very long way from being the mainstream of college programming. Nationally, only 16% of students in diploma and certificate programs (a definition which excludes all those university-transfer programs in Alberta and B.C. as well as academic DECs in Quebec) are in technical and trades-related programs. And this figure varies substantially from one province to another, ranging from 36% in Nova Scotia to just 9% in British Columbia.

Figure 1: Proportion of Diploma- and Certificate-Program Students Enrolled in Technical and Trade-Related Programs, 2008-09

Source: Statistics Canada, Postsecondary Student Information System

So if trades aren’t the mainstay of colleges, what is? Well, a look at Statistics Canada enrolment data shows pretty clearly that it’s programs that have substantial overlap with university studies.

Figure 2: College Enrolment by Program Type, 2008-09

Source: Statistics Canada, Postsecondary Student Information System

Nearly three out of ten students are in some kind of business program; and another 32% are in some kind of program related to arts, fine arts, social sciences or sciences. In other words, over 60% of college students studying in areas which are hardly a million miles from what universities do. Obviously, these programs are shorter and more applied than university degrees, but the actual subject matter is reasonably close.

It’s reasonable to look at this as evidence of “academic drift” among colleges. But that implies that the drift has all been one way. Take a look at new programs offered by universities over the last 30-40 years and it becomes quickly clear that the charge of “vocational drift” among universities is at least as valid. The world of work has changed and a lot of new applied fields lie somewhere between where universities and colleges used to be – gradually, they are converging in the middle.

Trades will always have an important place in post-secondary education, of course. But the new approach to vocational training seems to be involve a much softer set of skills.

February 23

Statistics Canada and the Two Types of Data

People often berate Statistics Canada when it comes to producing data on education in Canada. And not entirely without reason: there are some statistics that Canadians seem to be especially bad at producing. But it’s also worth noting that there are other kinds of data that Statscan is extraordinarily good at capturing – data that researchers in other countries would kill to have.

When it comes to research, there are broadly two types of data. The first is factual, aggregate data, which usually comes from administrative sources and is usually analyzed by means of a time series. We use this approach to measure things like aggregate enrolments, tuition fees, number of professors, university and college finances, etc. Call this Type I data.

Statistics Canada’s record in obtaining Type I data is disappointing. We seem to have a ludicrously difficult time in this country collecting data on part-time academics, college students, or the number of new students entering into higher education in a given year. Consistent ancillary fee data is also a challenge. There are a number of reasons why this is so, not all of which are Statistics Canada’s fault. But at the end of the day, the Type I data products Statistics Canada puts out are pretty disappointing.

But Type II data is a totally different story. Type II is based on surveys, not administrative data, and at its best is longitudinal. This is the stuff that helps researchers draw empirical conclusions about the relationship between inputs and outputs. When it comes to Type II data, Statscan is the envy of the world. The data in the Access and Support to Education and Training Survey does a good job of covering key topics in education across the life cycle, even if the public use file is a bit on the irritatingly useless side. And the Youth in Transition Survey, a magnificent 12-year longitudinal effort linked to the PISA test, is quite simply the best data source for the transition from high school to adulthood anywhere in the world.

So is Statscan doing a good job or a bad job with educational statistics? Well, if what you’re worried about is the ability to compare Canada to other OECD countries every year in Education at a Glance, then it’s doing a bad job, because that’s all Type I data. But if you’re interested in understanding the dynamics of access and outcomes in education, then it’s doing an exceptionally good job.

One small problem though: Statscan’s core funding is devoted entirely to Type I data; the funding for Type II comes entirely from HRSDC, whose commitment to supporting these surveys is best described as “soft.”

Cause for concern.

January 16

No More Boring Surveys

As most of you probably know, we at HESA spend a lot of our time working on surveys. While doing so, we see a lot of different types of survey instruments, especially from governments and institutions. And we’ve come to a major conclusion:

Most of them are really boring.

There was a time – say fifteen years ago– when doing surveys of applicants, graduates and alumni was relatively rare. There weren’t any surveys of satisfaction, or engagement, or anything else, really. We knew essentially nothing about the composition of the student body, their background, their finances or their experiences once they were there. Apart from the National Graduates Survey that Statistics Canada put out every four years, there was really almost nothing out there to tell us about students.

Things started to change in the late 1990s and early 2000s. Statscan and HRDC between them put the Youth in Transition Survey (YITS) into the field, along with the Post-Secondary Education Participation Survey (PEPS) and the Survey of Approached to Educational Planning (SAEP) (the latter two now being subsumed into the ASETS survey). A group of twenty or so institutions banded together to create the undergraduate survey consortium (CUSC); other institutions began signing on to a private-sector initiative (from the company that later became Academica) to look at data about applicants. Provincial governments began surveying graduates more regularly, too; the Millennium Scholarship Foundation also spurred some developments in terms of looking at student financing and students at private vocational colleges. That’s not to forget the post-2004 NSSE boom and the plethora of smaller institutional surveys now being conducted.

This explosion of activity – born of a combination of increasing policy interest in the field and decreasing survey costs – was all to the good. We learned a lot from these surveys, even if the insights they generated weren’t always used to improve programming and policy as much as they might.

But it’s getting really stale. Now that we have these surveys, we’ve stopped asking new questions. All anyone seems to want to do is keep re-running the same questions so we can build up time-series. Nothing wrong with time-series, of course – but since change in higher education comes at such a glacial pace, we’re wasting an awful lot of time and money measuring really tiny incremental changes in student engagement and the like rather than actually learning anything new. Where ten years ago we were discovering things, now we’re just benchmarking.

It doesn’t have to be this way. Over the next four days, we’ll be giving you our suggestions for how to change Canadian PSE surveys. It’s time to start learning again.

January 10

A Good Decade for Profs

I was browsing through some Statistics Canada data on university salaries the other day, and I rapidly came to the conclusion that there have been few decades in which it was better to be a prof than the last one. As the following table shows, over the years 2001 to 2009 (the years for which I could get good-quality data from Statscan for free – this email’s not paying a paying gig unfortunately), pay for full professors in non-medical disciplines across Canada rose at a rate very close to three times the rate of inflation, and about 85% faster than the average Canadian wage.

Change 2001 to 2009

There was some variation across institutions, of course. Generally speaking, pay increases were slightly higher at large institutions, and were definitely larger the further west one goes. But there was no university where the increase in salary was less than the increase in the national average wage. To put it another way, it’s a perfect Lake Wobegon situation, in that all professors are above-average.

Above-average, but not table-topping. Academics didn’t fare quite as well over the last decade as people from the “forestry, fishing, mining oil and gas category” – miners in particular have made out like bandits since about 2004. But they did fare substantially better than some other employment categories against which they might plausibly be measured: Finance/Real Estate (37%), Educational Services (27%) and especially Professional, Scientific & Technical (3%) (curious what STEM advocates have to say about that last one? Me too).

Increase in Average Nominal Earnings, by Occupation Sector, 2001 to 2009

(If you’re wondering how academics as a whole can be up 42% when every individual rank is up by more than that – it’s a composition issue. Proportionately, there are a lot more junior rank professors than there used to be, and that drags the average down.)

Now the obvious retort here is that by using the lens of a decade, I’m distorting the longer-term picture. To a considerable extent, the rapid rises of last decade were a reaction to the 1990s, when academic salaries rose by slightly less than inflation. That’s somewhat persuasive if you’re comparing academic salaries to inflation – less so to average wages, which also took a beating in the 1990s.

The cause of this? It’s not a drying up of supply – PhDs are being pumped out faster than ever. Nor, as we have seen, is it competition in the private sector – quite the opposite since professional/scientific wages were so weak. In fact, it’s really hard to avoid the conclusion that the cause is simply availability of government funding. The 90s were a decade of restraint and the 00s weren’t – and salaries changed accordingly.

November 30

Graduate Incomes and Getting Better Data

With most of the world undergoing a serious bout of youth unemployment, there’s been a lot of focus on graduate earnings and whether or not we are “overproducing” graduates. As I’ve noted before, some of this talk is nonsense, but given the times, the focus on outcomes isn’t surprising.

Don’t tell Margaret Wente, but in China the government is actively cutting majors that don’t produce high levels of post-graduation employment. In the U.S., there’s an increasing number of stories (like this one from the Wall Street Journal) trying to point graduates to the “right” disciplines in a tight labour market. As others have pointed out in fact, a lot of the highest disciplinary rates – the ones that really attract attention – actually have really small enrolments. What the data really shows is that graduates of nearly all fields of study in America have unemployment rates lower than those of non-graduates (something that isn’t true in China).

Turning to Canada, we’re clearly doing better than most in terms of unemployment. What we’re not doing so well is producing timely data on graduate outcomes (doesn’t that WSJ data make you drool?). Our best data comes from the National Graduates Survey – which looks at people who graduated in 2005. Not much use in today’s environment.

Admittedly, this kind of data is expensive to collect via surveys, and that’s why a budget-challenged Statistics Canada isn’t rushing out to do more. More data may be available when the National Household Survey results arrive in 2013, but it’s still not clear how useful that survey will be.

But there’s another way to get this data. Statscan has information on nearly all Canadian students in its Post-Secondary Student Information System (PSIS). It is possible, using probabilistic matching, to link this data to the Longitudinal Administrative Database (LAD), which is a database containing the complete tax records of one out of every five taxfilers. With that kind of link, it is possible to get continuous, year-by-year updates on how well students are doing in the labour market, and to report it however we want, even by field of study.

A PSIS-LAD link would give us high-quality, timely, policy-relevant labour-market data – all with no new data collection costs. Can someone explain why we aren’t doing this already?

September 14

Data Point of the Week: StatsCan Gets it Wrong in the EAG

So, as noted yesterday, the OECD’s Education at a Glance (EAG) statfest – all 495 pages of it – was just released. Now it’s our turn to dissect some of what’s in there.

Of most immediate interest was chart B5.3, which shows the relative size of public subsidies for higher education as a percentage of public expenditures on education. It’s an odd measure, because having a high percentage could mean either that a country has very high subsidies (e.g., Norway, Sweden) or very low public expenditures (e.g., Chile), but no matter. I’ve reproduced some of the key data from that chart below.

 

(No, I’m not entirely clear what “transfers to other entities” means, either. I’m assuming it’s Canada Education Savings Grants, but I’m not positive.)

Anyways, this makes Canada looks chintzy, right? But hang on: there are some serious problems with the data.

In 2008, Canada spent around $22 billion on transfers to institutions. For the chart above to be right would imply that Canadian spending on “subsidies” (i.e., student aid) was in the $3.5 – 4 billion range. But that’s not actually true – if you take all the various forms of aid into account, the actual figure for 2008 is actually closer to $8 billion.

What could cause such a discrepancy? Here’s what I’m pretty sure happened:

1) StatsCan didn’t include tax credits in the numbers. Presumably this is because they don’t fit the definition of a loan or a grant, though in reality these measures are a $2 billion subsidy to households. In fairness, the U.S. – the only other country that uses education tax credits to any significant degree – didn’t include it either, but it’s a much bigger deal here in Canada.

2) StatsCan didn’t include any provincial loans, grants or remission either. They have form on this, having done the same thing in the 2009 EAG. Basically, because StatsCan doesn’t have any instrument for collecting data on provincial aid programs, it essentially assumes that such things must not exist. (Pssst! Guys! Next time, ask CMEC for its HESA-produced database of provincial aid statistics going back to 1992!) So, what happens when you add all that in (note: U.S. data also adjusted)?

 

Not so chintzy after all.

Page 4 of 41234