HESA

Higher Education Strategy Associates

Tag Archives: OECD

April 24

Using Comparative Labour Market Outcome Data to Think About Education

So, recently, a colleague sent me some data produced by CMEC on the subject of labour market outcomes by educational attainment, among 16-65 year-olds.  Here’s the first one, showing outcomes for Canada.

Labour Market Status by Educational Attainment, 16-65 Year-Olds, Canada, 2012 (Source: The Programme for the International Assessment of Adult Competencies, 2012)

image001

 

 

 

 

 

 

 

 

 

 

 

And here’s a similar one, showing the same thing for the OECD as a whole.

Labour Market Status by Educational Attainment, 16-65 Year-Olds, OECD, 2012 (Source: The Programme for the International Assessment of Adult Competencies, 2012)

image002

 

 

 

 

 

 

 

 

 

 

 

(Just so we’re clear, when talking about the OECD as a whole, “College” means PSE below bachelor’s degree [ISCED 4/5B for higher ed data nerds], “university” means PSE bachelor’s degree or higher [ISCED 5A/6], and “PSE” refers to the two combined.  That’s not a perfect translation of what those qualifications mean in other OECD countries, but it’s close enough.)

If we compare Canada and the OECD, we notice two things right away. First, there are some pretty massive differences in education levels.  Fully 60% of Canadians have some kind of credential, compared to an OECD-wide average of just 36%.  Second, there is also a difference in employment levels: 75% in Canada versus 69% in the OECD.

An optimist (or at least a higher education lobbyist) would no doubt try to link these two factors, and say: Yay Canada!  Our higher attainment rate causes higher labour force participation rates!  And while that’s certainly one way to read the data, it’s not the only way.

Try looking at it like this: in both the OECD and Canada, exactly one-sixth of the PSE-educated population is either unemployed or not looking for work (6/36 in OECD, 10/60 in Canada).  In Canada, 35% of people with no PSE are either unemployed or not looking for work; in the OECD, 39% are.  Not a huge difference.  A much more pessimistic reading of this data, therefore, is thus: to a large degree, Canada educates people to no real purpose. The fact that we have a higher percentage of the population educated hasn’t appreciably increased the probability (at least vis-à-vis other OECD countries) that those with higher education are employed.

There is, of course, a third reading of the data: that education levels and employment levels aren’t linked statically in any meaningful way.  National labour markets develop in different ways over time, in response to varied economic conditions.  Countries with varying education/skill compositions can have similar levels of employment (though not necessarily similar levels of national income); conversely, countries with identical sets of skills can have quite different levels of employment and output, depending on a host of other institutional and environmental factors.  As a result, generalizing about economic outcomes based on educational ones is a bit of a mug’s game.

I’d kind of like the first option to be true.  But overall, my money’s on number three.

November 13

Using PIAAC to Measure Value-Added in Higher Ed: US Good, Australia Abysmal

A few weeks ago, when commenting on the PIAAC release, I noted that one could use the results to come up with a very rough-and-ready measure of “value added” in higher education.  PIAAC contains two relevant pieces of data for this: national mean literacy scores for students aged 16-19 completing upper-secondary education, and national mean literacy scores for students aged 16-29 who have completed Tertiary A.  Simply by subtracting the former from the latter, one arrives at a measure of “value added”, which I reproduce below in Figure 1 (the y-axis is difference in PIAAC scores; PIAAC is scored on a 500-point scale, where 7 points are considered to be equal to roughly 1 year of schooling).

Figure 1: Tertiary A value-added: Mean Tertiary A PIAAC Score Minus Mean Upper Secondary PIAAC Score

 

 

 

 

 

 

 

 

 

 

 

 

This is a bit crude, though; to be genuinely comparable, one needs to control for the proportion of upper-secondary graduates that actually go on to higher education.  Imagine two countries, both of which had the same mean score among upper secondary students, but country X enrols 20% of their upper secondary graduates in Tertiary A, and country Y enrols 40%.  One would expect a larger gap between mean high school and mean Tertiary A scores in country X than in country Y because, comparatively, it’s cherry-picking “better” students.  So we need some way to correct for that.

Fortunately, the OECD’s Education at a Glance provides data both on upper secondary graduation rates, and on tertiary attainment rates by age 30 (indicators A1.2a and A3.1b, if you’re following at home).  From those two figures one can calculate the proportion of upper secondary graduates who go to university.  Since PIAAC publishes not just means, but also scores for the 25th and 75th percentiles, one can now estimate the relevant threshold PIAAC score for getting into a Tertiary program (i.e. if 37.5% of your students get a degree, then the threshold secondary score will be halfway between the mean score and the 75th percentile score).  To get a value-added figure for Tertiary A, one can take the mean score for Tertiary A and subtract this threshold secondary score, rather than the mean secondary score.  The results look like this:

Figure 2: Tertiary A Value-Added: Mean Tertiary A PIAAC Score (16-29 yr.olds) Minus Threshold Upper Secondary PIAAC Score (16-19 yr-olds)

 

 

 

 

 

 

 

 

 

 

 

 

This change in methodology only slightly changes the results: absolute scores are smaller, reflecting the fact that the baseline is now higher, but the rank order of countries is similar.  The US looks pretty good, indicating that its higher education system may compensate for a weak K-12 system; Australia’s result, somehow, is negative, meaning that average PIAAC scores for Tertiary A graduates are lower than the threshold score for secondary graduates heading to university.  Which is a bit mind-boggling, to be honest.

I’m looking for feedback here, folks.  What do you think?  Does this work as a model for calculating tertiary value-added?  How could it be improved?  I’m all ears.

October 11

PIAAC: The Results for Aboriginal and Immigrant Canadians

One of the unbelievably cool things about this week’s PIAAC release is the degree to which StatsCan and CMEC have gone the extra mile to not only oversample for every province, but also for every territory (a first, to my knowledge), and for Aboriginal populations, as well – although they were not able to include on-reserve populations in their sample.  This allows us to take some truly interesting looks at several vulnerable sub-segments of the population.

Let’s start with the Aboriginal population.  Where numbers permit, we have province-by-province stats on this, albeit only for off-reserve populations.  Check out figure 1:

Figure 1: PIAAC Literacy Scores for Aboriginal and Mainstream Canadians, Selected Provinces.

 

 

 

 

 

 

 

 

 

 

 

 

So, good news first: in BC and Ontario, the gap between Aboriginal and mainstream Canadians is down to single digits – this is great news, even if it doesn’t include the on-reserve population.  But given the differences in educational attainment, you have to think that a lot of this is down to attainment rates: if one were to control for education, my guess is the difference would be exceptional.

The bad news, of course, is: WHAT THE HELL, NUNAVUT?  Jumpin’ Jehosophat, those numbers for the Inuit are awful.  The reason, of course, again comes down to education, with high-school completion rates for the population as a whole being below 50%.  Other territories are better, but not by much.  It’s a reminder of how much work is still needed in Canada’s north.

The immigration numbers are a bit more complicated.  The gap in literacy scores between non-immigrants and immigrants is about 25 points, and this gap is consistent at all levels of education.  That’s not because immigrants are less capable, it’s because, for the most part, they’re taking the test in their second – or possibly third – language (breaking down the data by test-takers’ native language confirms this).  As someone pointed out to me on twitter, the consequence of this is that PIAAC literacy isn’t pure literacy, per se – it’s a test of how well one functions in society’s dominant language.  Conversely, though, since facility in the dominant language clearly has an effect on remuneration, one wonders how much of the oft-discussed gap in salaries between immigrants and native-born Canadians, which seems illogical when just looking at educational levels, might be understood in light of this gap in “literacy”?

A larger point to remember, though, is that the presence of immigrants makes it difficult to use overall PIAAC scores as a commentary on educational systems. Over 20% of Canadians aged 16-65 are immigrants, and most of these people did their schooling outside of Canada, and, bluntly, they bring down the scores.  Provinces with high proportions of immigrants will naturally see lower scores.  Policymakers should be careful not to let such confounding variables affect their interpretation of the results.

October 10

More PIAAC: The Canadian Story

Yesterday I offered my thoughts on some of the highlights from the international portion of the PIAAC release; today I want to focus on the Canadian results. 

Figure 1 shows the overall literacy scores, by province.

Figure 1: Literacy Scores by Province, PIAAC

 

 

 

 

 

 

 

 

 

 

 

 

At first glance, PIAAC doesn’t seem to be telling us anything we didn’t already know from years of PISA & TIMSS surveys.  Alberta comes first, the Atlantic is mostly a mess, and everybody else is kind of in-between.  But look a little more closely at the data, and a different story emerges.  Remember that PISA and TIMSS are single-cohort snapshots of kids with identical amounts of education; PIAAC is a mashup of multiple cohorts, each with quite different educational patterns.  Because they are measuring such different things, similarities may simply be coincidental.

So let’s see what happens when we try to standardize for age and education.  Figure 2 shows PIAAC literacy scores, by province, for the 25-34 age cohort who possess a university degree:

Figure 2: Literacy Scores by Province, University Graduates Aged 25-34

 

 

 

 

 

 

 

 

 

 

 

 

At face value, Figure 2 is pretty exciting if you’re from the Atlantic.  I mean, hey, OECD says one year of schooling is equal to seven points on the PIAAC scale – which implies that Islanders with university degrees, on average, have literacy rates equal to about three years of extra education over the left-coasters.  But because of sample sizes, these numbers come with pretty big confidence intervals: PEI and Nova Scotia are outside the margin of error for BC and Saskatchewan, but not for anyone else.  The other six are all essentially equal.

Now take a look at the result for college graduates, aged 25-34:

Figure 3: Literacy Scores by Province, College Graduates Aged 25-34

 

 

 

 

 

 

 

 

 

 

 

 

There’s a similar pattern here, but the gaps at either end are bigger, and confidence intervals don’t help quite as much.  BC through Manitoba are all within each others’ margin of error.  Put PEI and Alberta are genuinely ahead of everyone else, except BC; Newfoundland and Saskatchewan come out looking bad no matter what.

Here’s what you should take from this:

1)   Alberta’s overall high PIAAC scores are due less to its own education system, and more to its ability to attract talent from elsewhere.  That’s the only way you can reconcile their own scores with what we know about their PSE access rates, and the performance shown in the second and third figures above.

2)   Saskatchewan needs to ask some hard questions.  Really hard.

3)   PEI is… odd.  This doesn’t look like a fluke.  But then, if they’ve got all these great skills, why is their economy such a basket case?

4)   Newfoundland is Newfoundland.  Decades of relative poverty will take its toll.

5)   Don’t get fooled by small differences – the other six provinces are essentially indistinguishable from one another.

More tomorrow.

October 09

Some Bombshells from the Programme for International Assessment of Adult Competencies (PIAAC)

So, yesterday saw the release of the first results from the Survey of Adult Skills, a product of the OECD’s Programme for International Assessment of Adult Competencies.  This survey is meant to examine how adults from different countries fare on a set of tests measuring cognitive and workplace skills, such as literacy, numeracy, and ICT skills; perhaps somewhat controversially, some of the OECD’s own employees are referring to it as a “ranking” (though, honestly, that does them a grave disservice).  Additionally, Statistics Canada did a seriously massive oversample, which allows us to make comparisons not only between provinces and territories (to my knowledge, this is the first time anyone’s gone to the trouble of getting representative data in Nunavut), but also between immigrants and non-immigrants, and aboriginals and mainstream Canadians.

Fun, huh?  So much fun it’s going to take me the rest of the week to walk you through all the goodies in here.  Let’s begin.

Most of the media coverage is going to be on the “horse-race” aspects of the survey – who came top, across the entire population – so that’s a good place to start.  The answer is: Japan for literacy and numeracy, Sweden for ICT skills.  Canada is middle of the pack on numeracy and literacy, and slightly above average on ICT.  These Canadian results also hold even when we narrow the age-range down to 16-24 year-olds, which is more than passing strange, since these are the same youth who’ve been getting such fantastic PISA scores for the last few years.  Most national PIAAC and PISA scores correlate pretty well, so why the difference for Canada?  Differences in sampling, maybe?  It’s a mystery which deserves to be resolved quickly.

But here’s the stuff that grabbed my attention: national literacy scores among young graduates, by level of education.  The scores for secondary school graduates are for 16-19 year-olds only; the scores for university graduates are for 16-29 year olds.  Note that scores are out of 500, with 7 points being equivalent (allegedly) to one extra year of schooling.

Figure 1: PIAAC scores by Country and Level of Education

 

 

 

 

 

 

 

 

 

 

 

 

Eyeball that carefully.  Japanese high school graduates (red bars) have higher literacy levels than university graduates (blue bars) in England, Denmark, Poland, Italy, and Spain.  Think about that.  If you were a university rector in one of those countries, what do you think you’d be saying to your higher education minister right about now?

Another way to look at this data is to look at “value added” by higher education systems by looking at the differences between scores for recent university or college (technically, tertiary non-university, always a tricky category) graduates, and those for secondary graduates.  Figure 2 shows the differentials for universities:

Figure 2: Difference in PIAAC Scores Between University (Tertiary A) and High School Graduates

 

 

 

 

 

 

 

 

 

 

 

 

And figure 3 shows them for tertiary non-universities (Tertiary B):

Figure 3: Difference in PIAAC Scores Between “Tertiary Non-University (Tertiary B) and High School Graduates

 

 

 

 

 

 

 

 

 

 

 

 

There’s an awful lot one could say about all this.  But for me it boils down to: 1) the fact that so many countries’ Tertiary B value-adds are negative is a bit scary; 2) The Americans, Finns, and Belgians (the Flemish ones, anyway) all have really good value-add results across their whole tertiary systems; 3) the Australians and Koreans appear to have absolutely god-awful value-add results across their whole tertiary systems; and, 4) Canada is just… middle-of-the-pack.  Not bad, but probably not where we should be, given how much higher-than-average our expenditures on higher education are.

More tomorrow.

April 29

Freeing Apprenticeships from the Trades

I was looking at some apprenticeship statistics in a few OECD countries the other day, and I noticed yet another way in which Canada seems to be missing the boat.

It’s not just that our ratio of on-the-job training to classroom training is especially elevated, for no apparent reason.  And it’s not just that our apprenticeships last longer than those in other countries, for no apparent reason.  It’s that our ideas about which occupations are apprenticeable are stuck in the world of medieval craft-guilds.   Virtually all other countries that are serious about apprenticeship programming are finding ways to extend apprenticeships to retail, banking, and health care, but not us.

If you’ve been paying close attention to the apprenticeship file, it’s likely that you’ll have heard some laudatory murmurings about Australia over the last few years, something to the effect of: they doubled the number of apprenticeships; isn’t it great that somebody “gets it”; etc.  But dig a little deeper and you’ll find that what Australia did was to expand the system of apprenticeships into fields where they had not previously been, mainly in the retail business.  The actual number of apprentices in the trades barely increased. The same thing happened in Finland, where enrolments tripled in the early 1990s when apprenticeships were extended into areas like health care, administration, and tourism.

Or take Germany, home of the famed “Dual System” of apprenticeships that everyone loves so much.  Of the top ten apprenticeship occupations, five – retail sales, office administration, business administration, medical administration, and wholesale/export sales – are not apprenticeable trades in Canada.  Indeed, only about 40% of German apprentices are in the traditional trades, which means that the number of trades apprentices per capita in Canada is probably about equal to, or even slightly higher than, what it is in Germany.

Back home, the Canadian Apprenticeship Forum (CAF), which ostensibly is an organization to promote a non-occupationally-specific form of learning, defines apprenticeship as follows: “Apprenticeship is a workplace-based program that teaches people the skills they need in the trades (emphasis mine) to achieve competencies and perform tasks to industry standards.”  Federal and provincial government mouth the same lines.

Our national apprenticeship policies rest on a deliberate conflation of apprenticeships and human resource development for the construction, energy, and natural resources sectors.  In Canada, apprenticeships = trades.  Period.  But if we genuinely believe apprenticeships are a good way to improve the level of skills in the economy, why restrict them to only those sectors of the economy which existed in the 19th century?  Why not emulate the global leaders in apprenticeships, and find ways to extend this form of learning into the service sectors which will dominate the 21st century?

January 08

Left Behind Again

One of the most interesting phenomenon in global higher education these days is a movement known as the Tuning Process.  And, surprise, surprise, Canada’s allegedly-globally-linked-in, ultra-internationalized universities are nowhere to be found.

The Tuning Process is a process of detailing learning outcomes at the program-of-study level – a mostly faculty-driven process to determine what students should know, and be able to do, by the end of their degree.  What distinguishes Tuning from the kind of learning outcomes process we see at Canadian universities, such as Guelph, is that the process of determining outcomes statements aren’t the responsibility of faculty members at a single institution; rather, they emerge from the collaborative effort of multiple institutions.

The original Tuning was designed to come up with Europe-wide outcomes statements in a few fields of study.  Since then, it has spread around the world: to Latin America, Russia, and Japan.  More recently, it has expanded to places such as China (where, to be honest, it seems hard to believe there was much practical difference in learning outcomes between institutions, anyway) and Africa (where the degree of faculty particularism makes it really hard to imagine this process taking off).  Globally, Tuning has been at the heart of the OECD’s AHELO project, which aims to compare general cognitive skills and specific subject knowledge.

But perhaps the biggest surprise is what’s happening in the United States.  There, the Lumina Foundation launched a Tuning project about three years ago with a number of US states (Indiana, Minnesota, Texas, and Utah) in a variety of subjects; more recently, they have attempted to do a Tuning process nationally, on a single subject area, through a partnership with the American Historical Association.

Tuning is a big deal.  Though institutional participation in Tuning is everywhere voluntary, the speed at which it is spreading around the world means that within a relatively short space of time degrees that are “tuned” (that is, come complete with widely accepted learning outcomes statements) will be the norm.  Once that’s the case, there will be implications for the ability of the “untuned” to attract students.  In professional programs, this isn’t a huge deal because accreditation serves more or less the same function.  But in other disciplines, while a few institutions are stepping up to the plate, we haven’t yet got to the point where we can have grown-up, national conversations about program outcomes.

We’ll pay for this, eventually, if we don’t board this train.  Someone needs to kick-start this discussion here in Canada.  But who?

May 08

The Shape of Things to Come

Sit down before you look at this graph, which shows new investment in higher education in 2011. The data comes from our annual survey of 40 countries around the world which make up over 90% of all enrolments and scientific production.

Change in Public Expenditures on Higher Education, 2011

The basic story here is this: in the OECD, we’ve finally hit what I call “Peak Higher Education”; the point beyond which we can no longer expect any increase in public investment in higher education. Between shrinking youth populations and catastrophic fiscal pressures across most of these countries, there simply isn’t room for expenditures to grow the way they have over most of the past forty years. At the moment, the downward pressure is mostly coming from the United States, and there are some countervailing pressures from newer OECD members (Turkey, Israel and Chile all saw double-digit increases) – but even in the medium term this trend looks unstoppable.

But outside the OECD it’s a whole different story, where the country average increase was 8.4%. Now, some of that was due to inflation, which tends to be much higher in these fast-developing countries than it is here. And the average hides the fact that there is movement in both directions: funding drops in places like Ukraine and Pakistan topped 20%. But the big hitters are furiously increasing spending; in India, a 21% real increase (albeit from a pretty tiny base) and in China it’s 10%. Small players aren’t being left behind, either; Singapore’s increase was close to 30% last year.

Now, we don’t need to get all Lou Dobbs about this. It’s not as though the rise of universities in other parts of the world poses some sort of existential threat to our own schools – the spread of education and knowledge like this is a great thing. But let’s not ignore some of the obvious ramifications. It means increased competition for top faculty, which will drive up prices at the top end of the scale. It means both increased sources of and competition for prestige, which will lead many universities to expand their activities into some very distant parts of the world.

It’s not going to all happen overnight. Many of these countries are starting from a very long way behind our universities in terms of resources, so even with very large annual increases, it could be a couple of decades before they reach western levels of expenditure (and because prestige is a stock rather than a flow, it will take another couple of decades again before institutions in these countries will be seen as broadly comparable to ours).

It will happen; it’s just a question of time. Many more years like 2011, and it could happen a lot sooner than you think.

September 14

Data Point of the Week: StatsCan Gets it Wrong in the EAG

So, as noted yesterday, the OECD’s Education at a Glance (EAG) statfest – all 495 pages of it – was just released. Now it’s our turn to dissect some of what’s in there.

Of most immediate interest was chart B5.3, which shows the relative size of public subsidies for higher education as a percentage of public expenditures on education. It’s an odd measure, because having a high percentage could mean either that a country has very high subsidies (e.g., Norway, Sweden) or very low public expenditures (e.g., Chile), but no matter. I’ve reproduced some of the key data from that chart below.

 

(No, I’m not entirely clear what “transfers to other entities” means, either. I’m assuming it’s Canada Education Savings Grants, but I’m not positive.)

Anyways, this makes Canada looks chintzy, right? But hang on: there are some serious problems with the data.

In 2008, Canada spent around $22 billion on transfers to institutions. For the chart above to be right would imply that Canadian spending on “subsidies” (i.e., student aid) was in the $3.5 – 4 billion range. But that’s not actually true – if you take all the various forms of aid into account, the actual figure for 2008 is actually closer to $8 billion.

What could cause such a discrepancy? Here’s what I’m pretty sure happened:

1) StatsCan didn’t include tax credits in the numbers. Presumably this is because they don’t fit the definition of a loan or a grant, though in reality these measures are a $2 billion subsidy to households. In fairness, the U.S. – the only other country that uses education tax credits to any significant degree – didn’t include it either, but it’s a much bigger deal here in Canada.

2) StatsCan didn’t include any provincial loans, grants or remission either. They have form on this, having done the same thing in the 2009 EAG. Basically, because StatsCan doesn’t have any instrument for collecting data on provincial aid programs, it essentially assumes that such things must not exist. (Pssst! Guys! Next time, ask CMEC for its HESA-produced database of provincial aid statistics going back to 1992!) So, what happens when you add all that in (note: U.S. data also adjusted)?

 

Not so chintzy after all.

September 13

Education at a Glance

By the time you read this, the first headlines should be coming through from Paris on the 2011 version of OECD’s annual publication, Education at a Glance (EAG). We’ll be taking a deeper look at some of the statistics tomorrow and over the coming weeks, but today I wanted to offer some thoughts on the product itself.

Over the 16 years since EAG was first published, it has had a considerable effect on policy-making around the world. By drawing direct comparisons between national systems, OECD has kick-started an entire policy sub-culture around benchmarking national outcomes. Canada, however, has had difficulty taking advantage of this explosion of comparative data, because of the difficulty adapting our education data – which is designed for our own policy purposes – to the somewhat abstract categories that OECD uses to make data from such varied countries comparable.

There’s been a lot of hysteria over this last point over the years. Back when the Canadian Council on Learning was still around (ok, they technically still exist, but have you seen what they’ve been putting out since their funding got nuked?) the annual EAG release would reliably be accompanied with anguished wails from CCL, going on about how Statistics Canada’s inability to produce comparable data was depriving the country of much of this benchmarking goodness and turning us into some third world backwater.

Slowly, however, Statistics Canada has been getting better at this, so tomorrow’s EAG may have more Canada in it that have previous editions. But just remember as you read the press coverage that there are an awful lot of caveats and simplifications that go into EAG in order to make vastly different education systems comparable. For instance, population educational attainment – a measure on which Canada traditionally does very well – is calculated based on labour force survey questionnaires which use different questions in different countries. So is Canada really the best educated country, or do we just have slack labour force survey questions?

Caveat lector.

Page 2 of 212