Higher Education Strategy Associates

Tag Archives: PIAAC

July 14

Paul Cappon, Again

You may have noticed that Paul Cappon – former President of the Canadian Council of Learning – had a paper out last week about how what the country really needs is more federal leadership in education.  It is desperately thin.

Cappon starts by dubiously claiming that Canada is in some sort of education crisis.  Canada’s position as a world-leader in PSE attainment is waved away thusly: “this assertion holds little practical value when the competencies of those participants are at the low end compared with other countries”.  In fact, PIAAC data shows that graduates of Canadian universities, on average, have literacy skills above the OECD average.  Cappon backs up his claim with a footnote referencing this Statscan piece, but said document does not reference post-secondary graduates.  Hmm.

And on it goes.  He cites a Conference Board paper putting a $24 billion price tag on skills shortages as evidence of “decline”, even though there’s no evidence that this figure is any worse than it has ever been (Bank of Canada data suggests that skills shortages are always with us, and were worse than at present for most of the aughts), or that a similar methodology would not lead to even bigger figures in other countries.  He cites a low proportion of STEM graduates in engineering as cause for concern, despite a total lack of evidence that this percentage has anything at all to do with economic growth.

Ridiculously, he proclaims that the Canadian education system (not higher education – all education from kindergarden onwards) is less effective than the Swiss because they beat us on some per capita measures of research output (which are of course largely funded through pockets entirely separate from the education budget).  Yes, really.  For someone so in favour of evidence-based policy, Cappon’s grasp on what actually constitutes evidence is shockingly tenuous.

Having cobbled together a pseudo-crisis, Cappon concludes that “the principal cause of our relative regression is that other countries are coordinating national efforts and establishing national goals… Canada, with no national office or ministry for education, is mired in inertia, each province and territory doing its best in relative isolation.” Now there is no evidence at all that the existence of national systems or national goals makes a damn bit of difference to PIAAC outcomes.  Germany comes in for all sorts of praise because of its co-operation between national and state governments, but their PIAAC and PISA outcomes are actually worse than Canada’s.  Yet Cappon believes – without a single shred of evidence – that if only there were some entity based in Ottawa that could exercise leadership in education that everything would be better.

Two points: first, our nation rests on a very simple compromise.  Back in 1864, the *only* way in which Catholic, Francophone Lower Canada could be tempted into agreeing to a federal government with representation by population was if that new level of government never, ever, ever got its hands on education.  That was, and is, the deal.  It’s not going to change.  Period.

Second, everyone needs to remember that there was a time not long ago that the Director-General of CMEC suggested exactly such a body to some deputy ministers in Ottawa, and Ottawa created a body not dissimilar to what Cappon is describing.  But said CMEC DG didn’t tell the provinces he was negotiating this deal with the feds.  When he was offered the leadership of the new organization, he jumped, taking several CMEC senior staff with him.  The result was that provinces, feeling betrayed by the DG’s chicanery, refused to work with the new organization.  As a result, it achieved little before being shut down in 2011.

That DG, of course, was Paul Cappon.

This report is essentially a self-justification for a dishonest act performed nearly a decade ago, rather than a genuine contribution to public policy.  If Cappon actually cared about repairing federal-provincial relations around learning, a mea culpa would have been more appropriate.

November 13

Using PIAAC to Measure Value-Added in Higher Ed: US Good, Australia Abysmal

A few weeks ago, when commenting on the PIAAC release, I noted that one could use the results to come up with a very rough-and-ready measure of “value added” in higher education.  PIAAC contains two relevant pieces of data for this: national mean literacy scores for students aged 16-19 completing upper-secondary education, and national mean literacy scores for students aged 16-29 who have completed Tertiary A.  Simply by subtracting the former from the latter, one arrives at a measure of “value added”, which I reproduce below in Figure 1 (the y-axis is difference in PIAAC scores; PIAAC is scored on a 500-point scale, where 7 points are considered to be equal to roughly 1 year of schooling).

Figure 1: Tertiary A value-added: Mean Tertiary A PIAAC Score Minus Mean Upper Secondary PIAAC Score













This is a bit crude, though; to be genuinely comparable, one needs to control for the proportion of upper-secondary graduates that actually go on to higher education.  Imagine two countries, both of which had the same mean score among upper secondary students, but country X enrols 20% of their upper secondary graduates in Tertiary A, and country Y enrols 40%.  One would expect a larger gap between mean high school and mean Tertiary A scores in country X than in country Y because, comparatively, it’s cherry-picking “better” students.  So we need some way to correct for that.

Fortunately, the OECD’s Education at a Glance provides data both on upper secondary graduation rates, and on tertiary attainment rates by age 30 (indicators A1.2a and A3.1b, if you’re following at home).  From those two figures one can calculate the proportion of upper secondary graduates who go to university.  Since PIAAC publishes not just means, but also scores for the 25th and 75th percentiles, one can now estimate the relevant threshold PIAAC score for getting into a Tertiary program (i.e. if 37.5% of your students get a degree, then the threshold secondary score will be halfway between the mean score and the 75th percentile score).  To get a value-added figure for Tertiary A, one can take the mean score for Tertiary A and subtract this threshold secondary score, rather than the mean secondary score.  The results look like this:

Figure 2: Tertiary A Value-Added: Mean Tertiary A PIAAC Score (16-29 yr.olds) Minus Threshold Upper Secondary PIAAC Score (16-19 yr-olds)













This change in methodology only slightly changes the results: absolute scores are smaller, reflecting the fact that the baseline is now higher, but the rank order of countries is similar.  The US looks pretty good, indicating that its higher education system may compensate for a weak K-12 system; Australia’s result, somehow, is negative, meaning that average PIAAC scores for Tertiary A graduates are lower than the threshold score for secondary graduates heading to university.  Which is a bit mind-boggling, to be honest.

I’m looking for feedback here, folks.  What do you think?  Does this work as a model for calculating tertiary value-added?  How could it be improved?  I’m all ears.

November 11

Kevin Lynch is Horribly Wrong

It’s disappointing that Kevin Lynch, former head of the public service in Ottawa, is the latest victim of that peculiarly Canadian disease, where one’s casual knowledge of the German apprenticeship system leads one to lose all critical faculties – as demonstrated in this awful article from the weekend Globe.

The article starts by noting that, “in proficiency in numeracy and literacy among 16-24 year-olds…, Canada is lagging the results for the Nordic countries, Australia and Germany”.  Wrong.  Well, at least partly wrong.  In literacy, the statement is true with respect to Australia, Finland, and Sweden, but differences between Denmark, Germany, Norway, and Canada are statistically insignificant.  And in numeracy, Australia and Norway are identical to Canada (pgs. 72-82 of the PIAAC report).  The article then goes on to note that, “In preparing young Canadians… experiential education appears to be quite valuable, especially for the skilled trades, and here there may be much to learn from others” – “others” apparently meaning Germany.

Leaving aside the issue that German PIAAC results aren’t really better than Canadian ones, it’s hard to understand why Lynch thinks that – even in theory – higher participation in the skilled trades would have strong positive effects on PIAAC scores.  Literacy and numeracy “skills” are quite different than “skilled” trades.

Lynch then sails into the usual puppy love about German vocationalism.  It’s “impressive”, according to him, that 50% of German high school students end up in vocational programs.  As if this was a choice.  As if streaming didn’t enter into it.  As if this streaming didn’t end up disproportionately steering poorer Germans and immigrants into vocational schools.  As if Germans themselves hadn’t noted how this dynamic contributes to Germany having among the most unequal literacy and numeracy outcomes in the OECD.

From there, it’s the usual conflation of apprenticeships with skilled trades, a peculiarly Canadian mistake.  If you look at the top ten occupations for apprenticeships in Germany, only three are in (what we’d call) the skilled trades: mechanic, mechanical engineer, and cook (the other seven – retail sales, office administration, business administration, medical administration, hairdressing, wholesale and export sales, and “sales” – would mostly be taught at colleges in Canada).  And then, to wrap up the article, is the specious argument that this vocational education system is the cause of Germany’s current low level of unemployment (seven years ago Germany had an unemployment rate of 12% – were apprenticeships the cause of that, too?).

Lynch’s argument, then, is: German youth have better PIAAC skills than Canadian youths (partly wrong), PIAAC skills are improved by skilled trades (huh?), German apprenticeships = skilled trades (wrong), and apprenticeships = lower unemployment (wrong).

Experiential education is, of course, a good thing.  But how about we discuss it without all this irrelevant nonsense about Germany?  It doesn’t improve the quality of our debate, at all.

October 11

PIAAC: The Results for Aboriginal and Immigrant Canadians

One of the unbelievably cool things about this week’s PIAAC release is the degree to which StatsCan and CMEC have gone the extra mile to not only oversample for every province, but also for every territory (a first, to my knowledge), and for Aboriginal populations, as well – although they were not able to include on-reserve populations in their sample.  This allows us to take some truly interesting looks at several vulnerable sub-segments of the population.

Let’s start with the Aboriginal population.  Where numbers permit, we have province-by-province stats on this, albeit only for off-reserve populations.  Check out figure 1:

Figure 1: PIAAC Literacy Scores for Aboriginal and Mainstream Canadians, Selected Provinces.













So, good news first: in BC and Ontario, the gap between Aboriginal and mainstream Canadians is down to single digits – this is great news, even if it doesn’t include the on-reserve population.  But given the differences in educational attainment, you have to think that a lot of this is down to attainment rates: if one were to control for education, my guess is the difference would be exceptional.

The bad news, of course, is: WHAT THE HELL, NUNAVUT?  Jumpin’ Jehosophat, those numbers for the Inuit are awful.  The reason, of course, again comes down to education, with high-school completion rates for the population as a whole being below 50%.  Other territories are better, but not by much.  It’s a reminder of how much work is still needed in Canada’s north.

The immigration numbers are a bit more complicated.  The gap in literacy scores between non-immigrants and immigrants is about 25 points, and this gap is consistent at all levels of education.  That’s not because immigrants are less capable, it’s because, for the most part, they’re taking the test in their second – or possibly third – language (breaking down the data by test-takers’ native language confirms this).  As someone pointed out to me on twitter, the consequence of this is that PIAAC literacy isn’t pure literacy, per se – it’s a test of how well one functions in society’s dominant language.  Conversely, though, since facility in the dominant language clearly has an effect on remuneration, one wonders how much of the oft-discussed gap in salaries between immigrants and native-born Canadians, which seems illogical when just looking at educational levels, might be understood in light of this gap in “literacy”?

A larger point to remember, though, is that the presence of immigrants makes it difficult to use overall PIAAC scores as a commentary on educational systems. Over 20% of Canadians aged 16-65 are immigrants, and most of these people did their schooling outside of Canada, and, bluntly, they bring down the scores.  Provinces with high proportions of immigrants will naturally see lower scores.  Policymakers should be careful not to let such confounding variables affect their interpretation of the results.

October 10

More PIAAC: The Canadian Story

Yesterday I offered my thoughts on some of the highlights from the international portion of the PIAAC release; today I want to focus on the Canadian results. 

Figure 1 shows the overall literacy scores, by province.

Figure 1: Literacy Scores by Province, PIAAC













At first glance, PIAAC doesn’t seem to be telling us anything we didn’t already know from years of PISA & TIMSS surveys.  Alberta comes first, the Atlantic is mostly a mess, and everybody else is kind of in-between.  But look a little more closely at the data, and a different story emerges.  Remember that PISA and TIMSS are single-cohort snapshots of kids with identical amounts of education; PIAAC is a mashup of multiple cohorts, each with quite different educational patterns.  Because they are measuring such different things, similarities may simply be coincidental.

So let’s see what happens when we try to standardize for age and education.  Figure 2 shows PIAAC literacy scores, by province, for the 25-34 age cohort who possess a university degree:

Figure 2: Literacy Scores by Province, University Graduates Aged 25-34













At face value, Figure 2 is pretty exciting if you’re from the Atlantic.  I mean, hey, OECD says one year of schooling is equal to seven points on the PIAAC scale – which implies that Islanders with university degrees, on average, have literacy rates equal to about three years of extra education over the left-coasters.  But because of sample sizes, these numbers come with pretty big confidence intervals: PEI and Nova Scotia are outside the margin of error for BC and Saskatchewan, but not for anyone else.  The other six are all essentially equal.

Now take a look at the result for college graduates, aged 25-34:

Figure 3: Literacy Scores by Province, College Graduates Aged 25-34













There’s a similar pattern here, but the gaps at either end are bigger, and confidence intervals don’t help quite as much.  BC through Manitoba are all within each others’ margin of error.  Put PEI and Alberta are genuinely ahead of everyone else, except BC; Newfoundland and Saskatchewan come out looking bad no matter what.

Here’s what you should take from this:

1)   Alberta’s overall high PIAAC scores are due less to its own education system, and more to its ability to attract talent from elsewhere.  That’s the only way you can reconcile their own scores with what we know about their PSE access rates, and the performance shown in the second and third figures above.

2)   Saskatchewan needs to ask some hard questions.  Really hard.

3)   PEI is… odd.  This doesn’t look like a fluke.  But then, if they’ve got all these great skills, why is their economy such a basket case?

4)   Newfoundland is Newfoundland.  Decades of relative poverty will take its toll.

5)   Don’t get fooled by small differences – the other six provinces are essentially indistinguishable from one another.

More tomorrow.

October 09

Some Bombshells from the Programme for International Assessment of Adult Competencies (PIAAC)

So, yesterday saw the release of the first results from the Survey of Adult Skills, a product of the OECD’s Programme for International Assessment of Adult Competencies.  This survey is meant to examine how adults from different countries fare on a set of tests measuring cognitive and workplace skills, such as literacy, numeracy, and ICT skills; perhaps somewhat controversially, some of the OECD’s own employees are referring to it as a “ranking” (though, honestly, that does them a grave disservice).  Additionally, Statistics Canada did a seriously massive oversample, which allows us to make comparisons not only between provinces and territories (to my knowledge, this is the first time anyone’s gone to the trouble of getting representative data in Nunavut), but also between immigrants and non-immigrants, and aboriginals and mainstream Canadians.

Fun, huh?  So much fun it’s going to take me the rest of the week to walk you through all the goodies in here.  Let’s begin.

Most of the media coverage is going to be on the “horse-race” aspects of the survey – who came top, across the entire population – so that’s a good place to start.  The answer is: Japan for literacy and numeracy, Sweden for ICT skills.  Canada is middle of the pack on numeracy and literacy, and slightly above average on ICT.  These Canadian results also hold even when we narrow the age-range down to 16-24 year-olds, which is more than passing strange, since these are the same youth who’ve been getting such fantastic PISA scores for the last few years.  Most national PIAAC and PISA scores correlate pretty well, so why the difference for Canada?  Differences in sampling, maybe?  It’s a mystery which deserves to be resolved quickly.

But here’s the stuff that grabbed my attention: national literacy scores among young graduates, by level of education.  The scores for secondary school graduates are for 16-19 year-olds only; the scores for university graduates are for 16-29 year olds.  Note that scores are out of 500, with 7 points being equivalent (allegedly) to one extra year of schooling.

Figure 1: PIAAC scores by Country and Level of Education













Eyeball that carefully.  Japanese high school graduates (red bars) have higher literacy levels than university graduates (blue bars) in England, Denmark, Poland, Italy, and Spain.  Think about that.  If you were a university rector in one of those countries, what do you think you’d be saying to your higher education minister right about now?

Another way to look at this data is to look at “value added” by higher education systems by looking at the differences between scores for recent university or college (technically, tertiary non-university, always a tricky category) graduates, and those for secondary graduates.  Figure 2 shows the differentials for universities:

Figure 2: Difference in PIAAC Scores Between University (Tertiary A) and High School Graduates













And figure 3 shows them for tertiary non-universities (Tertiary B):

Figure 3: Difference in PIAAC Scores Between “Tertiary Non-University (Tertiary B) and High School Graduates













There’s an awful lot one could say about all this.  But for me it boils down to: 1) the fact that so many countries’ Tertiary B value-adds are negative is a bit scary; 2) The Americans, Finns, and Belgians (the Flemish ones, anyway) all have really good value-add results across their whole tertiary systems; 3) the Australians and Koreans appear to have absolutely god-awful value-add results across their whole tertiary systems; and, 4) Canada is just… middle-of-the-pack.  Not bad, but probably not where we should be, given how much higher-than-average our expenditures on higher education are.

More tomorrow.