So, yesterday saw the release of the first results from the Survey of Adult Skills, a product of the OECD’s Programme for International Assessment of Adult Competencies. This survey is meant to examine how adults from different countries fare on a set of tests measuring cognitive and workplace skills, such as literacy, numeracy, and ICT skills; perhaps somewhat controversially, some of the OECD’s own employees are referring to it as a “ranking” (though, honestly, that does them a grave disservice). Additionally, Statistics Canada did a seriously massive oversample, which allows us to make comparisons not only between provinces and territories (to my knowledge, this is the first time anyone’s gone to the trouble of getting representative data in Nunavut), but also between immigrants and non-immigrants, and aboriginals and mainstream Canadians.
Fun, huh? So much fun it’s going to take me the rest of the week to walk you through all the goodies in here. Let’s begin.
Most of the media coverage is going to be on the “horse-race” aspects of the survey – who came top, across the entire population – so that’s a good place to start. The answer is: Japan for literacy and numeracy, Sweden for ICT skills. Canada is middle of the pack on numeracy and literacy, and slightly above average on ICT. These Canadian results also hold even when we narrow the age-range down to 16-24 year-olds, which is more than passing strange, since these are the same youth who’ve been getting such fantastic PISA scores for the last few years. Most national PIAAC and PISA scores correlate pretty well, so why the difference for Canada? Differences in sampling, maybe? It’s a mystery which deserves to be resolved quickly.
But here’s the stuff that grabbed my attention: national literacy scores among young graduates, by level of education. The scores for secondary school graduates are for 16-19 year-olds only; the scores for university graduates are for 16-29 year olds. Note that scores are out of 500, with 7 points being equivalent (allegedly) to one extra year of schooling.
Figure 1: PIAAC scores by Country and Level of Education
Eyeball that carefully. Japanese high school graduates (red bars) have higher literacy levels than university graduates (blue bars) in England, Denmark, Poland, Italy, and Spain. Think about that. If you were a university rector in one of those countries, what do you think you’d be saying to your higher education minister right about now?
Another way to look at this data is to look at “value added” by higher education systems by looking at the differences between scores for recent university or college (technically, tertiary non-university, always a tricky category) graduates, and those for secondary graduates. Figure 2 shows the differentials for universities:
Figure 2: Difference in PIAAC Scores Between University (Tertiary A) and High School Graduates
And figure 3 shows them for tertiary non-universities (Tertiary B):
Figure 3: Difference in PIAAC Scores Between “Tertiary Non-University (Tertiary B) and High School Graduates
There’s an awful lot one could say about all this. But for me it boils down to: 1) the fact that so many countries’ Tertiary B value-adds are negative is a bit scary; 2) The Americans, Finns, and Belgians (the Flemish ones, anyway) all have really good value-add results across their whole tertiary systems; 3) the Australians and Koreans appear to have absolutely god-awful value-add results across their whole tertiary systems; and, 4) Canada is just… middle-of-the-pack. Not bad, but probably not where we should be, given how much higher-than-average our expenditures on higher education are.