HEQCO produced a fascinating report on skills last week, which I want to explore in depth. Unfortunately, it has put a few people’s backs up because of a couple of poorly-chosen sentences in a covering press release, which I will also explore. But let’s focus on the first bit, because simply putting this study took an enormous amount of effort that needs to be acknowledged and celebrated.
(Actually, they released two intriguing reports: one on literacy and numeracy and one on critical thinking. But because there is so much to unpack in these, I’m going to limit my comments today to the former today. I may get to the latter one in a few days)
To start with, HEQCO got 9 universities (Algoma, Brescia, Brock, McMaster, Nipissing, Queen’s, Guelph, York and Quest) and 11 colleges (Algonquin, Centennial, Conestoga, Fanshawe, Fleming, George Brown, Humber, Sault, Seneca, Sheridan and St. Lawrence) to agree to participate in this study. “Participation” mostly means a) allowing HEQCO to advertise to find and compensate students to take the test b) administering the test and c) using a PIN system, attaching administrative data to each test to permit HEQCO to do deeper analysis. The resulting sample is therefore not “random” in any sense, but given the institutions involved and the admin data provided in the report, I think it’s fair to say that demographically these students are collectively pretty close to the Ontario average.
HEQCO gave one set of tests to students in their graduating year. This allows the possibility of looking at the skill level of students entering into the labour market, which is in itself useful. What they found was about 30% of said graduates were operating at what PIAAC calls literacy/numeracy levels 1 and 2. For literacy, this means these people have trouble interpreting dense or lengthy text, and for numeracy it means they have trouble interpreting graphs and tables or dealing with mathematical information if the context in which it is presented is less than completely explicit. A good description of what the levels mean can be found here for literacy and here for numeracy.
That’s interesting information, and somewhat distressing if we wish to claim that PSE is preparing graduates well for the labour-market. The problem, I think, is that HEQCO in its press release (but not the actual paper) chose to describe levels one and two as being below “the minimum required for graduates to perform well in today’s work world”. This unnecessarily raised some hackles because it appears to a commentary on HEQCO’s part, rather than an actual OECD-approved description of what those PIAAC levels mean (OECD tends to stick to generalities about higher levels of literacy and numeracy leading to greater labour market rewards).
In any event, HEQCO also gave the test to a group of first year students. This sampling strategy is the same one used by the late, lamented AHELO project and by the Collegiate Learning Assessment. The idea here is that if you test a reasonably comparable group of first-year and last-year students, the latter becomes in effect a synthetic-cohort and you can attribute differences in their scores to the education they received, thus giving you a “value-added” measure of education (you could theoretically get a real cohort and test the same students in first and fourth year but that takes longer and you get sample attrition). Using this method HEQCO found almost no “learning gain” between first and final year students.
This was, I thought, maybe the weakest part of the paper. In the US, where this technique was pioneered, they can control for the quality of the first and final year students by using average SAT scores. In Canada, where we don’t have SATs or any other standardized testing, it’s a lot harder. By my reading, there was no attempt to control for the relative quality of students. And in fact, without getting into too much detail, if you look at figures 11 and 12 on page 39, I think there’s a good case to be made that the first-year students who were tested are an exceptional bunch, and not actually typical of first students as a whole. Which makes the conclusion that there has been little learning gain somewhat questionable because the baseline may be off by quite a bit.
(Minor quibble: I was a little disappointed that there wasn’t more of an attempt to try to look at patterns by field of study: the sample size wasn’t huge but some conclusions certainly could have been drawn around graduate numeracy/literacy. Perhaps there is a follow-up paper to come.)
So, at one level what we have here is one strong-ish conclusion that there are an unfortunate number of graduates who seem to have weak literacy/numeracy skills and a weak conclusion about lack of learning gain. Put that way, it may not sound like such a big deal. But to me just getting to this point is a big deal. Putting together a coalition of universities and colleges who want to put in time and effort to look at numeracy and literacy outcomes is incredibly important. Understanding that we need to do more to normalize the baseline observations and stimulating conversations about how to do that is incredibly important.
This is the start of something big and not some kind of end point. The long, hard road to moving to an evidence-based culture in post-secondary education that focuses on actual student outcomes starts with projects like this. So, yeah, the results are less than earth-shattering and press release could have been better written, but so what? Never let the perfect be the enemy of the good. And in the grand scheme, this project is pretty good indeed.
There is a study which breaks out for disciplines, albeit an American one, in the book Academically Adrift. And it used a single cohort, measured before commencing, and after second and fourth year.
It concluded that the only correlation to improvements in critical thinking (as measured by the CLA) were the amount of reading and writing students did, and which subject they were in. In fact, the latter turned out to be a proxy for the former.
It would be nice if the HEQCO would come up with a similar report but, frankly, I suspect that they lack the temperament to really understand it. They’d be so prejudiced in favour of the telos of learning outcomes/job skills, that they’d completely miss the fact that an education is inculcated by reading and writing a lot, and not skipping anything deemed outside “the skills of tomorrow,” or whatever.