No More Boring Surveys

As most of you probably know, we at HESA spend a lot of our time working on surveys. While doing so, we see a lot of different types of survey instruments, especially from governments and institutions. And we’ve come to a major conclusion:

Most of them are really boring.

There was a time – say fifteen years ago– when doing surveys of applicants, graduates and alumni was relatively rare. There weren’t any surveys of satisfaction, or engagement, or anything else, really. We knew essentially nothing about the composition of the student body, their background, their finances or their experiences once they were there. Apart from the National Graduates Survey that Statistics Canada put out every four years, there was really almost nothing out there to tell us about students.

Things started to change in the late 1990s and early 2000s. Statscan and HRDC between them put the Youth in Transition Survey (YITS) into the field, along with the Post-Secondary Education Participation Survey (PEPS) and the Survey of Approached to Educational Planning (SAEP) (the latter two now being subsumed into the ASETS survey). A group of twenty or so institutions banded together to create the undergraduate survey consortium (CUSC); other institutions began signing on to a private-sector initiative (from the company that later became Academica) to look at data about applicants. Provincial governments began surveying graduates more regularly, too; the Millennium Scholarship Foundation also spurred some developments in terms of looking at student financing and students at private vocational colleges. That’s not to forget the post-2004 NSSE boom and the plethora of smaller institutional surveys now being conducted.

This explosion of activity – born of a combination of increasing policy interest in the field and decreasing survey costs – was all to the good. We learned a lot from these surveys, even if the insights they generated weren’t always used to improve programming and policy as much as they might.

But it’s getting really stale. Now that we have these surveys, we’ve stopped asking new questions. All anyone seems to want to do is keep re-running the same questions so we can build up time-series. Nothing wrong with time-series, of course – but since change in higher education comes at such a glacial pace, we’re wasting an awful lot of time and money measuring really tiny incremental changes in student engagement and the like rather than actually learning anything new. Where ten years ago we were discovering things, now we’re just benchmarking.

It doesn’t have to be this way. Over the next four days, we’ll be giving you our suggestions for how to change Canadian PSE surveys. It’s time to start learning again.

Posted in

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.