Higher Education Strategy Associates

Category Archives: surveys

November 19

Stories Arts Faculties Tell Themselves

Here at HESA towers, we’ve been doing some work on how students make decisions about choosing a university (if you’re interested: the Student Decisions Project was a multi-wave, qualitative, year-long longitudinal study that tracked several hundred Grade 12 students as they went through the PSE research, application, and enrolment process.  We also took a more targeted qualitative look, specifically at Arts, with the national Prospective Arts Students Survey).  We’ve been trying to do the same for colleges, but it’s a much trickier demographic to survey.

In both studies, one of the questions we asked is what students really want from their education.

Now at one level, this question is kind of trite.  We know from 15 years of surveys from the Canadian Undergraduate Survey Consortium that students go to university: i) to get better jobs; ii) because they like learning about a particular field; and also, iii) to make friends, and enjoy the “university experience”.

Where it gets a little trickier, however, is when you break this down by particular fields of study.  With most faculties, there tends to be a positive reason to attend.  However, when it comes to Arts, enrolment is often seen as a fall-back option – it’s something you do if you don’t have concrete goals, or if you can’t do anything else.  Now, Arts faculties tend to take the positive here, and spin this as students wanting to “find themselves”. But in deploying this bit of spin, Arts faculties often end up heading in the wrong direction.

One of the problems here is that the notion of students “finding themselves” (not a term students themselves use) is not as straightforward as many think. Broadly, there are three possible definitions.  The first situates “finding yourself” in academic terms: by exploring a lot of different academic options, a student finds something that interests her/him, and becomes academically engaged.  This is one of the reasons that Arts faculties are built around a smorgasbord model, which lets students “taste” as many things as possible, and hence “discover” themselves.

But that’s not the only possible definition of “finding oneself”.  There is another option, in which students essentially view PSE as a cooling out period where they can “find” what they want to do, in a vocational sense.  Yes, they are taking courses, but since they recognize that Arts courses don’t lead directly to employment, they are more or less marking time while they discover how to make their way in the employment world, and think about how and where they want to live.  Then there is a third, slightly different take, in which students view “finding themselves” as the process by which they acquire transversal skills, and the skills of personal effectiveness needed to be successful adults.  School is something they do while they are learning these skills, often for little reason other than that going to school is something they have always done, and in many cases are expected to do.

Though all of these interpretations of “finding yourself” have some currency among students, it probably shouldn’t come as a surprise to learn that the one about “finding yourself” being a voyage of academic discovery is, in fact, the least frequently mentioned by incoming students.  Now, maybe they come around to this view later on, but it is not high on the list of reasons they attend in the first place.  To the extent that they have specific academic interests as a reason for enrolling in Arts, they tend to be just that: specific – they want to study Drama, or History, or whatever.

Which raises two questions: if this is true, what’s the benefit of Arts faculties maintaining such a wide breadth of requirements?  And second, why aren’t Arts faculties explicitly building-in more transversal skills elements into their programs?  Presumably, there would be a significant advantage in terms of recruitment for doing so.  Someone should give it a whirl.

November 06

What Canadians Think About Universities, and Where Canadian Universities Want To Go

A couple of quick notes about two interesting things from Universities Canada this week.

The first is the release of some public opinion polling, which they commissioned in the spring, regarding universities and other forms of higher education.  You can see the whole thing here, but I want to highlight a couple of slides, in particular.

The first is this one:

















It seems Canadians are overwhelmingly positive about most post-secondary institutions (though Quebecers clearly have a few doubts about CEGEPs).  Somewhat perplexingly, UnivCan also felt the need to test Canadians’ opinions about universities in Europe (do Canadians really have deep feelings about French grands écoles, German fachhochschulen, and Romanian politehnici?).  Mostly, though, this is all to the good.

But the more interesting set of answers is this one:


















Turns out Canadians think their universities are world-class, practical, and produce valuable research… but they also really need to change.  Which seems about right to me.  However, one wishes there might have been a follow-up: what kind of change is needed, exactly?

Often times, these kind of dissonant results (you’re great/please change) give the poll-reader a lot of room to cherry-pick.  Is UnivCan doing this?  Well, maybe.  Take a look at the new “Commitments to Canadians” the Presidents collectively issued this week.  They commit themselves to:

  • Equip all students with the skills and knowledge they need to flourish in work and life, empowering them to contribute to Canada’s economic, social, and intellectual success.
  • Pursue excellence in all aspects of learning, discovery, and community engagement.
  • Deliver a broad range of enriched learning experiences.
  • Put our best minds to the most pressing problems – whether global, national, regional, or local.
  • Help build a stronger Canada through collaboration and partnerships with the private sector, communities, government, and other educational institutions in Canada and around the world.

OK, so some of this is yadda yadda, whatever kind-of-stuff. (“pursue excellence in everything we do” is utterly void of meaning). But an emphasis on partnerships is good, as is the commitment to preparing students for work & life – in that order.  Something stronger on internships and co-ops would have been better: both UC Chair Elizabeth Cannon and UC President Paul Davidson have spoken a lot about co-ops in recent speeches, but a specific commitment to them is lacking in the actual statement.  That’s too bad: co-ops and internships have the potential to be a genuine and unique value proposition for Canadian higher education; our universities do a lot more of it than those in other developed countries.  And pretty much everyone loves them, bar the sniffy types who disdain them as “mere training”.

The issue is follow-through, of course, and Lord knows shifting institutional cultures ain’t easy.  But one gets the sense that Canadian universities are absorbing the change message, and acting upon it.  That’s good news.

Have a good weekend.

September 22

Where Do Students Want to Live?

Today, we at HESA released a paper called: Moving On?  How Students Think About Choosing a Place to Live After Graduation, which is based on a 2011 survey of 1,859 students from across the country.  Obviously, you should go read the whole thing, but for the time-pressed here are the highlights:

1)      Part of the paper’s purpose is to examine the qualities students look for in a place to live.  Turns out Richard Florida’s whole shtick about young educated types looking for cities that are “hip” or “creative” may be somewhat wide of the mark; in fact, students’ main priorities in finding a place of residence are access to good jobs, healthcare services, and a low crime rate.  Access to cultural events and foodie cultures rank way, way down the list.  To put that another way: what young educated people look for in a place to live is pretty much what everyone else looks for.

2)      A solid majority of students intend to stay in the province in which they graduated.  That said, just over 40% of students are at least open to the idea of moving.  However, these students are not evenly distributed.  Students in the prairie provinces (including Alberta) are much more open to moving away than are students in British Columbia.  And, equally, students are not open to moving just anywhere – of the people open to a move, most have three or fewer potential destination provinces in mind, and are not open to any others (the most commonly-sought destinations are Ontario and British Columbia – Manitoba and Saskatchewan are the least ).  Only 7% are genuinely open to a move to pretty much anywhere in the country.

3)      Here’s perhaps the most important piece of news: financial incentives for graduates such as the tax credits used by Saskatchewan, Manitoba, and New Brunswick have almost no effect.  We asked students what they expected to earn in their first job in the province they were most likely to call home.  Then we asked them how much it would take to get them to move to each of the other provinces.  For most provinces (BC was the outlier), about a quarter said “nothing could get me to go there” and another 25% said “I’d go for an extra $25,000 or more” (which is really just a polite way of saying “never”).  But, intriguingly, between 13% (Manitoba) and 25% (British Columbia) of all students, say they’d move to that province for either no extra money or even a cut in pay – just give them a job and they’d go.  The percentage who say they’d move for an extra $2,000 (roughly the value of the tax credits in SK, MB and NB)?  About 1%.  Move the financial incentive up to $5,000 and you get another 1%.  And that’s perfectly consistent, right across the country.

The fact is, students are going to move where they’re going to move.  They are either tied to their present spot by networks of friends and family, or they are lured by money, jobs, and prosperity.  A couple of thousand bucks, in the grand scheme of things, just doesn’t seem to matter that much.

All of which begs the question: how come more provinces aren’t making like Nova Scotia and ditching these tax rebate programs?

January 17

Can’t Get No Satisfaction (Data)

Many of you will have heard by now that the Globe and Mail has decided not to continue its annual student survey, which we at HESA ran for the last three years.  The newspaper will continue publishing the annual Canadian University Report, but will now do so without any quantitative ratings.

Some institutions will probably greet this news with a yawn, but for a number of others, the development represents a real blow.  There were a number of institutions who based a large part of their marketing campaigns around the satisfaction data, and the loss of this data source makes it more difficult for them to differentiate themselves.

When the survey started a decade ago, many were skeptical about the relevance of satisfaction data.  But slowly, as year followed year, and as schools more or less kept the same scores in each category year after year, people began to realize that satisfaction data was pretty reliable, and might even be indicative of something more interesting.   And as it became apparent that satisfaction scores actually had a reasonably good correlation with things like “student engagement” (basically: a disengaged student is an unhappy student), it also  became apparent that “satisfaction” was an indicator which was both simple and meaningful.

Sure, it wasn’t a perfect measure.  In particular, institutional size clearly had a negative correlation with satisfaction.  And there were certainly some extra-educational factors which tended to affect scores, be it students’ own personalities, or even just geography – Toronto students, as we know, are just friggin’ miserable, no matter where they’re enrolled.  But, when read within its proper context (mainly, by restricting comparisons to similarly-sized institutions), it was helpful.

Still, what made the data valid and useful to institutions was precisely what eventually killed it as a publishable product.  The year-to-year reliability assured institutions that something real was being measured, but it also meant that new results rarely generated any surprises.  Good headlines are hard to come by when the data doesn’t change much, and that poses a problem for a daily newspaper.  The Globe stuck with the experiment for a decade, and good on them for doing so; but in the end, the lack of novelty made continuation a tough sell.

So is this the end of satisfaction ratings?  A number of institutions who use the data have contacted us to say that they’d like the survey to continue.  Over the next week or so, we’ll be in intensive talks with institutions to see if this is possible.  Stay tuned – or, if you’d like to drop us a note with your views, you can do so at, info@higheredstrategy.com.

January 20

Graduate Surveys We’d Like to See

If there’s one type of Canadian educational survey where complete and utter stasis has set in, it’s graduate surveys. Questions like “are you employed,” “what are your earnings,” and “were you satisfied with your education” aren’t just boring, I think they’re actively making us stupider. There seems to be a general view that because the answers to these questions don’t change very much from year to year, that we’re doing as good a job as we ever have.

But labour market results aren’t achieved in a vacuum. Economic conditions (both global and local) play a role, as do demographics. Canada’s labour force, which has been increasing in size since WWII, is predicted to plateau in the next couple of years and then decline slightly thereafter. As employers get desperate for workers, they’ll take anyone (think about Alberta fast-food workers making $17/hour in the boom years); in those conditions, low levels of graduate unemployment can’t be taken as evidence of educational excellence.

In future, universities and colleges are going to be judged on how they make students more productive, not on whether they’re employed. That means institutions will need to dig a lot deeper in terms of figuring out how students acquire competencies and then put them into use. Surveys can be helpful in working out which elements of a student’s education proved to be useful and which didn’t. Graduates – even those from disciplines which aren’t vocationally-oriented (i.e., the humanities) – have a pretty good sense of which of their courses were useful and which were decorative. Identifying the courses (and professors!) that graduates in the labour force rate highly can be an enormously powerful tool in curriculum revision.

So, here’s a suggestion for graduate surveys: let’s ease up on the strictly quantitative stuff. The next time you do a survey of graduates, don’t ask them if they were satisfied with their education – ask them which class contributed most to their success in the job market. Don’t ask whether they’d recommend a university to a friend – ask them what missing skills they most wished they’d got before leaving school. Trust me, the answers will be revealing.

Finally: stop wasting information, and link individual graduates’ surveys to their student records. It’s not as time-consuming and expensive as you think, and it vastly increases the explanatory power of the available data.

As I mentioned Monday, Canadian institutions underwent a data revolution in the late 90’s and early 00’s, but unfortunately a benchmarking agenda took over and the discovery agenda was put to the side. But as we’ve shown over the past  three days, it doesn’t have to be that way. Better surveys are possible; we just need to design and execute them.

Let’s do it!

January 19

Faculty Workload Surveys We’d Like to See

While we’re on a roll about surveys, let me muse about one that I think many people in this country would like to see on academic staff and their workloads.

There is a lot of talk about teaching loads, particularly in comparison to professors in other countries where they get paid less (notably the United States). The problem is that we’re dealing in anecdotes; institutions unfortunately don’t publish hard data on teaching loads, either of the notional or the real variety (and there is a considerable difference between the two). Times used to be you could sort of work this out on your own by hand with a course calendar, but those aren’t published anymore so we need different ways of getting at this information.

One solution would be for Statistics Canada to add one or two fields to their annual survey of full-time staff. Right now, all they ask is field of study, age, rank and salary. It wouldn’t be an enormous stretch to ask about number of courses or total credit hours taught.

But why wait for Statscan? Here’s a simple little survey that VP Academics, or Provosts (or whatever they’re called in your neck of the woods) could fill out in a few minutes that would tell us everything we need to know:

Proportion of Tenured and Tenure-Track Professors Teaching Various Course Loads (Rows to Add to 100%)

No big database searches or anything – just “what’s your best guess about proportions of faculty in each category?” It doesn’t need to be 100% accurate – just give a rough idea.

The results would tell us a lot, wouldn’t they? Especially if you could do it every couple of years and create a time series. Yet, if I were to actually send out that survey, and ask people to fill it in, it’s a dead certainty that almost none would do so, even if given a guarantee of institutional anonymity. Most would find reasonable-sounding rationales for refusing, but really they’d just be afraid of the backlash if the data were ever published.

That, I would suggest, is indicative of a much larger problem: universities are not confident of their ability to explain their labour practices to the general public. But that’s no reason someone shouldn’t try out what would definitely be a very cool survey.

January 18

Student Surveys We’d Like to See

Surveys of current students tend to focus on just a few areas. Apart from questions about demographics and time use, they ask a lot of specific questions about satisfaction with student services along with a few general questions about overall satisfaction.

This is odd, because at the end of the day students don’t actually think student services are central to the overall quality of their PSE experience. What they care about first and foremost is the quality of the teaching they experience. Yet institutional surveys seem determined to avoid asking all but the most banal questions about teaching.

Sure, we have course evaluations. But these are used exclusively (if sparingly – but that’s another story) by departments. The data are never used as a tool to learn about what kinds of teaching methods work better than others, never linked to other demographic data to see if there are patterns in the data that link satisfaction or reported learning to student background, the amount of paid work a student engages in, etc. They are, in short, a massive lost opportunity.

What about the National Survey of Student Engagement (NSSE)? Well, despite allegedly being outcome-related, NSSE insists on treating a student’s entire class schedule as a single observation. It does ask about how often students “work in teams” or “make presentations in class,” but most students have a mix of classes, some of which have these elements and some which don’t. If you’re trying to understand how different teaching styles affect students, this stuff needs to be broken out class by class. NSSE, for all its cost, is essentially useless for this purpose.

Getting this data isn’t rocket science. Here at HESA, we regularly do surveys which ask questions about details of each of a student’s classes. That’s how we look at class size, it’s how we found out about the characteristics of students’ favourite and least-favourite classes and it’s how we learned about the effectiveness of e-learning resources in Canadian universities. If we can do it, so can universities and colleges.

From a public policy perspective, the reluctance to look harder at what makes for good teaching looks deeply suspicious. Teaching is what actually matters. It’s why institutions receive tax dollars. Finding out what kinds of pedagogical strategies are most effective should be a moral imperative for anyone who benefits from that amount of public support. But even if that weren’t true, having better knowledge about what works and what doesn’t in teaching is potentially an enormous source of strategic advantage for institutions.

But you can’t act on that advantage until you start collecting the data. So, enough with the student services questions; let’s start asking students real questions about teaching and learning.

January 17

Applicant Surveys We’d Like to See

I’ve always been a bit intrigued by the continuing popularity of Applicant Surveys. What is it that people expect to see in this year’s results that weren’t there last year?

There are basically three sets of research questions that are at the heart of current applicant surveys: who is applying (i.e., the social/ethnic composition), what information tools are students using to acquire information about institutions, and what do students say they are looking for in an institution?

The “who applies” question is an important one, but it’s not one that needs to get asked more than about once every three or four years. The data simply doesn’t change that fast.

At the margin, the “information tools” question does vary a bit with changes in technology. But in essence, the answer to this question is always the same: parents and friends at the top, schools and guidance counselors in the middle, followed by institutional websites, Maclean’s and the Globe and Mail, in more or less that order.

(Which is a disaster, of course; some of our recent research on the related topic of student financial aid found that students who relied on information from parents and friends were actually less knowledgeable about the topics at hand than people who had not sought ay information at all. Yikes.)

The important question isn’t “where did students get their information” but “what is it that students think they know about institutions”? Institutions need to have a much better sense of their own brand’s state of play if they are going to do anything useful about them. Sure, you can ask applicants if they think particular universities have “prestige researchers,” or offer “a safe environment”, but their answers are mostly ex-ante rationalizations, so why bother?

Bluntly, we know virtually nothing about the process of choice-formation. How do they develop interests in particular fields of study? How do they go about forming a choice set? How much do they understand about the differences between particular institutions? When do they become aware of institutional stereotypes (e.g., Western = “party school,” Trent = tree-huggers, etc.) and to what extent do these affect choice?

Admittedly, none of this is easy to get at through a survey. Some of these questions are inevitably qualitative (though analytical software is making post-survey coding ever-easier), and even the stuff that lends itself to quantitative analysis would require a lot of focus-group work to make sense.

But good things require work. In terms of being able to recruit better students, getting a jump on competitors gathering data to get inside the decision-making process is a lot more productive than seeing if the proportion of applicants saying they are looking for universities with “an emphasis on teaching” has moved another percentage point or not.

January 16

No More Boring Surveys

As most of you probably know, we at HESA spend a lot of our time working on surveys. While doing so, we see a lot of different types of survey instruments, especially from governments and institutions. And we’ve come to a major conclusion:

Most of them are really boring.

There was a time – say fifteen years ago– when doing surveys of applicants, graduates and alumni was relatively rare. There weren’t any surveys of satisfaction, or engagement, or anything else, really. We knew essentially nothing about the composition of the student body, their background, their finances or their experiences once they were there. Apart from the National Graduates Survey that Statistics Canada put out every four years, there was really almost nothing out there to tell us about students.

Things started to change in the late 1990s and early 2000s. Statscan and HRDC between them put the Youth in Transition Survey (YITS) into the field, along with the Post-Secondary Education Participation Survey (PEPS) and the Survey of Approached to Educational Planning (SAEP) (the latter two now being subsumed into the ASETS survey). A group of twenty or so institutions banded together to create the undergraduate survey consortium (CUSC); other institutions began signing on to a private-sector initiative (from the company that later became Academica) to look at data about applicants. Provincial governments began surveying graduates more regularly, too; the Millennium Scholarship Foundation also spurred some developments in terms of looking at student financing and students at private vocational colleges. That’s not to forget the post-2004 NSSE boom and the plethora of smaller institutional surveys now being conducted.

This explosion of activity – born of a combination of increasing policy interest in the field and decreasing survey costs – was all to the good. We learned a lot from these surveys, even if the insights they generated weren’t always used to improve programming and policy as much as they might.

But it’s getting really stale. Now that we have these surveys, we’ve stopped asking new questions. All anyone seems to want to do is keep re-running the same questions so we can build up time-series. Nothing wrong with time-series, of course – but since change in higher education comes at such a glacial pace, we’re wasting an awful lot of time and money measuring really tiny incremental changes in student engagement and the like rather than actually learning anything new. Where ten years ago we were discovering things, now we’re just benchmarking.

It doesn’t have to be this way. Over the next four days, we’ll be giving you our suggestions for how to change Canadian PSE surveys. It’s time to start learning again.