HESA

Higher Education Strategy Associates

October 25

Maslow v. Durkheim in the Canadian University Report

For those of you interested in student ratings of Canadian universities, the Globe and Mail’s Canadian University Report – for which we at HESA do the data work – is out today. I’m not going to recount all the gory details here – they’re available both in the magazine which accompanies today’s paper and online. What I’m going to do instead is outline briefly how the data can be used not just to compare institutions but to answer more profound questions about student experiences.

Most of the literature around “satisfaction” in universities indirectly traces back to the literature through student engagement to student retention, and on back to Durkheim (true story: if you just take Durkheim’s work, cross out the words “commit suicide” and write in “drop out of university,” you’re about 80% of the way to summarizing modern student retention literature). But the thing about most retention and engagement studies is that after you run them through the wringer they all tend to end up with an R-squared of about 0.4. That’s not nothing in social science by any means, but it suggests there’s a lot of other stuff going on that Durkheim can’t explain.

One theory bouncing around the HESA offices is that “belonging” is overrated as an explanation for engagement and satisfaction, and that self-actualization is more important. That, in effect, we need to be looking much more towards Abraham Maslow than to Durkheim for inspiration.

Examining data provided by the 33,870 students who responded to our Globe survey is a great way to check these hypotheses because of the enormous sample size, the ability to control for all sorts of institutional factors, and (of course) the very detailed information about satisfaction. It contains a battery of Durkheim-esque questions about belonging, and also some questions that hint at a Maslowian explanation for satisfaction, notably the one which asks students if their institution has the “right balance between work and fun.”

Using both sets of questions as independent variables vis-à-vis satisfaction and comparing the results questions can help us gain insights into the Maslow/Durkheim debate – and where we don’t get clear answers, we can use our monthly CanEd student panel to get further data to answer the questions (wait till you see next month’s survey!).

Over the long term, we think we can build up a pretty good picture of what makes different types of students tick, which will allow us not only to answer questions such as “Why are Toronto Students so Friggin’ Miserable,” but also to answer more profound questions about the sources of student satisfaction.

Over the next few weeks, we’ll be sharing some of this data with you. Stay tuned.

October 24

The Central Canadian Jock Windfall

The other day I was flipping through public policy maven Charles Clotfelter’s new book Big Time Sports in American Universities (you can get the gist via this interview on YouTube). It reminded me to check up a bit on a subject which intrigued me a few years ago, namely the evolution of sports scholarships in Canadian universities.

Fifteen years ago, Canadian athletic scholarships were still both small and rare: even in 2001-02, CIS schools were distributing just $3.4 million in scholarships to 2,439 athletes. By 2009-10, those figures had risen substantially: now, just over 4,000 students per year receive a combined $10 million in athletic scholarships (all data from CIS’s statistics page).

But what’s really interesting is where the increases have occurred. Since 2001-02, scholarships at prestige schools like U of T, UBC and Alberta haven’t increased that much. The Atlantic schools for the most part have kept their increases below the average in the rest of the country (the exceptions being Dalhousie, up 395% to $414,000, and Acadia, up 477% to $550,000), though it is nevertheless significant that this tiny region, home to less than 10% of Canadian students, accounts for 32% of all scholarship spending.

No, the real change in Canadian athletics is happening in big central Canadian universities – basically, the OUA plus the Anglophone Quebec universities. Check out some of these eye-popping percentage increases: McGill up 664% to $222,000, Concordia up 886% to $224,000, Carleton, up 1138% to $265,560. And that’s only the ones for whom it makes a modicum of sense to express change in percentage terms, i.e., who were spending more than $20,000 to begin with. We could get into others: Queen’s, up from $18,000 to $177,000, McMaster up from $5,000 to $218,000, and Wilfrid Laurier, up from $12,000 to $299,000.

Is this really where these universities want to be seen to be spending money on the eve of a serious downturn in public funding? Even assuming most of this is from alumni donations (which I suspect is not in fact universally the case), what’s the point? Have these universities become noticeably better at inter-university sport? Or has a scholarships arms race just created windfall benefits for a particular group of students?

Sport is about performance. It would be nice if we thought of university budgets the same way.

October 21

Why are Toronto Students so Friggin’ Miserable? (Part Two)

Today we revisit the issue of why student dissatisfaction in Canada seems to be concentrated in Toronto, aka the Centre of the Universe. We’ll try to answer the simple question – do Toronto schools fare poorly because a disproportionate number of Toronto students live in their parents’ basements?

Our data source today is the HESA-administered survey that fuels the satisfaction results in The Globe and Mail’s Canadian University Report, in which students are asked to express satisfaction on a scale ranging from one (Very Dissatisfied) to nine (Very Satisfied). In practice, students only rarely use the bottom half of the scale, so all institutions receive mean scores greater than five.

Living at home is indeed associated with lower satisfaction (Figure 1). Those who manage to escape their parents’ city entirely are the most satisfied. The difference is small but not insignificant – just over 0.4 points (out of 9) on average between the at-home and away-from-family groups. And Toronto certainly has plenty more kids living at home (Figure 2) – 57% of our Toronto sample lives at home, compared to 33% elsewhere.

Figure 1: Overall Satisfaction with Institution, by Living Arrangement

 

Figure 2: Living Arrangement by Location

So have we found our answer, then? Well, no. As Figure 3 shows, it’s not quite that simple.

Figure 2: Overall Satisfaction with Institution, by Living Arrangement and Location

While Toronto students who are stuck under the ever-vigilant eyes of their parents are indeed the least satisfied, there remains a large (0.6 points out of 9), significant and unexplained satisfaction gap between this group and those who live at home in other cities. Moreover, there’s still clearly a location effect: students who go away to university in Toronto are less happy than students who stay at home elsewhere in Canada.

So, living at home is clearly part of the answer, but it’s a long way from answering the question of why Toronto students are so friggin’ miserable. Next week, once we have some data from the new Globe survey (new CUR out on the 25th!), we’ll be delving into issues around institutional size and students’ perceptions of institutional mission.

October 20

What is Research, Anyway?

As we’ve seen repeatedly over the past few weeks, there’s a constituency out there that wants to see greater differentiation of institutions in terms of research-intensiveness. In the vernacular, this comes across as advocating “teaching institutions” to complement “research institutions,” something which occasionally gets incorporated into government policy as it did in British Columbia vis-à-vis the new universities.

This kind of talk, of course, makes much of the professoriate go bananas. And they fire back with good stuff like Stephen Saideman, did, saying that universities aren’t about research vs. teaching, they’re about research and teaching.

But here’s the thing: do we really think both sides mean the same thing when they use the word “research”?

When professors pull out the “my life as a scholar means nothing without research” line, they aren’t necessarily trying to say they all need large research budgets and hordes of grad students and tri-council grants or their lives will be meaningless (well, some might be saying that, but they’re a minority). What they are saying is that research as a process of searching for new knowledge or construction of new meaning – which can be done through low-budget activities like editing journals, writing reviews, etc. – is inherent in the notion of being a scholar, and that institutions where the teaching isn’t done by scholarly people aren’t worthy of being allowed to grant degrees. Fair enough.

On the flip side, when governments say “we want teaching-only institutions,” they’re not saying they wish to ban professors from doing scholarly reading or engaging with colleagues at colloquia, etc. No one’s going to tell professors to give back their SSHRC grants or to stop writing articles. What they are saying is (a) that they don’t want to stump up big bucks for research infrastructure and (b) they would prefer a system that more closely resembles the U.S. public university system where at flagship institutions, professors essentially teach two courses a semester but everywhere else, they teach four. Also fair enough – unless one is prepared to argue that every non-flagship U.S. institution isn’t a “real university” because they don’t focus enough on research.

“Research” encompasses a wide variety of activities of varying intensities and time commitments. If we’re going to talk more about the balance between teaching and research, we need to stop making absolute statements about research and start treating the subject with the nuance it deserves.

October 19

Ducking the Issue

Man, did last week’s Globe editorial on reforming higher education get the bien pensants’ knickers in a knot, or what?

Constance Adamson of OCUFA took the predictable “everything would be fine if only there was more money” line. Over at Maclean’s, Todd Pettigrew made a passionate defence of research and teaching being inextricably entwined, largely echoing a piece from the previous week by McGill’s Stephen Saideman, who argued that universities aren’t teaching vs. research but teaching and research.

Methinks some people doth protest too much.

Let’s take it as read that universities are intrinsically about both teaching and research; there’s still an enormous amount of room for discussion about their relative importance. It may be cute to say that choosing between the two is a false dichotomy but in the real world profs make trade-offs: when they increase their research activity, they tend to spend less time teaching. This shouldn’t be controversial. It’s just math.

Unfortunately, obfuscating the trade-offs between research and teaching is a stock in trade of academia. My particular favourite is the old chestnut about research vs. teaching being a false dichotomy because “the best teachers are often the best researchers.” I’m being restrained when I say that this, as an argument, is a bunch of roadapples. As research has consistently shown, the relationship between the two is zero. Being a good researcher has no effect on the likelihood of being a good teacher and vice versa.

Look, there’s lots to quibble with in the Globe editorial, not least of which is the ludicrous insouciance with which it treats the concept of quality measurement. But most of its basic points are factually correct: by and large, parents and taxpayers think the main purpose of universities is to teach undergraduates and prepare them for careers (broadly defined). Canadian academics are, in fact, the most highly paid in the world outside the Ivy League and Saudi Arabia. They are also demonstrably doing less teaching than they used to, ostensibly in order to produce more research.

Anyone who can’t understand why that combination of facts might provoke at least some questioning about value for money really needs to get out more.

One of the sources of miscommunication here is that the seemingly simple term “research” is actually a very contested term which means enormously different things to different people. More on this tomorrow.

October 18

Well, That Was Interesting

The Report of the Expert Panel on R&D, that is. It’s an intriguing and well-written piece of work (kudos to Peter Nicholson), at least as much for what it doesn’t say as what it does.

There are three things this report does extremely well: i) it explains the mind-boggling number of tiny programs the federal government supports, ii) it graphically shows how the Scientific Research and Experimental Development program massively overshadows all other panels combined and iiI), it amusingly tells the government in no uncertain terms that the bit in their mandate about evaluating the relative effectiveness of programs was a crock because no data exists to allow such a comparison.

(Seriously – read the first three pages of chapter 5, because they set a whole new standard in expert panels rejecting the premise of their terms of reference.)

Those pieces of fabulousness aside, the panel came up with six main recommendations and a bunch of subsidiary ones, which you can find in short form here. Some of them are completely innocuous, such as “encouraging more collaboration” and “simplifying forms”; some are only mildly innocuous, like “having a national dialogue about innovation.” Some of them are neither innocuous nor radical but simply overdue, especially the recommendations on improving venture capital funding.

Others are more radical, such as the creation of a one-stop shop known as the Industrial Research and Innovation Council. The effectiveness of such bodies have to be taken somewhat on faith; single-windows have their advantages, but they depend on effective management which not all arms-length organizations have. The most intriguing proposed mandate given to this body is the creation of a national “business innovation talent strategy.” It’s a fascinating idea, which depends on a great deal of co-operation between several ministries across two levels of government plus educational institutions. There’s some potential for crashing and burning here – but potential for real innovation breakthroughs as well.

The headline recommendation, though, has to be the call to “transform” the National Research Council, though a process which skeptics might call “asset stripping.” Basically, all 17 of its institutes are to be either incorporated into a government agency, turned into a non-profit or integrated with a university or universities. University government relations offices should have a ball with that last one: let the lobbying begin!

But most amazing of all – there is nothing in the recommendations about any of the NSERC or tri-council programs that were included in the review. Not. One. Word.

How did that happen? My guess is we’ll never find out. But I’ll bet it’s a really good story.

October 17

The Future of Canadian R&D – Round One

The Mowat Institute showed some canny timing by releasing its paper, Canada’s Innovation Underperformance: Whose Policy Problem Is It?, on the Friday before the federal government’s Research and Development Review Panel reports. It was a real master-class in media management.

The report, authored by Tijs Creutzberg, doesn’t break a lot of new ground; in many ways it’s just a lit review, albeit a very nicely-written one. Basically, it argues two things: i) that our government innovation strategies are overly biased towards tax-credits and make insufficient use of direct cash support and ii) that there is too much overlap between federal and provincial policy instruments.

Though the first issue got the lion’s share of the media attention Friday, it’s actually the place where the report is thinnest. The report’s “evidence” basically consists of one graph which shows Canada as a policy outlier in its reliance on tax credits (not news if you’ve been keeping up with the OECD literature), and two paper citations on the benefits of direct subsidies over tax credits (one of which, if you bother to look it up, actually says nothing at all about the relative efficacy of direct support versus tax credits). Creutzberg may well be right about this, but on the evidence presented, it’s hard to tell.

On the second issue – that of carving more rational policy roles for the federal and provincial government – Creutzberg oozes good sense about the importance of place and regions in research, and then comes up with an eminently logical way of dividing up policy responsibilities between the two. The problem is that some of the recommendations come off sounding a tad too idealistic. However sensible it might be to get the federal government out of direct cluster-specific subsidies or for provinces to abjure sector-specific tax credits, it’s really, really hard to imagine it ever happening. Forget theories of federalism – those programs win votes, and politicians don’t give up vote-winners easily.

Untouched in Creutzberg’s paper is the issue of how all those federal billions that go to university research play into our research and innovation system. That is likely going to be the centerpiece of today’s paper from the Expert Panel. There’s a serious air of anticipation about this report; despite rumours of a divided panel not a single leak has taken place, which in Ottawa is about as rare as a Senators’ playoff run. It should be interesting.

Tune in tomorrow for more.

October 14

Teaching, Testing, Grading

In the last couple of months, some very interesting memes have started to take shape around the role of the professoriate.

Grade inflation – or grade compression as some would have it – is of course quite real. Theories for it vary; there’s the “leave-us-alone-so-we-can-do-research theory,” and also the “professors-are-spineless-in-the-face-of-demanding-students theory.” Regardless of the cause, the agent is clear: professors. They simply haven’t held a consistent standard over time, and that’s a problem.

About two months ago, the Chronicle put together a very interesting article on Western Governors University and how they’ve managed to avoid grade inflation. Simply put: they don’t let teachers grade. Rather, they leave that job to cohorts of assessors, all of whom possess at least a Master’s degree in the subject they are grading, and who are quite separate from the instructors.

This kind of makes sense: teachers are subject matter experts, but they aren’t expert assessors, so why not bring in people who are? Unlike professors, who have to put up with course evaluations, independent assessors have no incentive to skew grades.

One could take this further. Not only are professors not experts at grading, but they aren’t necessarily experts at devising tests, either. Solution? Step forward Arnold Kling of George Mason University who recommends improving testing by having outside professionals taking a professor’s lecture notes and course readings and fashioning a test on the basis of them.

Are there good reasons to try these ideas? On grading, the gains might be on quality rather than cost. Informally, TAs do a lot of the grading in large classrooms anyways so it’s not as if we aren’t already quasi-outsourcing this stuff. But the TAs have no more expertise than professors in terms of assessment, so professionalizing the whole thing might be beneficial. On testing, you might not get cost advantage unless you had some economies of scale (i.e., you’d need multiple participating institutions to make it worthwhile), though again there may be quality advantages.

Of course, to get any cost savings at all on either of these, you’d need to get professors to explicitly trade their testing and marking responsibilities in these areas for greater class loads. Have them teach three courses a term instead of two, but do less in each of them. It’s hard to say if anyone would bite on that one; but given coming funding crunches, it might be worth somebody at least trial-ballooning these ideas during their next collective bargaining round.

October 13

Why the “Great Disruption” is Bogus

So apparently Inigral CEO Michael Staton – who by and large is a sensible guy – has been talking up this idea about higher education being about to undergo a “Great Disruption.” Why he thinks this is the case isn’t clear – he spends most of his Inside Higher Ed article explaining why higher education isn’t, contra some of higher education’s weirder critics, in a bubble, but he does think everyone needs to spend a lot of money adapting to it right now.

This is what’s known as “talking your own book.” We consultants all do it to some degree, but that doesn’t mean we’re always right. And in this case, I happen to think Staton’s dead wrong.
The best analogy I can think of here is the frenzy over the “Death of Distance” in the mid-1990s. You may recall that there was briefly an intellectual fad for thinking that the “information superhighway” (younger readers: yes, some people really called it that) would render place irrelevant, allow people to work and collaborate from wherever they were and render large urban conglomerations ever less relevant.

Some of that occurred, of course, but as it turned out place started to matter more than ever. We like talking to people all over the world on the Internet, but we like physically working with and learning from others in the flesh.

And so it is for higher education. Obviously, technological change is having a very big effect on the way we store, use and relate to information. At the margin, we can improve undergraduate learning outcomes using technology, though as we pointed out in a study a few weeks ago, we need to get a lot better at integrating the technology.

But is this stuff going to replace a traditional undergraduate degree for the 18-24 crowd (which is, after all, still the core business of just about every university in the world)? Absolutely not. Students and their parents think it’s as important as ever to get their education in the flesh. There is no reason at all to think that this is going to change, and every reason to believe that parents will continue to pay top dollar for a developmental experience which is deeply based in intensive human contact.

As I noted last week, one university business line (adult professional education) is vulnerable to new technologies. Everything else, for the foreseeable future, is going to be remarkably insensitive to technological change. Period.

October 12

Enough About National Goals, Already

In any discussion of Canadian post-secondary education, you know you’re about to approach an impasse when someone starts blaming some real or imagined ill on the lack of “national goals” or the absence of a federal ministry of higher education.

Honestly, who cares? The lack of a Department of Education until the Carter administration didn’t stop the U.S. from creating one of the world’s great PSE systems. Our lack of one hasn’t prevented us from having world-class research funding or one of the most accessible systems of higher education globally.

But that hasn’t stopped the Canadian Council on Learning from devoting its valedictory report to the subject of – yes, you guessed it – how terrible it is that Canada has no national goals in education.

Once it became clear that CCL, a federally-funded entity, was not going to have the support of co-operation of the provinces (due in no small measure to CEO Paul Cappon’s own behavior – he secretly worked with the feds to set up the Council while still working for the Council of Ministers of Education, Canada) it was always going to struggle to find a role or a niche. The council decided early on that banging the drum for “national goals” (which is partially but not entirely code for “more federal involvement”) was going to be it.

From then on, virtually any problem one could name in higher education was henceforth a problem of national goals. CCL had a hammer, and every problem was a nail. Just read the CCL report and see. Immigrant skills not meeting labour market demand? That’s a result of not having national goals in PSE. StatsCan unable to make its data comparable with the OECD’s? National goals, again. The rather more sensible propositions that poor immigration policy or lack of imagination at StatsCan could be the issue doesn’t even enter the picture.

The report baselessly asserts that more national action could reduce the male-female attainment gap, or improve apprenticeship completion rates. It is even more baselessly asserted that Canada has no quality assurance agencies in PSE when in fact seven of ten provinces do have one with an eighth (Saskatchewan) about to bring one in.

It’s the same kind of magical thinking used by Quebec separatists. The nature of the actual problem barely matters: once we have achieved separatism/adopted national policy goals in PSE, those problems will disappear. From an organization that once aspired to thought leadership in the field, it’s a disappointingly simplistic and ahistorical approach.

Page 111 of 116« First...102030...109110111112113...Last »