HESA

Higher Education Strategy Associates

Author Archives: Alex Usher

November 01

More Shanghai Needed

I’m in Shanghai this week, a guest of the Center for World-Class Universities at Shanghai Jiao Tong University for their biannual conference. It’s probably the best spot on the international conference circuit to watch how governments and institutions are adapting to a world in which their performance is being measured, compared and ranked on a global scale.

In discussions like this the subject of rankings is never far away, all the more so at this meeting because its convenor, Professor Nian Cai Liu, is also the originator of the Academic Ranking of World Universities, also known as the Shanghai Rankings. This is one of three main competing world rankings in education, the others being the Times Higher Education Supplement (THES) and the QS World Rankings.

The THES and QS rankings are both commercially-driven exercises. QS actually used to do rankings for THES, but the two parted ways a couple of years ago when QS’s commercialism was seen to have gotten a little out of hand. After the split, THES got a little ostentatious about wanting to come up with a “new way” of doing rankings, but in reality, the two aren’t that different: they both rely to a considerable degree on institutions submitting unverified data and on surveys of “expert” opinion. Shanghai, on the other hand, eschews surveys and unverified data, and instead relies entirely on third-party data (mostly bibliometrics).

In terms of reliability, there’s really no comparison. If you look at the correlation between the indicators used in each of the rankings, THES and QS are very weak (meaning that the final results are highly sensitive to the weightings), while the Shanghai rankings are very strong (meaning their results are more robust). What that means is that, while the Shanghai rankings are an excellent rule-of-thumb indicator of concentrations of scientific talent around the world, the QS and THES rankings in many respects are simply measuring reputation.

(I could be a bit harsher here, but since QS are known to threaten academic commentators with lawsuits, I’ll be circumspect.)

Oddly, QS and THES get a lot more attention in the Canadian press than do the Shanghai rankings. I’m not sure whether this is because of a lingering anglophilia or because we do slightly better in those rankings (McGill, improbably, ranks in the THES’s top 20). Either way, it’s a shame, because the Shanghai rankings are a much better gauge of comparative research output, and with its more catholic inclusion policy (500 institutions ranked compared to the THES’s 200), it allows more institutions to compare themselves to the best in the world – at least as far as research is concerned.

October 31

The Trouble with Sniping

Much of the HESA staff was in Fredericton last week at the annual meeting of the Canadian Institutional Research and Planning Association where, as usual, a good and informative time was had by all (hat tip to the CIRPA organizing committee).

But something happened there which bothered me quite a bit: namely, a keynote address in which ACCC President Jim Knight began taking gratuitous potshots at the university sector. I’ve been wondering ever since if this was just an off-night for him, or a sign of a potentially very damaging split in the post-secondary lobbying world.

In the main, Knight’s talk was competent if not especially exciting. The aim of the speech was to list off the challenges the country was facing in terms of growth and social inclusion, and how he thought the post-secondary sector in general and colleges in particular could rise to meet these challenges – pretty standard Ottawa stuff, really.

Where things got a bit dicey was when Knight decided, in the midst of a very relevant discussion about the need for the sector to prove its “relevance” to government, to go on a tangent about how the university community really needed to wake up and smell the coffee on the relevance issue because, while colleges could prove themselves supremely relevant by churning out job-ready graduates, universities, well, you know…

This kind of talk really isn’t helpful. Colleges and universities each have their roles to play in equipping the country’s population with the skills to thrive in the modern economy. Both sectors do very well by international standards, and we have reason to be proud of both. No doubt each sector has its strengths and weaknesses, and that the rapid on-going shifts in the world of work pose significant challenges to graduates of specific programs in both sectors.

But what on earth is to be gained by one sector pointing the finger at the other and claiming superiority?

I’m really hoping this was just a bad night out for Mr. Knight. Because if this is actually a strategy, if ACCC really thinks the path to success in the new political and economic environment is to get their elbows up and start jostling with the university sector, then both sectors are in trouble.

Good things happen when the community sends positive, co-operative messages to the public. Bad things happen when it doesn’t. Simple as that.

October 28

Why are Toronto Students so Friggin’ Miserable (Part 3)

So, back to our favourite hobby of delving in to the causes of Toronto students’ misery. Today’ we’re looking at the issue of institutional size and asking the question: are Toronto schools Too Big Not to Fail?

(For those of you tired of hearing about Toronto, bear with us: you can learn a lot about satisfaction generally by following this series.)

First, to put this all in perspective: this year’s Canadian University Report data shows that Toronto students are really unhappy (Figure 1). On a nine-point scale, they rate their satisfaction 0.75 points lower than students from elsewhere in Canada. Given that no school receives an overall satisfaction score less than a 5.8 or greater than 8.2, this is a rather substantial difference (a standard deviation of 1.5, 1.5 standard deviations, if your unit of analysis is university means).

One obvious suspect is size. Toronto has some of the largest institutions in Canada, and smaller schools generally do better on the CUR, as a quick glance at the results charts in this year’s edition (or at Figure 2) will tell you.

So, given that a majority of students in our sample attend massive schools like University of Toronto or York University, is “Colossal U” the barrier to Toronto’s satisfaction? A closer look at the data suggests not. Toronto actually has institutions in all four CUR size categories. While the size hypothesis could account for low satisfaction grades at U of T’s St. George campus (B-) or York (C+) (Ryerson’s B actually hugs the national average for schools its size), it hardly explains sentiments at Mississauga (C+) or Scarborough (C+). OCAD’s B-minus is perhaps the ultimate proof; it’s not bad for a Toronto school, but still the lowest score nationally among very small institutions. That’s what teaching Torontonians will do to you.

In short, Toronto’s misery is not concentrated in any one institution or even one type of institution; it’s spread among the big and the small alike, as Figure 3 demonstrates.

So much for that obvious explanation. Maybe the problem is that we’re asking the wrong question: instead of looking at sources of dis-satisfaction, we need to take a harder look at what factors (other than size) are associated with satisfaction – in particular, the correlation between certain types of institutional and student characteristics that seem to positively affect satisfaction. Stay tuned.

— Alex Usher and Jason Rogers.

For the record, no actual Torontonians work at Higher Education Strategy Associates.

October 27

It Was 20 Years Ago Today*

…that the Report of the Commission of Inquiry on Canadian University Education was released.

In 1990, in the midst of deficit crises, national unity crises, etc., AUCC members decided that the only way to focus public attention on education was to appoint an independent commissioner, Dr. Stuart Smith, to shine a spotlight on their own activities. It worked, but probably not in the way they intended.

The first few pages of the report deal in the banalities used by every university president since Jesus was born: essentially, “the system is strong and healthy but could use more money.” That taken care of, Smith then took a vicious left turn from the script and laid into universities for neglecting their teaching mission and spending too much time on scientific research.

To say university presidents felt betrayed would be an understatement. They were not amused by the rather strong implication that their research mission was interfering with their teaching mission (now where have we heard that before?), and weren’t shy about saying so.

Reading the report today, one is struck both by what has changed and what hasn’t. It’s hard not to read the recommendations around credit transfer, the lack of data on faculty teaching loads or the imbalance of incentives around teaching and research and think “plus ça change.” But on the other hand, one can also read the recommendations around access, student assistance, teaching-track faculty research into higher education and performance indicators and think, “actually, we’ve come a really long way.”

(My favourite recommendation is the one suggesting that all institutions be required to publish the percentage of their budget devoted to helping faculty improve teaching or fund curricular innovation. Yeah it’s unworkable in practice, but it would be deliciously cruel – and probably highly motivational – to have institutions publish numbers that need to be measured in hundredths of percentage points.)

So, lots of progress, but frankly not enough. No one can read the section on teaching and learning and seriously think that the situation has improved in the last twenty years. It’s fair to say that Smith wasn’t providing a balanced picture of universities and their activities in his report. But I think it’s equally fair to say that wasn’t his brief.

Many people speak on behalf of research. Distressingly few, including student leaders, speak to the substance of education itself. The Smith Commission was by some distance the best manifesto for undergraduate education this country has ever produced. We could use another one like it soon.

*I think. It’s hard to tell about things that came out in the pre-Internet era.

October 26

Oh for Heaven’s Sake (Western Canadian Edition)

You may have seen some reporting recently – say, here, here and here – to the effect that I’ve authored a report saying that the intellectual centre of gravity in Canada is moving westward at a rapid rate. You may also have seen me quoted saying things to the effect that it’s a result of sustained funding increases over the past decade in the west, while in Ontario even the major increases seen in McGuinty’s first term were barely able to cope with increased demand, let alone reverse the effects of decades of underfunding.

I do believe all of this (and can argue this point at length, if any of you want to start me off with a beer), but I do feel that I should make it clear that no such report exists.

Here’s what happened: an Ottawa Citizen reporter asked me what the major issues of the Ontario election were in post-secondary education and I pointed to the fact that none of the three parties had promised any increase in PSE funding during the next five years. When asked to describe the possible effects of this, I pointed to the general relative decline of Ontario universities compared to those in the three Western provinces. This last bit, somehow, became the focus of the article.

“Toronto consultant says west is best” is one of those headlines that are hard to resist on the Prairies, so the Calgary Herald and others picked up the story. Then one student newspaper got the wrong end of the stick and assumed that since stories were being published around comments from a think-tank president on a subject, said comments must have originated in a report of some kind. This led to a somewhat surreal CUP podcast in which two journalists discussed a non-existent report.

I thought this was pretty harmless until this new version started getting picked up by places like Globe Campus, at which point I thought “enough is enough.”

So, to be clear: there are a number of Western Canadian universities that rock pretty hard; most eastern universities are struggling to keep up and will continue to do so as they get smacked with the effects of deficit-cutting measures; the difference between east and west is partly money, partly demographics (strong universities are a lagging indicator of economic success, as any academic from China or India could tell you) and in a couple of specific instances it’s about exceptional leadership; and for all these reasons it’s quite fair to call this a shifting of the country’s intellectual centre of gravity.

Just don’t go looking for the report because there isn’t one. Sorry.

October 25

Maslow v. Durkheim in the Canadian University Report

For those of you interested in student ratings of Canadian universities, the Globe and Mail’s Canadian University Report – for which we at HESA do the data work – is out today. I’m not going to recount all the gory details here – they’re available both in the magazine which accompanies today’s paper and online. What I’m going to do instead is outline briefly how the data can be used not just to compare institutions but to answer more profound questions about student experiences.

Most of the literature around “satisfaction” in universities indirectly traces back to the literature through student engagement to student retention, and on back to Durkheim (true story: if you just take Durkheim’s work, cross out the words “commit suicide” and write in “drop out of university,” you’re about 80% of the way to summarizing modern student retention literature). But the thing about most retention and engagement studies is that after you run them through the wringer they all tend to end up with an R-squared of about 0.4. That’s not nothing in social science by any means, but it suggests there’s a lot of other stuff going on that Durkheim can’t explain.

One theory bouncing around the HESA offices is that “belonging” is overrated as an explanation for engagement and satisfaction, and that self-actualization is more important. That, in effect, we need to be looking much more towards Abraham Maslow than to Durkheim for inspiration.

Examining data provided by the 33,870 students who responded to our Globe survey is a great way to check these hypotheses because of the enormous sample size, the ability to control for all sorts of institutional factors, and (of course) the very detailed information about satisfaction. It contains a battery of Durkheim-esque questions about belonging, and also some questions that hint at a Maslowian explanation for satisfaction, notably the one which asks students if their institution has the “right balance between work and fun.”

Using both sets of questions as independent variables vis-à-vis satisfaction and comparing the results questions can help us gain insights into the Maslow/Durkheim debate – and where we don’t get clear answers, we can use our monthly CanEd student panel to get further data to answer the questions (wait till you see next month’s survey!).

Over the long term, we think we can build up a pretty good picture of what makes different types of students tick, which will allow us not only to answer questions such as “Why are Toronto Students so Friggin’ Miserable,” but also to answer more profound questions about the sources of student satisfaction.

Over the next few weeks, we’ll be sharing some of this data with you. Stay tuned.

October 24

The Central Canadian Jock Windfall

The other day I was flipping through public policy maven Charles Clotfelter’s new book Big Time Sports in American Universities (you can get the gist via this interview on YouTube). It reminded me to check up a bit on a subject which intrigued me a few years ago, namely the evolution of sports scholarships in Canadian universities.

Fifteen years ago, Canadian athletic scholarships were still both small and rare: even in 2001-02, CIS schools were distributing just $3.4 million in scholarships to 2,439 athletes. By 2009-10, those figures had risen substantially: now, just over 4,000 students per year receive a combined $10 million in athletic scholarships (all data from CIS’s statistics page).

But what’s really interesting is where the increases have occurred. Since 2001-02, scholarships at prestige schools like U of T, UBC and Alberta haven’t increased that much. The Atlantic schools for the most part have kept their increases below the average in the rest of the country (the exceptions being Dalhousie, up 395% to $414,000, and Acadia, up 477% to $550,000), though it is nevertheless significant that this tiny region, home to less than 10% of Canadian students, accounts for 32% of all scholarship spending.

No, the real change in Canadian athletics is happening in big central Canadian universities – basically, the OUA plus the Anglophone Quebec universities. Check out some of these eye-popping percentage increases: McGill up 664% to $222,000, Concordia up 886% to $224,000, Carleton, up 1138% to $265,560. And that’s only the ones for whom it makes a modicum of sense to express change in percentage terms, i.e., who were spending more than $20,000 to begin with. We could get into others: Queen’s, up from $18,000 to $177,000, McMaster up from $5,000 to $218,000, and Wilfrid Laurier, up from $12,000 to $299,000.

Is this really where these universities want to be seen to be spending money on the eve of a serious downturn in public funding? Even assuming most of this is from alumni donations (which I suspect is not in fact universally the case), what’s the point? Have these universities become noticeably better at inter-university sport? Or has a scholarships arms race just created windfall benefits for a particular group of students?

Sport is about performance. It would be nice if we thought of university budgets the same way.

October 21

Why are Toronto Students so Friggin’ Miserable? (Part Two)

Today we revisit the issue of why student dissatisfaction in Canada seems to be concentrated in Toronto, aka the Centre of the Universe. We’ll try to answer the simple question – do Toronto schools fare poorly because a disproportionate number of Toronto students live in their parents’ basements?

Our data source today is the HESA-administered survey that fuels the satisfaction results in The Globe and Mail’s Canadian University Report, in which students are asked to express satisfaction on a scale ranging from one (Very Dissatisfied) to nine (Very Satisfied). In practice, students only rarely use the bottom half of the scale, so all institutions receive mean scores greater than five.

Living at home is indeed associated with lower satisfaction (Figure 1). Those who manage to escape their parents’ city entirely are the most satisfied. The difference is small but not insignificant – just over 0.4 points (out of 9) on average between the at-home and away-from-family groups. And Toronto certainly has plenty more kids living at home (Figure 2) – 57% of our Toronto sample lives at home, compared to 33% elsewhere.

Figure 1: Overall Satisfaction with Institution, by Living Arrangement

 

Figure 2: Living Arrangement by Location

So have we found our answer, then? Well, no. As Figure 3 shows, it’s not quite that simple.

Figure 2: Overall Satisfaction with Institution, by Living Arrangement and Location

While Toronto students who are stuck under the ever-vigilant eyes of their parents are indeed the least satisfied, there remains a large (0.6 points out of 9), significant and unexplained satisfaction gap between this group and those who live at home in other cities. Moreover, there’s still clearly a location effect: students who go away to university in Toronto are less happy than students who stay at home elsewhere in Canada.

So, living at home is clearly part of the answer, but it’s a long way from answering the question of why Toronto students are so friggin’ miserable. Next week, once we have some data from the new Globe survey (new CUR out on the 25th!), we’ll be delving into issues around institutional size and students’ perceptions of institutional mission.

October 20

What is Research, Anyway?

As we’ve seen repeatedly over the past few weeks, there’s a constituency out there that wants to see greater differentiation of institutions in terms of research-intensiveness. In the vernacular, this comes across as advocating “teaching institutions” to complement “research institutions,” something which occasionally gets incorporated into government policy as it did in British Columbia vis-à-vis the new universities.

This kind of talk, of course, makes much of the professoriate go bananas. And they fire back with good stuff like Stephen Saideman, did, saying that universities aren’t about research vs. teaching, they’re about research and teaching.

But here’s the thing: do we really think both sides mean the same thing when they use the word “research”?

When professors pull out the “my life as a scholar means nothing without research” line, they aren’t necessarily trying to say they all need large research budgets and hordes of grad students and tri-council grants or their lives will be meaningless (well, some might be saying that, but they’re a minority). What they are saying is that research as a process of searching for new knowledge or construction of new meaning – which can be done through low-budget activities like editing journals, writing reviews, etc. – is inherent in the notion of being a scholar, and that institutions where the teaching isn’t done by scholarly people aren’t worthy of being allowed to grant degrees. Fair enough.

On the flip side, when governments say “we want teaching-only institutions,” they’re not saying they wish to ban professors from doing scholarly reading or engaging with colleagues at colloquia, etc. No one’s going to tell professors to give back their SSHRC grants or to stop writing articles. What they are saying is (a) that they don’t want to stump up big bucks for research infrastructure and (b) they would prefer a system that more closely resembles the U.S. public university system where at flagship institutions, professors essentially teach two courses a semester but everywhere else, they teach four. Also fair enough – unless one is prepared to argue that every non-flagship U.S. institution isn’t a “real university” because they don’t focus enough on research.

“Research” encompasses a wide variety of activities of varying intensities and time commitments. If we’re going to talk more about the balance between teaching and research, we need to stop making absolute statements about research and start treating the subject with the nuance it deserves.

October 19

Ducking the Issue

Man, did last week’s Globe editorial on reforming higher education get the bien pensants’ knickers in a knot, or what?

Constance Adamson of OCUFA took the predictable “everything would be fine if only there was more money” line. Over at Maclean’s, Todd Pettigrew made a passionate defence of research and teaching being inextricably entwined, largely echoing a piece from the previous week by McGill’s Stephen Saideman, who argued that universities aren’t teaching vs. research but teaching and research.

Methinks some people doth protest too much.

Let’s take it as read that universities are intrinsically about both teaching and research; there’s still an enormous amount of room for discussion about their relative importance. It may be cute to say that choosing between the two is a false dichotomy but in the real world profs make trade-offs: when they increase their research activity, they tend to spend less time teaching. This shouldn’t be controversial. It’s just math.

Unfortunately, obfuscating the trade-offs between research and teaching is a stock in trade of academia. My particular favourite is the old chestnut about research vs. teaching being a false dichotomy because “the best teachers are often the best researchers.” I’m being restrained when I say that this, as an argument, is a bunch of roadapples. As research has consistently shown, the relationship between the two is zero. Being a good researcher has no effect on the likelihood of being a good teacher and vice versa.

Look, there’s lots to quibble with in the Globe editorial, not least of which is the ludicrous insouciance with which it treats the concept of quality measurement. But most of its basic points are factually correct: by and large, parents and taxpayers think the main purpose of universities is to teach undergraduates and prepare them for careers (broadly defined). Canadian academics are, in fact, the most highly paid in the world outside the Ivy League and Saudi Arabia. They are also demonstrably doing less teaching than they used to, ostensibly in order to produce more research.

Anyone who can’t understand why that combination of facts might provoke at least some questioning about value for money really needs to get out more.

One of the sources of miscommunication here is that the seemingly simple term “research” is actually a very contested term which means enormously different things to different people. More on this tomorrow.

Page 110 of 115« First...102030...108109110111112...Last »