HESA

Higher Education Strategy Associates

November 04

Why Are Toronto Students so Friggin’ Miserable? (Part Quatre)

To date, we’ve been musing about the causes of Toronto students’ dissatisfaction. But let’s put the shoe on the other foot for a bit: what causes student satisfaction to begin with?

One thing we ask in the Globe Canadian University Report survey is students’ perceptions about a number of dimensions of the character of their institution. For instance, we ask them if they think their institution’s curriculum is more theoretical or applied, whether the institution is broadly-based or focuses on a few areas of study, etc. (to save space, we’ll spare you the methodology and wording caveats; contact us directly if you want to know more).

It turns out that students’ perceptions of certain dimensions of institutional character are remarkably closely correlated with satisfaction:

Here’s what Figure 1 tells us: Students are more satisfied if they think their institution isn’t specializing in certain areas, but rather is spreading the (monetary) love about evenly across different fields of study. If they think their curriculum is practical/applied, they’re happier than if they think it’s theoretical. Students who see their schools as expecting them to be self-sufficient or as cautious to new approaches and ideas are miserable; by contrast, those who see their institution as “nurturing and supportive” or “open to new approaches” are extremely satisfied. Indeed, the range of satisfaction from one end of the scale to the other for both of these measures is over three points: that’s larger than the entire range of institutional satisfaction results in the 2012 Canadian University Report.

So, might these factors explain our Toronto problem? Are Toronto institutions seen as too theoretical, closed to new approaches, etc?

It turns out that Toronto institutions aren’t rated differently from institutions elsewhere in terms of spreading their wealth and are barely different in terms of being open to new ideas.  Having a theoretical curricula is a big factor for St. George, but not for the rest of Toronto on average (though there is variation – applied curricula at Ryerson and OCAD offset theory at York and the U of T satellites). Only on the issue of being insufficiently nurturing/supportive of students – a measure inversely correlated with institutional size – is there a clear difference between Toronto schools and those elsewhere. But even this is less than it seems. Toronto’s numbers are being driven by the three U of T campuses and OCAD (all of which do badly even once size is taken into account); York’s score is about average while Ryerson does exceptionally well on this measure.

In short, institutional characteristics are an important driver of satisfaction generally, but can only partially explain our Toronto conundrum.

The search continues next week. Stay tuned.

— Alex Usher and Jason Rogers

November 03

The Robin des Bois of Canadian Higher Education

In its budget this past spring, Jean Charest’s government announced its plans to increase tuition in Quebec by $325 per year for five years, beginning next fall. By 2016-17, the basic undergraduate tuition in Quebec will reach $3,792 for a typical, 30-credit year. While the tuition increase will keep Quebec students’ fees well below the average elsewhere in Canada, the increases still clock in at 75% over five years. Clearly there is potential for a significant impact on enrolment.

So it was with great pleasure that I read the recent report of the province’s committee on access (en français), the Comité sur l’accessibilité financière aux études.  The report describes the government’s plans to expand the already robust student aid system to ensure the most vulnerable students remain immune from a price shock.

Here’s the bluffer’s guide to Quebec’s tuition increase:

1. Tuition will increase by 75% by 2016-17…

2. Except for low-income students who receive financial aid, since the province will cover the entirety of their expanded tuition bill in the form of non-repayable bursaries; some lower-middle-income students will also benefit from expanded access to the bursary program…

3. Additionally, the province is planning to expand access to its loans and bursaries program by reducing the amount of income it expects a parent or spouse to contribute to a student’s education.

4. All told, the province is ramping up student aid funding by $118.4 million, an amount equivalent to 35% of the new tuition revenue

The province’s student groups are bellyaching about the tuition increase and the fact that, from a certain perspective, the province is robbing Peter to pay Paul. There is some merit to this view – after all, the expansion of student aid is only necessary because tuition is going up and is going to be funded from new tuition revenue. Except that Peter will still be getting a relative deal on higher education, Paul needs the additional support and Mary (the gouvernement) is out of cash. The analogy that fits best is that of an institutionalized Robin Hood: wealthy families will contribute more to a system in dire needs of funds, and low-income families won’t find themselves overstretched.

A tip of the cap to the Comité: through research and thoughtful reflection, it has shown exactly how a tuition hike can be made progressive.  Félicitations!

November 02

Many Bolognas

I spent part of October in Bucharest at the Bologna Future of Higher Education conference, trying, as I always do at these things, to get my head around what is happening in European higher education.

Part of the problem of trying to follow the Bologna Process is that there are many Bolognas that exist side by side. There is the “formal” Bologna – which is actually a crashing bore, unless you’re really into diploma supplements and qualifications frameworks and quality assurance processes – and the “informal” Bologna of student-centred learning, social dimensions and the Tuning process (basically, all the stuff Cliff Adelman writes about), which is all pretty groovy and gets most of the attention.

There is the Bologna of the Communiqués, the strong declarations about progress made and future challenges to be met, and the much messier Bologna of the Trenches, where the high phrases meet the cold reality of institutional reality. The latter, believe me, is a heck of a lot messier than anyone lets on.

There is European Bologna, which is what everyone agrees to, and there are the many Local Bolognas. Pretty much every country has its own, independent Bologna process because – being a process rather than a set of objectives or legal obligations – most national governments have been able to slip all sorts of local reforms (sometimes petty and irritating, sometimes decades overdue) over on higher education systems. As a result, the Bologna process has proceeded differently in different countries.

Finally, there is the Bologna of the Politicians (and sometimes Rectors, too), who deal in high politics, and the Bologna of the Education Policy Nerds (my peeps!), who have managed to use the brief policy opening offered by the initial flood of Bologna-mania to initiate and sustain a number of continent-wide discussions about a variety of pedagogical, curricular and managerial modernizations.

It is kind of amazing how all of these different Bolognas manage to co-exist side by side. We Canadians sometimes like to think of ourselves as flexible and pragmatic compared to those stuffy and inflexible continentals, but I’m pretty sure we’d have a nervous breakdown trying to deal with what Europeans take in their stride.

How do they do it? Basically, they don’t get hung up on small ideas like unanimity and full compliance. They get a critical mass of institutions or countries together with a bunch of stakeholders and start moving in one direction on an issue. If the others don’t join or don’t catch up, that’s their problem.

We could do that, too, on files like learning outcomes or credit transfer, if we really tried, and someone were willing to start the ball rolling. But it’s an approach so foreign to our psyche, my guess is it will never happen.

November 01

More Shanghai Needed

I’m in Shanghai this week, a guest of the Center for World-Class Universities at Shanghai Jiao Tong University for their biannual conference. It’s probably the best spot on the international conference circuit to watch how governments and institutions are adapting to a world in which their performance is being measured, compared and ranked on a global scale.

In discussions like this the subject of rankings is never far away, all the more so at this meeting because its convenor, Professor Nian Cai Liu, is also the originator of the Academic Ranking of World Universities, also known as the Shanghai Rankings. This is one of three main competing world rankings in education, the others being the Times Higher Education Supplement (THES) and the QS World Rankings.

The THES and QS rankings are both commercially-driven exercises. QS actually used to do rankings for THES, but the two parted ways a couple of years ago when QS’s commercialism was seen to have gotten a little out of hand. After the split, THES got a little ostentatious about wanting to come up with a “new way” of doing rankings, but in reality, the two aren’t that different: they both rely to a considerable degree on institutions submitting unverified data and on surveys of “expert” opinion. Shanghai, on the other hand, eschews surveys and unverified data, and instead relies entirely on third-party data (mostly bibliometrics).

In terms of reliability, there’s really no comparison. If you look at the correlation between the indicators used in each of the rankings, THES and QS are very weak (meaning that the final results are highly sensitive to the weightings), while the Shanghai rankings are very strong (meaning their results are more robust). What that means is that, while the Shanghai rankings are an excellent rule-of-thumb indicator of concentrations of scientific talent around the world, the QS and THES rankings in many respects are simply measuring reputation.

(I could be a bit harsher here, but since QS are known to threaten academic commentators with lawsuits, I’ll be circumspect.)

Oddly, QS and THES get a lot more attention in the Canadian press than do the Shanghai rankings. I’m not sure whether this is because of a lingering anglophilia or because we do slightly better in those rankings (McGill, improbably, ranks in the THES’s top 20). Either way, it’s a shame, because the Shanghai rankings are a much better gauge of comparative research output, and with its more catholic inclusion policy (500 institutions ranked compared to the THES’s 200), it allows more institutions to compare themselves to the best in the world – at least as far as research is concerned.

October 31

The Trouble with Sniping

Much of the HESA staff was in Fredericton last week at the annual meeting of the Canadian Institutional Research and Planning Association where, as usual, a good and informative time was had by all (hat tip to the CIRPA organizing committee).

But something happened there which bothered me quite a bit: namely, a keynote address in which ACCC President Jim Knight began taking gratuitous potshots at the university sector. I’ve been wondering ever since if this was just an off-night for him, or a sign of a potentially very damaging split in the post-secondary lobbying world.

In the main, Knight’s talk was competent if not especially exciting. The aim of the speech was to list off the challenges the country was facing in terms of growth and social inclusion, and how he thought the post-secondary sector in general and colleges in particular could rise to meet these challenges – pretty standard Ottawa stuff, really.

Where things got a bit dicey was when Knight decided, in the midst of a very relevant discussion about the need for the sector to prove its “relevance” to government, to go on a tangent about how the university community really needed to wake up and smell the coffee on the relevance issue because, while colleges could prove themselves supremely relevant by churning out job-ready graduates, universities, well, you know…

This kind of talk really isn’t helpful. Colleges and universities each have their roles to play in equipping the country’s population with the skills to thrive in the modern economy. Both sectors do very well by international standards, and we have reason to be proud of both. No doubt each sector has its strengths and weaknesses, and that the rapid on-going shifts in the world of work pose significant challenges to graduates of specific programs in both sectors.

But what on earth is to be gained by one sector pointing the finger at the other and claiming superiority?

I’m really hoping this was just a bad night out for Mr. Knight. Because if this is actually a strategy, if ACCC really thinks the path to success in the new political and economic environment is to get their elbows up and start jostling with the university sector, then both sectors are in trouble.

Good things happen when the community sends positive, co-operative messages to the public. Bad things happen when it doesn’t. Simple as that.

October 28

Why are Toronto Students so Friggin’ Miserable (Part 3)

So, back to our favourite hobby of delving in to the causes of Toronto students’ misery. Today’ we’re looking at the issue of institutional size and asking the question: are Toronto schools Too Big Not to Fail?

(For those of you tired of hearing about Toronto, bear with us: you can learn a lot about satisfaction generally by following this series.)

First, to put this all in perspective: this year’s Canadian University Report data shows that Toronto students are really unhappy (Figure 1). On a nine-point scale, they rate their satisfaction 0.75 points lower than students from elsewhere in Canada. Given that no school receives an overall satisfaction score less than a 5.8 or greater than 8.2, this is a rather substantial difference (a standard deviation of 1.5, 1.5 standard deviations, if your unit of analysis is university means).

One obvious suspect is size. Toronto has some of the largest institutions in Canada, and smaller schools generally do better on the CUR, as a quick glance at the results charts in this year’s edition (or at Figure 2) will tell you.

So, given that a majority of students in our sample attend massive schools like University of Toronto or York University, is “Colossal U” the barrier to Toronto’s satisfaction? A closer look at the data suggests not. Toronto actually has institutions in all four CUR size categories. While the size hypothesis could account for low satisfaction grades at U of T’s St. George campus (B-) or York (C+) (Ryerson’s B actually hugs the national average for schools its size), it hardly explains sentiments at Mississauga (C+) or Scarborough (C+). OCAD’s B-minus is perhaps the ultimate proof; it’s not bad for a Toronto school, but still the lowest score nationally among very small institutions. That’s what teaching Torontonians will do to you.

In short, Toronto’s misery is not concentrated in any one institution or even one type of institution; it’s spread among the big and the small alike, as Figure 3 demonstrates.

So much for that obvious explanation. Maybe the problem is that we’re asking the wrong question: instead of looking at sources of dis-satisfaction, we need to take a harder look at what factors (other than size) are associated with satisfaction – in particular, the correlation between certain types of institutional and student characteristics that seem to positively affect satisfaction. Stay tuned.

— Alex Usher and Jason Rogers.

For the record, no actual Torontonians work at Higher Education Strategy Associates.

October 27

It Was 20 Years Ago Today*

…that the Report of the Commission of Inquiry on Canadian University Education was released.

In 1990, in the midst of deficit crises, national unity crises, etc., AUCC members decided that the only way to focus public attention on education was to appoint an independent commissioner, Dr. Stuart Smith, to shine a spotlight on their own activities. It worked, but probably not in the way they intended.

The first few pages of the report deal in the banalities used by every university president since Jesus was born: essentially, “the system is strong and healthy but could use more money.” That taken care of, Smith then took a vicious left turn from the script and laid into universities for neglecting their teaching mission and spending too much time on scientific research.

To say university presidents felt betrayed would be an understatement. They were not amused by the rather strong implication that their research mission was interfering with their teaching mission (now where have we heard that before?), and weren’t shy about saying so.

Reading the report today, one is struck both by what has changed and what hasn’t. It’s hard not to read the recommendations around credit transfer, the lack of data on faculty teaching loads or the imbalance of incentives around teaching and research and think “plus ça change.” But on the other hand, one can also read the recommendations around access, student assistance, teaching-track faculty research into higher education and performance indicators and think, “actually, we’ve come a really long way.”

(My favourite recommendation is the one suggesting that all institutions be required to publish the percentage of their budget devoted to helping faculty improve teaching or fund curricular innovation. Yeah it’s unworkable in practice, but it would be deliciously cruel – and probably highly motivational – to have institutions publish numbers that need to be measured in hundredths of percentage points.)

So, lots of progress, but frankly not enough. No one can read the section on teaching and learning and seriously think that the situation has improved in the last twenty years. It’s fair to say that Smith wasn’t providing a balanced picture of universities and their activities in his report. But I think it’s equally fair to say that wasn’t his brief.

Many people speak on behalf of research. Distressingly few, including student leaders, speak to the substance of education itself. The Smith Commission was by some distance the best manifesto for undergraduate education this country has ever produced. We could use another one like it soon.

*I think. It’s hard to tell about things that came out in the pre-Internet era.

October 26

Oh for Heaven’s Sake (Western Canadian Edition)

You may have seen some reporting recently – say, here, here and here – to the effect that I’ve authored a report saying that the intellectual centre of gravity in Canada is moving westward at a rapid rate. You may also have seen me quoted saying things to the effect that it’s a result of sustained funding increases over the past decade in the west, while in Ontario even the major increases seen in McGuinty’s first term were barely able to cope with increased demand, let alone reverse the effects of decades of underfunding.

I do believe all of this (and can argue this point at length, if any of you want to start me off with a beer), but I do feel that I should make it clear that no such report exists.

Here’s what happened: an Ottawa Citizen reporter asked me what the major issues of the Ontario election were in post-secondary education and I pointed to the fact that none of the three parties had promised any increase in PSE funding during the next five years. When asked to describe the possible effects of this, I pointed to the general relative decline of Ontario universities compared to those in the three Western provinces. This last bit, somehow, became the focus of the article.

“Toronto consultant says west is best” is one of those headlines that are hard to resist on the Prairies, so the Calgary Herald and others picked up the story. Then one student newspaper got the wrong end of the stick and assumed that since stories were being published around comments from a think-tank president on a subject, said comments must have originated in a report of some kind. This led to a somewhat surreal CUP podcast in which two journalists discussed a non-existent report.

I thought this was pretty harmless until this new version started getting picked up by places like Globe Campus, at which point I thought “enough is enough.”

So, to be clear: there are a number of Western Canadian universities that rock pretty hard; most eastern universities are struggling to keep up and will continue to do so as they get smacked with the effects of deficit-cutting measures; the difference between east and west is partly money, partly demographics (strong universities are a lagging indicator of economic success, as any academic from China or India could tell you) and in a couple of specific instances it’s about exceptional leadership; and for all these reasons it’s quite fair to call this a shifting of the country’s intellectual centre of gravity.

Just don’t go looking for the report because there isn’t one. Sorry.

October 25

Maslow v. Durkheim in the Canadian University Report

For those of you interested in student ratings of Canadian universities, the Globe and Mail’s Canadian University Report – for which we at HESA do the data work – is out today. I’m not going to recount all the gory details here – they’re available both in the magazine which accompanies today’s paper and online. What I’m going to do instead is outline briefly how the data can be used not just to compare institutions but to answer more profound questions about student experiences.

Most of the literature around “satisfaction” in universities indirectly traces back to the literature through student engagement to student retention, and on back to Durkheim (true story: if you just take Durkheim’s work, cross out the words “commit suicide” and write in “drop out of university,” you’re about 80% of the way to summarizing modern student retention literature). But the thing about most retention and engagement studies is that after you run them through the wringer they all tend to end up with an R-squared of about 0.4. That’s not nothing in social science by any means, but it suggests there’s a lot of other stuff going on that Durkheim can’t explain.

One theory bouncing around the HESA offices is that “belonging” is overrated as an explanation for engagement and satisfaction, and that self-actualization is more important. That, in effect, we need to be looking much more towards Abraham Maslow than to Durkheim for inspiration.

Examining data provided by the 33,870 students who responded to our Globe survey is a great way to check these hypotheses because of the enormous sample size, the ability to control for all sorts of institutional factors, and (of course) the very detailed information about satisfaction. It contains a battery of Durkheim-esque questions about belonging, and also some questions that hint at a Maslowian explanation for satisfaction, notably the one which asks students if their institution has the “right balance between work and fun.”

Using both sets of questions as independent variables vis-à-vis satisfaction and comparing the results questions can help us gain insights into the Maslow/Durkheim debate – and where we don’t get clear answers, we can use our monthly CanEd student panel to get further data to answer the questions (wait till you see next month’s survey!).

Over the long term, we think we can build up a pretty good picture of what makes different types of students tick, which will allow us not only to answer questions such as “Why are Toronto Students so Friggin’ Miserable,” but also to answer more profound questions about the sources of student satisfaction.

Over the next few weeks, we’ll be sharing some of this data with you. Stay tuned.

October 24

The Central Canadian Jock Windfall

The other day I was flipping through public policy maven Charles Clotfelter’s new book Big Time Sports in American Universities (you can get the gist via this interview on YouTube). It reminded me to check up a bit on a subject which intrigued me a few years ago, namely the evolution of sports scholarships in Canadian universities.

Fifteen years ago, Canadian athletic scholarships were still both small and rare: even in 2001-02, CIS schools were distributing just $3.4 million in scholarships to 2,439 athletes. By 2009-10, those figures had risen substantially: now, just over 4,000 students per year receive a combined $10 million in athletic scholarships (all data from CIS’s statistics page).

But what’s really interesting is where the increases have occurred. Since 2001-02, scholarships at prestige schools like U of T, UBC and Alberta haven’t increased that much. The Atlantic schools for the most part have kept their increases below the average in the rest of the country (the exceptions being Dalhousie, up 395% to $414,000, and Acadia, up 477% to $550,000), though it is nevertheless significant that this tiny region, home to less than 10% of Canadian students, accounts for 32% of all scholarship spending.

No, the real change in Canadian athletics is happening in big central Canadian universities – basically, the OUA plus the Anglophone Quebec universities. Check out some of these eye-popping percentage increases: McGill up 664% to $222,000, Concordia up 886% to $224,000, Carleton, up 1138% to $265,560. And that’s only the ones for whom it makes a modicum of sense to express change in percentage terms, i.e., who were spending more than $20,000 to begin with. We could get into others: Queen’s, up from $18,000 to $177,000, McMaster up from $5,000 to $218,000, and Wilfrid Laurier, up from $12,000 to $299,000.

Is this really where these universities want to be seen to be spending money on the eve of a serious downturn in public funding? Even assuming most of this is from alumni donations (which I suspect is not in fact universally the case), what’s the point? Have these universities become noticeably better at inter-university sport? Or has a scholarships arms race just created windfall benefits for a particular group of students?

Sport is about performance. It would be nice if we thought of university budgets the same way.

Page 114 of 119« First...102030...112113114115116...Last »