Higher Education Strategy Associates

Category Archives: Institutions

October 19

Tuition: Walking and Chewing Gum Simultaneously

Since we’re talking tuition this week, I thought I’d take an opportunity to tee off on one of the weakest arguments out there on this subject.  You know, the one that goes like this:

  1. Higher Education is a Public Good
  2. Public Goods should be free
  3. Yay, free tuition.

There are actually two responses to this argument, one narrow and one broad.

The narrow argument is that in economic terms the first premise is wrong and hence the second and third are incorrect as well.  Yes, economists believe that public goods – that is, goods which are both non-rivalrous and non-excludable, should be free.  But higher education is neither of these things (see this earlier blog post here for a longer explanation) so the second and third points are false.  QED.

Now this is an unsatisfactory answer to most people, because when non-economists use the term “public good” they actually mean “a good which has a lot of positive externalities”, not “a good which is non-rivalrous and non-excludable”.  This drives some economists a bit batty, but such is life.   And, so the argument goes, since it has lots of positive externalities, we should subsidize it.

Economically, that’s a solid argument.  Having positive externalities means that those who pay for the good don’t reap all the benefits.  Everybody gains from having more doctors, for instance, or more teachers (more lawyers…well, that may have negative externalities).  They gain from having a more educated populace which tends to be healthier, less prone to crime, more likely to engage civically, etc.  Typically, where buyers don’t capture all the beneficiaries, less of a good gets consumed than it should.  This is a type of market failure, and an argument for government intervention and subsidy (this is the argument for public funding of research, for instance).

BUT.  But, but, but.  Notice the difference here.  With public goods, the argument is that government should be the sole payer because the non-excludability thing literally means that no one can capture the benefits of the investments (lighthouses are the classic if not entirely satisfactory example here).  But with goods with positive externalities, while there is a case for subsidies, there’s not a case for complete subsidization.  To the extent that there are private benefits, the beneficiaries should pay for them.

But that’s not the way the free tuition folks tend to argue things.  It often seems sufficient, as evidenced by this recent Christopher Newfield piece in the Guardian, to say “public good!” – by which he presumably means “goods with positive externalities”- and that’s a good enough reason to ask for a 100% subsidy.   Often, this is then contrasted with some “evil” alternative where asking for students to pay fees implies a “neoliberal agenda” in which education is entirely conceived of as a “private good”.

This is nonsense.  A good can have both public AND private aspects.  We can subsidize a good AND ask private beneficiaries to contribute at the same time.  In fact, we can even devise policies which subsidize degrees differentially based on the extent to which the benefits of the degree are public (e.g. undergraduate programs in nursing or ECE) or private (MBAs).  We won’t always get those divisions between public and private exactly right, but I can guarantee you that the split we come up with will be more accurate than “100% public”.

This idea that a good must be either public or private and either completely subsidized or completely marketized should be beneath us.  The world is neither that neat nor that Manichaean.  But just as most of us can walk and chew gum simultaneously, most of us can conceive of a world where goods can be both public and private without blowing a head valve.  And we can design policies accordingly.

October 16

A Guide to Canadian Campuses, 30 Years On

On Saturday, I spent a lovely morning at Mount Allison talking to their Board of Governors.  Afterwards, I scooted across the Nova Scotia border to Amherst, which is home to Amy’s, one of Canada’s most remarkable used bookstores.  There I found a host of historical higher ed treasures (had to make a quick trip to Giant Tiger to buy a bag to get them all on the plane home), the most amusing of which was Linda Frum’s Guide to Canadian Universities, which – to the utter horror of the Canadian university sector – was published 30 years ago last month.

If you weren’t there in the late 1980s, you’ll have no idea about this book or its impact.  It was a bible for hundreds of thousands of university-bound kids.  Pre-internet, pre-rankings, how else could you find out about ALL THE UNIVERSITIES in one convenient place for the low price of $14.95?  And, packed as it was in bitchy, knowing sarcasm, it felt like it was delivering the straight dope in a way that no metrics-laden ranking exercise ever could.  Not that Frum’s dope was necessarily accurate.  Her research mostly consisted of spending a day on each campus and chatting to one or two senior admins and clutches of undergraduates over lunch in a cafeteria or java in a coffee shop.  Superficial, sure, but hundreds of times more informative than anything else available at the time.

Now, there is lots wrong with this book.  Its mere existence is an indictment of the closed nature of Canadian publishing (who asks an unknown 24 year old to write a major book and name it eponymously unless they are the offspring of a famous broadcaster?).  And it carries a huge whiff of Toronto snobbery, particularly towards smaller universities.  But in its way, it’s genius, and a quick re-read today is worth an hour or two of your time, for three reasons.

The first is that as a period piece it captures some aspects of university life that we’ve nearly completely forgotten today.  The most interesting one has to do with entrance averages, which were clearly – contrary to the way most people seem to remember it – a whole heck of a lot lower then than they are now (the description for York, for instance, ends with “if you want to live a little, and you can afford it, you’d be smart to take your 70 to Acadia, St. FX, Dalhousie, Memorial, UNB or Trent.”)  The occasional comment about having to wait a month to see first-run movies in Halifax, or the pulling power of student unions to attract up-and-coming music acts (Billy Idol played Brock Student Union?) speak to a very different, pre-Internet, entertainment culture that is mostly forgotten.


The second is that while the book is very often completely unfair (not surprisingly given its research methods), it is nonetheless hysterically funny.  No one write this way about higher education any more, and it’s a shame because it’s very engaging to the non-specialists.  My favourites:

“When people say McGill is a great institution, they really mean is that Montreal is a great city. Put McGill in Tuktoyaktuk and they wouldn’t be quite so enthusiastic”

“Queen’s students are absolutely head over heels in love with their school, not to mention themselves”.

“If only to keep up appearances, Western has a faculty”

The third and final reason to re-read it is to reflect on what has and hasn’t changed in Canada in thirty years.  Change the names of which on- or near-campus bars and restaurants are “hot”, and double (or in some cases triple) the enrolment numbers at each university, and a lot of this book still seems pretty accurate, which speaks to how durable institutional cultures and reputations are.  That said, some things do change: I’m pretty sure Acadia is no longer the preppy haven it was in 1987, and the description of UBC is completely jarring (“Before OPEC, no matter what UBC did it couldn’t help being the best school west of the Ontario border; now, UBC is a great Canadian university on its way down…the best professors and graduate students are deserting to accept higher salaries over the Rockies”).

One thing I really remember from that time is how much universities hated the book.  The principal criticism was that it was too subjective.  If only we could have some *real* analysis of institutions – you know with facts and data and stuff.  Four years later, Maclean’s came out.  Universities hated it.  Didn’t contextualize institutions enough.  Moral of the story: there’s no way to both compare universities and make them happy unless you conclude they are all brilliant.  Worth keeping in mind when reading criticisms of institutional comparisons.

And do read Linda Frum’s Guide to Canadian Universities if you can find a copy.  An entertaining trip down memory lane no matter what you think of it.


October 11

How Sessionals Undermine the Case for Universities

Last year, I wrote a blog post about what sessionals get paid, and how essentially it works out to about what assistant profs get paid for the teaching component of their jobs and that in this sense at least one could argue that sessionals in fact are getting equal pay for work of equal value.

I got a fair bit of hate mail for that one, mostly because people have trouble distinguishing between is-ought arguments.  People seemed to think that because I was pointing out that pay rates for teaching are pretty close for junior profs and sessionals, that everything is therefore hunky-dory.  Not at all.  The heart of the case is that sessionals don’t want to be paid just to teach, they’d like to be paid to do research and all that other scholarly stuff as well.

(Well, some of them would, anyway.  Others have day jobs and are perfectly happy teaching one course a year on the side because it’s fun.  We have no idea how many fall into each category. Remarkably, Statistics Canada is planning on spending a million dollars to count sessionals in Canadian universities but in such a way as to shed absolutely no light on this rather important question.  But I digress: for the moment, let us assume that when we are talking about sessionals, we are talking about those who aspire to full-time academia)

A lot of advocates on behalf of sessionals seem obsessed with arguing their case on “fairness” grounds.  “It’s not fair” that they only get paid to teach while others get paid to teach and research and do whatever the hell service consists of.  To which there is a fairly curt answer, even if most people are too polite to make it: “if you didn’t get hired as a full-time prof, it’s probably because the relevant hiring committee didn’t think you were up to our standards on the whole research thing.”  So this isn’t really a winning argument.

Where universities are much more vulnerable is on the issue of mission.  The whole point of universities – the one thing that gives them their cachet – is that they are supposed to be delivering education in a research-intensive atmosphere.  This is the line of defence that is continually used whenever the issue of offering more degrees in community colleges or polytechnics arises.  “Research-intensive!  Degrees Gotta Be Research-intensive!  Did we mention the Research Intensity thing?”

But given that, why is it that in most of the country’s major universities, over 50% of undergraduate student course-hours are taught by sessionals who are specifically employed so as to not be research-active?

Ok, where the purpose of education is more practice than theory (eg law, nursing, journalism), you probably want a lot of sessionals who are working professionals.  In those programs, sessionals complement the mission.  But in Arts?  Science?  Business?  In those fields, the mere existence of sessionals undermines the case for universities’ exclusivity in undergraduate degree-granting.  And however financially-advantageous sessionals may be to the university (not an unimportant consideration in an era where public support for universities is eroding), long-term this is a far more dangerous problem.

So the real issue re: universities and sessionals is not one of fairness but of hypocrisy.  If sessionals really wanted to put political pressure on institutions, they would make common cause with colleges and polytechnics.  They would ceaselessly demand documents from institutions through Freedom of Information from institutions to determine what percentage of student credit hours are taught by sessionals.  They would use that information to loudly back colleges’ claims for more undergraduate degree-granting powers, because really, what’s the difference?  And eventually, governments would relent because the case for the status quo is really weak.

My guess is that those activists arguing on behalf of sessionals won’t choose this course because their goal is less to smash the system of privileged insiders than it is to join it.    But they’d have a case.  And universities would do well to remember it.

October 06

Of No Fixed Address

Most people usually think of universities as being particularly stable, physically speaking.  Sure, they grow a bit: if they are really ambitious they add a satellite campus here and there – maybe even set one up overseas.  But by and large, the centre of the university itself stays put, right?

Well, not always.  There are some interesting exceptions.

In the first place, the idea of a “university” as a physical place where teaching gets done is not a universal one.  In many places, a university was a place that offered examinations and degrees while the teaching was done somewhere else, like in colleges.  The University of Manitoba started off that way, for instance: individual colleges were scattered around Winnipeg, and U of M just handed out the degrees.  UBC and the University of Victoria, famously, started their lives as colleges which prepared people to take McGill degrees; ditto Brandon and McMaster.  And all of this was more or less based on an example back in England, where the University of London played the same role right across the Empire (a number of African colleges started life as prep colleges for University of London degrees).

Some universities had to move because of wartime exigencies.  After the Japanese invasion of 1937, the majority of Chinese universities – the public ones, anyway – hightailed it to the interior, to Wuhan and then later to Chongking or Yunnan.  There, many universities would share campuses and what little bits of laboratories and libraries the universities had managed to bring with them.  Peking, Tsinghua and Nanking universities actually merged temporarily to form the South-west Associated University.  Similarly, during the Korean War, the main universities in Seoul (Yonsei, Korea, and Seoul National) all left town and headed to the (relative) safety of Busan, only returning to the capital when the war was over.  As in China there was a great deal of co-operation between fugitive universities; some observers say the big prestigious Korean universities have never been as willing to accept credits from other schools as they were at the start of the 1950s.  And sometimes, fugitive universities never make it home.  A number of religious universities in Taiwan for instance (e.g. Soochow University, Fu Jen Catholic University) were originally located in mainland China but left ahead of the Communist take-over in 1949.

Domestic politics can lead to changes as well.  In Seoul, the challenge of locating a major campus quite close to the centre of political power was brought home to authorities when students from Seoul National University helped overthrow the Syngman Rhee government in 1960.  Rhee’s successor, Park Chung-hee, put a safe distance between students and the regime by relocating the entire campus south of the river the following decade.   In Belgium until the 1960s, the Catholic University of Leuven was in tricky situation – a prestigious, historically French institution in an area that was mostly Flemish-speaking.  Eventually, being Belgians, they decided that the best course of action was to split the university; basically, the Flemish got the site and the infrastructure while most of the professoriate decamped 20 mile to the south to a greenfield site in a French speaking province.  Thank God no one’s suggested that at McGill.

And finally, some universities aren’t where they used to be because, well, they aren’t the same university, even though they may share a name.  Visitors to Salerno might want to visit the local university, thinking it has some connection with the ancient medical school there.  Unfortunately, that university disappeared about 700 years ago; the modern thing up the hill is an expansion of a teacher training college created during the Second World War.

So, yes, universities on the whole are pretty durable, staid and stable institutions.  Doesn’t mean they don’t wander around on occasion.

October 04

New Quality Measurement Initiatives

One of the holy grails in higher education – if you’re on the government or management side of things, anyway – is to find some means of actually measuring institutional effectiveness.  It’s all very well to note that alumni at Harvard, Oxford, U of T (pick an elite university, any elite university) tend to go on to great things.  But how much of that has to do with them being prestigious and selective enough to only take the cream of the crop?  How can we measure the impact of the institution itself?

Rankings, of course, were one early way to try to get at this, but they mostly looked at inputs, not outputs.  Next came surveys of student “engagement”, which were OK as far as they went but didn’t really tell you anything about institutional performance (though it did tell you something about curriculum and resources).  Then came the Collegiate Learning Assessment and later the OECD’s attempt to build on it, which was called the Assessment of Higher Education Learning Outcomes, or AHELO.  AHELO was of course unceremoniously murdered two years ago by the more elite higher education institutions and their representatives (hello, @univcan and @aceducation!) who didn’t like its potential to be used as a ranking (and, in fairness, the OECD probably leant too hard in that direction during the development phase, which wasn’t politically wise).

So what’s been going on in quality measurement initiatives since then?  Well, two big ones you should know about.

The first is one being driven out of the Netherlands called CALOHEE (which is, sort of, short for “(Measuring and Comparing Achievements of Learning Outcomes in Higher Education in Europe).  It is being run by more or less the same crew that developed the Tuning Process about a decade ago, and who also participated in AHELO though they have broken with the OECD since then.  CALOHEE builds on Tuning and AHELO in the sense that it is trying to create a common framework for assessing how institutions are doing at developing students’ knowledge, skills and competencies.  It differs from AHELO in that if it is successful, you probably won’t be able to make league tables out of it.

One underlying assumption of AHELO was that all programs in a particular area (eg. Economics, Civil Engineering) were trying to impart the same knowledge skills and competencies – this was what made giving them a common test valid.  But CALOHEE assumes that there are inter-institutional differences that matter at the subject level.  And so while students will still get a common test, the scores will be broken up in ways that are relevant to each institution given the set of desired learning outcomes at each institution.  So Institution X’s overall score in History relative to institution Y’s is irrelevant, but their scores in, for instance, “social responsibility and civic awareness” or “abstract and analytical thinking” might be, if they both say that’s a desired learning outcome.  Thus, comparing learning outcomes in similar programs across institutions becomes possible, but only where both programs have similar goals.

The other big new initiative is south of the border and it’s called the Multi-State Collaborative to Advance Quality Student Learning (why can’t these things have better names?  This one’s so bad they don’t even bother with an initialism).  This project still focuses on institutional outcomes rather than program-level ones, which reflects a really basic difference of understanding of the purpose of undergraduate degrees between the US and Europe (the latter caring a whole lot less, it seems, about well-roundedness in institutional programming).  But, crucially in terms of generating acceptance in North America, it doesn’t base its assessment on a (likely low-stakes) test.  Rather, samples of ordinary student course work are scored according to various rubrics designed over a decade or more (see here for more on the rubrics and here for a very good Chronicle article on the project as a whole). This makes the outcomes measured more authentic, but implicitly the only things that can be measured are transversal skills (critical thinking, communication, etc) rather than subject-level material.  This will seem perfectly fine to many people (including governments), but it’s likely to be eyed suspiciously by faculty.

(Also, implicitly, scoring like this on a national scale will create a national cross-subject grade curve, because it will be possible to see how an 80 student in Engineering compares in to an 80 in history, or an 85 student at UMass to an 85 student at UWisconsin.  That should be fun.)

All interesting stuff and worth tracking.  But notice how none of it is happening in Canada.  Again.  I know that after 25 years in this business the lack of interest in measurable accountability by Canadian institutions shouldn’t annoy me, but it does.   As it should anyone who wants better higher education in this country.  We can do better.

October 03

The Real Competition is Closer Than You Think

I’ve recently been dismissive of the notion that Canada is “falling behind” in higher education since everyone seems to insist on making ludicrous comparisons with places like China, Switzerland and Singapore.  But upon a little bit of further digging, it turns out there is one of our very close competitors which is doing rather well these days, one we probably should be worried about despite the fact that we’ve mostly been ignoring it since the Financial Crisis of ’08.

It’s our neighbour to the south, the United States.

Now, back in the 1990s and early 2000s, when our dollar was worth diddly-squat and the Americans were spending seriously on higher education, the USA dominated Canadian discussion about foreign competitors.  It was all “brain drain” this and “impossible to keep our talent” that.  We were getting creamed.

Since 2008 it’s been a different story.  Everybody remembers 2009-2010, when top American talent was streaming across the border due to a combination of a strong Canadian dollar and savage cut-backs in the US.  And then the Obama stimulus expired in 2011 and everyone knew that was going to put paid for research dollars.  And the states were mostly bankrupt (look at what happened in Illinois!). And then Trump.  Trump!  Disaster, etc.  The story we’ve been telling ourselves is that the American higher ed policy for last decade or so has been one long exercise in Making Canada Great Again.

The problem is, this story isn’t quite true.

Let’s start by taking a look at what’s been going on in Canada, both at our top research universities (for the purpose of this exercise, the 8 universities that made this year’s Shanghai top 200 (Toronto, UBC, McMaster, McGill, Alberta, Montreal, Calgary and Ottawa) and at the rest of the higher education system.  Figure 1 shows the change in real average total institutional per-student expenditures, indexed to 2006.  What it shows is that – with some small differences in timing – both research-intensive and non-research-intensive institutions did well late last decade and worse this one, with both ending off with per-student expenditures about 3% lower than they were a decade earlier.

Figure 1: Change in Real Per-student Expenditures by sector, Shanghai top-200 ranked universities vs. rest of sector, Canadian Universities 2005-06 to 2015-16 (indexed to 2005-06)

Source: Statscan FIUC, Statscan PSIS

Now let’s look at similar data for private 4-year colleges in the United States. Same drill: the blue line represents the average of the 33 private universities which make the Shanghai top-200, and the orange line represented the average for the rest of the sector (about 1700 institutions in total.  For both, what we see is an unbroken increase in expenditures.  For many years it seemed like the big research heavyweights were leaving the rest of the sector behind, but in the last couple of years both sectors have seen a steady expansion of buying power.  For the decade, the research heavyweights are up 16%, and everyone else up 10%.

Figure 2: Change in Real Per-student Expenditures by sector, Shanghai top-200 ranked universities vs. rest of sector, US Private 4-year colleges 2004-05 to 2014-15 (indexed to 2004-05)

Source: IPEDS

(if you’re wondering why my ten-year span starts and ends a year later in Canada than in the US, the answer is that it turns out – miraculously – there is one piece of data reporting that Stascan does faster and better than the Americans, and that’s reporting on financials.  The 15-16 numbers just aren’t out yet in the US – sorry)

Ah, you say.  But that’s the private 4-year sector – not really a fair comparison.  Everyone knows those guys have loadsadough.  Compare us with the public sector instead – everyone knows those guys are in deep trouble, right?  Right?

Well….not exactly.  Across the big public research universities – the 37 that make the top 200 in the Shanghai rankings – expenditures are up 15% in real dollars on a decade ago, same as their counterparts in the private 4-year sector.  And the rest of the sector, the poor unloved disrespected non-research-intensive universities, they are up 8% on the decade.

Figure 3: Change in Real Per-student Expenditures by sector, Shanghai top-200 ranked universities vs. rest of sector, US Public 4-year colleges 2004-05 to 2014-15 (indexed to 2004-05)

Source: IPEDS

A reasonable question here is; how did the Canadian post-secondary community just miss this?  And miss it they did, because if they had this data they wouldn’t be making ludicrous claims about places like Switzerland.

The answer, I suspect, is that American universities are, from the perspective of our university leaders, are growing in the “wrong way”.  This growth isn’t happening because American governments – federal and state – are plunking down vast amounts of new money.  It’s because institutions are generating more money on their own – either through tuition, or self-generated income.  Now it’s not that Canadian institutions aren’t doing the same (albeit on a smaller scale) – as I’ve shown elsewhere, what growth there has been in real revenues in Canadian universities over the last five years has mostly come from new fee income.  But when our university lobbyists go to ask for money, it’s simpler to make the case for “investment” (i.e. spending) if the outside competition is mostly government-funded too.  It’s harder to walk in and say: “we need more money so we can compete with American universities who are really good at generating their own funds through fees” because our politicians are currently allergic to that solution.

Still, the fact of the matter is, American universities are very definitely on an up-swing at a time when even most countries – including vaunted China and Switzerland – are flat or even down somewhat.  This deserves more attention.

October 02

Atlantic Blues

One big story from out east that didn’t get a lot of play in the rest of the country was the news that the Nova Scotia government had, over the period 2013-2017, quietly bailed out Acadia University to the tune of $24 million.  This is of course the second time a Nova Scotia government has bailed out this decade: the Nova Scotia College of Art and Design (NSCAD) received about $10 million.

This isn’t really a partisan thing: it was an NDP government that bailed out NSCAD and a Liberal one which offered extra help to Acadia.  It’s a structural problem: Nova Scotia is not a very rich province, and it has a lot of universities, only one of which is large enough to have real economies of scale.  This problem was laid out in great detail seven years ago by economist Tim O’Neill in a special report on the system prepared for the Provincial Government but the Dexter government passed on making the difficult decisions.  Now, the new Liberal government has put some extra money in and has allowed institutions to raise a bit more money through tuition, but it doesn’t change the really basic structural challenge the province’s institutions face.  Come next recession, at least one university will be back in the same situation.

From the Acadia story we can glean two things.  First, former President Ray Ivany is clearly a very persuasive guy (but we kind of knew that).  Second, we now have some greater insight into the drafting of the controversial Bill 100, which provided for the possibility of universities to ignore collective agreements if they were in financial exigency and needed to restructure.  Turns out it wasn’t out of the blue: it was in reaction to Acadia telling them they needed a bail-out.  Bill 100 was the Government signalling to everyone in the province’s higher education community: bailouts aren’t the only possible outcome.  Radical restructuring is a possibility too.

How radical?  Well, two neighbouring provinces can give a sense of how bad things can get.  We’ve seen what kind of time Memorial University of Newfoundland is having dealing with cuts of 20% or more to income with little to no ability to recoup money from tuition fees.  In New Brunswick, cuts to provincial operating grants have if anything been as severe, if spread out a little more – a 22% drop in real terms from 2010-11 to 2015-2016 (see back here for more on provincial changes).  Student numbers have fallen as well, by nearly 15%.  That’s a double-edged sword, because while it means the per-student cut in the operating grant is not as severe, that’s also a lot less fee income coming in as well.

Figure 1: Change in university enrolments, Atlantic Provinces, 2010-11 = 100

Source: Association of Atlantic Universities

In fact, the only province in the region where things remain reasonably quiet is Prince Edward Island, where the UPEI is holding its ground both in terms of government grants and student numbers.  Compared to the rest of the region, that’s a reasonably good place to be.

The fundamental challenge of most of the region’s universities is size.  Small is a great selling point – if you can charge for it.  If you can’t, small just means fragile.  And fragility describes too many Atlantic universities right now.  Acadia won’t be the last university in the region to approach the brink; there’s almost certainly more drama to come in the years ahead.



September 28

China, Switzerland, Singapore

The other day, I questioned a claim made by University of Toronto President Meric Gerlter that we were falling behind countries like “China, Switzerland and Singapore” having made major recent investments in science and higher education.  First of all, I noted, this was an odd trio, with nothing much to suggest it was true other than the rise of a few institutions in such countries as doing reasonably well in various university rankings.  Second, I noted that the claim that we are “falling behind” these countries was factually untrue, at least as it relates to investments in higher education.  Today, I’m going to prove that, by comparing recent data on expenditures in top universities in each of these to expenditures in ours.

(To be clear before I press on: I’m singling out Gertler for criticism because the Star picked up on his comment. But he’s far from the only one to have made it – I’ve heard that same line a couple of times lately in Canadian PSE.  So think of today’s column as a general warning shot on the topic of international comparisons rather than me taking down anyone in particular).

So let’s look at what’s been happening in each of these countries over the past few years, using data obtained from each institution’s annual & financial reports.  Let’s start with Canada.  Real expenditures per student at top universities rose in the latter half of the aughts and early in this decade.  Since 2012, they have been mixed, rising slightly at Alberta (+6%) and falling slightly at McGill (-2%), UBC (-3%) and Toronto (-4%) and more sharply at McMaster and Montreal (-8%).  Summed across all institutions, it’s a drop of 4%.

 Figure 1 – Per-student Expenditures of Top Canadian Universities, 2012-13 to 2015-16 (in C$2016)

Now let’s take a look at those other countries over the same period.  Let’s start with Switzerland, where we will look at the seven universities that make the Shanghai top 200.  Has there been an increase in expenditures?  Yes, there has: about 5.6% over the last few years.  But that’s been almost exactly matched by a 5% increase in students.  Interestingly, the growth in student numbers has been most pronounced at the best university: ETH Zurich – the institution to which the University of Toronto would be likeliest to compare itself – has seen it’s funding per student fall by 3% as a result.  Most of the per-student gain comes at the one university – Geneva – which saw a significant fall in student numbers.

Figure 2 – Per-student Expenditures of Top Swiss Universities, 2012 to 2016 (in SFr 2016)

Let’s move over to China.  Data from China is not great (as I noted a couple of weeks ago, Chinese universities are getting more transparent about institutional finances, but getting accurate institutional enrolment numbers is practically impossible).  Still, to the extent we have information, what we see is essentially no change over the past four years – up slightly at Shanghai Jiao Tong, down slightly everywhere else.

Figure 3 – Per-student Expenditures of Top Chinese Universities, 2013 to 2016 (in RMB 2016)

I know that one seems hard to believe given all we hear about “massive Chinese investments” in higher education.  And in the previous decade: say from 2000 to 2010, it’s true that there were massive investments.  But lately, expenditures are not quite keeping up with inflation and growth in student numbers.

(Some people may point to forthcoming investments on the back of China’s recently announced and then re-announced “Double World-Class” program, as per this recent article in Caixin Global.  But the amounts being bandied about here don’t actually seem all that significant.  Sun-Yat Sen University, for instance, is breathlessly reported as being in line to receive an extra RMB 480 million as part of this project.  But on a budget of over RMB 6 Billion, that’s less than an 8% increase, which given inflation and student growth probably washes out in less than two years.  So treat these claims with skepticism)

Finally, we have Singapore, where we do see clear-cut growth in real expenditures per student over the last few years, at the National University of Singapore (8%) and Nanyang University (13%).

Figure 4 – Per-student Expenditures of Top Singaporean Universities, 2013 to 2016 (in S$2016)

So, does this back up Gertler’s “greater investment” claims?  No, not quite.  A closer look at the two institutions financials shows something more subtle.  In both cases, universities started spending more relative to their income: Singapore went from posting a 5% surplus to a 1.5% surplus between 2012 and 2016; in Nanyang’s case, the institution  went from about a 5% surplus to a 3% deficit, which implies in both cases it was changes in financial habits as much as any new investment doing the work.  And some of the work was done by increased tuition fee income.  The increase in research income from government sources is really only doing about a third of the work here.

So, are China, Switzerland and Singapore really “leaving Canada behind”?  In the case of China and Switzerland the answer seems to be: only marginally, if at all.  In Singapore – tiny Singapore – the evidence that more money for research in play is clearer, but the total “extra funding” in question is maybe $250 million at two universities who between them might have 4,000 tenured faculty.

If this is the best competition we have, I’d say we’re doing OK.


September 25

How to Read a Poll

You may have seen the results of a poll out last week from Abacus Canada for Universities Canada, one which purports to look at “how Canadians feel about universities”.  I suspect you will hear a lot of this poll over the next few months, especially with respect to research and the Naylor report.  But it’s always worth approaching these things with a skeptical eye, so let’s spend a little time looking at the poll and how the questions were put together to see how much stock we should put in the answers.

Let’s start with this first set of questions:

The design of this first question is pretty good.   Respondents were offered two options and asked which one was closer to their opinion of Canadian universities.  Forced choices tend to give better responses than single answer questions, but the trick is in interpreting the response.  Some will probably try to tell you “73% of Canadians think universities are practical and up-to-date”, but that’s not quite right.  Rather, 73% of Canadians think they are more practical than impractical,and a similar percentage think they are closer to being up-to-date than to being out-of-date.

Much of the poll was devoted to research issue.  These next two questions are, to my mind, the most interesting.  The text is a little small, but the first question asks whether Canadians think research should focus on national competitiveness and the second questions asks whether the best way to lead in innovation is to invest in fundamental research.  The answer to both questions were exactly the same.

This is gold for Universities Canada.  “90% of Canadians agree that we need to invest in fundamental science in order to lead in innovation”.  But this would have been an interesting place to use one of those double-ended questions we saw at the start.  For instance: “Which is closer to your view: That Canada will gain leadership in innovation by investing in basic research or investing in applied research”?  The answers would have been instructive, but not nearly as politically useful to universities Canada.

Now, we get into the more dubiously worded questions. Take the questions where they ask about the importance of funding research. They aren’t just bad questions, they’re bad politics.

They are bad questions because you’re going to get all kinds of social acceptability bias creeping in.  Who is going to say they are against spending money on discovering new medicines?  And they’re potentially bad politics – from universities’ point of view anyway – because it assumes that university research has particular practical endpoints.  Some of it does of course.  But notice they didn’t ask about supporting research in the humanities. Ways of reconciling with Indigenous peoples, say.  Or ensuring no 18th century Jesuit manuscript goes unread.  Wonder why.

(When reading a poll, always – always – look for the questions they don’t ask.)

Meanwhile the last set of questions, which are agree/disagree to a set of specific statements, is just shameful.

Look at that third one: “the government should spend more on universities because the upside for Canada is tremendous”.  I’ve heard of leading questions before but this is ridiculous.  Or the fourth one: “Canada has a chance to lead the world in higher education research and innovation”.  We have a chance to become the wine capital of the world, too.  Or World Cricket champions.  A chance, sure, but how much of one?  Meaningless term.

Now take a look at what questions Universities Canada didn’t ask.  They didn’t ask people how important any of these issues were, and where they ranked in their political priorities.  Sure, 86% say Ottawa should spend more on research.  All that tells us is that in the abstract, when no other priorities or financial constraints are in the picture, Canadians think supporting university research is a good idea.  But the real question is: where do they want to direct incremental government expenditure.  If Ottawa has a spare billion this year (and it’s increasingly looking like they will have more than that), where do they want it to go?  University research?  Or something else – like safe water in First Nations communities, cyberdefense, child care, deficit reduction, tax cuts?  And, more to the point, from a politician’s point of view, is this an issue that moves votes?  Is anyone going to change their vote if a government does or does not invest in this area?

Those are the real questions.  The fact that Universities Canada didn’t ask these question (or, possibly, asked them but didn’t publish them) suggests they know that clarity on this point probably wouldn’t help their case.

September 21

Flagship Universities vs World-Class Universities

Almost since the “world-class” university paradigm was established fifteen years ago, the concept has faced a backlash.  The concept was too focussed on research production, it was unidimensional, it took no account of universities’ other missions, etc. etc.  Basically the argument was that if people took the world class university concept seriously, we would have a university monoculture that ignored many important facets of higher education.

The latest iteration of this backlash comes in the form of the idea of “flagship universities”, promoted in the main by John Aubrey Douglass, a higher education expert at UC Berkeley.  Douglass’ idea is essentially that what the world needs is not more “world-class” universities – which he dismisses as being overly focussed on the production of research – but more “flagship” universities.  What’s the difference?  Well, “flagship” universities are – essentially – world class universities with a commitment to teaching top undergraduate students, to providing top-level professional education and to a mission of civic engagement, outreach and economic development.  Basically, all flagship universities are world-class universities, but not vice-versa.  They are world-class universities with a heart, essentially.

Or, that’s what promoters of a “flagship concept” would have you believe.  I would argue the concept is simply one of American academic colonialism, driven by a simplistic belief that all systems would be better if only they all had their own Morrill ActsWisconsin Ideas, and California Master Plans.

If you read Douglass’ book on the matter, it’s quite plain that when he says “flagship university” he means roughly the top 20 or so US public universities – Cal, Washington, Virginia, Michigan, etc.  And those are without question great universities and for the most part appropriate to the settings in which they exist.  But as a guiding concept for universities around the world, it’s at least as inappropriate as “world class” universities if not more because it assumes that these quintessentially American models will or should work more or less anywhere.

Start with the idea that to be a flagship university you have to have excellent research output.  That takes out nearly all of Africa, India, the Middle East, South-East Asia, Russia and Latin America, just as the “world class” concept does.  Then you must have a research culture with full academic freedom, freedom of expression, etc (say goodbye to China and the rest of Russia).  You must also have a commitment to combine undergraduate and professional education and to a highly selective intake process for both (adieu France, auf wiedersehen, Germany), and a commitment to community outreach in the way Americans think of it (sayonara Japan, annyeong Korea).

What’s left?  Universities in anglophone countries, basically.  Plus Scandinavia and the Netherlands.  That’s it.  But if you then add in the requirement that flagships are supposed to be at the top of an explicit system of higher education institutions (a la California Master Plan), then to some degree you lose everyone except maybe Norway.

Douglass is undoubtedly right in saying the world-class universities are – in practice if not in theory – a pretty reductive view of higher education (though in defence of the group in Shanghai who came with the concept, it’s fair to say they thought of it as a benchmarking tool not a policy imperative).  But while the flagship concept cannot be called reductive, even more so than the “world-class concept” it is culturally specific and not readily exportable outside of its home context.

Universities around the world are descended from different traditions.  The governments that pay for them and regulate them have legitimately different conceptions of what they are supposed to achieve and how they are supposed to achieve it.  People happen to have got worked up about “world-class” research universities because research happens to be the only part of university outputs that can be measured with in a way which is half-way useful, quantitatively speaking.  The problem lies not with the measurement of research outputs, and not even with the notion of institutions competing with one another, but rather with the notion that there is a single standard for excellence.

The flagship universities vs. world-class universities debate, at heart, is simply an argument about which single standard to use.  Those of us in North America might prefer the flagship model because it speaks to our historic experience and prejudices.  But that’s no reason to think anyone should adopt it, too.

Page 1 of 3312345...102030...Last »