Higher Education Strategy Associates

October 04

New Quality Measurement Initiatives

One of the holy grails in higher education – if you’re on the government or management side of things, anyway – is to find some means of actually measuring institutional effectiveness.  It’s all very well to note that alumni at Harvard, Oxford, U of T (pick an elite university, any elite university) tend to go on to great things.  But how much of that has to do with them being prestigious and selective enough to only take the cream of the crop?  How can we measure the impact of the institution itself?

Rankings, of course, were one early way to try to get at this, but they mostly looked at inputs, not outputs.  Next came surveys of student “engagement”, which were OK as far as they went but didn’t really tell you anything about institutional performance (though it did tell you something about curriculum and resources).  Then came the Collegiate Learning Assessment and later the OECD’s attempt to build on it, which was called the Assessment of Higher Education Learning Outcomes, or AHELO.  AHELO was of course unceremoniously murdered two years ago by the more elite higher education institutions and their representatives (hello, @univcan and @aceducation!) who didn’t like its potential to be used as a ranking (and, in fairness, the OECD probably leant too hard in that direction during the development phase, which wasn’t politically wise).

So what’s been going on in quality measurement initiatives since then?  Well, two big ones you should know about.

The first is one being driven out of the Netherlands called CALOHEE (which is, sort of, short for “(Measuring and Comparing Achievements of Learning Outcomes in Higher Education in Europe).  It is being run by more or less the same crew that developed the Tuning Process about a decade ago, and who also participated in AHELO though they have broken with the OECD since then.  CALOHEE builds on Tuning and AHELO in the sense that it is trying to create a common framework for assessing how institutions are doing at developing students’ knowledge, skills and competencies.  It differs from AHELO in that if it is successful, you probably won’t be able to make league tables out of it.

One underlying assumption of AHELO was that all programs in a particular area (eg. Economics, Civil Engineering) were trying to impart the same knowledge skills and competencies – this was what made giving them a common test valid.  But CALOHEE assumes that there are inter-institutional differences that matter at the subject level.  And so while students will still get a common test, the scores will be broken up in ways that are relevant to each institution given the set of desired learning outcomes at each institution.  So Institution X’s overall score in History relative to institution Y’s is irrelevant, but their scores in, for instance, “social responsibility and civic awareness” or “abstract and analytical thinking” might be, if they both say that’s a desired learning outcome.  Thus, comparing learning outcomes in similar programs across institutions becomes possible, but only where both programs have similar goals.

The other big new initiative is south of the border and it’s called the Multi-State Collaborative to Advance Quality Student Learning (why can’t these things have better names?  This one’s so bad they don’t even bother with an initialism).  This project still focuses on institutional outcomes rather than program-level ones, which reflects a really basic difference of understanding of the purpose of undergraduate degrees between the US and Europe (the latter caring a whole lot less, it seems, about well-roundedness in institutional programming).  But, crucially in terms of generating acceptance in North America, it doesn’t base its assessment on a (likely low-stakes) test.  Rather, samples of ordinary student course work are scored according to various rubrics designed over a decade or more (see here for more on the rubrics and here for a very good Chronicle article on the project as a whole). This makes the outcomes measured more authentic, but implicitly the only things that can be measured are transversal skills (critical thinking, communication, etc) rather than subject-level material.  This will seem perfectly fine to many people (including governments), but it’s likely to be eyed suspiciously by faculty.

(Also, implicitly, scoring like this on a national scale will create a national cross-subject grade curve, because it will be possible to see how an 80 student in Engineering compares in to an 80 in history, or an 85 student at UMass to an 85 student at UWisconsin.  That should be fun.)

All interesting stuff and worth tracking.  But notice how none of it is happening in Canada.  Again.  I know that after 25 years in this business the lack of interest in measurable accountability by Canadian institutions shouldn’t annoy me, but it does.   As it should anyone who wants better higher education in this country.  We can do better.

October 03

The Real Competition is Closer Than You Think

I’ve recently been dismissive of the notion that Canada is “falling behind” in higher education since everyone seems to insist on making ludicrous comparisons with places like China, Switzerland and Singapore.  But upon a little bit of further digging, it turns out there is one of our very close competitors which is doing rather well these days, one we probably should be worried about despite the fact that we’ve mostly been ignoring it since the Financial Crisis of ’08.

It’s our neighbour to the south, the United States.

Now, back in the 1990s and early 2000s, when our dollar was worth diddly-squat and the Americans were spending seriously on higher education, the USA dominated Canadian discussion about foreign competitors.  It was all “brain drain” this and “impossible to keep our talent” that.  We were getting creamed.

Since 2008 it’s been a different story.  Everybody remembers 2009-2010, when top American talent was streaming across the border due to a combination of a strong Canadian dollar and savage cut-backs in the US.  And then the Obama stimulus expired in 2011 and everyone knew that was going to put paid for research dollars.  And the states were mostly bankrupt (look at what happened in Illinois!). And then Trump.  Trump!  Disaster, etc.  The story we’ve been telling ourselves is that the American higher ed policy for last decade or so has been one long exercise in Making Canada Great Again.

The problem is, this story isn’t quite true.

Let’s start by taking a look at what’s been going on in Canada, both at our top research universities (for the purpose of this exercise, the 8 universities that made this year’s Shanghai top 200 (Toronto, UBC, McMaster, McGill, Alberta, Montreal, Calgary and Ottawa) and at the rest of the higher education system.  Figure 1 shows the change in real average total institutional per-student expenditures, indexed to 2006.  What it shows is that – with some small differences in timing – both research-intensive and non-research-intensive institutions did well late last decade and worse this one, with both ending off with per-student expenditures about 3% lower than they were a decade earlier.

Figure 1: Change in Real Per-student Expenditures by sector, Shanghai top-200 ranked universities vs. rest of sector, Canadian Universities 2005-06 to 2015-16 (indexed to 2005-06)

Source: Statscan FIUC, Statscan PSIS

Now let’s look at similar data for private 4-year colleges in the United States. Same drill: the blue line represents the average of the 33 private universities which make the Shanghai top-200, and the orange line represented the average for the rest of the sector (about 1700 institutions in total.  For both, what we see is an unbroken increase in expenditures.  For many years it seemed like the big research heavyweights were leaving the rest of the sector behind, but in the last couple of years both sectors have seen a steady expansion of buying power.  For the decade, the research heavyweights are up 16%, and everyone else up 10%.

Figure 2: Change in Real Per-student Expenditures by sector, Shanghai top-200 ranked universities vs. rest of sector, US Private 4-year colleges 2004-05 to 2014-15 (indexed to 2004-05)

Source: IPEDS

(if you’re wondering why my ten-year span starts and ends a year later in Canada than in the US, the answer is that it turns out – miraculously – there is one piece of data reporting that Stascan does faster and better than the Americans, and that’s reporting on financials.  The 15-16 numbers just aren’t out yet in the US – sorry)

Ah, you say.  But that’s the private 4-year sector – not really a fair comparison.  Everyone knows those guys have loadsadough.  Compare us with the public sector instead – everyone knows those guys are in deep trouble, right?  Right?

Well….not exactly.  Across the big public research universities – the 37 that make the top 200 in the Shanghai rankings – expenditures are up 15% in real dollars on a decade ago, same as their counterparts in the private 4-year sector.  And the rest of the sector, the poor unloved disrespected non-research-intensive universities, they are up 8% on the decade.

Figure 3: Change in Real Per-student Expenditures by sector, Shanghai top-200 ranked universities vs. rest of sector, US Public 4-year colleges 2004-05 to 2014-15 (indexed to 2004-05)

Source: IPEDS

A reasonable question here is; how did the Canadian post-secondary community just miss this?  And miss it they did, because if they had this data they wouldn’t be making ludicrous claims about places like Switzerland.

The answer, I suspect, is that American universities are, from the perspective of our university leaders, are growing in the “wrong way”.  This growth isn’t happening because American governments – federal and state – are plunking down vast amounts of new money.  It’s because institutions are generating more money on their own – either through tuition, or self-generated income.  Now it’s not that Canadian institutions aren’t doing the same (albeit on a smaller scale) – as I’ve shown elsewhere, what growth there has been in real revenues in Canadian universities over the last five years has mostly come from new fee income.  But when our university lobbyists go to ask for money, it’s simpler to make the case for “investment” (i.e. spending) if the outside competition is mostly government-funded too.  It’s harder to walk in and say: “we need more money so we can compete with American universities who are really good at generating their own funds through fees” because our politicians are currently allergic to that solution.

Still, the fact of the matter is, American universities are very definitely on an up-swing at a time when even most countries – including vaunted China and Switzerland – are flat or even down somewhat.  This deserves more attention.

October 02

Atlantic Blues

One big story from out east that didn’t get a lot of play in the rest of the country was the news that the Nova Scotia government had, over the period 2013-2017, quietly bailed out Acadia University to the tune of $24 million.  This is of course the second time a Nova Scotia government has bailed out this decade: the Nova Scotia College of Art and Design (NSCAD) received about $10 million.

This isn’t really a partisan thing: it was an NDP government that bailed out NSCAD and a Liberal one which offered extra help to Acadia.  It’s a structural problem: Nova Scotia is not a very rich province, and it has a lot of universities, only one of which is large enough to have real economies of scale.  This problem was laid out in great detail seven years ago by economist Tim O’Neill in a special report on the system prepared for the Provincial Government but the Dexter government passed on making the difficult decisions.  Now, the new Liberal government has put some extra money in and has allowed institutions to raise a bit more money through tuition, but it doesn’t change the really basic structural challenge the province’s institutions face.  Come next recession, at least one university will be back in the same situation.

From the Acadia story we can glean two things.  First, former President Ray Ivany is clearly a very persuasive guy (but we kind of knew that).  Second, we now have some greater insight into the drafting of the controversial Bill 100, which provided for the possibility of universities to ignore collective agreements if they were in financial exigency and needed to restructure.  Turns out it wasn’t out of the blue: it was in reaction to Acadia telling them they needed a bail-out.  Bill 100 was the Government signalling to everyone in the province’s higher education community: bailouts aren’t the only possible outcome.  Radical restructuring is a possibility too.

How radical?  Well, two neighbouring provinces can give a sense of how bad things can get.  We’ve seen what kind of time Memorial University of Newfoundland is having dealing with cuts of 20% or more to income with little to no ability to recoup money from tuition fees.  In New Brunswick, cuts to provincial operating grants have if anything been as severe, if spread out a little more – a 22% drop in real terms from 2010-11 to 2015-2016 (see back here for more on provincial changes).  Student numbers have fallen as well, by nearly 15%.  That’s a double-edged sword, because while it means the per-student cut in the operating grant is not as severe, that’s also a lot less fee income coming in as well.

Figure 1: Change in university enrolments, Atlantic Provinces, 2010-11 = 100

Source: Association of Atlantic Universities

In fact, the only province in the region where things remain reasonably quiet is Prince Edward Island, where the UPEI is holding its ground both in terms of government grants and student numbers.  Compared to the rest of the region, that’s a reasonably good place to be.

The fundamental challenge of most of the region’s universities is size.  Small is a great selling point – if you can charge for it.  If you can’t, small just means fragile.  And fragility describes too many Atlantic universities right now.  Acadia won’t be the last university in the region to approach the brink; there’s almost certainly more drama to come in the years ahead.



September 29

Notes on Medieval Higher Education Finance

No Ving Rhames/Pulp Fiction jokes (you were thinking it, you know you were).  Just a couple of interesting tales from about the high middle ages to show that in fact there is almost no tale under the sun in higher education which isn’t seven or eight centuries old.

Student loans.  Though the tradition of providing aid to worthy but needy students as a gift (i.e. bursaries) has a history almost as old as universities themselves, the concept of lending money to students – both commercially and at concessionary rates – is nearly as old.  In thirteenth century Oxford, lending to students by local loan-sharks was considered sufficiently predatory that King Henry III issued a decree limiting interest on loans to two pennies per week per every twenty shillings one pound lent (which works out to a rate of forty-three percent).  To keep students away from such lenders, Oxford encouraged the creation of endowments whose funds could be used to provide students with interest-free loans.  These loans were securitized against a scholar’s possession – books, fine cutlery, etc – which would be forfeit if the student did not repay the loan within a year.

(If you’re wondering “how is a book surety for tuition”, the answer, more or less, is that pre-Gutenberg books were incredibly expensive, and not that much less in value than a semester’s worth of lectures).

National Security and Higher Education.  After World War II, area studies (that is, interdisciplinary studies of various world regions) took off in the United States, essentially because both institutions and governments decided that if the country was going to run the free world, it might help to know something about the various bits of it.  The CIA had ties to area studies, but so too did the major old-school foundations like Ford, Rockefeller and Carnegie.

But this was by no means the first time that authorities had tried to use universities’ expertise in the humanities to strategic purposes.  In the early 14th century, the Catholic church was still reeling from the loss of the Holy Land to the Arabs, and there was a desire to turn things around in part by trying to convert the infidel.  At the Council of Vienne (that’s Vienne, France, just south of Lyon, not Vienna) in 1311, the church decided to set up endowed chairs in Hebrew, Aramaic and Arabic both in Rome to serve the papacy directly but also at the Ivy League of the High Middle Ages (Paris, Oxford, Bologna and Salamanca).  Not quite knowledge in the service of the state – but given the link between religious control and territorial control, it’s close enough.

Disposing of Extra Cash.  Ok, this one’s a bit different.  Back in the day, administrative positions like Proctor – an overseer of general university and in some cases town life as well – were voted on by the Masters (i.e. tenured staff), in much the way that department chairs are today.  Since these positions entailed an ability to impose charges on students and in some cases townsfolk as well, this was a post that was in some demand.  It wasn’t quite a sinecure, but it was better than just being a lowly Master (the economic and social standing of Masters being a lot lower back then than that of Professor is today).

As with many such official positions in Europe prior the latter half of the nineteenth century when these crazy things called “examinations” started being used to award positions, the proctor post was something one purchased rather than won on merit.  On becoming a proctor, the successful candidate would be required to pay a substantial sum of money to the Master’s for the privilege.  According to custom, the Master’s would immediately repair to a tavern in order to – and I am not making this term up, you can read about it in Rashdall’s Medieval Universities – “Drink the Surplus”.

If anyone wants to get profs more comfortable with the notion of larger numbers of administrative staff on campus, I would suggest perhaps finding a way to revive this particular tradition.

Have a good weekend.

September 28

China, Switzerland, Singapore

The other day, I questioned a claim made by University of Toronto President Meric Gerlter that we were falling behind countries like “China, Switzerland and Singapore” having made major recent investments in science and higher education.  First of all, I noted, this was an odd trio, with nothing much to suggest it was true other than the rise of a few institutions in such countries as doing reasonably well in various university rankings.  Second, I noted that the claim that we are “falling behind” these countries was factually untrue, at least as it relates to investments in higher education.  Today, I’m going to prove that, by comparing recent data on expenditures in top universities in each of these to expenditures in ours.

(To be clear before I press on: I’m singling out Gertler for criticism because the Star picked up on his comment. But he’s far from the only one to have made it – I’ve heard that same line a couple of times lately in Canadian PSE.  So think of today’s column as a general warning shot on the topic of international comparisons rather than me taking down anyone in particular).

So let’s look at what’s been happening in each of these countries over the past few years, using data obtained from each institution’s annual & financial reports.  Let’s start with Canada.  Real expenditures per student at top universities rose in the latter half of the aughts and early in this decade.  Since 2012, they have been mixed, rising slightly at Alberta (+6%) and falling slightly at McGill (-2%), UBC (-3%) and Toronto (-4%) and more sharply at McMaster and Montreal (-8%).  Summed across all institutions, it’s a drop of 4%.

 Figure 1 – Per-student Expenditures of Top Canadian Universities, 2012-13 to 2015-16 (in C$2016)

Now let’s take a look at those other countries over the same period.  Let’s start with Switzerland, where we will look at the seven universities that make the Shanghai top 200.  Has there been an increase in expenditures?  Yes, there has: about 5.6% over the last few years.  But that’s been almost exactly matched by a 5% increase in students.  Interestingly, the growth in student numbers has been most pronounced at the best university: ETH Zurich – the institution to which the University of Toronto would be likeliest to compare itself – has seen it’s funding per student fall by 3% as a result.  Most of the per-student gain comes at the one university – Geneva – which saw a significant fall in student numbers.

Figure 2 – Per-student Expenditures of Top Swiss Universities, 2012 to 2016 (in SFr 2016)

Let’s move over to China.  Data from China is not great (as I noted a couple of weeks ago, Chinese universities are getting more transparent about institutional finances, but getting accurate institutional enrolment numbers is practically impossible).  Still, to the extent we have information, what we see is essentially no change over the past four years – up slightly at Shanghai Jiao Tong, down slightly everywhere else.

Figure 3 – Per-student Expenditures of Top Chinese Universities, 2013 to 2016 (in RMB 2016)

I know that one seems hard to believe given all we hear about “massive Chinese investments” in higher education.  And in the previous decade: say from 2000 to 2010, it’s true that there were massive investments.  But lately, expenditures are not quite keeping up with inflation and growth in student numbers.

(Some people may point to forthcoming investments on the back of China’s recently announced and then re-announced “Double World-Class” program, as per this recent article in Caixin Global.  But the amounts being bandied about here don’t actually seem all that significant.  Sun-Yat Sen University, for instance, is breathlessly reported as being in line to receive an extra RMB 480 million as part of this project.  But on a budget of over RMB 6 Billion, that’s less than an 8% increase, which given inflation and student growth probably washes out in less than two years.  So treat these claims with skepticism)

Finally, we have Singapore, where we do see clear-cut growth in real expenditures per student over the last few years, at the National University of Singapore (8%) and Nanyang University (13%).

Figure 4 – Per-student Expenditures of Top Singaporean Universities, 2013 to 2016 (in S$2016)

So, does this back up Gertler’s “greater investment” claims?  No, not quite.  A closer look at the two institutions financials shows something more subtle.  In both cases, universities started spending more relative to their income: Singapore went from posting a 5% surplus to a 1.5% surplus between 2012 and 2016; in Nanyang’s case, the institution  went from about a 5% surplus to a 3% deficit, which implies in both cases it was changes in financial habits as much as any new investment doing the work.  And some of the work was done by increased tuition fee income.  The increase in research income from government sources is really only doing about a third of the work here.

So, are China, Switzerland and Singapore really “leaving Canada behind”?  In the case of China and Switzerland the answer seems to be: only marginally, if at all.  In Singapore – tiny Singapore – the evidence that more money for research in play is clearer, but the total “extra funding” in question is maybe $250 million at two universities who between them might have 4,000 tenured faculty.

If this is the best competition we have, I’d say we’re doing OK.


September 27

Some Surprising (?) Data on Canadian University Expenditures

I’ve been doing some work on financial data of higher education institutions around the world, and specifically looking at what’s been going on at top research institutions compared to everyone else.  And I thought maybe you all would be interested in what I’ve found for Canada.

For the purpose of this document, I have separated the six institutions in Canada which always come top in the Academic Ranking of World Universities (aka “Shanghai Rankings”) – that’s Toronto, UBC, McGill, McMaster, Alberta and Montreal – from the rest, which allows me to look at “the Big 6” against everyone else.  Over the past fifteen years, those institutions have pretty consistently made up around 20% of the country’s total enrolments and about 30% of institutional expenditures.

Figure 1 shows growth in expenditures in real dollars, with 2000-2001 as the base year. And basically what it shows you is that Canadian universities had a money-hose aimed at them in the early 2000s, with annual increases in the 9-10% range.  These big increases continued for slightly longer at the big 6 than in the rest of the system, but by about 2006 they were growing at the same rate again (around 4% per year after inflation), and by 2011 growth had leveled off completely and was holding steady in real terms.  The more dramatic and historically-challenged in the community call this “austerity”, but actually it’s just steady-state.

Figure 1: Change in Canadian University Expenditures, 2000-01 to 2014-15, indexed to 2000-01 

But wait, you say, hasn’t participation rates been going up?  Weren’t there a lot of new students to accommodate in there?  There were indeed.  See figure 2.

Figure 2: Change in Canadian University Enrolments, 2000-01 to 2014-15, indexed to 2000-01 


Two things about changes in enrolments: first, they increase at roughly the same pace in the top 6 as everywhere else, and second, they increased pretty consistently across the time period, rather than jump radically at first and then tail off, as expenditures did.

From the foregoing you can probably do the math in your head and work out what per-student expenditures look like: rising a bit in the 00s and falling a bit since about 2000.  And this is in fact exactly what we see:

Figure 3: Per-student Expenditures at Canadian Universities, 2000-01 to 2014-15

But there’s a bit of a difference here.  If you’re not at one of the big six, per student expenditures in real dollar have been falling ever so slightly for seven or eight years and now sit right about where they were fifteen years ago.  If you’re at one of the big six, they’ve been falling only since the turn of the decade and they currently sit about 20% above where they were at the turn of the millennium.  The big institutions, which already spent a lot more per student because they had more expensive programs, did more research, etc, in a sense “pulled away” from the rest of the pack over the course of the aughts.

But actually what was most interesting to me in this exercise was looking at how heterogenous the “top 6” actually are in terms of expenditures.  At one end, you have Montreal, which sneaks into the world top 200 with expenditures of only about $30,000 per student (Montreal includes both Polytechnique and HEC for these purposes), which in fact make them look – financially anyway – more like the “everyone else” category than the rest of the big 6. Toronto, McGill, and McMaster all spend just north of $40,000 per student.  But the big western universities – Alberta and UBC – spend substantially more: over $50,000 in the case of the former and nearly $60,000 in the case of the latter.

Figure 4: Per-student Expenditures at Canadian “Top 6” institutions, 2000-01 to 2015-16

These graphs pose many interesting questions, but here are the three posers I’d most like to see explained:

i)      why didn’t Canadian universities derive any economies of scale when enrolments increased?

ii)     what are University of Alberta and UBC buying with their extra $10,000 per student per year?

iii)    what does Montreal do that allows it to get into the upper reaches of global research universities with a comparatively pedestrian budget?

Tomorrow, we can look a bit at how some of Canada’s “close competitors” stack up.

September 26

Arguing for Science in All the Wrong Ways

You can tell it’s pre-budget consultation time in Ottawa because university Presidents are writing op-eds about the importance of research and backing the Naylor Report.  But man, are they ever unconvincing.

Let’s start with University of Toronto President Meric Gertler’s September 12th Toronto Star op-ed entitled “Don’t Let the World Pass Us By on Science”.  The sentiment is fine, I suppose, but the specific evidence Gertler uses to back up his claim is – to put it politely – weak.  It says that we are falling behind countries like “China, Switzerland and Singapore”; the evidence for this is that their universities are bouncing up various international rankings tables while Canadian universities are just staying stable.

Three points here.  First, the idea that Canada needs to spend billions on research so that U of T and a handful of other institutions can be rankings big-shots is perhaps the worst possible argument for supporting the Naylor report.  Second: “China, Switzerland and Singapore”?  What kind of trio is that?  If those three – which between them don’t have more than a dozen genuinely world-class universities – are the only ones we are worrying about, doesn’t that imply that in fact we are doing reasonably well compared to the traditional powerhouses like the US, UK and Germany?  Third, Gertler’s assertion that these three countries are “making aggressive investments in scientific infrastructure and researchers” just isn’t borne out by the data, at least over the last four or five years: in none of those countries are university budgets keeping pace with inflation and growth in student numbers (I’ll be showing this in more detail in a series of blogs over the next few weeks – stay tuned).

The second article of note saw Gertler team up with McGill’s Suzanne Fortier and UBC’s Santa Ono (let’s portmanteau them collectively as “Gerforno” to keep things simple) to pen an op-ed for the Globe and Mail called Ottawa must improve research funding – or risk losing the innovation race.  This one was a bit better as a pitch: now the rationale for investment is Canadian prosperity rather than U-15 bragging rights.  But the logic underpinning it is deeply problematic and points to the continuing poverty of the Canadian innovation debate.

The Gerforno argument runs like this: “hey, look at the breakthroughs Professors at our three fantastic universities have made: Geoffrey Hinton (artificial intelligence, U of T), Bernard Belleau (molecular pharmacology, McGill) and Michael Smith (biotech, UBC).  Their brilliant work in basic science led to huge advances that turned into big companies and lots of jobs, etc.  See?  Basic researcher equals innovation equals Canadian prosperity!”

Now, it’s definitely gutsy of Gerforno to make this argument on national economic grounds when both Hinton and Belleau sold their companies to foreigners (Google and Shire Pharmaceuticals) and Smith chose to found his company in Seattle.  But leaving that aside, as an argument this is still just arguing from anecdote. It doesn’t even attempt to draw conclusive links between basic science and economic growth.

Canada is a tiny country.  In terms of population, 0.5% of the world, in terms of GDP maybe 2%, in terms of science maybe 4%.  The firms that employ Canadians import – minimum – 95% of the technology with which our firms work.  Adding a billion or two to university research does not change that.  We will always be takers of technology, and – as importantly – most of the economic impact of ideas generated by Canadian scientists are going to occur outside our borders.  That’s how open science works in an open economy.

The effect of that extra billion or so on national levels of productivity is going to be tiny.  Sure, we might be able to generate a few more hi-tech or life-sciences companies, but that’s if and only if the rest of the innovation infrastructure – venture capital, skilled managers able to take companies from start-up to production to IPO, intellectual property regimes, tax structures, legislation limiting non-competes, etc – is all working as well.  And I think there’s a fair bit of evidence to suggest that all of this isn’t in working order.

At the national level, research capacity is a necessary but insufficient condition for innovation.  Universities, though, have taken to pretending that this is not true, that is sufficient and that a pot of gold lies at the end of the rainbow, if only we can get CIHR and NSERC budgets up another few percent.   They therefore have every reason to play down the possibility that more money on scientific research may be – from the point of view of innovation and economic growth – wasted.  Or that in the absence of the rest of the eco-system being in working order, more research money may be like pushing on a string.

Now, that’s not a reason on its own to dismiss the Naylor Report or the idea of increasing research budgets.  Maybe we ought to do it because science is good in and of itself and a more science-oriented society is a better society.  But if we’re going to justify science in the name of economic growth, we have to do better than arguing by anecdote and pretending that there’s some kind of straight unmediated line between research and growth.

Gerforno can do better than this.  Heck we can do better than this.  We have to.

September 25

How to Read a Poll

You may have seen the results of a poll out last week from Abacus Canada for Universities Canada, one which purports to look at “how Canadians feel about universities”.  I suspect you will hear a lot of this poll over the next few months, especially with respect to research and the Naylor report.  But it’s always worth approaching these things with a skeptical eye, so let’s spend a little time looking at the poll and how the questions were put together to see how much stock we should put in the answers.

Let’s start with this first set of questions:

The design of this first question is pretty good.   Respondents were offered two options and asked which one was closer to their opinion of Canadian universities.  Forced choices tend to give better responses than single answer questions, but the trick is in interpreting the response.  Some will probably try to tell you “73% of Canadians think universities are practical and up-to-date”, but that’s not quite right.  Rather, 73% of Canadians think they are more practical than impractical,and a similar percentage think they are closer to being up-to-date than to being out-of-date.

Much of the poll was devoted to research issue.  These next two questions are, to my mind, the most interesting.  The text is a little small, but the first question asks whether Canadians think research should focus on national competitiveness and the second questions asks whether the best way to lead in innovation is to invest in fundamental research.  The answer to both questions were exactly the same.

This is gold for Universities Canada.  “90% of Canadians agree that we need to invest in fundamental science in order to lead in innovation”.  But this would have been an interesting place to use one of those double-ended questions we saw at the start.  For instance: “Which is closer to your view: That Canada will gain leadership in innovation by investing in basic research or investing in applied research”?  The answers would have been instructive, but not nearly as politically useful to universities Canada.

Now, we get into the more dubiously worded questions. Take the questions where they ask about the importance of funding research. They aren’t just bad questions, they’re bad politics.

They are bad questions because you’re going to get all kinds of social acceptability bias creeping in.  Who is going to say they are against spending money on discovering new medicines?  And they’re potentially bad politics – from universities’ point of view anyway – because it assumes that university research has particular practical endpoints.  Some of it does of course.  But notice they didn’t ask about supporting research in the humanities. Ways of reconciling with Indigenous peoples, say.  Or ensuring no 18th century Jesuit manuscript goes unread.  Wonder why.

(When reading a poll, always – always – look for the questions they don’t ask.)

Meanwhile the last set of questions, which are agree/disagree to a set of specific statements, is just shameful.

Look at that third one: “the government should spend more on universities because the upside for Canada is tremendous”.  I’ve heard of leading questions before but this is ridiculous.  Or the fourth one: “Canada has a chance to lead the world in higher education research and innovation”.  We have a chance to become the wine capital of the world, too.  Or World Cricket champions.  A chance, sure, but how much of one?  Meaningless term.

Now take a look at what questions Universities Canada didn’t ask.  They didn’t ask people how important any of these issues were, and where they ranked in their political priorities.  Sure, 86% say Ottawa should spend more on research.  All that tells us is that in the abstract, when no other priorities or financial constraints are in the picture, Canadians think supporting university research is a good idea.  But the real question is: where do they want to direct incremental government expenditure.  If Ottawa has a spare billion this year (and it’s increasingly looking like they will have more than that), where do they want it to go?  University research?  Or something else – like safe water in First Nations communities, cyberdefense, child care, deficit reduction, tax cuts?  And, more to the point, from a politician’s point of view, is this an issue that moves votes?  Is anyone going to change their vote if a government does or does not invest in this area?

Those are the real questions.  The fact that Universities Canada didn’t ask these question (or, possibly, asked them but didn’t publish them) suggests they know that clarity on this point probably wouldn’t help their case.

September 22

Twenty Years Ago Sunday

Five years ago I wrote the following blog, under the headline “fifteen years ago today”.  I think it’s worth running again (with a couple of minor alterations).

On September 24th, 1997, Jean Chrétien rose in the House of Commons to present his reply to the Speech from the Throne. About half-way in, he noted casually that there would likely be a financial surplus that year (a miracle, considering where we’d been in 1995). And he was planning to blow it on something called “Millennium Scholarships.”

Until that exact moment, his caucus had been in the dark about the idea. Indeed, cabinet had been in the dark until the day before. So, too, had the Privy Council Office – Chrétien had deliberately kept them out of the loop because he knew they’d hate it on section 93 grounds and try to top him.

The way the project was pursued in the run-up to the 1998 budget didn’t do the Foundation any favours. There were two basic problems. The first was that it wasn’t clear for months whether these were going to be merit scholarships or need-based grants (in French, the word “bourse” covers both). The public servants at HRDC and Paul Martin wanted it to be about need because they saw the political hay people were making about increasing student debt (note: unlike today, this was a time when debt actually was increasing quite rapidly); Pierre Pettigrew and the Finance mandarins wanted it to be about merit, but for different reasons. Pettigrew has his eye on Quebec and its not unreasonable complaint that the feds were duplicating a provincial program and thought a more merit-based program would take the edge off that argument.  Finance, I think, wanted merit because the top folks there wanted a culture shift in Canada to promote merit (they were also pretty much all Queen’s grads, as far as I could tell, which may or may not explain the fixation).

In the end, 95% was distributed “primarily” on the basis of need, while 5% went to merit. This mix was about right; broad fears of rising tuition and debt required a policy response that emphasized need. Conversely, had more been allocated to merit, the Excellence Awards the Foundation eventually developed would have been devalued – part of what made them special was the fact that they weren’t available to the tens of thousands of students originally envisaged.

The second problem was that no one in Ottawa – including HRDC – really understood how student aid worked. The result was a commitment to give the Foundation’s need-based aid to students with “the highest need” – that is, to exactly the students who already received grants from the provinces. The result was that Millennium awards ended up saving provincial aid programs a bucket-load of money. The Foundation did its best to get provinces to re-invest that money in things that would benefit students. Apart from in Nova Scotia it was reasonably successful though it didn’t always seem that way to the students who were bursary recipients. With some justification, those students were sometimes disappointed; Eddie Goldenberg, the Prime Minister’s Senior Political Advisor whose views on fed-prov relations were…well, let’s just say they lacked subtlety…was apoplectic.

This isn’t the place to recount the Foundation’s history (for that, I recommend Silver Donald Cameron’s book A Million Futures). All you really need to know is that for ten years, the Foundation ran a national social program that wasn’t based in Ottawa and wasn’t one-size fits all. It ran a merit program that was much more than just money-for-marks, and was rigorous about using empirical research to improve our understanding of how to improve access to higher education. It was just a different way of doing student aid.

Now, I’m biased, of course.  I worked at the Foundation.  I met my wife there.  Had Chrietien not risen in the House that day, my daughter literally would not exist and the world would be deprived of its smartest and most beautiful 8 year-old ballet dancer/sumo enthusiast.  But even if none of that were true, I’d still stand by my final comment from five years ago:

The Foundation was created on the back of a cocktail napkin, and suffered from a profoundly goofy governance structure. But within the boundaries of that cocktail napkin, a lot of neat stuff happened. And even though some of what was best about the Foundation has been taken up by the federal government since its demise, the country’s still worse off now that the Foundation’s gone.

September 21

Flagship Universities vs World-Class Universities

Almost since the “world-class” university paradigm was established fifteen years ago, the concept has faced a backlash.  The concept was too focussed on research production, it was unidimensional, it took no account of universities’ other missions, etc. etc.  Basically the argument was that if people took the world class university concept seriously, we would have a university monoculture that ignored many important facets of higher education.

The latest iteration of this backlash comes in the form of the idea of “flagship universities”, promoted in the main by John Aubrey Douglass, a higher education expert at UC Berkeley.  Douglass’ idea is essentially that what the world needs is not more “world-class” universities – which he dismisses as being overly focussed on the production of research – but more “flagship” universities.  What’s the difference?  Well, “flagship” universities are – essentially – world class universities with a commitment to teaching top undergraduate students, to providing top-level professional education and to a mission of civic engagement, outreach and economic development.  Basically, all flagship universities are world-class universities, but not vice-versa.  They are world-class universities with a heart, essentially.

Or, that’s what promoters of a “flagship concept” would have you believe.  I would argue the concept is simply one of American academic colonialism, driven by a simplistic belief that all systems would be better if only they all had their own Morrill ActsWisconsin Ideas, and California Master Plans.

If you read Douglass’ book on the matter, it’s quite plain that when he says “flagship university” he means roughly the top 20 or so US public universities – Cal, Washington, Virginia, Michigan, etc.  And those are without question great universities and for the most part appropriate to the settings in which they exist.  But as a guiding concept for universities around the world, it’s at least as inappropriate as “world class” universities if not more because it assumes that these quintessentially American models will or should work more or less anywhere.

Start with the idea that to be a flagship university you have to have excellent research output.  That takes out nearly all of Africa, India, the Middle East, South-East Asia, Russia and Latin America, just as the “world class” concept does.  Then you must have a research culture with full academic freedom, freedom of expression, etc (say goodbye to China and the rest of Russia).  You must also have a commitment to combine undergraduate and professional education and to a highly selective intake process for both (adieu France, auf wiedersehen, Germany), and a commitment to community outreach in the way Americans think of it (sayonara Japan, annyeong Korea).

What’s left?  Universities in anglophone countries, basically.  Plus Scandinavia and the Netherlands.  That’s it.  But if you then add in the requirement that flagships are supposed to be at the top of an explicit system of higher education institutions (a la California Master Plan), then to some degree you lose everyone except maybe Norway.

Douglass is undoubtedly right in saying the world-class universities are – in practice if not in theory – a pretty reductive view of higher education (though in defence of the group in Shanghai who came with the concept, it’s fair to say they thought of it as a benchmarking tool not a policy imperative).  But while the flagship concept cannot be called reductive, even more so than the “world-class concept” it is culturally specific and not readily exportable outside of its home context.

Universities around the world are descended from different traditions.  The governments that pay for them and regulate them have legitimately different conceptions of what they are supposed to achieve and how they are supposed to achieve it.  People happen to have got worked up about “world-class” research universities because research happens to be the only part of university outputs that can be measured with in a way which is half-way useful, quantitatively speaking.  The problem lies not with the measurement of research outputs, and not even with the notion of institutions competing with one another, but rather with the notion that there is a single standard for excellence.

The flagship universities vs. world-class universities debate, at heart, is simply an argument about which single standard to use.  Those of us in North America might prefer the flagship model because it speaks to our historic experience and prejudices.  But that’s no reason to think anyone should adopt it, too.

Page 2 of 11912345...102030...Last »