Higher Education Strategy Associates

September 21

Flagship Universities vs World-Class Universities

Almost since the “world-class” university paradigm was established fifteen years ago, the concept has faced a backlash.  The concept was too focussed on research production, it was unidimensional, it took no account of universities’ other missions, etc. etc.  Basically the argument was that if people took the world class university concept seriously, we would have a university monoculture that ignored many important facets of higher education.

The latest iteration of this backlash comes in the form of the idea of “flagship universities”, promoted in the main by John Aubrey Douglass, a higher education expert at UC Berkeley.  Douglass’ idea is essentially that what the world needs is not more “world-class” universities – which he dismisses as being overly focussed on the production of research – but more “flagship” universities.  What’s the difference?  Well, “flagship” universities are – essentially – world class universities with a commitment to teaching top undergraduate students, to providing top-level professional education and to a mission of civic engagement, outreach and economic development.  Basically, all flagship universities are world-class universities, but not vice-versa.  They are world-class universities with a heart, essentially.

Or, that’s what promoters of a “flagship concept” would have you believe.  I would argue the concept is simply one of American academic colonialism, driven by a simplistic belief that all systems would be better if only they all had their own Morrill ActsWisconsin Ideas, and California Master Plans.

If you read Douglass’ book on the matter, it’s quite plain that when he says “flagship university” he means roughly the top 20 or so US public universities – Cal, Washington, Virginia, Michigan, etc.  And those are without question great universities and for the most part appropriate to the settings in which they exist.  But as a guiding concept for universities around the world, it’s at least as inappropriate as “world class” universities if not more because it assumes that these quintessentially American models will or should work more or less anywhere.

Start with the idea that to be a flagship university you have to have excellent research output.  That takes out nearly all of Africa, India, the Middle East, South-East Asia, Russia and Latin America, just as the “world class” concept does.  Then you must have a research culture with full academic freedom, freedom of expression, etc (say goodbye to China and the rest of Russia).  You must also have a commitment to combine undergraduate and professional education and to a highly selective intake process for both (adieu France, auf wiedersehen, Germany), and a commitment to community outreach in the way Americans think of it (sayonara Japan, annyeong Korea).

What’s left?  Universities in anglophone countries, basically.  Plus Scandinavia and the Netherlands.  That’s it.  But if you then add in the requirement that flagships are supposed to be at the top of an explicit system of higher education institutions (a la California Master Plan), then to some degree you lose everyone except maybe Norway.

Douglass is undoubtedly right in saying the world-class universities are – in practice if not in theory – a pretty reductive view of higher education (though in defence of the group in Shanghai who came with the concept, it’s fair to say they thought of it as a benchmarking tool not a policy imperative).  But while the flagship concept cannot be called reductive, even more so than the “world-class concept” it is culturally specific and not readily exportable outside of its home context.

Universities around the world are descended from different traditions.  The governments that pay for them and regulate them have legitimately different conceptions of what they are supposed to achieve and how they are supposed to achieve it.  People happen to have got worked up about “world-class” research universities because research happens to be the only part of university outputs that can be measured with in a way which is half-way useful, quantitatively speaking.  The problem lies not with the measurement of research outputs, and not even with the notion of institutions competing with one another, but rather with the notion that there is a single standard for excellence.

The flagship universities vs. world-class universities debate, at heart, is simply an argument about which single standard to use.  Those of us in North America might prefer the flagship model because it speaks to our historic experience and prejudices.  But that’s no reason to think anyone should adopt it, too.

September 20

How Families Make PSE Choices

Over the last few months at HESA Towers we’ve been doing a lot of interviews of parents of grade 12 students, to help understand what it is that shapes and shifts their perceptions of higher education institutions.  I can’t give away much of the content here (that’s for paying customers), but one issue I do think is worth a mention is what we’re finding about how families make decisions about post-secondary education.

The way researchers conceive of decision-making in post-secondary is a pretty linear one, at least where traditional-aged students are concerned.  Parents and students, separately or in tandem, research possible career avenues and try to match them with educational pathways and students’ own interests.  They research programs and institutions, and try to judge quality.  They examine their finances – preferably jointly – and discuss what is affordable.  And on the basis of these various pieces of information, they winnow down the number of potential programs/institutions from a large number to a fairly small one, to which one might apply and finally down to a single institution.  Conceptually, it’s like a funnel, wide at the top and then narrowing gradually as students and parents seek and process information.

As a conceptual model, this suffers from just one problem: it’s mostly wrong.

Here’s what we’ve found instead.  The first is that the notion of parents and students “discussing” post-secondary options is valid only if you think of “discussions” as being asynchronous snatches of conversation that stretch over months or even years.  Parents do not really see their role as one of getting students to decide on choices.  In fact, most assume that the more direct they are about discussing or suggesting options, the more their kids will disengage or oppose them.  Instead, parents see their role as almost horticultural.  They “plant seeds” with their kids by suggesting ideas here and there, but more or less allow them to come to their own conclusions.

Another key set of assumptions about family decision making is that money is a central part of the discussion and money plays an important role in the eventual choice of institution.  And here the answer is basically “yes and no”.  Parents do talk to their kids about money in general terms.  A few don’t – they refuse to talk about it so as “not to distract them”, some save money but don’t tell them about it to “keep them motivated” – but for the most part parents let their kids know at least in general terms how much money is available.

But when it comes to choosing an institution, money plays an ambiguous role.  It’s pretty clear that most parents would prefer if their kid stayed home for financial reasons For the most part, kids are more than happy to study close to home, too, so for them the issue of money simply doesn’t impinge on choice.  Money (or lack of it) really only comes into play once a student starts coming close to making a decision that involves going away to school.

Not surprisingly, parents are reluctant to spend money on kids who they think are unlikely to benefit much by going away to school.  This isn’t just a preference for spending less money rather than more: many parents of grade 12 students simply don’t think their kids are mature enough or organized enough to go away.  But – and here’s where it gets interesting – parents don’t necessarily express this opinion by talking to their kids about money.  Another way they do it is to talk up local schools – or at least avoid talking up more distant ones – during their “planting seeds” discussions and hope the kid comes to the preferred conclusion on his or her own (though to some extent this also reflects greater parental familiarity with local as opposed to more distant institutions).

On the other hand, if the kid is perceived as actually having their act together – partially an issue of grades but also one of having goals and a sense of purpose – money becomes less of an issue for parents.  It’s not that the issue disappears or that they’ll let their kids do whatever they want, but if parents think their kid has their act together, they are more open to allow the student to drive the decision about where to go to school.

So in other words, much of the “discussion” occurs by way of kids spending hundreds of hours cracking the books (or not cracking then, as the case may be) and thus sending signals to parents about their talents and capacities.  Based on the presence/absence of said capacities, parents gradually, over a number of years, drop hints about institutional and program preferences, sometimes to kids who have a hopelessly short attention span about such things.  Sometime between mid-grade 11 and early grade-12, the students themselves become serious about searching for institutions.  When they start this process, they are doing so after having experienced years of subtle (or maybe not-so-subtle) hints from their parents about what kinds of programs and institutions are acceptable.  From that, they make a choice.  Then, and pretty much only then, do discussion about money explicitly come into the open.  But in many cases it does not need to because the student has already made the “correct” (form the parents’ point of view) choice which will unlock a contribution sufficient to get the student into school.

As noted above, this is quite different from how most college choice theories describe the decision-making process. And I think this has some serious consequences for the way we communicate to families and students issues of price and affordability.  There need to be some very simple, general messages about how aid makes education affordable, wherever one chooses to undertake it, that can be hammered over and over.   Communicating the specific details of aid programs is almost a total waste of time until very late in the final year of high school, after the institutional choice has already been made. It’s almost as if two different information products need to be created for two different audiences: a simple and general one for use during the choice process and a detailed one for afterwards.

There are also quite a lot of implications here for how institutions should sell themselves.  But those we keep for our institutional clients.  Drop us a line (info@higheredstrategy.com) if you’re interested in becoming one.

September 19

Growth of Presidential Compensation

Let’s do another blog on this topic because everyone loves talking executive compensation.

Yesterday we looked at Presidential pay in international comparison and saw that Canadian university Presidents have fairly low pay compared to equivalents in other English-speaking countries.  But, one might argue, that’s the wrong metric.  Maybe the real problem isn’t high pay so much as a relatively quick rise in pay over the past few years.

That’s a fair argument.  But let’s see what the data says.

My data source here is the ever-handy CAUT Almanac (2005, 2010-11 and 2015-16 editions), from which we can obtain data on Presidential salaries for 2003, 2008 and 2013.  There are multiple observations for 58 institutions (44 in 2003, 54 in 2008 and 56 in 2013); any institution for which I have data for only one year is eliminated.  Now with numbers this small, one has to be careful about one or two outliers distorting averages either up or down.  With that in mind, figure 1 shows the evolution of average compensation for university Presidents in Canada.

Figure 1: Average Pay, University Presidents, Canada 2003-2013, in real 2013 dollars

Source: CAUT Almanac

What figure 1 shows is that between 2003 and 2008 average Presidential pay rose by 8% after inflation (or, about 1.6% per year).  However, between 2008 and 2013 the figure fell by a little less than 1%, meaning that for the decade as a whole the average annual increase was 0.68%.

Surprised?  Skeptical?  Well with datasets this small, it’s good to be careful.  Not all of these are observations are comparable.  In any given year, a few Presidents get hired and others leave their position.  For these people, the salaries & compensation as captured by the Almanac are not particularly helpful because their salary only covers part of the year.  In a couple of cases, you also get what look like one-off payments which inflate the salary.

To try and get around this problem, let’s look at the change between 2003 and 2008 for every institution for which we have data for both years:

Figure 2: Distribution of Real Presidential Salary Changes, 2003 to 2008

Source: CAUT Almanac

The highest value here is Acadia, and that’s clearly because the 2003 observation was for a President who was only came on board in September, thus giving her an artificially low number in the base year.  Similarly, most of the negative numbers are for people who came on board mid-way through the year in 2008, such as Alan Rock (Ottawa), Roseann Runte (Carleton) and Michael Goldbloom (Bishop’s).  But some of the negative numbers are also “resets” as universities bring compensation down when a new President is installed; for instance, Indira Samarasekara’s 2008 compensation was 28% lower than her predecessor’s in 2003.

Without a lot of fact-checking around appointment dates which I frankly have no interest in doing (free email guys: you get what you pay for) I can’t be sure exactly which Presidents fall into which category.  But assuming that the artificially high and artificially low observations more or less cancel each other out, looking at the median observations should give us a sense of what was going on at the typical institution.  As it turns out, the median here is 20%, compared to the 8% average we saw in figure 1: that’s not a “better” figure, by the way, just a different lens.  For 2008 to 2013, the median is 0% (same as the average), and for 2003 to 2013 the number is again 20%.

Just for amusement, let’s compare this for a second to what’s been going on with professorial pay. Again, the data source is the CAUT Almanac for the same years.  Guess what?  Between 2003 and 2013, the average rise in pay – after inflation – for full professors (the nearest comparator to University Presidents) was 23%.  I suspect that the rapid rise after 2008 has to do with fewer retirements and more professors staying on for more years and receiving annual pay rises.

Figure 3: Comparison of Changes in Presidential and Full Professorial Pay, 2003-2013

Source: CAUT Almanac

To sum up:  between 2003 and 2008, presidential pay was rising by somewhere between 1.5% and 3.7% per year over inflation, depending on how you look at it.  However, between 2008 and 2013 presidential pay stayed even with inflation.  Meanwhile, average pay for full professors rose steeply after 2008, and over the decade to 2003 to 2013, their average pay rises were higher than those for Presidents – substantially so if we take an average-to-average comparison.

So next time anyone complains about huge pay rises to executives, just remember that professorial pay has been rising faster.  Sauce for the goose, etc.


September 18

Presidential Compensation

Over the summer, the revelation that the University of Alberta paid Indira Samarasekera two full years of administrative leave at over $550,000 per year after the conclusion her ten-year (two-term) Presidency caused a series of snit-fits, the most notable one being this one from Paige MacPherson, the Alberta Director of the Canadian Taxpayers Federation.

As I’ve noted before (here and here), Canadian university Presidents are not that well paid, at least by the standards of other Anglosphere universities.  Paul Kniest, of Australia’s National Tertiary Education Union, helpfully put together a good international comparison which I reproduce below (the blog post from which it is taken is here).

Now those figures are in Australian dollars, but our currencies are close enough to par as to make no odds.  Also, yes, the Canadian number looks a little low; I think it’s because the CAUT Annual Digest – the source for the data – includes all Presidents, even if they are not serving a full year, so there are a few “partial” salaries which bring the average down.  My guess is that if you exclude those, you end up with a figure closer to New Zealand’s.  But we’re nowhere near our neighbours to the South.  For instance, the base pay of our highest-paid President (David Turpin) would place him about 250th in the US, or 118th among public university Presidents.  (Santa Ono, in case you’re wondering, took a cut in base pay of about 20% to move from Cincinnati to UBC).

I suppose one might argue that this is all a kind of “if all your friends were jumping off a bridge” argument.  There are other comparators one could use: hospital executives and senior public servants are the most obvious ones.  But even here, I’m not sure this is such a great comparison.  Neither of those are required to raise their own revenue from market and philanthropic sources to the extent a university President is.  After all, with provincial government funds now providing less than half of university funding in many cases, one could argue that fairer comparisons might be with private industry.  By this logic, some university Presidents – those say in Quebec or Newfoundland where provincial government foot well over half the bill – might be adequately or even over-paid, but equally the President of a place like U of T ($3 billion in revenue, less than a quarter of which is provincial grant) might be seen as grossly underpaid.

(I think this is probably right, btw.  I’d argue that the scandal in Presidential salaries is actually how narrowly banded they are.  The gap in executive pay between, say, UQ Abitibi-Temiscamingue and UBC should be a lot bigger because the latter is a hell of a lot of a bigger job).

The other favourite comparator is of course the Prime Minister/Premier (note that in the UK, the government is now proposing legislation to effectively prohibit university vice-chancellors from making more than the Prime Minister).  After all, he/she is in charge of the Whole Damn Country/Province, why should anyone in a public position make more?  There is some force to this (though one could apply it to the private sector too), but of course PMs and Premiers tend to make a lot of money in what amounts to deferred compensation – making speeches, sitting on corporate Boards, ambassadorships, etc.  But if there is one thing that drives people crazier about university Presidents than their salaries, it’s the idea of deferred compensation (as the Samarasekera snit-fit shows).

Presidents are mostly former academics.  Part of the academic compensation package is sabbaticals – time off from teaching every seventh year to pursue research interests.  As far as I can tell, the idea of “administrative leave” – that is, a fully-paid year off after a five-year term in administration – was originally thought of as analogous in terms compensation.  It’s not a perfect match of course – a year off after five years instead of six, full pay instead of 90% pay, etc, – but it’s close.

But some things about deferred pay weren’t analogous.  Turning the leave into lump-sum cash payments for instance.  David Johnston made off with over $1 million that way when he went to become Governor General without anyone saying “boo”, but when Amit Chakma tried the same thing at Western he got roasted.  So no one does that anymore.  In fact, a couple of contracts I’ve seen recently limit deferred compensation to a single year, even if the President serves for more than one term as President.

Are these perks similar to those seen in other countries, or are they Canadian universities’ way of surreptitiously bumping executive pay?  I can find no evidence of this kind of compensation in either the UK or Australia (which doesn’t mean it doesn’t happen; just that a couple of minutes’ googling on my part came up dry).  However, in the US, this kind of thing most definitely happens and on a much greater scale

Just to take a couple of examples found with a minimum of research: at the University of Florida, the former President got five years of deferred pay equal to his Presidential salary, though it was structured as a non-compete payment.  At the University of Michigan, the President receives one year’s deferred compensation if he stays at the institution – but he also is guaranteed a $2 million fund to start up a new laboratory.  In many cases, US universities (public and private) don’t offer salaried administrative leave, but do offer boatloads of “deferred payments” which are booked in the year they are earned, and counted separately from base salary (some examples here on p.3).  More broadly, US university Presidents also seem to benefit from a variety of other perks which may not be available to Canadian ones.

In other words, while our Presidents are well-paid, there’s no obvious reason to think they are vastly overpaid either.  And if we have a problem with the idea of deferred compensation, fine: just fold that extra compensation into their salaries during their term and be done with it.  Presidents could then save it, spend it, do what they want with it.  This arrangement would be clearer, cleaner and more transparent than what we do now.


September 15

Why our Science Minister is Going to be Disappointed in Statscan

Last week Statscan sent me a consultation form asking my opinions about their ideas on how to change UCASS (the University and College Academic Staff Survey, which like most Statscan products containing the word “college” does not actually include the institutions most of us call “colleges” i.e. community colleges).  I’ve already said something about this effort back here to the effect that focussing so much effort on data collection re: part-time staff is a waste of time, but the consultation guide makes me think Statscan is heading into serious trouble with this survey reboot for a completely different set of reasons.

Remember that when the money for all this was announced, the announcement was made by our Minister of Science, Kristy Duncan.  One of her priorities as Minister has been to improve equity outcomes in scientific hiring, particularly when it comes to things like Canada Research Chairs (see here for instance).  The focus of her efforts has usually been gender, but she’s also interested in other equity populations – in particular, visible minorities, Indigenous peoples, and persons with disabilities.  So one of the things she charged Statscan with doing in this revived UCASS (recall that Statscan cut the program for five years as a result of Harper-era cuts) is to help shine a light on equity issues in terms of salaries, full-time/part-time status, and career progression.

This is all fine except for one tiny thing.  UCASS is an not a questionnaire-based instrument.  It’s an administrative survey.  That means institutions fill in a complicated set of sheets to provide Statscan with hundreds of different aggregated data cuts about their institution (what is the average salary of professors in Classics?  How many professors in chemical engineering are female?  Etc).  In order to use UCASS to address the demographic questions Duncan wants answered, institutions would first need to know the answer themselves.  That is, they would need to know precisely which instructors have disabilities, or which are “visible minorities”, just as they currently know everyone’s gender.  Which means they would need to find a way to make such disclosures mandatory, otherwise they would not be able to report to Statistics Canada.

I tried this idea out on my twitter audience over the weekend.  Let’s just say it did not go over very well.  A significant number of responses were, essentially: “over my dead body do I give this information to my employer.  If Statscan wants to know this, they can ask me directly.”

Well, yes, they could I suppose, but then the resulting data couldn’t be linked to administrative information on rank and salary without getting each faculty member’s permission, which I can see not always being forthcoming.  In addition, you’d have all sorts of non-response bias issues to deal with, especially if they tried to do this survey every year – my guess is most profs would simply ignore the survey after year 2.  And yes, you’d have to do it frequently because not all disabilities are permanent.

Here’s my suggestion.  Statscan should actually do two surveys.  Keep UCASS more or less the way it is, extend it to colleges (some of whom will take a decade to report properly but that’s life) and part-timers (if they must – frankly, I think more people would be interested in data on non-academic staff than in data on part-time staff) but don’t mess around with the basic structure or try to force professors into reporting on their demographic characteristics – other than gender, which is already in there – to their employers because that’s just more trouble than it’s worth.  Then, every five years or so, do a second survey so in which you take a demographic snapshot of the professoriate as a whole.  It will have mediocre completion rates, but it’s better than nothing.

(In fact, universities and colleges could do this themselves if they wanted to at a cost much lower than whatever Statscan will end up paying, but since they almost never collaborate on creating public data without a gun to their heads it seems like some federal intervention is inevitable if anyone wants this done).

This is not what Minister Duncan asked for, I know.  But it’s the only way to get her the data she wants without causing mayhem on campuses.  Hopefully, pragmatism will prevail here.

September 14

Notes on the Finances of China’s Top Universities

One of my distractions over the past summer has been to learn more about Chinese universities.  And, fortunately, this is becoming a lot easier as Chinese universities are starting to put more of their data online.  Today, I just want to take you through a bit of a tour of China’s top universities (roughly the equivalent of the US Ivy League), which are known as the “C9”, most of which now put their financial data online.

So let’s start just by looking at raw annual expenditures (I prefer using expenditures to income as a guide to a university size because it tends to be more stable year-to-year) at these top universities.  Figure 1 shows this by institutions for the 2015 calendar year.  Tsinghua leads the pack by a wide margin, at a little over RMB 13 billion.  Peking, Zhejiang and Shanghai JiaoTong are next at between RMB 8-9 billion Yuan, Fudan followed by Fudan Xi’an Jiao Tong at between RMB 5-6 billion.  The bottom positions are held by the two C9 universities which do not report to the higher education ministry: the University of Science and Technology of China (Chinese Academy of Science) and the Harbin Institute of Technology (Ministry of Industry and Information Technology) at RMB 3.4 billion and RMB 2.2 billion, respectively.

Figure 1: Expenditures, in Billions of RMBTop Chinese Universities, 2015

One interesting piece of information about these institutions is how little of their annual budget actually comes from government.  Figure 2 shows government appropriations as a percentage of annual expenditures (Harbin Institute of Technology is excluded because its financials do not distinguish between public and private sources of revenue).  As it turns out, top Chinese universities actually look a lot like Ontario ones in that they tend to get less than half their money from government.  That said, at most institutions student fees only account for about 15% of total revenue.

Figure 2: Government income as a % of total expenditures, Top Chinese Universities, 2015

Now at this point you may be wondering: RMB 13billion….is that a lot?  A little?  What’s the frame of reference here?  Well, fair enough.  Let’s put all this into US dollars, just so we’re clear.  And for reference, let’s throw in data for Harvard, Berkeley, U of T and UBC for 2015-16 for comparison.  To do this, I’m converting to USD at the mid 2015 exchange rate of RMB 6.21 = CDN $1.29 = USD $1.  The results are shown in Figure 3: By this measure, only Tsinghua is really up in the North American big leagues.

Figure 3: Total Expenditures, in USD, Top Chinese Universities plus US/Canada Comparators, 2015

But hang on a second.  What if we use purchasing power-parity instead of exchange rates?  Well, actually, this changes things more than you’d think.  If you convert the data at the mid-2015 Big Mac Index rate of RMB 3.55 = CDN $1.22 = USD $1.

Figure 4: Total Expenditures, in billions of USD at PPP, Top Chinese Universities plus US/Canada Comparators, 2015

Once adjusted for PPP, Tsinghua moves closer to Harvard, and the next three are more obviously in the big leagues, having all passed UBC.  Now in fact, PPP probably overstates universities’ buying power somewhat, because for many of the goods what universities purchase (top professors, scientific equipment, etc), the price is global rather than local.  So if you want to think about relative purchase power, a fair comparison between the institutions is probably somewhere between figure 3 and figure 4.

(If we were to do this from the perspective of “how big is each institution relative to the size and development of the economy” – that is, adjusting for GDP per capita, all the Chinese institutions would rise by a factor of four relative to American ones, i.e. Tsinghua would be three times as large as Harvard.

Now, what about dollars per student?  For this, I take the student numbers the institutions report to Quacquarelli Simons (QS) for use on its “top universities” website.  You can take these with a grain of salt: I can’t get QS’ numbers to line up with the data I have directly from any of these institutions, but it’s the most consistent thing we’ve got, so we’ll just have to live with it.

Figure 5: Expenditures per Student, in USD at PPP, Top Chinese Universities plus US/Canada Comparators, 2015

Now Tsinghua is much more clearly in an Ivy-League-approaching kind of position, with expenditures of over $100,000 per student.  That’s not near Harvard, which spends about twice that, but it is a full 25% higher than Berkeley and 150% higher than UBC and Toronto.  Even the Chinese second-tier trio of Shanghai Jiao Tong, Peking and Zhejiang are spending 50% more per student than the top Canadian universities.

In short, the top Chinese universities aren’t, as it is sometimes said, “rising”.  Financially, they’re already comfortably part of the world elite.

September 13

Some Curious Data From OECD Education at a Glance 2017

The OECD put out its annual Education at a Glance  publication yesterday.  No huge surprises except for the fact that they appear to have killed one of the most-used tables in the whole book (A.1.2, which compared tertiary attainment rates for 25-34 year olds by type of tertiary program – i.e. college v. university) which is an enormous bummer.  The finance data says what it pretty much always says: Canada is the #2 spender overall on higher education at 2.6% of GDP (just behind the US at 2.7%).  If you read my analysis last year, the story is still pretty much the same this year.

But there are some interesting nuggets buried away in the report nevertheless – stuff that other media won’t pick up.  I thought I would highlight two of them in particular which pose some thorny questions about Canadian statistical data and what we think we know about higher education.

Let’s start with the data on expenditures per pupil at the tertiary level.  Figure 1 looks at costs in Short-cycle Tertiary Education (meaning career-oriented, which in Canada’s case means community colleges)

Figure 1: Total Expenditures per Student, Colleges (or equivalent), Select OECD countries

Among major countries, Canada spends the most (from both public and private sources) on college or college-equivalent student.  A couple of countries actually do outspend us (the Austrians and – totally out left field – the Czechs), but the important point here is that our expenditures are nearly 40% above the OECD average.  And if you’re wondering why the UK and the US aren’t there, it’s because the former has no college equivalent and the latter chooses to not to report on colleges on the batshit crazy spurious grounds that even if you’re studying for a (college-equivalent) associate’s degree, the fact that this can be laddered up into a full bachelor’s means everything is really degree-level.  Nonsense, I know, but there we are.

Now, let’s do the same with universities:

Figure 2: Total Expenditures per Student, Universities, Select OECD countries

There’s not much in figure 2 we didn’t already know: US and Canada in terms of total expenditure per university student at the top with us over 50% above the OECD average and Korea way down at the bottom because the Koreans do everything in higher ed on a shoestring.

Now, one new little detail that OECD has added to Education at a Glance this year is that it splits out the portion of total expenditures (that is combine short-cycle and degree-levels)  which are devoted to R&D.  And this data is a little odd.

Figure 3: Total R&D Expenditures per Tertiary Student, Selected OECD Countries

There’s nothing obviously egregiously wrong with figure 3 – except for the data on the USA, which is bananas.  Read literally, it suggests that Canadian universities on average spend twice as much on R&D as American ones do and that’s simply false.

(The explanation, I think, is that Canada and possibly some other countries claim that all professors’ time spent on research – notionally 40% of time or thereabout – counts as “R&D”.  Whereas Americans claim that their universities – which only pay staff for 9 months a year with the rest of the time notionally off for research – do not count time that way, preferring to claim that the government is buying profs’ time with research grants.  Basically, they view universities as mailboxes for cheques to pay for staff time and so all that time money gets claimed as government expenditure on R&D, not university expenditure on R&D.  GERD, not HERD, in the innovation policy lingo.  I think, anyway).

What’s actually a little crazy about figure 3 is that the denominator is all tertiary students, not just degree-level students.  And yet we know that R&D money is pretty heavily concentrated (98%+) in universities.  In a country like Germany where over 99% of tertiary students are in degree-level institutions, that’s not a big deal.  But in Canada, about a third of our students are in short-cycle programs.  Which means, if you do the math, that in fact the R&D expenditures per university student are a little ways north of $9750.  Now here’s figure 3 again, with just degree-level students in the denominator.

Figure 4: Total R&D Expenditures per University Student, Selected OECD Countries

And of course, subtracting these numbers means we can revisit figure 2 and work out total non-R&D expenditures per student in universities.  Canada still remains 40% or so ahead of the OECD average, but is now similarly that far behind the US in per-student expenditure.

Figure 5: Total non-R&D Expenditures per University Student, Selected OECD Countries

Now, to be clear: I’m not saying OECD is wrong, or Statscan is wrong or anything else like that.  What I’m saying is that there appear to be major inconsistencies in the way institutions report data for international comparative purposes on key concepts like R&D.  And that this particular inconsistency means that Canada at least (possibly others) look a lot better vis-à-vis the United States than it probably should.

Just something to keep in mind when making comparisons in future – particularly around research expenditures and performance.

September 12

NDP Leadership Race Notes

So the deadline to sign up for the federal NDP leadership passed a couple of weeks ago, and the first deadline for the mail-in ballots is next Monday.  So what to make of the four candidates and their views on post-secondary education?   Based on their platforms and a series of responses to a questionnaire on Science policy from Evidence for Democracy (responses available here), my take is as follows:

Jagmeet Singh.  Nothing.  He has a lot of policy proposals on various topics but effectively nothing on skills, education and how to pay for them.  He is also the only one of the four candidates specifically avoided making any commitments at all with respect to the Naylor Report.

Charlie Angus.  On the skills side Angus says he would “establish a labour market partners’ forum so government can work with labour and other stakeholders to develop programs and make Canada’s labour market development programs more accessible by lowering the eligibility requirement.”  I am not entirely sure what this means, though the use of the term “eligibility requirements” seems to imply that he’s talking about skills acquisitions as being entirely tied to Employment Insurance, which is somewhat restrictive (even though Angus does simultaneously promise it to make it easier to qualify for EI).

On post-secondary generally, Angus says he would work “towards a comprehensive education accord with the provinces that eliminates tuition, ensures adequate funding for research, sets standards for mental health and sexual assault policies, and improves working conditions for students, staff and adjunct or contract faculty on campus,” which suggests ambition if not a totally firm grasp on how federalism works (also: no price tag attached).  He also says he wants to eliminate interest on Canada Student Loans (bad idea), put new money into PSSSP for Indigenous students and extend it to include bridging programs, increase weekly loan limits for all students and better harmonize federal & provincial retraining programs (all excellent ideas).  And finally, with respect to Science, Angus is pro-Naylor (committed to implementing the report, full stop) and anti-superclusters.

Guy Caron.  For a former CFS chair, Caron is awfully quiet about PSE (then again, as an MP from Quebec, his perspective may have changed somewhat).  From an income standpoint, his Basic Income scheme – everyone over 18 gets their income topped up to at least equal Statistics Canada’s Low-Income Cut Off  would pretty much take care of the need to increase student aid any time soon.  But also in Caron’s platform is a genuinely intriguing mention of an “Activity Account for Lifelong Learning” which is describes thusly: “financed by contributions from workers, employers, and the federal government, the account will enable its holder to finance lifetime learning and job retraining. It would be portable so that if the individual moved or switched jobs, the account would migrate with them”.  The notion is not developed further, so it’s hard to say exactly what’s intended, but it sounds a lot like a mix of CPP/EI (compulsory deductions) with RESPs (government top-ups) for personal use.  In principle there’s much to like about this kind of idea though it’s worth remembering that a badly-implemented version of this idea cost the UK government hundreds of millions of dollars back in 2001.  Caron also supports full implementation of the Naylor report.

Niki Ashton.  This is the big one.  Ashton promises to:

  • Eliminate tuition fees, as per the proposal made by the Canadian Centre for Policy Alternatives.  That would cost $3.5 Billion, and still depends on a) provinces being willing to pick up half the bill, and provinces being willing to accept massively different levels of federal support to so (basically, provinces currently doing most would receive the least under this program, leading to the obvious problems I described back here).
  • Reduce tuition for international students to “affordable levels”.  No financial details as to what possible mechanism would compel institutions or provinces to go along with this, or whether it has even occurred to her that most HEIs would sharply reduce intake of international students if this ever happened (unless the feds ponied up a couple of extra billion in compensation).
  • Eliminating interest on Canada Student Loans and doubling the repayment threshold so students do not need to repay loans if earning under $50,000.  It’s hard to tell from the platform, but this looks like a retroactive commitment – that is, it applies to all outstanding student loans.  That’s an expensive commitment, since international evidence shows that raising the threshold usually has significant knock-on effects in terms of lifetime repayment rates.
  • Increase funding for Aboriginal PSE.  Basically the promise here is to fulfill the TRC recommendation to get rid of the 2% cap (which the Trudeau government already ditched last budget), fund the backlog of First Nations applicants and include Metis students in this funding arrangement.
  • Increase funding for graduate students and “equalize research funding across disciplines”.  My interpretation of this is that it means increasing the SSHRC budget relative to those of NSERC and CIHR, but it’s not 100% clear.
  • With respect to Naylor’s recommendations, Ashton carefully says she is committed to “addressing” them, but carefully avoids any comment at all on the big issue of a $1.3 Billion increase in funding.  The bits she likes involves “re-balancing” funding and handing more money to grad students, post-docs and early career scientists.  If one were being uncharitable, one might suspect that she cares about government funding for science mainly as an income support mechanism for scientists rather than a means for actually performing scientific endeavours.

No argument from me on the Indigenous funding, but apart from that, my comments on Ashton’s platform are largely the same ones I had on the Green Party platform in the 2015 election (to which this bears more than a passing resemblance): so many billions of dollars, and not one of them going to increase the quality of provision or increase the number of student seats.  It’s all about cheaper.  Such a waste.

Anyways, if I’m ranking these platforms, Angus probably edges it.  His PSE accord idea is unworkable, but the pledges on Indigenous education and harmonizing training funding are good.  Caron would come second for the originality of his learning account idea.  Ten points to Ashton for thinking PSE is important, another ten for her position on Indigenous education but minus several hundred for the actual, wasteful substance.  Singh is simply missing in action.

The first round of voting takes place October 2nd; should extra ballots be required, they will take place on the following two weekends.  Best of luck to all.

September 11

The Growing Importance of Fee Income

I made a little remark last week to the effect that on present trends, student fees would pass provincial funding as a source of revenue for universities by 2020-2021 and combined fed-prov government funding by 2025.  Based on my twitter feed, that seems to have got people quite excited.  But I should have been a little clearer about what I was saying.

First of all, by “on present trends”, I literally meant do the simple/stupid thing and take the annual change from 2014-15 to 2015-16 and stretch it out indefinitely.  One could use longer-term trends but for provincial government funds, the difference is minuscule because the 1-year and 5-year trends are pretty similar.  It’s harder to do that with the federal money because it jumps around a lot on an annual basis (is there a federal infrastructure program in a given year?  Have they given a one-time bump to granting council dollars?  etc.) and so medium term trends are harder to discern.   Second, when I said it would pass government funding, I meant for the entire budget, not just the operating budget (feds don’t really contribute to operating budgets).  And third, I was speaking in terms of national averages: regional averages vary considerably and in some provinces, fees passed government grants as a source of income some time ago.

Anyways, I thought it would be fun to do some inter-provincial comparisons on this.  To make things simple, I’m going to exclude federal funds from the exercise, and just look at provincial grants and student fees.  As previously, the data source is the Statcan/CAUBO Financial Information of Universities and Colleges Survey.

Let’s start by looking at how grants and fees compare to the size of the operating budget of universities in each province.

Figure 1: Provincial grant and fee income as a percentage of operating income, by province, 2015-16

Now, remember: some provincial and fee income goes to areas other than the operating budget and operating income is not restricted to just student fees and government grants.  Thus, you shouldn’t expect the two sets of lines to add up to 100%.  In some cases they add to more than 100%, in some cases less.  But no matter, the point is here that already in 2015-16 fees represent a greater portion of the operating budget than government grants in Ontario and an equal proportion in Nova Scotia.  In BC and PEI, fee and grant income are close-ish, but in the other six provinces government grants predominate.

Now let’s look at the five-year percentage change in income, in real dollars, from fees and grants.  This one is kind of complicated, so bear with me.

Figure 2: Change in income from provincial grants and student fees, by province, 2010-11 to 2015-16

There are seven provinces which share a pattern: increasing real fee income and decreasing real provincial grant income, though the extent varies.  The biggest shifts here are in Ontario and BC.  Quebec is the only province which has seen an increase in income from both sources.  In all eight of these provinces, we can do straight-line projections of the future pretty easily.

But then there are two provinces – Newfoundland and New Brunswick – which have seen net decreases in both sources of income.  Basically, this is what happens when a demographic collapse happens at the same time as a fiscal collapse.  In per-student terms this doesn’t look quite so bad because enrolments are declining, but since staff don’t get paid on a per-student basis that doesn’t help much when it comes to paying the bills.  It’s hard to do straight-line projections with these two because it’s quite clear the fee income declines aren’t going to continue indefinitely (the demographic collapse stabilizes, eventually).  So we’re going to say good-bye to these two for the rest of this analysis, while wishing them the very best in dealing with their rather significant challenges.

Ok, for the remaining eight provinces, let’s combine the info in those last few graphs.  Let’s take the income by source data in figure one, and then apply the trend changes in figure 2 to each province.  The easiest way to show this in a graph is to show fee income as a percentage of provincial grant income.  We can show this out to 2024-25, as seen below in figure 3.

Figure 3: Projected ratio of student fee income to government grant income to 2025, by province

What figure 3 really shows is that Canada is heading towards a much more financially heterogeneous higher education system.  For the country as a whole, fee income for universities should surpass provincial government grants in 2020-21.  But this masks huge variation at the provincial level.  Ontario and Nova Scotia (by now) already exceed that level.  BC will get there in three or four years, PEI will get there by 2024-25.  But the other provinces aren’t on track to hit that level until 2030 at the earliest (and in Quebec’s case it’s about 2055).

Another way to think of this is that in about a decade’s time, the funding landscape in places like Quebec, Manitoba and Saskatchewan is going to look the way it did in Ontario ten to fifteen years ago.  At the same time, Ontario’s funding landscape is going to look a lot like big American public schools, with less than 30% of the operating budget (and probably something like 15% of total funding) coming from provincial governments.  Differing incentives facing different universities means they are probably going to be run quite differently too: expect a greater variety of institutional cultures as a result.

Now, as with any straight-line projection, you should take the foregoing with a healthy amount of salt.  Politics matter, and funding trajectories can change.  This is one possible scenario, not necessarily the most likely but simply the one most in line with current trends.

But keep in mind that the above is the probably good news scenario for Ontario.  The bad news scenario would see the percentage of funds coming from fees restricted not by increasing the government grant, but by restricting student intake, or the intake of international students (which is where the big gains in fees are really coming from).  So even if you find this scenario disturbing: be careful what you wish for.

September 08

Data on Sexual Harassment & Sexual Assault in Higher Ed-an Australian Experiment

Earlier this year, I raged a bit at a project that the Ontario government had launched: namely, an attempt to survey every single student in Ontario about sexual assault in a way that – it seemed to me – likely to be (mis)used for constructing a league table on which institutions had the highest rates of sexual assault.  While getting more information about sexual assault seemed like a good idea, the possibility of a league table – based as it would be on a voluntary survey with pretty tiny likely response rates – was a terrible idea which I suggested needed to be re-thought.

Well, surprise!  Turns out Australian universities actually did this on their own initiative last year.  They asked the Australian Human Rights Commission (AHRC) to conduct a survey almost exactly along the lines I said was a terrible idea. And the results are…interesting.

To be precise: the AHRC took a fairly large sample (a shade over 300,000) of university students – not a complete census the way Ontario is considering – and sent them a well-thought-out survey (the report is here).  The response rate was 9.7%, and the report authors quite diligently and prominently noted the issues with data of this kind, which is the same as bedevils nearly all student survey research, including things like the National Survey of Student Engagement, the annual Canadian Undergraduate Research Consortium studies etc etc.

The report went on to outline a large number of extremely interesting and valuable findings.  Even if you take the view that these kinds of surveys are likely to overstate the prevalence of sexual assault and harassment because of response bias, the data about things like the perpetrators of assault/harassment, the settings in which it occurs, report of such events and the support sought afterwards are still likely to be accurate, and the report makes an incredible contribution by reporting these in detail (see synopses of the reports  from CNN, and Nature).  And, correctly, the report does not reveal data by institution.

So everything’s good?  Well, not quite.  Though the AHRC did not publish the data, the fact that it possessed data which could be analysed by institution set up a dynamic where if the data wasn’t released, there would be accusations of cover-up, suppression, etc.  So, the universities themselves – separate from the AHRC report – decided to voluntarily release their own data on sexual assaults.

Now I don’t think I’ve ever heard of institutions voluntarily releasing data on themselves which a) allowed direct comparisons between institutions b) on such a sensitive subject and c) where the data quality was so suspect.  But they did it.  And sure enough, news agencies such as ABC (the Australian one) and News Corp immediately turned this crap data into a ranking, which means that for years to come, the University of New England (it’s in small-town New South Wales) will be known as the sexual assault capital of Australian higher education.  Is that label justified?  Who knows?  The data quality makes it impossible to tell.   But UNE will have to live with it until the next time universities do a survey.

To be fair, on the whole the media reaction to the survey was not overly sensationalist.  For the most part, it focussed on the major cross-campus findings and not on institutional comparisons.  Which is good, and suggests that some of my concerns from last year may have been overblown (though I’m not entirely convinced our media will be as responsible as Australia’s).  That said, for data accuracy, use of a much smaller sample with incentives to produce a much higher response rate would still produce a much result with much better data quality than what the ARHC did, let alone the nonsensical census idea Ontario is considering.  The subject is too important to let bad data quality cloud the issue.


Erratum: There was a data transcription error in yesterday’s piece on tuition.  Average tuition in Alberta is $5749 not $5479, meaning it is slightly more expensive than neighbouring British Columbia, not slightly less.

Page 1 of 11712345...102030...Last »