Higher Education Strategy Associates

May 16

Deans and Multiple Personality Disorders

Imagine two scenarios.  In the first, an academic is threatened with termination if he/she speaks out publicly against the university’s proposed strategic plan.  In the second, a manager is fired for disobeying a direct order from a superior about running down the company he/she works for.  For most readers, I’d guess the first scenario is abhorrent, and the second quite understandable (if perhaps somewhat harsh).  Yet both scenarios describe precisely what happened to University of Saskatchewan’s Dean, Robert Buckingham.

The Buckingham incident goes to the heart of a real live issue in Canadian universities: for whom do deans work – the President and Provost, or the faculty?  Are they management’s tool to keep faculty in line, or do they represent the interests of their faculty in the halls of the power?

I don’t think there’s much doubt in a legal sense that Deans answer to senior management rather than faculty.  But the way Deans are chosen usually incorporate a large amount of feedback from professors in that department, who want to make sure that the Dean is – to the extent possible – sympatico with their interests.  And whether the Dean is a likable figure or not, he/she is very much expected to fight for the interests of that faculty and its members when it comes to things like resource allocation.

So, to Saskatoon where, as part of the university’s restructuring process, the 5-year-old School of Public Health Buckingham headed was slated, along with the School of Dentistry and the college of Medicine, to become part of an enlarged Faculty of Medicine.  The School, which at least in its own eyes is pretty hot stuff having just received European accreditation for its program, was less than thrilled with the notion of being under the same roof as the College of Medicine, which has had a rough time with accreditation issues for the past few years.

Buckingham fought his corner spiritedly but quietly for several months.  When Deans were recently told that the time for chat was over, and it was time for all the managers to fall in line, Buckingham chose not to do so.  Instead, he wrote a letter (available here) that wound up in the StarPhoenix in which he effectively implied that: a) the President and Provost lacked courage, and b) that the College of Medicine was sub-standard.  Within the next 24 hours, Buckingham was not only removed as Dean, but was also fired as a tenured professor, and escorted from campus.

Now, given the high level of tension on campus, and that Buckingham was only a few weeks away from retirement, it might have made more sense to let this incident go with a reprimand (and indeed, after much media attention, and an emergency meeting called by Advanced Ed Minister, Rob Norris, the University “reconsidered and reversedparts of its initial decision).  But make no mistake, within a managerial capacity, it was a fire-able offense: you can’t have your Deans going off and running down their colleagues’ departments in public.

Simply put, the freedom of comment that one has as a faculty member doesn’t apply to management.  Buckingham’s line about “I’ve never seen academics be silenced like this” is somewhat disingenuous: Deans are management and held to a different standard.  Saskatchewan was within its rights to ditch him as a Dean; where they overstepped, and have since clawed back on their decision, was in firing him as a professor, because that raises legitimate issues of academic freedom.  As far as I know no professor has been dismissed for speaking out about university management since Norman Strax at UNB in 1968, and that’s not a place we want to go back to.

Both sides stepped over the line here, but it’s easy to see how it happened, and how it is likely to happen again.  At the end of the day, deans’ identities and allegiances are split between their role as academics and their role as administrators.  It’s a thankless and occasionally dangerous position.

May 15

Does More Information Really Solve Anything?

One of the great quests in higher education over the past two decades has been to make the sector more “transparent”.  Higher education is a classic example of a “low-information” economy.  Like medicine, consumers have very limited information about the quality of higher education providers, and so “poor performers” cannot easily be identified.  If only there were some way to actually provide individuals with better information, higher education would come closer to the ideal of “perfect information” (a key part of “perfect competition”), and poor performers would come under pressure from declining enrolments.

For many people, the arrival of university league table rankings held a lot of promise.  At last, some data tools with some simple heuristics that could help students make distinctions with respect to quality!  While some people still hold this view, others have become more circumspect, and have come to realize that most rankings simply replicate the existing prestige hierarchy because they rely on metrics like income and research intensity, which tend to be correlated with institutional age and size. Still, many hold out hope for other types of information tools to provide this kind of information.  In Europe, the big white hope is U-Multirank; in the UK it’s the “Key Information Set”, and in Korea it’s the Major Indictors System.  In the US, of course, you see the same phenomenon at work with the White House’s proposed college ratings system.

What unites all of these efforts is a belief that people will respond to information, if the right type of information is put in front of them in a manner they can easily understand/manipulate.  The arguments have tended centre around what kind of information is useful/available, and the right way to display/format the data, but a study out last month from the Higher Education Funding Council for England asked a much more profound question: is it possible that none of this stuff makes any difference at all?

Now, it’s not an empirical study of the use of information tools, so we shouldn’t get *too* excited about it.  Rather, it’s a literature review, but an uncommonly good one, drawing significantly from sources like Daniel Kahneman and Herbert Simon.  The two key findings (and I’m quoting from the press release here, because it’s way more succinct about this than I could be) are:

1) that the decision-making process is complex, personal and nuanced, involving different types of information, messengers and influences over a long time. This challenges the common assumption that people primarily make objective choices following a systematic analysis of all the information available to them at one time, and

2) that greater amounts of information do not necessarily mean that people will be better informed or be able to make better decisions. 

Now, because HEFCE and the UK government are among those people that believe deeply in the “better data leads to better universities via competition model” the study doesn’t actually say “guys, your approach implies some pretty whopping and likely incorrect assumptions” – but the report implies it pretty strongly.

It’s very much worth a read, if for no other reason than to remind oneself that even the best-designed, most well-meaning “interventions”, won’t necessarily have the intended effects.

May 14

Trends in Applications

Some interesting trend data to review from Ontario today.

First, there’s the fact that applications from secondary schools have dropped by 3% this year, from 92,892 to 89,609 (as of the February snapshot, which for most purposes is as good as the final numbers, since something like 95% of all applicants apply before the end-of-January line).  This is a moderately big deal since it’s the first time since the double cohort that numbers have fallen.

Figure 1: Applications from Secondary Schools by Year, Ontario, 2004-14














Some university officials have waved this away as being a result of declining population, but there’s no evidence that the population of 18 year-olds has fallen by 3%.  Statscan hasn’t published data for 2014 yet, but between 2010 and 2013 the number of 18 year-olds actually increased by 2%, even though the agency’s population projections had suggested their numbers would fall somewhat.  In the chart below, which shows the ratio of secondary school applicants to 18-year-olds over time, I average Statscan’s projection with the actual annual increase for the past four years, and assume a fall of 1.6% in 2014. So even accounting for population change, university applicant numbers are still down.

Figure 2: Applications from Secondary School as a Percentage of 18-Year-Olds, Ontario, 2004-14














The fact that the percentage of 18 year-olds attending Ontario universities has fallen is notable, but we shouldn’t overstate the implications.  In the first place, it hasn’t fallen far – just back to where it was in 2012.  Second, these numbers are only for Ontario applicants; they don’t include all the many international students whose numbers are still rising.  Fact is, most institutions will be OK for awhile yet.

More interesting, perhaps, is what’s going on with applications by field of study.  Check out, for instance, what’s happening with the “big four” fields, which account for slightly over 70% of all enrolments.  Applications to Arts subjects have been falling for some time; in 2003, 35% of all applications were to Arts Faculties, now it is just 27% (albeit of a much larger applicant pool – in absolute numbers they are about where they were in 2003).  Science and Business have more or less kept their share of enrolments steady over time, while Engineering has seen its share grow from 8% to 11%.  That might not sound like much, but in absolute terms it represents an increase of 81%, from 5,515, to 9,984.

Figure 3: Arts, Science, Business, and Engineering Applications as Percentage of Total, Ontario, 2004-14














But look a little more closely at the data, at some of the smaller fields of study, and you can see some really amazing shifts in numbers.  Nursing, by some distance, is the “hot” discipline (not surprising, given the 100% placement rate and the $50K plus starting salaries), with applications increasing by close to 150%.  Social Work has seen applications double, and Math applications are up almost 90%.  Fine Arts applicant numbers have stayed very stable over the past decade; only Journalism has seen a major negative shock, with applications down by over a third from their 2008 peak.

Changes in Application Numbers, Selected Fields of Study, Ontario, 2004-14, Indexed to 2004














The disciplinary enrolment shifts are of significantly more importance than 1-year changes in total enrolments.  They show that, over time, students do in fact respond to changes in labour market conditions, but that it may take a few years for the response to be evident.  Quite properly, students might want to see sustained evidence of change before committing to a different field of study.  And that’s a good thing, whatever the usual Labour Market whiners might say.

May 13

U-Multirank: Game On

Those of you who read this blog for the stuff about rankings will know that I have a fair bit of time for the U-Multirank project.  U-Multirank, for those in need of a quick refresher, is a form of alternative rankings that has been backed by the European Commission.  The rankings are based on a set of multi-dimensional, personalizable rankings data, and were pioneered by Germany’s Centre for Higher Education (CHE).

There is no league table here.  Nothing tells you who is “best”.  You just compare institutions (or programs, though in this pilot year these are still pretty thin) on a variety of individual metrics.   The results are shown as a series of letter grades, meaning that, in practice, institutional results on each indicator are banded into five groups – therefore no spurious precision telling you which institution is 56th and which is 57th.

Another great feature is how global these rankings are.  No limiting to a top 200 or 400 in the world, which in practice limits comparisons to a certain type of research university in a finite number of countries.  Because U-Multirank is much more about profiling institutions than about creating some sort of horse-race amongst them, it’s open to any number of institutions.  In the inaugural year, over 850 institutions from 70 countries submitted information to the rankings, including 19 from Canada.  That instantly makes it the largest of the world’s major rankings system (excluding the webometrics rankings).

Of course, the problem with comparing this many schools is that there are a lot of apples-and-oranges in terms of institutional types.  The Big Three rankings (Shanghai, THE, QS) all sidestep this problem by focussing exclusively on research universities, but in an inclusive ranking like this one it’s a bit more difficult.  That’s why U-Multirank includes a filtering tool based on an earlier project called “U-MAP”, which helps to find “like” institutions based on institutional size, mission, discipline, profile, etc.

Why am I telling you all this?  Because the U-Multirank site just went live this morning.  Go look at it, here.  Play with it.  Let me know what you think.

Personally, while I love the concept, I think there’s still a danger that too many consumers – particularly in Asia – will prefer the precision (however spurious) and simplicity of THE-style league tables to the relativism of personalized rankings.  The worry here isn’t that a lack of users will create financial problems for U-Multirank – it’s financed more than sufficiently by the European Commission, so that’s not an issue; the potential worry is that low user numbers might make institutions – particularly those in North America – less keen to spend the person-hours collecting all the rather specialized information that U-Multirank demands.

But here’s hoping that’s not true.  U-Multirank is the ranking system academia would have developed itself had it had the smarts to get ahead of the curve on transparency instead of leaving that task to the Maclean’s of the world.  We should all wish it well.

May 12

Non-Lieux Universities: Whose Fault?

About four months ago, UBC President Stephen Toope wrote a widely-praised piece called “Universities in an Era of Non-Lieux“.  Basically, the piece laments the growing trend toward the deracinated homogenization of universities around the globe.  He names global rankings and government micro-management of research and enrolment strategies – usually of a fairly faddish variety, as evidenced by the recent MOOC-mania – as the main culprits.

I’m not going to take issue with Toope’s central thesis: I agree with him 100% that we need more institutional diversity; but I think the piece fails on two counts.  First, it leaves out the question of where governments got these crazy ideas in the first place.  And second, when it comes right down to it, the fact is that big research universities are only against institutional diversity insofar as it serves their own interests.

Take global rankings, for instance.  Granted, these can be fairly reductionist affairs.  And yes, they privilege institutions that are big on research.  But where on earth could rankers have come up with the idea that research was what mattered to universities, and that big research = big prestige?  Who peddles that line CONSTANTLY?  Who makes hiring based on research ability?  Who makes distinctions between institutions based on research intensity?  Could it possibly be the academic community itself?  Could it be that universities are not so much victims as culprits here?

(I mean, for God’s sake, UBC itself is a member of “Research Universities Council of BC” – an organization that changed its name just a few years ago so its members would be sure to distinguish themselves from the much more lumpen new [non-research-intensive] universities who caucus in the much less-grandly named BC Association of Institutes & Universities.  Trust me – no rankers made them do that.  They came up with this idea on their own.)

As for the argument that government imposes uniformity through a combination of meddling and one-size-fits-all funding models, it’s a point that’s hard to argue.  Canadian governments are notorious for the way they only incentivize size and research, and then wonder why every university wants to be larger and more research-intensive.  But frankly, this has traditionally worked in research universities’ favour.  You didn’t hear a lot of U15 Presidents moaning about research monocultures as long as the money was still flowing entirely in their direction.  So while Toope is quite right that forcing everyone into an applied research direction is silly, the emergence of a focus on applied research actually has a much greater potential to drive differentiation than your average government policy fad.

So, to echo Toope, yes to diversity, no to “non-lieux”.  But let’s not pretend that the drive to isomorphism comes from anywhere but inside the academy.  We have met the enemy and he is us.

May 09

Who’s Progressive?

To the extent that finances act as a barrier to higher education, they are an obstacle to those without resources – that is, those who tend to come from lower-income backgrounds.  It is, therefore, simply common sense that if you want to relieve financial barriers, you concentrate resources among those with the fewest means.

Except, it doesn’t seem to be common sense among many of those who consider themselves “progressive” in Canada.  “Progressives”, for reasons that are almost incomprehensible, prefer solutions that give far more money to students from high-SES backgrounds.  Why?  Good question.

The best way to focus aid is to use grants based on income (or, second best, on assessed need).  By using income-targeting, you can get money to exactly who you want.  Say you have $100M that you want to put towards affordability.  Want to give all of it to students in the lowest income quartile?  You can do that.  Split it between the bottom two income quartiles? You can do that, too.  Or maybe spread it out a little more thinly so that it cuts out gradually – say, 60% to bottom income quartile, 30% second quartile, 10% third quartile?  You can do that, too.  Grants make many different types of investments possible.

Figure 1: Some Possible Distributions of Grants Across Income Quartiles














But some people despise this idea.  Some people say things like, “lower tuition is the best form of student aid”.  Implicitly, because people from richer families are likelier to attend post-secondary education, the distribution of $100 million, if delivered in the form of a tuition cut, looks like this:

Figure 2: Distribution of Benefits of a Tuition Reduction, by Income Quartile














That doesn’t actually look so good compared to a grant, does it?  In fact, it’s even worse than it seems.  That’s because a $100 million reduction in tuition ends up affecting other types of aid as well.  For every dollar of tuition reduced, students and their families lose 21-28 cents (depending on the province) in tax credits.  And, to the extent that anyone has provincial need-based grants (as opposed to the mainly income-based federal ones), a dollar less in tuition means a dollar less in need, which in many cases means a dollar less in grant.  Thus, for high-need students (which is not quite the same thing as low-income students, though there is some overlap), a dollar less in tuition can mean $1.25 less in non-repayable aid.  That is to say, they are worse off after the tuition reduction than they were before.  But the rich kids who had no need of student aid to begin with?  They’re better off by $0.75.

All told then, if you spend $100 million to reduce tuition, the spread of benefits looks something like this:

Figure 3: Distribution of Benefits of a $100 Million Tuition Reduction, by Income Quartile, in Millions














Of the $100 million in reduced tuition, $42 million gets recouped by one level of government or another, either through reduced tuition tax credits or reduced grants.  Of the remainder, only 13% goes to the poorest quartile, and only 38% goes to the bottom half.

So, ask yourself: who’s progressive?  The folks who want to give 50, 60, or 100% of their money to kids from the bottom income quartile?  Or the folks who want to give almost three times as much to the top quartile as to the bottom?

May 08

Why (Almost) Everyone Loves International Students (Part 2)

Yesterday, I showed how good international students were for universities’ bottom lines.  But it’s not quite as simple as I made it out to be.  Whether admitting international students makes sense or not depends on four factors:

1)      How much of the income do you get to keep?  In Quebec, international students in “regulated” programs (which include Arts) are worth essentially nothing to institutions because the government claws it all back.  On the other hand, in block-grant provinces (and in Saskatchewan, which is part-formula, part block), international students are basically pure profit.  The only reason to not take international students is if the provincial government might punish you for it, because of fears of crowding out local demand (cf. Alberta).  In most formula-funding provinces, and for Quebec’s unregulated programs, the return is somewhere in-between – institutions can charge what they want for international students, but get zero subsidy for them from the province.

2)      What’s the marginal cost per student?  Remember: marginal, not average.  There is a tendency to think that international students are more financially beneficial in Arts or Business because average costs are lower there than in Science and Engineering.  And while, to some extent, that’s  true, what really matters is how close to capacity each program is.  An extra Engineering student in a class of 29 with a capacity of 30 is actually going to be cheaper than an extra Arts student in a class of 30 with the same capacity, because being the 31st student means starting a new class section, hiring a new instructor, occupying more classroom space, etc.  The problem for most institutions is that they have only the barest notion of what marginal costs are across the institution at any given time.

3)      What’s the cost of recruitment?  At most mid-sized institutions these days, recruitment costs per international student are – all told – in the $6-7K range, once you take agent fees, overhead, and everything else into consideration.  Assuming the student is coming for four years and is going to generate 60-80K in fees, that’s pretty good (less so if your school has a problem with international student retention).  But it’s even better if you’re McGill, Toronto, or UBC; with so much brand prestige you don’t need to spend so much.

4)      What’s the opportunity cost?  Now that you know your income and expenses from international students, you can work out what your net income is by field of study.  But opportunity costs matter, too; your potential earnings from domestic students need to be taken into account.  For most institutions outside the big cities, the answer is “nothing” because the alternative to an international student is no students at all.  In these cases, the decision to admit international students is obvious.  Where it gets less obvious is where you can gain income from a domestic student.  At that point, you need to work out how net (not gross) income from a graduate student compares with net income from government grants and tuition fees.  At some institutions, in some fields of study, it will sometimes make more sense to enroll a domestic student over an international one.  But it’s close.

Got all that?  Good.  Now go build your strategic enrolment plans.

May 07

Why Everyone Loves International Students (Part 1)

A nice simple post today: why universities are going bananas for international students.

The first figure shows undergraduate tuition fees for international students in each province.  They range from a little under $10,000 in Newfoundland, to just over $25,000 in PEI.  The national average for this period is $18,840; in Ontario it is $23,000.

International Undergraduate Tuition Fees by Province, 2012, in $2013














What’s more, fees for international students have been going up quite steadily for two decades.  Over the last 21 years, fees for international students have risen annually by an average of 4% in real terms (i.e. over and above inflation).

Average International Undergraduate Tuition Fees by Province, 1990-2012, in $2013














And these fee rises seem to have no effect on demand.  Check out the rise in the number of international students.  Is that great or what?  High fees?  Lots of international students.  Raise fees?  MORE International students!

International Student Enrolments, 1992-2011














Does anyone expect universities to turn down that kind of money, from an apparently inexhaustible source?  Especially when the amount they get from government is flat, and tuition is tightly regulated?

OK, yes, the decision to take in international students is, in fact, marginally more complicated than I’m making it out to be here.  I’ll get to that tomorrow.  But the basic case for international students is right there in those three graphs.

Money talks, you know.  Gotta pay the bills.

May 06


So, there’s this cute little graphic making the rounds on the internet.  Take a look, and tell me what you see:



















If you laughed, I’m disappointed.  This joke, to me, represents absolutely everything wrong with the humanities these days.

The joke, essentially, is that scientists are narrow-minded eggheads.  They have knowledge, but not wisdom.  But your lovable humanities types?  Well, they may not know their ass from their elbow as far as recombinant DNA goes, but boy have they got wisdom.  Buckets full of wisdom, actually.  And as far as they are concerned, letting a 40-foot theropod loose in a modern laboratory is asking for trouble.  Scientists, on the other hand, are apparently too stupid to work this out on their own.

I mean, think about this for a moment: pretty much anyone with the intellectual maturity of an 8 year-old, and who has seen Jurassic Park, could understand the dangers of having a T-Rex wandering around (the reptilian ones, anyway – there are also dangers to having 70s glam-rock bands wandering around, but you need to be older to work that one out).  How arrogant do you have to be to assume that only humanities training can give you the necessary wisdom to work this out?

The thing is, scientists are actually really good at working out the ramifications of their discoveries on their own.  Take the 1975 Asilomar Conference, for instance.  When scientists gained the technical ability to start swapping DNA across species in the early 1970s, the entire biological profession took notice.  Concern about the implications of these techniques – whose effects at the time were largely unknown – persuaded the entire profession into a 16-month moratorium on its use.  The top people in the profession then came together at Asilomar to debate the issue, and come up with guidelines for ensuring the safe use of recombinant DNA techniques (summary available here).  And they did this, so far as I can tell, on their own, without help from superior, wisdom-stuffed humanities types.  Thus, the joke, at one level, stems from rank ignorance of how science works.

I get that humanities feel picked upon these days.  What I don’t get is why they react to this not by saying “humanities have their place”, but rather by exclaiming that “everyone without a humanities degree is a subtlety-free buffoon” (bonus points if you can wedge in something about humanities and citizenship, thus implying nobody else is as qualified to talk about politics).  It’s juvenile.  And it sure as hell doesn’t win the humanities many friends.

And yes, I know it’s supposed to be a joke.  But it’s a poor one, and reflects poorly on those who make it.

May 05

The MOOC Conversation We Should Have (But Won’t)

In all the hype and backlash about MOOCs, it seems that we forgot to have a really important conversation about what MOOCs actually tell us about traditional higher education.

The thing that freaked absolutely everybody out (some positively, some negatively) about MOOCs was the idea that a single instructor could teach tens of thousands of students around the world, simultaneously.  “Oh my God”, people panicked/enthused, “what will happen to the university once content is available freely everywhere”.  Well, not much, as it turned out.  Certainly, no more than print, radio, television, or videotapes – all previous knowledge-transmission technologies that allegedly threatened the monopoly of official education providers – did.

But the bigger point of MOOCs is that they reminded us that what makes universities special as teaching institutions isn’t that they are superior content providers.  MOOC instructors are usually tenured professors – just like in universities.  And the topics they cover are university level.  So why do many persist in thinking of them as “less than university”?  Partly, it’s a legalistic reason: they aren’t delivering “credits”, which lead to a “degree.  But this can be remedied: Coursera’s new X track now has many prestigious institutions giving away certificates of completion in return for completing bundles of related MOOC courses.  It’s not a degree, but it’s getting closer.  What then will distinguish MOOCs and “real” universities?

The answer, basically, is the learning environment.  What MOOCs lack are not profs, but meaningful, durable relationships, both among students, and between students and professors.  Yes, they can deliver some classroom interaction through chat groups, etc.  But these by-and-large don’t lead to the kinds of interactions you get on campus just through serendipity.

There’s an architecture of discovery on physical campuses that isn’t, at the moment, replicable online.  It’s in the conversations you have in hallways, in libraries, and cafes.  It’s the learning that happens through extra-curricular activities, and arguing with your TA over a beer at the campus bar.  It’s the shared experiences that build up over time.  That’s a university’s real advantage. Maybe a MOOC delivered via Oculus Rift might be able to get you halfway there, but we’re still a ways from that.

But – and here’s the conversation we aren’t having – if we accept that universities are about environment and not about content, why aren’t we putting the environment at the centre of our discussions about universities?  Why do we still hire professors (more or less) exclusively based on research ability?  Why, even on the rare occasions where we take learning outcomes seriously, do we still assume this gets achieved exclusively through what happens in classrooms?  Why aren’t we thinking night and day about how to make higher education a more immersive experience, investing more in the pastoral care (broadly defined) of students?

Yeah, yeah, I know.  The faculty would never wear it.  But isn’t that the real problem?

Page 22 of 81« First...10...2021222324...304050...Last »