Higher Education Strategy Associates

September 30

The Problem with Global Reputation Rankings

I was in Athens this past June, at an EU-sponsored conference on rankings, which included a very intriguing discussion about the use of reputation indicators that I thought I would share with you.

Not all rankings have reputational indicators; the Shanghai (ARWU) rankings, for instance, eschew them completely.  But QS and Times Higher Education (THE) rankings both weight them pretty highly (50% for QS, 35% for THE).  But this data isn’t entirely transparent.  THE, who release their World University Rankings tomorrow,  hides the actual reputational survey results for teaching and research by combining each of them with some other indicators (THE has 13 indicators, but it only shows 5 composite scores).  The reasons for doing this are largely commercial; if, each September, THE actually showed all the results individually, they wouldn’t be able to reassemble the indicators in a different way to have an entirely separate “Reputation Rankings” release six months later (with concomitant advertising and event sales) using exactly the same data.  Also, its data collection partner, Thomson Reuters, wouldn’t be able to sell the data back to institutions as part of its Global Institutional Profiles Project.

Now, I get it, rankers have to cover their (often substantial) costs somehow, and this re-sale of hidden data is one way to do it (disclosure: we at HESA did this with our Measuring Academic Research in Canada ranking.  But given the impact that rankings have for universities, there is an obligation to get this data right.  And the problem is that neither QS nor THE publish enough information about their reputation survey to make a real judgement about the quality of their data – and in particular about the reliability of the “reputation” voting.

We know that the THE allows survey recipients to nominate up to 30 institutions as being “the best in the world” for research and teaching, respectively (15 from one’s home continent, and 15 worldwide); the QS allows 40 (20 from one’s own country, 20 world-wide).  But we have no real idea about how many people are actually ticking the boxes on each university.

In any case, an analyst at an English university recently reverse-engineered the published data for UK universities to work out voting totals.  The resulting estimate is that, among institutions in the 150-200 range of the THE rankings, the average number of votes obtained for either research or teaching is in the range of 30-to-40, at best.  Which is astonishing, really.  Given that reputation counts for one third of an institution’s total score, it means there is enormous scope for year-to-year variations  – get 40 one year and 30 the next, and significant swings in ordinal rankings could result.  It also makes a complete mockery of the “Top Under 50” rankings, where 85% of institutions rank well below the top 200 in the main rankings, and therefore are likely only garnering a couple of votes apiece.  If true, this is a serious methodological problem.

For commercial reasons, it’s impossible to expect the THE to completely open the kimono on its data.  But given the ridiculous amount of influence its rankings have, it would be irresponsible of it – especially since it is allegedly a journalistic enterprise – not to at least allow some third party to inspect its data and give users a better sense of its reliability.  To do otherwise reduces the THE’s ranking exercise to sham social science.

September 29

Differentiation and Branding From the Student Perspective

One question that always comes up (or should come up, anyway) in discussions of university branding and positioning is: “how different is our institution, really”?  Well, for a few years, when we ran the Globe and Mail Canadian University Report survey, we used to ask students questions that would allow us to see how different students thought their university was.  The results were… interesting.

We asked students to locate their institution on an 11-point double-ended scale.  Did they think their institution was more focussed on graduate students or undergraduates?  Was it focussed on a couple of key fields of study, or more spread out?  Focussed on global issues or local ones?  Was it open to new ideas, or cautious about them? Did students have to be self-sufficient, or was the institution a nurturing one?  Was the curriculum more theoretical or applied? Was the student body homogeneous or diverse?  And, broadly, was it professor-centred or student-centred?  We could then plot their answers on a spidergram to show how institutions differed from one another.  What we found was an interesting degree of clustering, which allowed us to make specific categorizations of institutions.

The most definable groups of institutions were what we called the Liberal Arts schools – mostly small schools where students described their institution as undergrad-focussed, very nurturing, and not focussed on a particular field of study.  This includes the usual suspects at Acadia, Mt. Allison. St. FX, but also the religious schools (e.g. Redeemer, Trinity Western), the Western Colleges (Brescia, Huron, King’s) and Guelph (but not Trent).  The other big clustering of schools were what we called the “graduate-focussed schools”.  This was basically the U-15, minus McGill, U of T St. George (though the other U of T campuses qualified), Waterloo, Manitoba, and Saskatchewan, but including Carleton, SFU and Concordia.  These two groups position themselves on the spidergram as follows:

Figure 1: Positioning Results for “Graduate” and “Liberal Arts” Schools














Note: The further a line is to the edge of the spidergram, the more the value tends towards the first word in the axis description (e.g. “graduate” rather than “undergraduate”).

There were a number of individual schools that had very distinct profiles.  Waterloo, for instance, looks entirely unlike every other institution. Students there are much more likely to describe their institution as open to change (a trait which is significantly correlated with student satisfaction), and much more likely to describe their institution as being “focussed” on a few key areas.  OCAD University was by far the most likely to be described as having an applied curriculum and a student-centred program.  McGIll and Toronto (St. George) had weirdly identical results (i.e. their students described them in *exactly* the same terms), describing them as being much more global, much more graduate focussed, and much more sink-or-swim (i.e. NOT nurturing) than other schools.

Figure 2: Positioning Results for OCAD University, University of Waterloo and McGill/University of Toronto (St. George)














Now, here’s the important bit.  The results from every other school in the country, from Trent to Saint Mary’s, to Fraser Valley, to Saskatchewan and Manitoba – were basically one big lump.  None of these institutions really stood out in any way – basically, students just described these places as “school”.  None had any real distinguishing characteristics on which you could build a sensible brand.  If they had a colour, they would be beige.

If you view higher education as one more social service where the goal is provide a uniform product to students everywhere, this is a good thing.  But if you think universities should be distinct entities, catering to varying niche tastes that evolve over time, it’s a pretty depressing picture.

September 26

Te Wānanga o Aotearoa

A couple of weeks ago, I promised I would tell you the story of Te Wānanga o Aotearoa, the entirely Maori-run polytechnic with over 35,000 students.  So here it is.

The 1970s saw significant Aboriginal cultural revivals in many parts of the world.  Aboriginal higher education – or at least the access of aboriginal peoples to mainstream higher education – was a significant part of that.  In Canada, the struggle was mostly about gaining a foothold in mainstream institutions; in the United States, the focus was much more on creating aboriginal-controlled institutions, known as tribal colleges and universities.  In New Zealand, the Maori journey in higher education was similar to the US in that it involved creating their own separate institutions, and then later seeking recognition for them.  The result was a class of institutions called “Wānangas” (a term that roughly equates with “knowledge”).

The first Wānanga (Te Wānanga o Raukawa) focused mostly on language and culture, and had relatively small enrolments (still under 1,000).  The second, Te Wānanga o Aotearoa, was more focussed on skill acquisition – mainly Maori arts and crafts, but also with courses in tourism and computer skills.  They were a fairly marginal institution, too, until they caught a big break in the late 1980s when the Education Act was being re-written.  At the time, New Zealand’s governing Labour Party was putting the country through a major free-market revolution.  In education, that meant caring more about outcomes and outputs than about the provider’s pedigree.  It was also a time when the government was making concerted efforts to improve relations between Maori and Pakeha, and treat the Treaty of Waitangi with some respect.  And so, when it came time to write the Act, they decided to give Wānanga status as a fourth official type of tertiary education, alongside universities, polytechnics, and privates.  That guaranteed them some annual funding, but because the government was in financial straits, there was no money available for capital.                                

That’s where things got interesting.  Part of the whole return to the Treaty of Waitangi involved creating a Treaty Tribunal to adjudicate cases where Maori felt that public policy were not in keeping with the terms of the treaty.  Te Wānanga o Aotearoa decided to challenge the capital funding policy, arguing that they were a recognized form of education but had been unable to benefit, as others had, from public capital spending.  The tribunal agreed with them, handing the Wānanga what looked to be a whopping cash settlement.  Cannily, however, they played a long game and negotiated a reduced settlement on capital in exchange for a straight per-student funding agreement with – and this is crucial – no cap on numbers.

It was at this point that all manner of fun broke loose.  The funding deal allowed Te Wānanga o Aotearoa to indulge all its most entrepreneurial instincts, and the school went from having 1,000 students in 1998 to having 65,000 students in 2002.  This involved a lot of institutional change in terms of widening the scope of the types of programs it offered, and it involved a lot of community delivery – at one point they had over 300 teaching sites.  Its ability to attract significant numbers of non-Maori students – particularly recent immigrants – was another important factor.

But it also involved cutting some corners.  In 2005, the Auditor-General came down hard on the school, suggesting (basically) that while Te Wānanga o Aotearoa may have done wonders in expanding access, it would have been nice if they had kept some actual receipts for their spending.  The resultant tighter enforcement rules drove down enrolments to roughly 30,000, where they remain today.  In the short-term, that caused a bit of a financial crisis, as well as layoffs.  But in the longer term it probably made the organization stronger, and it remains by far the world’s largest aboriginal-controlled institution of higher education, delivering thousands of recognized tertiary credentials each year, including some at the Bachelor’s level.

The Wānanga model is not one we’ve adopted in Canada, but for those leaders in government and aboriginal organizations seeking to expand educational opportunities for aboriginal Canadians, there are a lot of important lessons to be drawn from Te Wānanga o Aotearoa’s experience.  We should pay heed to them.

September 25

Ending the Merit Scholarship Arms Race

Here’s a way the new Ontario Minister of Training Colleges and Universities, Reza Moridi, could do everyone an enormous service, and win political capital at the same time: force institutions to cut back radically on automatic merit-based entrance scholarships.

Here’s the background: at some point in the 1990s, Canadian institutions hooked onto the idea of giving out entrance awards as a way of managing enrolment.  It was a nice trick to help lock students in early in the admissions process – you register with me by a certain date, and I give you $1,000 (or whatever).  To keep costs down, “merit” was redefined as being exclusively in terms of high school grades and awarded according to a grid – $1,000 if you had over a 90, $750 if you had an 85, etc.   These were dubbed “Automatic Academic” awards by Franca Gucciardi in her 2004 paper Recognizing Excellence.

In the aughts, institutions in Ontario started getting truly stupid about this “merit” money, and an arms race broke out.  Here’s roughly how it worked: one year, school X would decide it was going to try to steal some smart science and engineering students from school Y, and would raise its offer to students with grades over 85% from (say) $1,500 to $2,000.  This worked, and school Y would see its yields go down.  But the following year, school Y would decide to up its offer by $1000, and the pendulum would swing back!  And so on, and so forth.  By 2007, Ontario institutions were handing out something on the order of $70 million in this fashion, and there is every reason to think this amount has increased since then.

There is no durable way an institution can get out in front of the pack in such an arms race – others would always match the awards, and an equilibrium would be restored.  All institutions are making themselves worse off because their net tuitions are all declining, yet no one can disengage because they are afraid they would lose share to others.  And into the deal we get a complete devaluation of the word “merit” because institutions are giving these awards to about 70% of incoming freshmen.

This is a classic collective action problem.  And collective action problems have solutions: imposed government rule-making.  In this case, the Government of Ontario should restrict the ability of universities to give away money based on grades alone to, say, 1% of what they take in via tuition.  If they want to run more complicated merit schemes – things that take into account service, innovation, and what have you – great.  Let ‘em knock themselves out (they won’t of course, because those take actual effort to organize, but whatever).  Or, if they want to give out more money based on need, that’d be OK too.  Just end the insanity around academic awards.

You think universities would oppose this on grounds of competitiveness or institutional autonomy?  Think again.  They’d love for someone to get them out of this arms race, because they’re clearly incapable of doing it themselves.  And of course the Ontario Undergraduate Student Alliance has already called for essentially the same thing, so the student constituency is already covered off.

So, how about it Dr. Moridi? Are you up for a quick win?  Just end the merit arms race.

September 24

The Math at Windsor

Not only is there strike talk at Laurentian, but there is also a strike in the air at Windsor (a one-day strike was held last week, but a full strike is promised for October 1st if no deal is reached).  Bargaining there began earlier this year, but for whatever reason, no progress was made in negotiations over the spring.  After a conciliator was unable to nudge the two sides closer together, the university was in the legal position to impose its offer on the faculty, which it did in early July. This was a canny piece of timing: by doing this in the summer, the university deprived the union of an immediate strike threat (because who cares if profs go on strike in summer?).

This was a rare case of a university playing hardball on timing; and though this may have wrong-footed the union, they’ve responded by making great rhetorical hay out of having a contract imposed on them.  Now, the strike is no longer about petty monetary demands, it’s about the right to collective bargaining.  Yay, righteousness!  And that’s a big bonus for the union, because if the strike was just about financial proposals, their position would be almost indefensible.

The union position is that the university’s offer – 0%, 0%, and 3% over three years – is inadequate because staff can’t be expected to accept wage increases below inflation.  While that’s one way of framing the institution’s offer, it glosses over the stonking amount of money the university is offering faculty through its Progression Through the Ranks (PTR) system (for a refresher course on PTR, see here).  Under the university’s offer, every single professor (other than those with over 30 years experience) gets an annual pay rise of $2,550.  This isn’t based on merit or anything, the way it is at Alberta or UBC or Waterloo, it’s just for sticking around another year.  On top of that, they get 0%, 0%, 3%.

Now that doesn’t translate easily into a percentage figure because $2,550 represents a different percentage for each professor, depending on their current pay. But let’s take a stab at it based on known average pay by rank.  Current pay figures are unavailable (thanks for cutting the UCASS faculty salary survey, StatsCan!), but I do have them from 4 years ago – they’ve probably gone up slightly since then, but for giggles let’s use them to take a look at what the university offer means if the PTR is included.












On top of that, there’s something called the “Windsor Salary Scale” – a Windsor-only deal, in place for decades, which means that the Windsor salary scale always rises at the Ontario median.  Based on the past few years’ deals, this would mean average pay rises of another 4% or so (15.4% now for assistant profs, if you’re counting) – though it’s hard to predict exactly, since we don’t know what future salary settlements across Ontario will look like.  On the other hand, professors will also be asked to pay more into their pensions, what with returns being so meagre in our low-interest rate environment.  So let’s call these two a wash and stick with the figures in the table above.

To summarize, this deal – which, recall, had to be imposed – will see nearly all faculty salaries rise by rates well above inflation (in the case of assistant professors, by a factor of two).  The only ones who will not see a rise equal to inflation are that tiny minority (the 30+ years crew) already at the top of the pay grid, and who in most circumstances will be earning over $150,000.

Remember, this is at a university in a region where first-year enrolment fell by 10% this year, and where the regional youth cohort (Windsor-Essex-Chatham-Kent) is set to shrink by 15% or so over the next six years – meaning the institution will receive even fewer tuition dollars, and will receive a declining share of total government grants budget, which, if we’re lucky, will decline by only 3% in real terms over the next three years.

And still the union said no.

September 23

Another Reason to Get Serious About Measuring Workloads

So I see the Laurentian faculty union is threatening to strike.  The main issues are “workload” (they’d like to have lower undergraduate teaching loads to deal with an influx of graduate students) and pay (they’d like to “close the gap” with the rest of Ontario).

This is where the entire system would be well served by having some understanding of what, exactly, everybody is getting paid for.  Obviously, if you’re doing the same amount and type of work as someone else, you’ve got a pretty good claim to parity.  The problem is that what professors do – that is, their expected workload and outputs – can vary significantly from one place to another.

Lets’s take the issue of graduate supervision.  Laurentian profs are doing more of it than they used to – overall, 6% of full-time enrolments at Laurentian were at the graduate level in 2012, up from 4% five years earlier.  But if we’re going to use “the Ontario average” as a goal, it’s worth noting that across the province, 12% of full-time students are graduate students.  So on average, Laurentian professors do only about half as much graduate supervision as other professors across the province – and probably less if we were to weight doctoral supervision more highly.

Well, what about undergraduate teaching – maybe they do more of it that others?  On paper, they teach 3/2 (except in Science and Engineering, where its 2/2).  That’s the same as at most smaller Ontario institutions, and somewhat more than you’d see at larger institutions where 2/2 or even 2/1 is the norm.  But that’s not the whole story: class sizes are smaller at Laurentian.  Sixty-seven per cent of all undergraduate classes at Laurentian are under 30 students, compared to just 51% at York (though, surprisingly, the figure at Queen’s is almost the same as Laurentian – 65%).  But ask yourself: which takes more work, a 2/1 with average class sizes of 60, or a 3/2 with an average class size of 30?  Hard to tell.  But how can you make arguments about “equal pay for equal work” unless you know?

Then there’s research output.  If you use tri-council funding as a metric, and normalize for field of study, Laurentian profs in Science and Engineering are winning about 55% of the national average – higher than Ryerson, but less than half of what Carleton gets.  That’s not too bad.  In humanities and social sciences, however, Laurentian wins only 21% of the national average – about a fifth of what they get at Ottawa, and a third of what they get at Laurier (all data from our Measuring Academic Research in Canada paper, available here.  I could go on with data about publications and citations, but you get the idea: Laurentian professors’ research output isn’t all that close to the provincial average.

To recap: Laurentian is a school where (on average) professors have lower graduate teaching responsibilities and research output than the Ontario average, and an undergraduate teaching load that is higher than average in terms of number of classes, but is arguably lower in terms of total students taught.  So where should their pay be, relative to the provincial average?  Probably somewhere below the average, which indeed is where it is.

But the question for this dispute is: how far below?  Better comparative data, combined with some agreement about the relative weight of different parts of the professorial job, would take a lot of heat out of this debate.

September 22

Where Do Students Want to Live?

Today, we at HESA released a paper called: Moving On?  How Students Think About Choosing a Place to Live After Graduation, which is based on a 2011 survey of 1,859 students from across the country.  Obviously, you should go read the whole thing, but for the time-pressed here are the highlights:

1)      Part of the paper’s purpose is to examine the qualities students look for in a place to live.  Turns out Richard Florida’s whole shtick about young educated types looking for cities that are “hip” or “creative” may be somewhat wide of the mark; in fact, students’ main priorities in finding a place of residence are access to good jobs, healthcare services, and a low crime rate.  Access to cultural events and foodie cultures rank way, way down the list.  To put that another way: what young educated people look for in a place to live is pretty much what everyone else looks for.

2)      A solid majority of students intend to stay in the province in which they graduated.  That said, just over 40% of students are at least open to the idea of moving.  However, these students are not evenly distributed.  Students in the prairie provinces (including Alberta) are much more open to moving away than are students in British Columbia.  And, equally, students are not open to moving just anywhere – of the people open to a move, most have three or fewer potential destination provinces in mind, and are not open to any others (the most commonly-sought destinations are Ontario and British Columbia – Manitoba and Saskatchewan are the least ).  Only 7% are genuinely open to a move to pretty much anywhere in the country.

3)      Here’s perhaps the most important piece of news: financial incentives for graduates such as the tax credits used by Saskatchewan, Manitoba, and New Brunswick have almost no effect.  We asked students what they expected to earn in their first job in the province they were most likely to call home.  Then we asked them how much it would take to get them to move to each of the other provinces.  For most provinces (BC was the outlier), about a quarter said “nothing could get me to go there” and another 25% said “I’d go for an extra $25,000 or more” (which is really just a polite way of saying “never”).  But, intriguingly, between 13% (Manitoba) and 25% (British Columbia) of all students, say they’d move to that province for either no extra money or even a cut in pay – just give them a job and they’d go.  The percentage who say they’d move for an extra $2,000 (roughly the value of the tax credits in SK, MB and NB)?  About 1%.  Move the financial incentive up to $5,000 and you get another 1%.  And that’s perfectly consistent, right across the country.

The fact is, students are going to move where they’re going to move.  They are either tied to their present spot by networks of friends and family, or they are lured by money, jobs, and prosperity.  A couple of thousand bucks, in the grand scheme of things, just doesn’t seem to matter that much.

All of which begs the question: how come more provinces aren’t making like Nova Scotia and ditching these tax rebate programs?

September 19

Better Know a Higher Ed System: France

France is one of the original homelands of the university: the University of Paris was the first real university outside the Mediterranean basin, and was home to six universities by 1500 – only Italy and Spain had more at the time.  But while it has quite ancient roots, it is also, in many respects, one of the youngest systems of higher education in Europe, because the entire university system was wiped out during the Revolution, and then developed again from scratch during Napoleonic period that followed.

Unlike virtually every other system on earth, the French do not put universities at the top of the higher education hierarchy.  Instead, there are what are called “les Grandes Écoles”: peak, specialized institutions that only operate in a certain limited number of fields – École des Mines and Polytechnique for Engineering, l‘École Normale Superieur for Education, and l‘École Nationale d’Administration Publique” to train the masters of the universe.  Most of these go back two centuries – Polytechnique was an excellent spot for Napoleon to train his gunners – but ENAP actually only dates from the 1940s.

One step down in the hierarchy are the big “Instituts”, which serve as the training ground for professions, mainly in technology (IUT), but also in fields like nursing.  Universities, for the most part (medical studies excepted), are widely viewed as the dregs of the system, the catch-all for people not smart enough to make the grandes écoles, or driven enough to do professional studies.  That’s partly because they are bereft of many prestige disciplines, but it’s also because, historically, they are not centres of research.  As with many other European countries (notably Germany and Spain), the public research mission was largely the responsibility of the Centre National de Recherche Scientifique (CNRS), which was not attached to the universities.

Another historical feature of French universities is the degree to which they have been under state control.  Legally, all faculties were part of a single “Universite de France” for most of the 19th century.  Universities as we know them – autonomous institutions that pursue their own plans and goals – are fairly recent.  If you’re being generous, they date back to 1968; in fact they didn’t reach North American levels of autonomy until the loi Pecresse in 2007 – in practice, though, the shift happened in late 1980s.  Prior to that, hiring and promotion was essentially all done through the Ministry; curricula were also laid down on national lines by expert committees run from Paris.

Recently, international rankings have been a major spur to change.  When the Academic Ranking of World Universities first appeared in 2003, it created the “choc de Shanghai” – the country was genuinely shocked at how weak its institutions were seen to be.  Much of it was down to system design, of course.  The Grandes Ecoles couldn’t compete with American multiversities because they were small, single-discipline institutions, and the universities couldn’t compete because the research was all tied up at CNRS.  But the French government, instead of standing up and saying “this ranking is irrelevant because our structures are different, and frankly our system of research and innovation works pretty well anyway”, decided to engage in a wild bout of policy-making: excellence initiatives, institutional mergers, etc.  It’s all designed implicitly to make their system look more American; though to keep up pretences, if anyone asks it’s actually about being “world-class”.

Maybe the most interesting development to watch is what’s going on at Paris Saclay – a campus that brings together roughly two dozen universities and scientific institutions in a single spot.  It’s both a federation of universities and a new independent institution.  The governance arrangements look like a nightmare, but the potential is certainly there for it to become a genuinely European super-university.  It’s not the only new university in the world whose founders dream of hitting the Shanghai Top Ten, but it’s probably the one with the best chance of doing so.

September 18

Systematically Valuing the Wrong Things

Michael Staton is a former teacher, venture capitalist, and founder of Uversity, which is a kind of data nerd Strategic Enrolment Management outfit in the US.  He also has some interesting ideas about the “unbundling” of higher education, which have appeared – amongst other places – in Andrew Kelly and Kevin Carey’s recent book Stretching the Higher Education Dollar (ungated copy available here).

His take is that there are four basic groups of services that undergraduate education strives to deliver, and that in at least some of them they face competition from “unbundling” because they could be delivered by alternative providers.  While Staton occasionally sounds like he’s drunk too much kool-aid when he talks about the likelihood of things being unbundled, his analysis about the relative ease of unbundling of various services is quite perceptive.

In declining order of substitutability, these four areas are (and I’m paraphrasing his categories a bit here):

  • Academic Content: If there’s one thing MOOCs and other OERs prove, it’s that – at the undergraduate level at least – content is mostly a commodity.  You can get the substance of an undergraduate degree pretty much anywhere, pretty much for free.  Obviously, this isn’t true at a graduate level, which is where all the prestige is, from the institutional perspective.  But from an undergraduate perspective…
  • Certification of Acquired Knowledge and Skills: All universities offer the same credentials, and the power to offer these credentials remains a gift of the state.  Thus, universities as a class are protected, but at the same time they are unable to distinguish themselves from one another through the undergraduate degree offerings.
  • Acquisition of Learning Meta-Skills: Universities and colleges don’t just teach subject knowledge, they teach people how to approach problems and solve them (or at least they’re supposed to).  Those institutions specializing in (say) co-op or Liberal Arts implicitly have a defined approach to meta-skills acquisition, even if they don’t describe it as such.  But most don’t do so.  They seem to think that just saying “we teach kids how to think” is enough.
  • Life-Altering Experiences:   This is the stuff that really can’t be outsourced.  The experiences one has while at university – the friendships, the life lessons, the transition from adolescence to adulthood – simply can’t be replicated in any place other than a traditional campus.  This is what people really pay for.

What’s interesting here, from a strategy point of view, is the complete misalignment of institutional resources with actual sources of value and distinction.  The two areas where distinctions can actually be made between institutions – meta-skills and life-altering experiences – are the areas where institutions spend the least amount of time and effort. In fact, institutions care so little about the latter that they basically leave it to students themselves (which predictably results in some pretty wild swings in quality, not just between institutions, but also over time at the same institution).  Instead, institutions toss all their money into the (from an undergraduate perspective) “useless content hole”.

Somewhere, sometime, a university will decided to bring a laser-like focus to the issues of meta-skills and experience (indeed, one could argue that Minerva University is about halfway there.  Until then, the misalignment – and the mediocrity of undergraduate studies in general – will continue.

September 17

Welcome to the Crisis

I just took a look at the new enrolment confirmation statistics for Ontario universities.  They are jaw-dropping.

Overall, the system experienced its first fall in “number of confirmed enrolments from secondary school” since (I believe) the early 1990s (I say “I believe” because OUAC doesn’t have public stats that go that far back, but I think that’s right).  Ever since the double-cohort, the province’s universities have seen a steady annual 3% bump in total direct-entry enrolments.  That’s been the source of a useful financial cushion for institutions.

Now, however, things are heading into reverse.  The province’s 18 year-olds fell by 2.1% in 2014.  The fall in confirmed direct-entry undergraduate numbers was slightly higher – 2.8%, or about 2,050 students (73,002 in 2013 vs. 70,950 in 2014).  But what was striking was less the total than the distribution of these numbers.

Let’s start by looking at changes by institution.  Most institutions managed to hit their previous year’s numbers, more or less.  Queen’s did; Ryerson actually managed to boost their numbers by 6%; and Western somehow found a way to jump its numbers by 12%.  But in a system that’s declining overall, those gains can’t come without losses elsewhere.  And some of these numbers are doozies.  Waterloo is down 8%.  Nipissing and York both fell by a little under 11%.  And OCAD University and Laurier are down by – are you ready for this? – 14%.

Some of these numbers can be offset with increases elsewhere.  The University of Ottawa, for instance, has five hundred fewer direct-entry students, but has added five hundred non-direct entry students (presumably, these are Quebec students with CEGEP diplomas).  And of course, there’s the ever-popular solution of adding more international students.  But these are big bucks that institutions are losing.  At York alone, you’re talking about a $5 million hit in tuition (larger if you factor in what will happen to the operating grant).  If there’s no increase next year, you can double that.

The numbers by field of study are even more stunning.  Overall, there was a loss of 2,050 Ontario secondary students.  The decline in Arts enrolment was 2,600.  Put differently: more than 100% of the decline can be attributed to a fall in Arts enrolment.  Hell, even journalism increased slightly.  This should be a wake-up call to Arts faculties – however good a job you think you’re doing, however intrinsically valuable the Arts may be, kids just aren’t buying it the way they used to.  And if you think that isn’t going to have an effect on your budget lines, think again.  Even at those institutions where responsibility-centred budgeting hasn’t taken hold, cash-strapped universities are going to think twice about filling vacant positions in departments where enrolments are declining.

(And lest you think this is just a “kids-just-want-practical-jobs” thing, keep in mind: college new enrolment is also down 2%.)

Is all this a one-off?  Well, the 18-year old cohort is going to shrink another 6% over the next three years, which will make it difficult for any institution to maintain its numbers over that period.  That’s going to put pressure on budgets across the board, but most particularly at institutions located in places where the demographic picture is weakest.  It’s going to create even more incentives for hiking international student numbers.  And it’s going to set off changes in the distribution of funding and salary lines within institutions.

It was all fun and games until the enrolment boom stopped.  Now it gets interesting.

Page 20 of 84« First...10...1819202122...304050...Last »