Higher Education Strategy Associates

September 18

Systematically Valuing the Wrong Things

Michael Staton is a former teacher, venture capitalist, and founder of Uversity, which is a kind of data nerd Strategic Enrolment Management outfit in the US.  He also has some interesting ideas about the “unbundling” of higher education, which have appeared – amongst other places – in Andrew Kelly and Kevin Carey’s recent book Stretching the Higher Education Dollar (ungated copy available here).

His take is that there are four basic groups of services that undergraduate education strives to deliver, and that in at least some of them they face competition from “unbundling” because they could be delivered by alternative providers.  While Staton occasionally sounds like he’s drunk too much kool-aid when he talks about the likelihood of things being unbundled, his analysis about the relative ease of unbundling of various services is quite perceptive.

In declining order of substitutability, these four areas are (and I’m paraphrasing his categories a bit here):

  • Academic Content: If there’s one thing MOOCs and other OERs prove, it’s that – at the undergraduate level at least – content is mostly a commodity.  You can get the substance of an undergraduate degree pretty much anywhere, pretty much for free.  Obviously, this isn’t true at a graduate level, which is where all the prestige is, from the institutional perspective.  But from an undergraduate perspective…
  • Certification of Acquired Knowledge and Skills: All universities offer the same credentials, and the power to offer these credentials remains a gift of the state.  Thus, universities as a class are protected, but at the same time they are unable to distinguish themselves from one another through the undergraduate degree offerings.
  • Acquisition of Learning Meta-Skills: Universities and colleges don’t just teach subject knowledge, they teach people how to approach problems and solve them (or at least they’re supposed to).  Those institutions specializing in (say) co-op or Liberal Arts implicitly have a defined approach to meta-skills acquisition, even if they don’t describe it as such.  But most don’t do so.  They seem to think that just saying “we teach kids how to think” is enough.
  • Life-Altering Experiences:   This is the stuff that really can’t be outsourced.  The experiences one has while at university – the friendships, the life lessons, the transition from adolescence to adulthood – simply can’t be replicated in any place other than a traditional campus.  This is what people really pay for.

What’s interesting here, from a strategy point of view, is the complete misalignment of institutional resources with actual sources of value and distinction.  The two areas where distinctions can actually be made between institutions – meta-skills and life-altering experiences – are the areas where institutions spend the least amount of time and effort. In fact, institutions care so little about the latter that they basically leave it to students themselves (which predictably results in some pretty wild swings in quality, not just between institutions, but also over time at the same institution).  Instead, institutions toss all their money into the (from an undergraduate perspective) “useless content hole”.

Somewhere, sometime, a university will decided to bring a laser-like focus to the issues of meta-skills and experience (indeed, one could argue that Minerva University is about halfway there.  Until then, the misalignment – and the mediocrity of undergraduate studies in general – will continue.

September 17

Welcome to the Crisis

I just took a look at the new enrolment confirmation statistics for Ontario universities.  They are jaw-dropping.

Overall, the system experienced its first fall in “number of confirmed enrolments from secondary school” since (I believe) the early 1990s (I say “I believe” because OUAC doesn’t have public stats that go that far back, but I think that’s right).  Ever since the double-cohort, the province’s universities have seen a steady annual 3% bump in total direct-entry enrolments.  That’s been the source of a useful financial cushion for institutions.

Now, however, things are heading into reverse.  The province’s 18 year-olds fell by 2.1% in 2014.  The fall in confirmed direct-entry undergraduate numbers was slightly higher – 2.8%, or about 2,050 students (73,002 in 2013 vs. 70,950 in 2014).  But what was striking was less the total than the distribution of these numbers.

Let’s start by looking at changes by institution.  Most institutions managed to hit their previous year’s numbers, more or less.  Queen’s did; Ryerson actually managed to boost their numbers by 6%; and Western somehow found a way to jump its numbers by 12%.  But in a system that’s declining overall, those gains can’t come without losses elsewhere.  And some of these numbers are doozies.  Waterloo is down 8%.  Nipissing and York both fell by a little under 11%.  And OCAD University and Laurier are down by – are you ready for this? – 14%.

Some of these numbers can be offset with increases elsewhere.  The University of Ottawa, for instance, has five hundred fewer direct-entry students, but has added five hundred non-direct entry students (presumably, these are Quebec students with CEGEP diplomas).  And of course, there’s the ever-popular solution of adding more international students.  But these are big bucks that institutions are losing.  At York alone, you’re talking about a $5 million hit in tuition (larger if you factor in what will happen to the operating grant).  If there’s no increase next year, you can double that.

The numbers by field of study are even more stunning.  Overall, there was a loss of 2,050 Ontario secondary students.  The decline in Arts enrolment was 2,600.  Put differently: more than 100% of the decline can be attributed to a fall in Arts enrolment.  Hell, even journalism increased slightly.  This should be a wake-up call to Arts faculties – however good a job you think you’re doing, however intrinsically valuable the Arts may be, kids just aren’t buying it the way they used to.  And if you think that isn’t going to have an effect on your budget lines, think again.  Even at those institutions where responsibility-centred budgeting hasn’t taken hold, cash-strapped universities are going to think twice about filling vacant positions in departments where enrolments are declining.

(And lest you think this is just a “kids-just-want-practical-jobs” thing, keep in mind: college new enrolment is also down 2%.)

Is all this a one-off?  Well, the 18-year old cohort is going to shrink another 6% over the next three years, which will make it difficult for any institution to maintain its numbers over that period.  That’s going to put pressure on budgets across the board, but most particularly at institutions located in places where the demographic picture is weakest.  It’s going to create even more incentives for hiking international student numbers.  And it’s going to set off changes in the distribution of funding and salary lines within institutions.

It was all fun and games until the enrolment boom stopped.  Now it gets interesting.

September 16

Trivia Time

This is normally the time of year when I award the “worst-back-to-school story” award.  But I’m not going to this year.  As a result either of this blog’s massive influence, or sheer bad luck, no pundit or op-ed writer wrote anything really stupid this year.  Really, the worst was this CBC credulous news story covering the CCPA’s latest nose-stretcher on tuition fees.  And while it’s needlessly sloppy on the part of the CBC journalist (Dude!  You couldn’t think of one single person outside than CCPA to go to for a reaction quote? Seriously?), it doesn’t really contain the sort of sustained inanity that this award demands.  So I’m leaving it unawarded this year.  Sorry.

But that leaves me with a dilemma – how to entertain the mass readership of this blog without having some op-ed writer on whom to tee off?  So I decide to go with a quiz.  Folks, time to test your knowledge of Canadian universities.  Let’s begin:

1)      Which Canadian University lost ten percent of its inaugural class to Typhoid?

2)      David Dodge recently stepped down as Chancellor of Queen’s University; who was the only other Governor of the Bank of Canada to serve as a university chancellor (and which university was it)?

3)      The YMCA was deeply involved with the creation of two universities in Ontario (York and Carleton), but only one Canadian University actually began its life as the adult learning branch of the local YMCA.  Which one was it?

4)      Only two Canadian universities can boast three or more Prime Ministers among their alumni.  Which ones?

5)      Only one Canadian university has the distinction of having had two Prime Ministers drop out.  Which is it?

6)      Which western Canadian university began its life as an affiliate of McMaster University?

7)      What is a “martlet” and what two Canadian universities have it as their mascot?

8)      The University of Waterloo started life as a satellite college of another institution, and only became independent because of a fight with the parent institution over the issue of co-op education (the Waterloo folks loved it – the parent campus thought it was a supremely dumb idea).  Name the parent institution.

9)      Two Canadian universities are the result of mergers between one Catholic and one Protestant College.  Name them.

10)   Canada is the world’s largest exporter of lentils, and one-third of all world lentils come from strains originally bred at one Canadian university.  Name it.

11)   Which Canadian university has the highest proportion of international students?

12)   Which Australian university has a campus in Canada?

The person with the best set of answers sent to: info@higheredstrategy.com  gets to choose a future OTTSYD topic; plus they get huge – HUGE – bragging tights.  Find the answers tomorrow in the grey box.


September 12

Hosanna! *More* Graduate Income Data!

Okay, so I goofed on Tuesday.  Contrary to what I said, Colleges Ontario actually does publish sector-wide data on graduate incomes six months out – they just don’t publish it with the rest of the KPI data.  Instead, it’s at the back of the graduate outcomes section of their excellent annual Environment Scan (thanks to Glenn for the heads up).  So let’s take a look at what they say.

On Tuesday we noted that graduate employment outcomes for college graduates six-months out seemed to have taken a bigger knock in the recession than university graduates.  To wit:

Figure 1: Percent of Ontario Graduates Employed Six-Months Out, by Graduating Class














That said, although employment results have fallen significantly, the picture is somewhat better when you look at the changes in graduate incomes.  Now, looking at “college” outputs is always a bit tricky because colleges offer so many different kinds of credentials.  In Figure 2, we look at change-over-time in incomes for holders of each credential, and also the weighted average for all credentials.

Figure 2: Income of Ontario College Graduates Six-Months After Graduation, by Credential Level, by Graduating Class, in $2011














In one respect, Figure 2 is about what you’d expect: the longer a college program lasts, the more a college graduate makes (graduate certificates are a partial exception in that they are usually one-year, but they are meant to be delivered after four years of university, so the basic rule still holds).  But it also shows that in real terms, the diplomas, advanced diplomas, and graduate certificates have held their value reasonably well, while certificates have lost 4% (degrees have lost more – 5% – but the numbers there are tiny and therefore subject to a bit more volatility).

Now let’s see how that compares to the six-month numbers at universities, which are below in Figure 3:

Figure 3: Income of Ontario University Graduates Six-Months After Graduation, by Field of Study, by Graduating Class, in $2011














Figure 3 show a slightly different picture than what we saw with the 2-year out data back on Monday.  Humanities and physical sciences still saw the largest fall, but at six months the 2011 computer science grads were 7% up on the class of 2007 (as opposed to 3% down at the 2-year mark), and overall the decline was 6% (as opposed to 13%).  This implies that the job market for recent graduates actually got significantly worse between 2011 and 2013.

Finally, let’s compare college and university averages.

Figure 4: Income of Ontario College University Graduates Six Months After Graduation, in $2011














Figure 4 shows – unsurprisingly – that university graduates make more than college graduates six-months out.  But it also shows that college credentials seem to be holding their value better than undergraduate degrees – down just 2%, rather than 6% for universities, over the period 2007-2011.

This is one of those rare cases where employment averages and income averages are moving in different directions.  In one sense, everyone wins: in the near future, expect Ontario universities to promote themselves as a way to a safe job, and Ontario colleges to talk about how their credentials hold their value in bad times.  And they’ll both be correct.

September 11

Those OECD Attainment Numbers

The OECD’s annual Education at a Glance publication was released on Tuesday.  There weren’t a whole lot of shockers in there, but one thing that always sets Canadians crowing is the table that looks at tertiary educational attainment because, at first glance, we seem to do really well on that measure.  To wit:

Figure 1: Tertiary Attainment Rates, 25-64 Year Olds, Canada, OECD Average and Select OECD Countries, 2012














Yay Canada!  We’re number one!

Well, hang on a second.  A lot of that is because we’ve been in the mass higher education game longer than anyone else: indeed our 55-64 year olds are in a class of their own.  But when it comes to educating young people, it’s a different story.  To wit:

Figure 2: Tertiary Attainment Rates, 25-34 Year Olds, Canada, OECD Average and Select OECD Countries, 2012














Okay, still not bad.  Our attainment rate is higher among young people than the general population (57% vs. 53%), which means we’re making some progress, albeit slow.  And we’re still 18 points above the OECD average.  But notice we’re third now, behind Korea and Japan.

Now let’s look specifically at degree-level attainment rates – or, what in international educational statistics-speak is known as “Tertiary Level 5A” – as opposed to tertiary rates.

Figure 3: Tertiary 5A Attainment Rates, 25-34 Year Olds, Canada, OECD Average and Select OECD Countries, 2012














See, now this is quite a different picture.  If we’re simply looking at obtaining degrees, Canada is actually below the OECD average.  We’d need to increase our degree attainment rates by almost 50% to be in first place here.  The reason for this, of course, is that unlike most countries, Canada has a big “Tertiary B” sector, as shown below in Figure 4.

Figure 4: Tertiary 5B Attainment Rates, 25-34 Year Olds, Canada, OECD Average and Select OECD Countries, 2012














Interpreting the 5B data is a bit tricky, partly because Tertiary B data looks very different depending on the country, but also partly because the Canadian data is a mess.  In some countries, Tertiary B is purely vocational; in others (for instance, Korean and the US) the figures include junior college associate degrees, which in other countries would be considered incomplete 5A degrees.

In Canada, the Tertiary B figure is mostly traditional community college/polytechnic completers below degree level.  But it also includes a number of other types that make it not entirely comparable to other countries, including:

  • CEGEP Graduates who Never Went on to Universities.  Few other countries would consider people with only 13 years of schooling to be Tertiary B;
  • Trade/Apprenticeship Certificate Holders.  In Europe, where apprenticeship systems are considered part of the secondary system, these kinds of programs would be considered level 4 or even level 3;
  • Private Vocational College Credential Holders.  Again, these are usually one year or less in duration, and it’s unlikely most countries would consider them equivalent to Tertiary B.

Now, there’s nothing sinister here – these differences aren’t an attempt to “juice” our numbers.  The first two issues are the result of structural differences – part and parcel of the difficulties of trying to standardize data internationally across not-entirely-parallel systems.  The third one – private vocational credentials – is an outgrowth of the fact that this data is taken from various Labour Force Surveys, and the wording of the relevant question on our survey is just a little looser than in other countries.

All of this is to say that a “Canada’s Number One” narrative based on the OECD numbers isn’t necessarily warranted.  In some ways, we may even be falling behind.

September 10

How StatsCan Measures Changes in Tuition

Every September, Statistics Canada publishes data on “average tuition fees”. It’s a standard date on the back-to-school media calendar, where everyone gets to freak out about the cost of education.  And we all take it for granted that the data StatsCan publishes is “true”.  But there are some… subtleties… to the data that are worth pointing out.

Statistics Canada collects data on tuition from individual institutions through a survey called the Tuition and Living Accommodation Survey (TLAC).  For each field of study at each institution, TLAC asks for “lower” and “upper” fees separately for Canadian and foreign students, for both graduate and undergraduate students.  Now, in provinces where the “upper” and “lower” figure are the same (eg. Newfoundland), it’s pretty simple to translate lower/upper to “average”.  In Quebec and Nova Scotia, where “upper” and “lower” are functionally equivalent to “in-province” and “out-of-province”, averages can be worked out simply by cross-referencing to PSIS enrolment data, and weighting the numbers according to place of student origin.  Everywhere else, it’s a total mess.  In Ontario, significant variation between “upper” and “lower” numbers are the norm, even inside the institution (for instance, with different tuition levels for different years of study).  Somehow, StatsCan uses some kind of enrolment weighting to produce an average, but how the weights are derived is a mystery.  Finally, in a couple of provinces where there are differences between the “lower” and “upper” figures, StatsCan chooses to use the “lower” figure as an average.  (No, I have absolutely no idea why).

But the tuition data is squeaky clean compared to the mess that is StatsCan’s data on ancillary fees.  Institutions fill in the ancillary fee part of the questionnaire every year, but usually without much reference to what was reported the year before.   Since StatsCan doesn’t have the staff to thoroughly check the information, institutional figures swing pretty wildly up and down from one year to the next, even though everyone knows perfectly well ancillary fees only ever go in one direction.

Another complication is that “average” is a central tendency – it is affected not just by posted prices, but also by year-to-year shifts in enrolments.  As students switch from cheaper to more expensive programs (e.g. out of humanities and into professional programs), average tuition rises.  As student populations grow more quickly in the more expensive provinces (e.g. Ontario) than in cheaper ones (e.g. Quebec, Newfoundland), then again average tuitions rise – even if all fees stayed exactly the same.  Both of these things are in fact happening, and are small but noticeable contributors to the “higher tuition” phenomenon.

A final complicating factor: the data on tuition and the data on enrolment by which it’s weighted come from completely different years.  Tuition is up-to-the-minute: the 2014-15 data will be from the summer of 2014; the enrolment data by which it is weighted will be 2012-3.  And, to make things even weirder, when StatsCan presents the ’14-15 data next year as a baseline against which to measure the ’15-16 data, it will be on the basis of revised figures weighted by an entirely different year’s enrolment data (2013-4).

In short, using SatsCan tuition data is a lot like eating sausages: they’re easier to digest if you don’t know how they’re made.

September 09

More Graduate Labour Market Data

Yesterday I showed that recent Ontario university graduates’ incomes are taking a beating, notably in Arts and Sciences.  I’m sure this led to a fair bit of crowing among those who claim we have too many students in university, and they all oughta go to college instead because skills, new economy, yadda yadda.

The problem with that argument is that college grads are getting creamed in the labour market, too.

Now, we can’t compare university and college outcomes in terms of incomes because Colleges Ontario – unlike COU – doesn’t publish income data from the provincial graduate survey.  Nor can we compare graduate employment rates over 2 years because – for reasons that pretty much defy any rational explanation – the Ontario government doesn’t track college graduates past the first six months.  But at the six-month-after-graduation point, we do at least have a point of comparison between the two sectors.  Let’s see how it looks:

Figure 1: Percent of Ontario Graduates Employed Six-Months Out, by Graduating Class














So, as with yesterday, let’s take this graph with a grain of salt – this is the employment rate excluding those who go on to do further study.  That said, what it shows is that while both college and university graduates in Ontario took a hit in the recession, the hit was actually considerably bigger on the college side: what was a one and a half percentage-point gap in six-month-out employment rates before the recession is now almost a five percentage point gap.  And that’s not simply due to good results among professional programs in universities – even Humanities grads have increased the gap.

(NB: the drop at most colleges hasn’t been quite this drastic; but, the 6-month employment rate deterioration at some of the big Toronto colleges – 9% at Sheridan and Seneca, and 13% at Centennial – throw the average off quite a bit.)

So the fall in real incomes appear to be generalized across post-secondary education rather than specific to general Arts/Science programs at universities.  Why might this be?  Well, first, it’s worth bearing in mind that our baseline year – 2007 – was a reasonably good year for youth employment in Ontario.  Indeed: only three of the last 23 years saw lower youth unemployment rates than 2007.  So comparing to 2007 means we’re starting from a high base.

Figure 2: Unemployment Rates for 20-24 Year-Olds, Ontario














And the other thing to remember is that the graduating class of 2011 was more than 10% larger than the class of 2007, because of the major expansion of access that occurred in the aughts.  This will tend to move the average downward in two ways: first, through competition (more grads chasing a similar number of jobs could result in the bidding-down of labour prices); and second, through composition effects (as access increased, the likelihood is that universities were also taking-in more students of slightly lower ability, which could also push down the average wage a bit).

None of this is meant to deny that university grads in Ontario are having a bad few years; it is, however, to say that there are a number of structural factors that need to be taken into account before jumping to the conclusion that universities “aren’t worth it”.

September 08

Some Scary Graduate Income Numbers

Last week, the Council of Ontario Universities put out a media release with the headline “Ontario University Graduates are Getting Jobs”, and trumpeted the results of the annual provincial graduates survey, which showed that 93% of undergraduates had jobs two years after graduation, and their income was $49,398.  Hooray!

But the problem – apart from the fact that it’s not actually 93% of all graduates with jobs, but rather 93% of all graduates who are in the labour market (i.e. excluding those still in school) – is that the COU release neither talks about what’s going on at the field of study level, nor places the data in any kind of historical context.  Being a nerd, I collect these things when they come out each year and put the results in a little excel sheet.  Let’s just say that when you do compare these results to earlier years, things look considerably less rosy.

Let’s start with the employment numbers, which look like this:

Figure 1: Employment Rate of Ontario Graduates 2 Years Out, Classes of 1999 to 2011














Keep your eye on the class of 2005 – this was the last group to be measured 2 years out before the recession began (i.e. in 2007).  They had overall employment rates of about 97%, meaning that today’s numbers actually represent a 4-point drop from there.  If you really wanted to be mean about it, you could equally say that graduate unemployment in 2013 has doubled since 2007.  But look also at what’s happened to the Arts disciplines: in the first four years of the slowdown, their employment rates fell about two percentage points more than the average (though, since the class of ’09, their employment levels-out).

Still, one might think: employment rates in the 90s – not so bad, given the scale of the recession.  And maybe that’s true.  But take a look at the numbers on income:

Figure 2: Average Income (in $2013) 2 Years After Graduation, Ontario Graduating Classes from 2003-2011, Selected Disciplines














Figure 2 is unequivocally bad news.  The average in every single discipline is below where it was for the class of 2005.  Across all disciplines, the average is down 13%.  Engineering and Computer Science are down the least, and have made some modest real gains in the last couple of years; for everyone else, the decline is in double-digits.  Business: down 11%.  Humanities: down 20%.  Physical Sciences: down 22% (more evidence that generalizations about STEM disciplines are nonsense).

Now, at this point some of you may be saying: “hey, wait a minute – didn’t you say last year that incomes 2 years out were looking about the same as they did for the class of 2005?”  Well, yes – but you may also recall that a couple of days later I called it back because Statscan did a whoopsie and said: “you know that data we said was two years after graduation?  Actually it’s three years out”.

Basically, the Ontario data is telling us that 2 years out ain’t what it used to be, and the Statscan data is telling us is that three years out is the new two; simply, it now takes 36 months for graduates to reach the point they used to reach in 24.  That’s not a disaster by any means, but it does show that – in Ontario at least – recent graduates are having a tougher time in the recession.

Tomorrow: more lessons in graduate employment data interpretation.

September 05

Better Know a Higher Ed System: New Zealand

We don’t hear much up here about New Zealand higher education, mainly because the country’s tiny, and literally located at the end of the earth.  But that’s a pity, because it’s an interesting system with a lot to like about it.

The country’s university system is pretty ordinary: eight universities, three of which were founded in the 19th century, and the rest founded after WWII. All of them are pretty much based on English lines, with just one – Auckland – generally considered to be “world-class”.  Rather, what makes New Zealand an interesting higher education system is what happens outside the universities.

About 30 years ago, New Zealand came close to bankruptcy; in response, the government moved to sharply liberalize the economy.  In education, this meant eliminating established educational monopolies, and widening the ability to provide education: anyone who wanted to deliver a degree or a diploma could do so, provided they could meet an independent quality standard.  Polytechnics – equivalent to our colleges – started offering degrees (in the process becoming an inspiration to our own colleges, some of whom proceeded to push for their own degree-granting status, and labelled themselves “polytechnics”), and hundreds of private providers started offering diplomas.  Despite this liberalization, the system is still able to enforce a qualifications framework, which allows people to stack lower-level qualifications towards higher-level ones – and that’s down to having a serious high-quality regulator in the New Zealand Qualifications Authority.

Another major system feature are the “wānangas”.  The term is a Maori word indicating traditional knowledge, but in practice the term has come to mean “Maori polytechnic” (the country’s universities all use the term “Whare Wānanga” – meaning “place of learning” – to translate their names into Maori).  There are three of these, two of which are tiny (less than 500 students), and one of which is freaking massive (38,000 today, down from a peak of 65,000 ten years ago). I’ll tell you the story of Te Wānanga o Aotearoa another time, because it deserves its own blog post.  But for the moment just keep in mind that in New Zealand, wānangas are considered the fourth “pillar” of higher education (along with universities, polytechnics, and privates), and that these institutions, entirely run by Maori, have had an enormously positive impact on Maori educational attainment rates (see this previous blog for stats on that).

A last point to note about NZ is its international strategy.  Like our government, New Zealand’s aims in this area are pretty mercantilist: students in = money in = good.  It could not possibly care less about outward mobility or other touchy-feely stuff.  What distinguishes their strategy from ours, however, is that theirs is smart.  Brilliant, actually.  Take a couple of minutes to compare Canada’s laughably thin and one-dimensional policy with Education New Zealand’s unbelievably detailed set of strategies, goals, and tactics laid out not just for the country as a whole, but for each of six key sub-sectors: universities, colleges, privates, primary/secondary schools, English language sector, and educational service/product providers.  That, my friends, is a strategy.  Now ask yourself: why we can’t produce something that good?

In short, there’s a lot Canadians could learn from New Zealand – if only we paid more attention.

September 04

Who’s Relatively Underfunded?

As I said yesterday, there’s a quick way to check claims of relative underfunding in block-grant provinces: take each institution’s enrolment numbers by field of study from Statscan’s Post-Secondary Student Information System (PSIS), plug those numbers into the Ontario and Quebec funding formulas, and then compare each institutions’ hypothetical share of total provincial weighted student units (WSUs) under those formulas to what we know they actually receive via CAUBO’s annual Financial Information of Universities and Colleges (FIUC) Survey.

Simple, right? Well, no, not really, but I have some really talented staff who do this stuff for me (Hi Jackie!), so let’s go look at the data.

Let’s start with Manitoba, where pretty much every second day you can hear the University of Winnipeg making a case about relative underfunding (say what you will about Lloyd Axworthy: the man knows how to keep his message in the newspapers).  But is the claim true?

Figure 1: FTEs, Weighted FTEs, and Actual Funding, Manitoba Universities














Here’s what Figure 1 says:  The University of Manitoba has 69% of the province’s students, but receives 79% of all provincial funding (this is from 2011-12); The University of Winnipeg, on the other hand, has 24% of the students, but only 13% of the total funding.  Clear cut case of underfunding, right?

Well, not entirely.  The fact is that Manitoba has a lot more students in high-cost disciplines than does Winnipeg.  If U of M were in Ontario, it would get 75% of provincial funding; if it were in Quebec (where the formula is slightly more tilted towards medical disciplines), it would get 77% of provincial funding.  So Manitoba receives slightly more funding than it would in other provinces, as does – in a relatively more significant way – Brandon University.  And Winnipeg does receive less than it would if it were in another province: $18 million less than if Manitoba used Quebec’s formula, and $25 million less than if Ontario’s were used.  That’s a big gap, but still less than it would appear just looking at FTEs alone.

Now, on to New Brunswick.  One has to be a little careful about making inter-institutional comparisons with CAUBO data in New Brunswick because of the peculiar arrangement between UNB and St. Thomas (STU).  Because the two share the former’s campus, the provincial government sends UNB a little bit extra (and STU a little bit less) in order to cover extra costs.  So, with that in mind, let’s look at the data:

Figure 2: FTEs, Weighted FTEs, and Actual Funding, New Brunswick Universities














New Brunswick looks a bit different than Manitoba, where the biggest university is overfunded.  In New Brunswick, it’s UNB that actually seems to be doing badly, receiving 50% of all money when, in Ontario, it would receive 54%, and in Quebec it would receive 59% (and remember, that 50% is actually inflated a bit because of the money to support STU students).  The institution that really seems to be overfunded in New Brunswick is Moncton, which is receiving $13 million more than it would if New Brunswick used either the Quebec or Ontario formulae.

So, yes Virginia, relative underfunding does exist in Manitoba and New Brunswick.  This probably wouldn’t be the case if either province ever bothered to put its institutional funding on an empirical footing, via a funding formula.  But that would create winners (likely Winnipeg & UNB) and losers (likely Brandon, Moncton and, to a lesser extent, Manitoba).  And what politician likes to hear that?

Page 7 of 70« First...56789...203040...Last »