HESA

Higher Education Strategy Associates

September 10

How StatsCan Measures Changes in Tuition

Every September, Statistics Canada publishes data on “average tuition fees”. It’s a standard date on the back-to-school media calendar, where everyone gets to freak out about the cost of education.  And we all take it for granted that the data StatsCan publishes is “true”.  But there are some… subtleties… to the data that are worth pointing out.

Statistics Canada collects data on tuition from individual institutions through a survey called the Tuition and Living Accommodation Survey (TLAC).  For each field of study at each institution, TLAC asks for “lower” and “upper” fees separately for Canadian and foreign students, for both graduate and undergraduate students.  Now, in provinces where the “upper” and “lower” figure are the same (eg. Newfoundland), it’s pretty simple to translate lower/upper to “average”.  In Quebec and Nova Scotia, where “upper” and “lower” are functionally equivalent to “in-province” and “out-of-province”, averages can be worked out simply by cross-referencing to PSIS enrolment data, and weighting the numbers according to place of student origin.  Everywhere else, it’s a total mess.  In Ontario, significant variation between “upper” and “lower” numbers are the norm, even inside the institution (for instance, with different tuition levels for different years of study).  Somehow, StatsCan uses some kind of enrolment weighting to produce an average, but how the weights are derived is a mystery.  Finally, in a couple of provinces where there are differences between the “lower” and “upper” figures, StatsCan chooses to use the “lower” figure as an average.  (No, I have absolutely no idea why).

But the tuition data is squeaky clean compared to the mess that is StatsCan’s data on ancillary fees.  Institutions fill in the ancillary fee part of the questionnaire every year, but usually without much reference to what was reported the year before.   Since StatsCan doesn’t have the staff to thoroughly check the information, institutional figures swing pretty wildly up and down from one year to the next, even though everyone knows perfectly well ancillary fees only ever go in one direction.

Another complication is that “average” is a central tendency – it is affected not just by posted prices, but also by year-to-year shifts in enrolments.  As students switch from cheaper to more expensive programs (e.g. out of humanities and into professional programs), average tuition rises.  As student populations grow more quickly in the more expensive provinces (e.g. Ontario) than in cheaper ones (e.g. Quebec, Newfoundland), then again average tuitions rise – even if all fees stayed exactly the same.  Both of these things are in fact happening, and are small but noticeable contributors to the “higher tuition” phenomenon.

A final complicating factor: the data on tuition and the data on enrolment by which it’s weighted come from completely different years.  Tuition is up-to-the-minute: the 2014-15 data will be from the summer of 2014; the enrolment data by which it is weighted will be 2012-3.  And, to make things even weirder, when StatsCan presents the ’14-15 data next year as a baseline against which to measure the ’15-16 data, it will be on the basis of revised figures weighted by an entirely different year’s enrolment data (2013-4).

In short, using SatsCan tuition data is a lot like eating sausages: they’re easier to digest if you don’t know how they’re made.

September 09

More Graduate Labour Market Data

Yesterday I showed that recent Ontario university graduates’ incomes are taking a beating, notably in Arts and Sciences.  I’m sure this led to a fair bit of crowing among those who claim we have too many students in university, and they all oughta go to college instead because skills, new economy, yadda yadda.

The problem with that argument is that college grads are getting creamed in the labour market, too.

Now, we can’t compare university and college outcomes in terms of incomes because Colleges Ontario – unlike COU – doesn’t publish income data from the provincial graduate survey.  Nor can we compare graduate employment rates over 2 years because – for reasons that pretty much defy any rational explanation – the Ontario government doesn’t track college graduates past the first six months.  But at the six-month-after-graduation point, we do at least have a point of comparison between the two sectors.  Let’s see how it looks:

Figure 1: Percent of Ontario Graduates Employed Six-Months Out, by Graduating Class

image001

 

 

 

 

 

 

 

 

 

 

 

 

So, as with yesterday, let’s take this graph with a grain of salt – this is the employment rate excluding those who go on to do further study.  That said, what it shows is that while both college and university graduates in Ontario took a hit in the recession, the hit was actually considerably bigger on the college side: what was a one and a half percentage-point gap in six-month-out employment rates before the recession is now almost a five percentage point gap.  And that’s not simply due to good results among professional programs in universities – even Humanities grads have increased the gap.

(NB: the drop at most colleges hasn’t been quite this drastic; but, the 6-month employment rate deterioration at some of the big Toronto colleges – 9% at Sheridan and Seneca, and 13% at Centennial – throw the average off quite a bit.)

So the fall in real incomes appear to be generalized across post-secondary education rather than specific to general Arts/Science programs at universities.  Why might this be?  Well, first, it’s worth bearing in mind that our baseline year – 2007 – was a reasonably good year for youth employment in Ontario.  Indeed: only three of the last 23 years saw lower youth unemployment rates than 2007.  So comparing to 2007 means we’re starting from a high base.

Figure 2: Unemployment Rates for 20-24 Year-Olds, Ontario

image002

 

 

 

 

 

 

 

 

 

 

 

 

And the other thing to remember is that the graduating class of 2011 was more than 10% larger than the class of 2007, because of the major expansion of access that occurred in the aughts.  This will tend to move the average downward in two ways: first, through competition (more grads chasing a similar number of jobs could result in the bidding-down of labour prices); and second, through composition effects (as access increased, the likelihood is that universities were also taking-in more students of slightly lower ability, which could also push down the average wage a bit).

None of this is meant to deny that university grads in Ontario are having a bad few years; it is, however, to say that there are a number of structural factors that need to be taken into account before jumping to the conclusion that universities “aren’t worth it”.

September 08

Some Scary Graduate Income Numbers

Last week, the Council of Ontario Universities put out a media release with the headline “Ontario University Graduates are Getting Jobs”, and trumpeted the results of the annual provincial graduates survey, which showed that 93% of undergraduates had jobs two years after graduation, and their income was $49,398.  Hooray!

But the problem – apart from the fact that it’s not actually 93% of all graduates with jobs, but rather 93% of all graduates who are in the labour market (i.e. excluding those still in school) – is that the COU release neither talks about what’s going on at the field of study level, nor places the data in any kind of historical context.  Being a nerd, I collect these things when they come out each year and put the results in a little excel sheet.  Let’s just say that when you do compare these results to earlier years, things look considerably less rosy.

Let’s start with the employment numbers, which look like this:

Figure 1: Employment Rate of Ontario Graduates 2 Years Out, Classes of 1999 to 2011

image001

 

 

 

 

 

 

 

 

 

 

 

 

Keep your eye on the class of 2005 – this was the last group to be measured 2 years out before the recession began (i.e. in 2007).  They had overall employment rates of about 97%, meaning that today’s numbers actually represent a 4-point drop from there.  If you really wanted to be mean about it, you could equally say that graduate unemployment in 2013 has doubled since 2007.  But look also at what’s happened to the Arts disciplines: in the first four years of the slowdown, their employment rates fell about two percentage points more than the average (though, since the class of ’09, their employment levels-out).

Still, one might think: employment rates in the 90s – not so bad, given the scale of the recession.  And maybe that’s true.  But take a look at the numbers on income:

Figure 2: Average Income (in $2013) 2 Years After Graduation, Ontario Graduating Classes from 2003-2011, Selected Disciplines

image002

 

 

 

 

 

 

 

 

 

 

 

 

Figure 2 is unequivocally bad news.  The average in every single discipline is below where it was for the class of 2005.  Across all disciplines, the average is down 13%.  Engineering and Computer Science are down the least, and have made some modest real gains in the last couple of years; for everyone else, the decline is in double-digits.  Business: down 11%.  Humanities: down 20%.  Physical Sciences: down 22% (more evidence that generalizations about STEM disciplines are nonsense).

Now, at this point some of you may be saying: “hey, wait a minute – didn’t you say last year that incomes 2 years out were looking about the same as they did for the class of 2005?”  Well, yes – but you may also recall that a couple of days later I called it back because Statscan did a whoopsie and said: “you know that data we said was two years after graduation?  Actually it’s three years out”.

Basically, the Ontario data is telling us that 2 years out ain’t what it used to be, and the Statscan data is telling us is that three years out is the new two; simply, it now takes 36 months for graduates to reach the point they used to reach in 24.  That’s not a disaster by any means, but it does show that – in Ontario at least – recent graduates are having a tougher time in the recession.

Tomorrow: more lessons in graduate employment data interpretation.

September 05

Better Know a Higher Ed System: New Zealand

We don’t hear much up here about New Zealand higher education, mainly because the country’s tiny, and literally located at the end of the earth.  But that’s a pity, because it’s an interesting system with a lot to like about it.

The country’s university system is pretty ordinary: eight universities, three of which were founded in the 19th century, and the rest founded after WWII. All of them are pretty much based on English lines, with just one – Auckland – generally considered to be “world-class”.  Rather, what makes New Zealand an interesting higher education system is what happens outside the universities.

About 30 years ago, New Zealand came close to bankruptcy; in response, the government moved to sharply liberalize the economy.  In education, this meant eliminating established educational monopolies, and widening the ability to provide education: anyone who wanted to deliver a degree or a diploma could do so, provided they could meet an independent quality standard.  Polytechnics – equivalent to our colleges – started offering degrees (in the process becoming an inspiration to our own colleges, some of whom proceeded to push for their own degree-granting status, and labelled themselves “polytechnics”), and hundreds of private providers started offering diplomas.  Despite this liberalization, the system is still able to enforce a qualifications framework, which allows people to stack lower-level qualifications towards higher-level ones – and that’s down to having a serious high-quality regulator in the New Zealand Qualifications Authority.

Another major system feature are the “wānangas”.  The term is a Maori word indicating traditional knowledge, but in practice the term has come to mean “Maori polytechnic” (the country’s universities all use the term “Whare Wānanga” – meaning “place of learning” – to translate their names into Maori).  There are three of these, two of which are tiny (less than 500 students), and one of which is freaking massive (38,000 today, down from a peak of 65,000 ten years ago). I’ll tell you the story of Te Wānanga o Aotearoa another time, because it deserves its own blog post.  But for the moment just keep in mind that in New Zealand, wānangas are considered the fourth “pillar” of higher education (along with universities, polytechnics, and privates), and that these institutions, entirely run by Maori, have had an enormously positive impact on Maori educational attainment rates (see this previous blog for stats on that).

A last point to note about NZ is its international strategy.  Like our government, New Zealand’s aims in this area are pretty mercantilist: students in = money in = good.  It could not possibly care less about outward mobility or other touchy-feely stuff.  What distinguishes their strategy from ours, however, is that theirs is smart.  Brilliant, actually.  Take a couple of minutes to compare Canada’s laughably thin and one-dimensional policy with Education New Zealand’s unbelievably detailed set of strategies, goals, and tactics laid out not just for the country as a whole, but for each of six key sub-sectors: universities, colleges, privates, primary/secondary schools, English language sector, and educational service/product providers.  That, my friends, is a strategy.  Now ask yourself: why we can’t produce something that good?

In short, there’s a lot Canadians could learn from New Zealand – if only we paid more attention.

September 04

Who’s Relatively Underfunded?

As I said yesterday, there’s a quick way to check claims of relative underfunding in block-grant provinces: take each institution’s enrolment numbers by field of study from Statscan’s Post-Secondary Student Information System (PSIS), plug those numbers into the Ontario and Quebec funding formulas, and then compare each institutions’ hypothetical share of total provincial weighted student units (WSUs) under those formulas to what we know they actually receive via CAUBO’s annual Financial Information of Universities and Colleges (FIUC) Survey.

Simple, right? Well, no, not really, but I have some really talented staff who do this stuff for me (Hi Jackie!), so let’s go look at the data.

Let’s start with Manitoba, where pretty much every second day you can hear the University of Winnipeg making a case about relative underfunding (say what you will about Lloyd Axworthy: the man knows how to keep his message in the newspapers).  But is the claim true?

Figure 1: FTEs, Weighted FTEs, and Actual Funding, Manitoba Universities

image005

 

 

 

 

 

 

 

 

 

 

 

 

Here’s what Figure 1 says:  The University of Manitoba has 69% of the province’s students, but receives 79% of all provincial funding (this is from 2011-12); The University of Winnipeg, on the other hand, has 24% of the students, but only 13% of the total funding.  Clear cut case of underfunding, right?

Well, not entirely.  The fact is that Manitoba has a lot more students in high-cost disciplines than does Winnipeg.  If U of M were in Ontario, it would get 75% of provincial funding; if it were in Quebec (where the formula is slightly more tilted towards medical disciplines), it would get 77% of provincial funding.  So Manitoba receives slightly more funding than it would in other provinces, as does – in a relatively more significant way – Brandon University.  And Winnipeg does receive less than it would if it were in another province: $18 million less than if Manitoba used Quebec’s formula, and $25 million less than if Ontario’s were used.  That’s a big gap, but still less than it would appear just looking at FTEs alone.

Now, on to New Brunswick.  One has to be a little careful about making inter-institutional comparisons with CAUBO data in New Brunswick because of the peculiar arrangement between UNB and St. Thomas (STU).  Because the two share the former’s campus, the provincial government sends UNB a little bit extra (and STU a little bit less) in order to cover extra costs.  So, with that in mind, let’s look at the data:

Figure 2: FTEs, Weighted FTEs, and Actual Funding, New Brunswick Universities

image006

 

 

 

 

 

 

 

 

 

 

 

 

New Brunswick looks a bit different than Manitoba, where the biggest university is overfunded.  In New Brunswick, it’s UNB that actually seems to be doing badly, receiving 50% of all money when, in Ontario, it would receive 54%, and in Quebec it would receive 59% (and remember, that 50% is actually inflated a bit because of the money to support STU students).  The institution that really seems to be overfunded in New Brunswick is Moncton, which is receiving $13 million more than it would if New Brunswick used either the Quebec or Ontario formulae.

So, yes Virginia, relative underfunding does exist in Manitoba and New Brunswick.  This probably wouldn’t be the case if either province ever bothered to put its institutional funding on an empirical footing, via a funding formula.  But that would create winners (likely Winnipeg & UNB) and losers (likely Brandon, Moncton and, to a lesser extent, Manitoba).  And what politician likes to hear that?

September 03

“Relative” Underfunding

Institutions always claim to be underfunded.  Seriously, I’ve been at universities in maybe 25 countries – including Saudi Arabia and the Emirates – and I have yet to find an institution that thought it was overfunded.  The reason for this is simple: there’s always just a little bit more quality around the bend, if only you could buy it (the university down the street has a space-shuttle simulator? We need an actual space shuttle to stay competitive!).  So it’s easy to tune out this kind of talk.

The slightly more sophisticated argument is one of relative underfunding.  That is too say: institution A is getting less than it “should” based on what a selection of other comparable institutions get.  The trick, of course, is to get the comparator right – too often, it’s transparently a plea by institutions in poor provinces to get funded in the same way as some of their peers in wealthier provinces.

One way that governments can avoid this kind of argument is to institute funding formulas (indeed, in many cases, this is precisely the reason they were introduced).  Once a funding formula is created, and institutions are paid according to some kind of algorithm, it becomes tough to argue relative underfunding (that is, unless the formula is specifically re-jigged in such a way as to screw over one particular partner – as Quebec did with its famous “ajustement McGill”).  You can argue that the funding doesn’t weight activities the right way – small institutions tend to argue that fixed costs aren’t properly accounted for, large ones that research activities are never compensated adequately – but you can’t argue being underfunded because the criteria by which money is being distributed are objective.

In Canada, it’s really only Quebec and Ontario that have anything close to pure formula funding, based on input indicators.  Nova Scotia does have a formula, but weirdly only takes a reading of the indicators every decade or so; Saskatchewan has some weird block grant/formula hybrid, which is ludicrously complex for a province with only two institutions.  PEI and Newfoundland don’t really need formulae given that they are single-institution provinces.

That leaves Alberta, British Columbia, Manitoba, and New Brunswick.  In these provinces, money is delivered by block grant rather than on the basis of an algorithm, so there is plenty of scope for institutions to claim being “underfunded” relative to others in their province.   This means that institutions have a perennial rhetorical stick with which to beat government.

Or do they?  In fact, there is a way to check claims of relative underfunding, even in block grant provinces.  All one needs to do is to simply look at the distribution of money across an institution and see if it matches up with the distribution of funds one would see in a province that does have a funding formula (i.e. Quebec and/or Ontario).  If they don’t match, there’s probably a case for underfunding; if they do, there probably isn’t.

Tomorrow, we’ll try this out on Manitoba and New Brunswick.

September 02

Higher Education as a Positional Good

In policy circles, we talk a lot about whether education is a public or a private good (it’s both), and what the implications are for pricing.  But one thing we don’t talk enough about is the extent to which education is a positional good.  And that’s a problem because our decisions on this topic have serious implications for the way we fund higher education.

What’s a positional good?  It’s a good that derives part of its value from the fact it’s valuable, but not everybody has it.  It’s kind of like when a particular article of clothing becomes “cool” – if too many people wear it, it ceases to be cool.  Status goods are a zero-sum game – every time someone else gets something I have, its value to me decreases.

Now think about education.  If too many people get a Bachelor’s degree, its value as a signal of skill goes down, even if everyone’s still gaining the same set of skills.  Think about what that means: the consequence of education being a positional good is that as access to higher education increases, the value of a “plain” Bachelor’s degree decreases, and degree-holders have to find new ways to distinguish themselves.

One way to do this is to focus less on the credential and more on where it was obtained.  Thus, one common pattern in higher education is that as access increases, so too does stratification.  Harvard degrees, for instance, have increased significantly in prestige as access to education has improved. We haven’t seen much of this phenomenon, because – as Joseph Heath recently pointed out – in Canada our most prestigious institutions accommodate a pretty large proportion of our student body.  Combine McGill, Toronto, and UBC and you’ve got about 14% of the entire population (the equivalent for, say, Yale/Harvard/Princeton is about 0.1%).  Since our top institutions don’t confer (much) exclusivity, Canadians look for higher education distinction in the collection of additional degrees.  Hence the explosion in professional Master’s degrees and (to a lesser extent) PhDs.

What makes higher education so weird as a field of policy is that it’s pretty much the only type of status good that governments subsidize.  And that’s really quite weird when you think about it.  What idiot would try to promote universal access to something that, by definition, not everyone can have?

We justify subsidizing degrees because to some extent they do raise skills and productivity, which is good for everyone, not just the people who get the degrees.  But the fact is, when it comes to private returns, in the early career phase at least, what matters is the positional value of the degree.  And not everyone can get into Harvard, or into a PhD program.  If they could, those goods would lose all meaning.  Scarcity is what makes them valuable.

The cry of people who say that there are too many spots in higher education/ law school/ teacher training are really making an argument that there aren’t enough status goods to go around.  That’s in part the consequence of subsidizing education to improve access – you’re bound to get excess demand.

As they say in the computer business, that’s a feature, not a bug.

August 29

Predicting the Effects of Australian Fee De-regulation

If the Australian government’s plan on fee-deregulation comes to pass, what follows will be one of the greatest experiments ever in higher education.  Institutions will have the right to set fees exactly as they want, which begs two questions: what will they do with that power, and what will the effects be?

Let’s start with the first question.  When institutions in England were given the freedom to set tuition fees up to a maximum of £9,000, nearly all of them immediately jumped to that maximum from their previous level of about £3,300.   Contrary to the government’s hopes, no one tried to compete on price.  Thus, the ceiling quickly became the mode.

De-regulation proponents in Australia say that won’t happen this time.  The problem in England, they say, was the existence of a ceiling – it gave everyone a point of reference around which to cluster.  Take away the ceiling and genuine competition will occur as universities figure out how to deliver different combinations of price and value.  Opponents say this is wishful thinking – the first set of fees to be announced by a prestige institution (read: Group of 8 member) will become a de facto cap, and hence the standard to which everyone else will gravitate.

There’s a story doing the rounds in Australia that supports this idea.  A few years ago, the government allowed institutions to raise fees by up to 25%, which pretty much all institutions did, apart from Curtin University in Western Australia.  Instantly, Curtin went from being second preference for local applications (behind the University of Western Australia) to third (behind Murdoch University).  Through market research, they found out that because students and their families can’t judge institutional quality, they judge it based on inputs – so when Curtin chose a cheaper price, the signal families received was that Curtin was of inferior quality.

There are contrary examples, of course.  In England, institutions have power to set fees both for international students and taught (i.e. professional) Master’s programs, and there is lots of variation in pricing.  So what’s the difference?  In a word, guaranteed income-contingent student loans with significant forgiveness provisions.  Domestic undergraduates have them, international and taught Master’s students don’t.  All undergraduates can get a loan to cover their fees up-front, and are not on the hook for the whole amount if their post-graduation incomes aren’t high enough.

So let’s apply that lesson to Australia, which also has an income-contingent Higher Education Contribution Scheme, albeit one with less generous repayment subsidies than England’s.  HECS will still insulate students from the main financial consequences of the new fees, and so, as in Britain, they will likely absorb the higher fees with very little effect on enrolment. As a result, institutions will push the fee levels quite high because they can do so without fear of losing students (the exception will be students who learn at a distance – which is a more significant chunk of the student body in Australia than it is in most other OECD countries).  The likelihood is that they will get quite close to the international student level – and they will do so at nearly all institutions.

The real question is: what will institutions do with that money?  The likelihood is that every penny of the extra $5,000 – $10,000 per year students will be asked to pay will be ploughed back into research for prestige reasons.  It won’t be the access disaster some are predicting, but it’s a bad deal for students nonetheless.

August 28

The Australian Experiment (Part 1)

I spent a good part of this month in Australia, talking to people about the radical program introduced in the May budget.  The basics of the system are as follows:

  • A recently-introduced plan of uncapped places, with the government funding as many students as institutions wish to admit, was maintained; however, the average amount of the per-student subsidy will drop by 20%;
  • Tuition fees will be fully de-regulated.  Institutions will be able to charge what they like, subject to the stipulation that fees for domestic students cannot rise above fees for international ones;
  • Institutions will be required to sequester 20% of all tuition revenue over and above what is required to make good on the cut in government subsidies for students’ scholarships (in conception, this is identical to the 30% tuition set-aside in Ontario);
  • The repayment threshold for student loans will be lowered slightly, and students will for the first time be required to pay real interest on the loan (currently, they pay the loan back in constant inflation-adjusted dollars).

The package is seen as favouring the more prestigious and research-intensive universities (known as the Group of 8 – their version of our U-15), since – it is believed – they will be able to charge the higher fees.  And they have made their intentions plain as to where they will go under the new regime: significantly higher fees, reduced undergraduate intake, and ploughing all the extra money back into research, research, and more research.

(Australian universities conceive of research, rankings, and international students as a kind of iron triangle – they need international students for money, good rankings to attract international students, and research to drive their rankings results.  It’s patently not true – the most international-student-intensive university in Australia is Federation University in Ballarat, which is emphatically not research-intensive – but somehow it drives the discourse to the exclusion of all else.)

As might be expected, the sector isn’t unanimously in support.  Students don’t want the higher debt and the academics union are backing them (though, if universities do get more money, they’ll surely be arguing for much of it to be directed their way through higher salaries).  The most on-the-fence group are the lower-prestige universities, who aren’t convinced the market will let them charge higher fees, and hence see the whole exercise as one of increasing system stratification.

The budget is at risk of not passing; control of the Senate rests with a number of small parties – some of whom are definitely in wing-nut territory (google the terms: “Motoring Enthusiast Party”, “preference whisperers”, and “Jacquie Lambie radio interview” to get a sense of the nuttiness), and the budget contains a number of other highly-controversial reforms, such as introducing fees for doctors visit.  Some think that there’s a deal to be had – acceptance of the fees in return for ditching the student loan interest rate proposals – but this deal is far from certain.

Complicating matters further is that the G-8 seem determined to go ahead and announce their 2016 fee structure next month, even before the package’s fate in the Senate is assured.  This, they claim, is necessary for guidance counsellors and high school juniors to make decisions.  But it’s a hell of a risk – one which raises the stakes for the government.

And what will happen if it passes?  Tune in tomorrow.

August 27

The Problem at the Back End

Yesterday, we talked about how the Canadian aid system was both generous and clumsily organized, what with most of it being delivered through tax credits and loan remissions – neither of which shows up directly to reduce tuition at the time of registration.  This is something that needs to change; if we’re giving students so much money, we should at last give it to them in a form that is both useful and comprehensible.  So why can’t we do it?

Our tax credit system goes back to the Diefenbaker era (backstory here), but it really took off in the late 1990s when the education amount increased nearly eightfold in less than a decade.  Why such an increase?  Well, partly it’s because tax credits are politically convenient at election time – you can count them either as new spending or as tax cuts, depending on what audience you’re talking to.  But it’s also because provinces can’t hijack federal tax credits by diverting the money to other causes (e.g. the Harris or Charest tax cuts), or simply allow federal dollars to push out provincial ones (e.g. the Canada Study Grants for Students with Dependants).  Tax credits go straight from Ottawa to taxpayers.  So even though tax credits are a sub-optimal way of delivering student aid, they are superlative as a displacement-free method of delivering federal money.  If you don’t understand why that matters, you haven’t spent enough time in Ottawa.

Where provinces have built on tax credit nuttiness is by giving out large scale tuition rebates through the tax system.  Most provinces haven’t succumbed to it, but Saskatchewan, Manitoba, and New Brunswick have – and frankly they’ve gone bananas with it.  In Manitoba and Saskatchewan, students who finish a first undergraduate degree or college program qualify for one of those province’s graduate rebates. For graduates that remain in the province, these programs are so generous that literally everyone ends up receiving more in subsidy than they pay in tuition.  In Saskatchewan, students from families earning over $120,000 can end up receiving $34,000 in various forms of grants tax credits, while paying only $26,000 in tuition if they graduate in four years and stay in the province for seven years after graduation.

This, frankly, is nuts.  What’s the point of paying graduates $20K each to stay in Saskatchewan, regardless of pre- or post-study income?  Even if that argument had a smidgeon of validity fifteen years ago when it was introduced (amidst significant out-migration), now that Saskatchewan is a “have” province it looks more than faintly ridiculous.  Doesn’t Saskatchewan have more pressing needs than to pay middle-class kids to do what they were going to do anyway?

If we’re spending all this money making higher education cheap, we should spend it in a way that is more useful and comprehensible than tax credits.  The problem is, these kinds of subsidies are difficult to re-allocate because they benefit so many people.  But finding a way to re-allocate tax credit money is a must.  Stakeholder representative groups could do everyone a service by banding together and working out ways to give governments the necessary political cover to do the right thing.

Page 18 of 80« First...10...1617181920...304050...Last »