HESA

Higher Education Strategy Associates

March 19

The Canadian Way of Higher Education Subsidies

One of the biggest arguments in student assistance is about who to subsidize and why.  Unfortunately, because we are rarely explicit in the way we talk about subsidies, discussions tend to be a dialogue of the deaf.

One school of thought says we should subsidize students based on their parental income.  Students from poor families need more help to succeed than students from wealthier families, and so the former should pay less, and so we should pay them grants to reduce the net cost of attendance.  Then there’s a second school of thought, which says that the way to focus subsidies is to focus on needy graduates.  Forget the upfront subsidies: the people we need to support are the ones who don’t do well out of their education, and as a result remain low-income for years.   The third school of thought holds that everybody should receive the same subsidy no matter what their parents make, or what they make afterwards.  And then there’s a final school of thought, which says we should reward “good behaviour”, however defined.

Other countries are pretty explicit about their choices.  The Americans go pretty heavy on the parental income track (though the beneficial effects of this are counteracted by other funding and policy choices).  The UK is quite explicit about using the graduate income track as a means of subsidy: everybody borrows oodles of money to pay expensive tuition fees, and the ones who make out worse get these loans forgiven (eventually).  Much of Europe – especially Scandinavia – operates under the third school of thought, even at the price (in a few countries) of having an unnecessarily badly-funded system as a result.  The fourth view is surprisingly widely-held around the world, as it applies to anywhere that has a dual-track system of higher education (most of the ex-socialist countries of Europe, much of Latin American and Anglophone Africa) where “meritorious” students get first crack at the subsidies.

In Canada, we have a mish-mash of strategies, partly because we’re a federal system, so coherence is always a problem, and partly because we have a real tendency to reach for solutions before fully articulating the problem.  Our student aid system mostly works on the parental system, but allows students to declare independence relatively early (in practice, age 22), which effectively moves to the universal system.  We have a relatively generous Repayment Assistance Program (RAP), which uses the needy graduates approach.  And  though we aren’t especially heavy on merit awards, our $700M/year Canada Education Savings Grant, which rewards savers, is just a variant on the “reward good behaviour” approach.

You see, Canada just doesn’t do joined-up coordinated approaches; rather, we tend to just reach for whatever looks shiny, and implement it.  The result is a system that spends wildly in all directions, with nothing resembling an underlying philosophy.  Each individual program is arguably successful on its own terms, but the result is a system tha is arguably less successful than it could be if we focused spending on one or two of these pathways.

March 18

How ICRs can Become Graduate Taxes: The Case of England

As noted yesterday, graduate taxes and income-contingent loans have many similar features.  They both defer payments until after graduation, and they are usually payable as a percentage of marginal income above a given threshold.  In England right now, the payment scheme on ICR loans is that students pay 9% of whatever income they earn over £21,000 (roughly C$38,000).  The difference between the two is that with a loan you have a set amount to pay, and when it’s paid you’re finished.  With a graduate tax there is no principal, so you just keeping paying that fraction of your income for as long as the tax lasts.

That sounds like a simple and clear delineation, right?  Well, here’s a twist: what if the loan were so big that you had no practical chance of ever paying it off at the set repayment rate?  What would the difference between an ICR and a grad tax be then?  The answer is: practically nothing – and that’s exactly where England finds itself right now.

Let’s step back a bit: in 2010, the UK government decided to let institutions charge tuition up to £9000.  They also decided to allow students to borrow this amount for tuition (plus more, again, for living expenses) under the repayment scheme described above.  When they did this, they were under the misapprehension that universities might actually try to compete for students on price, and hence assumed an average tuition of about £7000.  Rather predictably, average tuition shot straight to £8500.  As a result, it’s quite common for students to be borrowing £12-13,000 per year, or £36-39,000 for a degree (that’s C$66-72,000 – yes, really).

Crazy, right?  Cue all the “intolerable debt burden” stuff.  But wait: these loans aren’t like the ones we’re used to.  Repayment is based on your income rather than size of debt – no graduate is ever required to pay more than 9% of their income over £21,000 in any given year, so the burden in any given year is pretty limited.  And – here’s the kicker – the loan gets forgiven after 30 years.  So, if you don’t finish paying, your obligation disappears without you having any debt overhang. Exactly like a Graduate Tax.

How many won’t pay it off?  Well, these things are difficult to predict, but even over 30 years, paying 9% of your income over $38,000 isn’t likely to completely pay off very many of these loans.  The government’s own financial forecasts are that 35-40% of the total net present value of the loans will have to be forgiven (others put it 8-10% higher).  At a rough estimate, that probably means 70 to 80% of all borrowers will see some loan forgiveness.

At this point you start to wonder if debt numbers really matter in this system.  Forget ICR: for most people, the current system is simply one in which government transfers billions of pounds in 2014 to institutions using student loans as a kind of voucher system, then turns a portion of those loans into student grants in 2044 via loan forgiveness.  In the meantime, graduates pay a 9% surtax on income over £21,000.

Altogether, a very wacky system.  Not a model for anyone, really.

March 17

Oregon’s “Pay It Forward” Scheme and the ICR vs. Graduate Tax Problem

You may have heard some rumblings from south of the border over the past few months with respect to a program called Pay It Forward (PIF).  The brainchild of a student group called Students for Educational Debt Reform, this idea was picked up by the Oregon assembly last summer; within a few months, over a dozen state governments were examining similar draft legislation.

The basics of the program are these: instead of paying tuition, students agree to pay a percentage of their future income (the percentages vary by state – in Oregon it’s 0.75% per year of study) for 20 years after graduation.  Some people mistook this for a version of income-contingent loans because it emphasized paying for school after-the-fact rather than up-front, and also because repayments were to be made as a function of income.  But there’s one key difference.  Loans have a limited liability: once you pay off the principal and interest, you’re done.  With PIF, there is no principal – once you start paying into a hypothecated fund, destined for the state’s higher education institutions, you keep on paying for 20 years no matter what.  This is formally known as a “graduate tax”.

Graduate taxes tend to be more progressive than income-contingent loans.  If you’re at the bottom of the income scale, you probably come out better off – you simply never pay anything.  If you’re at the top of the income scale, you’re likely going to pay a lot more because a portion of your income will go into public coffers long after you’d likely have paid off a loan.  Interestingly, the famous Yale Tuition Postponement Option of the early 1970s (designed by Nobellist James Tobin, and used by Bill Clinton when he attended law school there) went off the rails for precisely this reason – the richer students got tired of paying for the poorer ones, and started making a fuss.

One downside to a graduate tax is that it’s harder to collect than a loan.  In the US, for instance, it’s hard to imagine enforcing something like PIF, unless it was instituted nationally (if someone moved from Portland to Chicago, would Illinois be responsible for collecting the PIF contribution?).  A graduate tax was in fact examined relatively thoroughly not once but twice in England (the 1997 Dearing Report and the 2005 fee reform), and was rejected precisely because of concerns about grads evading repayment through emigration.

Another downside is: where exactly does the money come from while you’re waiting for graduates to start earning money?  If tuition is covering 40% of institutional expenditure, someone has to make that income good over the 20 or so years before the grad tax makes up the difference.  It’s not clear who that might be; if the state had money to do this, it probably wouldn’t be faffing around with ideas like PIF.  You could securitize the revenue stream, of course, but that also might get tricky.  Income-contingent loans lack graduate taxes’ most potentially progressive features, but they do have the advantage of: a) being collectable, and b) producing income for institutions in the short term.

There is of course one country that is trying very hard to merge the ideas of ICR and graduate taxes, with some really odd results.  More on the English experiment tomorrow.

March 14

Canadian Higher Ed Exceptionalism, Part 1 (An Occasional Series)

For awhile now, I’ve been writing about other national systems of higher education in our, “Better Know a Higher Ed System” series, in part to throw Canada’s own policy system into sharp relief. But sometimes it’s better to look at some things a bit more directly, so today I want to start exploring some areas where Canada really is an exception, globally.  And there’s nowhere we stick out more than in the way we admit students to university.

There are a limited number of ways to admit people to universities.  One of the most common is simply to use the scores from a common secondary matriculation exam as the basis for admission decisions.  Most of la Francophonie works this way, since they’ve all modelled themselves on France’s Baccalaureate.

Another option is to have a national university entrance exam, separate from matriculation.  The most famous of these is China’s gaokao, which draws on a millennia-long Chinese tradition, but which is in fact only 35 years old, and a product of Deng Xiaoping’s post-Mao reforms.  National exams are often a response to widespread cheating and corruption in schools.  In 2004, Russia introduced a new national exam with heavy security measures specifically to try to weed out academic corruption (it was only partially successful – since getting into university is a way for Russian males to avoid the draft, the impetus for academic corruption is pretty powerful).

In some places, individual universities have their own entrance exam, though these tend to exist only where a national exam is already in place.  Japan and Romania are two examples of this: in both countries, the more “elite” universities (e.g. Tokyo, Kyoto, Politehnica, Bucharest) have chosen to ditch the national exams and establish their own, for reasons of prestige, if nothing else.  And then, finally, you have the American options – not a university entrance exam but a national aptitude test, such as the SAT or the ACT.

So, now imagine trying to explain to foreigners how students get accepted to university in Canada.  Only Alberta, with its matriculation exams, has anything like the kind of standardized testing seen almost everywhere else in the world. In the rest of the country, you’re admitted entirely on the basis of grades based on high school marks predicated, to a considerable extent, on work portfolios rather than exams, and where grading standards between schools are only loosely consistent.  To the extent that there is fairness at all, it comes through the informal judgement of hundreds of admissions officers who, through simple experience, “know” which schools are easy graders, and take this into consideration when awarding places.

From the perspective of most other countries, the Canadian approach looks like sheer lunacy.  The scope for corruption in our system is enormous, but it’s simply not an issue here.  Everyone accepts the professional judgement of admissions officers and there are few complaints.  Such deep trust in the system is what has spared this country (outside Alberta, anyway) the kind of high-stakes exam nightmare that Americans endure.

In short, one thing that makes Canadian higher education exceptional is trust. That’s great, but trust is fragile.  It’s not something we should take for granted.

March 13

Teaching Loads, Fairness, and Productivity

It’s been a long time since I’ve been as disappointed by an article on higher education as I was by the Star’s coverage of the release of the new HEQCO paper on teaching and research productivity.  A really long time.

If you haven’t read the HEQCO paper yet, do so.  It’s great.  Using departmental websites, the authors (Linda Joncker and Martin Hicks) got a list of people teaching in Economics, Chemistry, and Philosophy at ten Ontario universities.  From course calendars, Google scholar, and tri-council grant databases, they were able to work out each professor’s course load, and whether or not they were “research active” (i.e. whether they had either published something or received a tri-council grant in the past three years).  On the basis of this, they could work out the teaching loads of profs who were research-active vs. those who were not (except in Philosophy, where they reckoned they couldn’t publish the data because there simply weren’t that many profs who met their definition of being research-active).  Here’s what they found:

Annual Course Load by Research Active Status

image001

 

 

 

 

 

 

 

 

 

 

 

 

To be clear, one course here is actually a half course.  So the finding that “non-research-active” professors teach less than one course extra means that there are, in fact, a heck of a lot of non-research-active profs who teach no extra courses, and who teach exactly the same amount as professors who are research active.

For reasons of fairness as much as  productivity, this seems like a result worth discussing, no?  And yet – here’s where the disappointment comes in – that doesn’t appear to be where the main actors in this little drama want to go with the story.  Rather, they appear to want to make irrelevant asides about the study itself.

Now I say “appear” because it’s possible they have more nuanced views on the subject, and the Star just turned the story into a he-said/she-said.  I want to give them the benefit of the doubt, because the objections printed by the Star are frankly ludicrous.  They amount to the following:

1)      Teaching involves more than classroom time, it’s preparation, grading, etc.  True, but so what?  The question is whether profs who don’t produce research should be asked to teach more.  The question of what “teaching” consists of is irrelevant.

2)      Number of courses taught is irrelevant – what matters is the number of students taught.  This is a slightly better argument, though I think most profs would say that the number of courses is a bigger factor in workload than the number of students (4 classes of 30 students is significantly harder than 3 of 40).  But for this to be a relevant argument, you’d need to prove that the profs without a research profile were actually teaching systematically larger classes than their research-active counterparts.  There’s no evidence either way on this point, though I personally would lay money against it.

Here’s the deal: you can quibble with the HEQCO data, but it needs to be acknowledged: i) that data could be better, but that it is institutions themselves who hold the data and are preventing this question from being examined in greater depth; and, ii) that this is the one of the best studies ever conducted on this topic in Canada.  Kvetching about definitions is only acceptable from those actively working to improve the data and make it public.  Anyone who’s kvetching, and not doing that, quite frankly deserves to be richly ignored.

March 12

The Skills “Crisis”: Why Good People Can’t Get Jobs

There’s a very slim volume out from Wharton Press called, Why Good People Can’t Get Jobs.  It’s by Peter Cappelli, a management professor from the University of Pennsylvania, who adapted the book from a series of articles he wrote for the Wall Street Journal in 2010 and 2011.  Not all of it applies to Canada (it’s a very US-focussed book), but enough of it does that I think it’s worth a read for everyone with an interest in the skills debate.

The book takes a simple “myth-busting” approach to the skills debate, much of which would be familiar to those of you who read this blog regularly (notably, with respect to how skills shortages are defined, and whether or not employers have considered the simple approach of “raising wages” as a way to solve said shortages).  But Cappelli makes three additional specific points that I think need to be more fully considered by everyone involved in the skills debate:

1)      Electronic job applications have revolutionized large-company hiring practices – but not necessarily for the better.  Because the internet has vastly lowered the barriers to application, companies have been flooded with applications.  Their response has been to automate the search process.  What tends to happen is that employers, in an attempt to keep numbers manageable, simply search for keywords on CVs – keywords that screen out far too many people.  This leads to a situation where the only people eligible for the job are people who have already done the job.  (There’s also an amusing anecdote about an HR firm CEO who suspected this was happening at his own company, and so sent in his own CV, incognito.  He was rejected.)

2)      Hiring new workers isn’t like shopping at Home Depot.  For any given body of work that a company undertakes, many different hiring strategies exist.  You could, for instance, do a job with a few highly-skilled workers and a lot of low-skilled workers, or an intermediate number of intermediate-skilled workers.  While certain job-specific skills are necessary, companies mainly need portfolios of skills across their entire workforce.  And the most important skill is the ability to work hard and be adaptable – precisely the kind of thing that hiring managers have trouble determining from keyword searches.

3)      North America (he says the US, but I think Canada fits this definition too) is the only place in the world that thinks of companies as consumers of skills.  Pretty much everywhere else in the world, they are thought of at least partly as producers of skills, because they do radical things like “training”.   If we have elevated expectations of our post-secondary institutions, why do we not have elevated expectations of employers as well?  Sure, it’s great when colleges and universities turn out prepared graduates, talented graduates, adaptable graduates.  But fully-trained, already-able-to-do-the-job graduates?  Employers have to be more realistic, and step up to the plate themselves.

All in all, a worthwhile contribution to the debate.  Pick it up.

March 11

A European Perspective on Three-Year Degrees

Glen Murray may be gone, but the allure of three-year bachelor’s degrees remains.  In future, my guess is that they’ll be much like the German apprenticeship system – an educational deus ex machina that successive generations of Canadian politicians will “discover” anew every couple of years.  So it’s probably worth asking, after roughly a decade of Bologna implementation, how Europeans themselves feel the whole experience is panning out. My own sense from talking to people across the continent is that, while no one thinks the three-year bachelor’s degrees are a failure, no one considers them a triumph, either.

For much of Europe, the adoption of a three-year bachelor’s degree was an act of division, not subtraction. That’s because in Germany, and most countries to its north and east, the pre-Bologna initial degree was not a 4-year bachelor’s but a 5- or even 6-year degree, equivalent to our master’s degree.  The move to divide these degrees into a 3-year bachelor’s and a 2-year master’s seemed to make sense for three reasons: first, because governments were indeed looking for ways to reduce student time-to-completion; second, the creation of a new credential seemed like an opportunity to get universities to focus on a new type of student, who wanted less theory and more practice; and third, for those who were dubious about the first two reasons, there was an overriding desire not to get left behind in the creation of a single, pan-European Higher Education Area with harmonized degree-lengths.

On the demand side, it’s been a bigger-than-expected challenge to get students to take shorter programs. In Germany, for instance, 80-90% of bachelor’s graduates go on to get a master’s, because everyone assumes that this is what businesses will want.  And they’re not wrong: in Finland, post-graduation employment rates for master’s grads is nearly 20 points higher than for bachelor’s grads (for university graduates, anyway – Polytechnic bachelor’s degree-holders do better).

It’s been no easier on the providers’ side.  When you’re used to giving 6 years of instruction to someone before giving them a credential, it’s not super-obvious how to cope with doing something useful in half the time.  In a number of cases, institutions left their five-year programs more or less unchanged, and just handed out a credential after three years (which makes at least some sense if 80-90% of people are going on anyway).  Where compression has actually occurred, what tends to happen is that institutions elect to keep courses on technical, disciplinary skills, and get rid of pesky things like electives, and courses that help build transversal skills.  The result is a set of much narrower, less flexible degrees than before.

At least part of the problem is that there hasn’t been a lot of progress in terms of finding ways to deliver both “soft skills” and technical skills in the same courses, which permit delivery of a more rounded curriculum without extending time-to-completion.  But innovative curriculum planners are in short supply at the best of times; it’s the sort of thing that probably should have been considered before engaging in a continent-wide educational experiment like this.

All of which is to say: three-year degrees are not easy to design or deliver, and they don’t necessarily work in the labour market, either.  Shorter completion times are good, but caveat emptor.

March 10

Could We Eliminate Sessionals if We Wanted To?

Last week, when I was writing about sessionals, I made the following statement:

“Had pay levels stayed constant in real terms over the last 15 years, and the surplus gone into hiring, the need for sessionals in Arts & Science would be practically nil”.

A number of you wrote to me, basically calling BS on my statement.  So I thought it would be worthwhile to show the math on this.

In 2001-02, there were 28,643 profs without administrative duties in Canada, collectively making $2.37 billion dollars, excluding benefits.  In 2009-10, there were 37,266 profs making $4.29 billion, also excluding benefits.  Adjusting for inflation, that’s a 56% increase in total compensation – but, of course, much of that is taken up by having more profs.  If we also control for the increase in the number of professors, what we have left is an increase of 18.8%, or $679 million (in 2009 dollars).

How many new hires could you make with that?  Well, the average assistant prof in 2009 made $90,000.  So, simple math would suggest that 7,544 new assistant profs could have been hired for that amount.  That means that had professors’ salaries stayed even in real terms, universities could have hired 16,347 new staff in that decade, instead of the 8,803 they actually did.

(Okay, I’m oversimplifying a bit.  There are transaction costs to landing new professors.  And hiring that many young profs all at once would just be storing up financial chaos 5-15 years down the road, as they gain in seniority.  So $679 million probably wouldn’t buy you that many new profs.  But on the other hand, if you were doing some hiring, you’d spend less money on sessionals, too, so it’s probably not far off.)

Would that number of new hires have eliminated the need for sessionals?  Hard to say, since we have no data either on the number of sessionals, or the number of courses they collectively teach.  What we can say is that if 7,500 professors had been hired, the student:faculty ratio would have fallen from 25:1 to 22:1, instead of rising – as, in fact, it did – to 27:1. That’s a pretty significant change no matter how you slice it.

(The question remains, though: would you want to give up sessionals, even if you could?  As I pointed out last week, in many programs sessionals perform a vital role of imparting practical, real-world experience to students.  And even where that’s not their primary function, they act as swing labour, helping institutions cope with sudden surges of students in particular fields of study.  They have their uses, you know.)

Now, I’m not suggesting that professors should have foregone all real wages increases over a decade, in order to increase the size of the professoriate.  But I am suggesting that universities have made some choices in terms of pay settlements that has affected their ability to hire enough staff to teach all the students they’ve taken on.  The consequence – as I noted before – is more sessionals.  But it very definitely did not need to be that way.

March 07

Those Times Higher Education Reputation Rankings, 2014

The Times Higher Education (THE) Rankings came out yesterday.  Compared to previous years, there was very little fanfare for the release this time.  And that’s probably because the results weren’t especially interesting.

The thing to understand about rankings like this is that they are both profoundly true and profoundly trivial.  A few universities are undoubtedly seen as global standards, and so will always be at the top of the pile.  Previous THE rankings have shown that there is a “Big Six” in terms of reputation: Harvard, Stanford, Berkeley, MIT, Cambridge, and Oxford – this year’s results again show that no one else comes close to them in term of reputation.  Then there are another thirty or so who can more or less hold their position at the top from year-to-year.

After that, though, results are capricious.  Below 50th position, the Times neither assigns specific ranks (it presents data in tens, i.e., 51st-60th, 61st-70th, etc.), nor publishes the actual reputation score, because even they don’t think the scores are reliable.  Just for kicks, I divided this year’s top 100 into those groups of ten – a top ten, a next ten, a third ten, and so on – to see how many institutions were in the same group last year.  Here’s what I got:

Number of Institutions in Each Ten-Place Grouping in THE Reputation Rankings, Which Remained in Same Grouping, 2013 and 2014

image001

 

 

 

 

 

 

 

 

 

 

 

 

You’d expect a little movement from group-to-group – someone 71st last year rising to 69th this year, for instance – but this is just silly.  Below about 40th spot, there’s a lot of essentially random survey noise because the scores are so tight together that even small variations can move an institution several places.

A few American universities rose spectacularly this year – Purdue came in at 49th, despite not even cracking the top 100 in the previous year’s rankings; overall, there were 47 (up 3 from last year) American universities in the top 100.  Seoul National University was the biggest riser within the top 50, going from 41st to 26th, which may suggest that people are noticing quality in Korean universities (Yonsei also cracked top 100 for the first time), or it may just mean more Koreans responded to the survey (within limits, national response rates do matter – THE re-weights responses by region, but not by country; if you’re in a region with a lot of countries, like Europe or Asia, and your numbers go up, it can tilt the balance a bit).  Surprisingly, Australian universities tanked in the survey.

The American result will sound odd to anyone who regularly reads the THE and believes their editorial line about the rise of the East and decline of West in higher education.  But what do you expect?  Reputation is a lagging indicator.  Why anyone thinks its worthy of measuring annually is a bit mysterious.

March 06

Sessionals

The plight of sessional lecturers (or, as they call them in the US, “adjuncts”) is possibly the only issue in higher education that generates even more overblown rhetoric than tuition fees.  Any time people start evoking slavery as a metaphor, you know perspective has flown the coop.

Though data on sessional numbers in Canada are non-existent, no one disputes that their numbers are rising, and that they are becoming an increasingly central part of major universities’ staffing plans.  In large Ontario universities, it’s not uncommon for certain faculties to have 40-50% of their total credit hours taught by sessionals.  Wage data is scarce, too, though last year University Affairs produced a worthwhile survey on sessionals’ working conditions.  The numbers vary from place to place, but let’s just say that relying solely on sessional wages must be pretty challenging.

A problem in generalizing about sessionals is that they come in two distinct varities.  First are the mid/late-career professionals who already make good money from full-time employment elsewhere, and who help provide relevant, up-to-date content based on practical experience in programs like Law and Nursing.  For them, sessional teaching is a way to pick up an extra cheque, and maybe have some fun doing it. Outside Arts & Science, this is the dominant model of sessionals, and universities are much the better for their presence.

In Arts & Sciences, on the other hand, sessionals are much more likely to be recent PhD graduates looking to get a foothold on the academic ladder.  Unable for the moment to make the tenure track, taking multiple sessional gigs lets them stay within the university system, but prevents them from doing what they (and indeed the entire higher ed system) value most: research.  As a result, being a sessional can sometimes take one further from the tenure track, rather than closer to it.  The sessional “crisis”, needless to say, focuses on this latter group, rather than on the professionals.

What’s truly bizarre about the discourse on sessionals are the frankly conspiratorial views of the cause of the “crisis”.  But there’s no mystery here: universities, for the most part, get paid by governments and students according to how much teaching they do; despite this, they pay their academic staff to spend roughly half their time doing stuff other than teaching.  Unsurprisingly, this results in there being more teaching duties than available teaching time.  Hence the need for sessionals (a need that has only grown larger as research has increased in importance).

And why is their pay so low?  Partly, it’s a free market and there’s a heck of a lot of people willing to do academic work for very little pay.  But partly it’s because institutions have a conscious choice to prioritize pay rises for existing full-time staff (gotta pay more for research excellence!) over hiring new full-time staff. Had pay levels stayed constant in real terms over the last 15 years, and the surplus gone into hiring, the need for sessionals in Arts & Science would be practically nil.

Basically, no one “decided” to create an academic underclass of sessionals.  Rather, they are an emergent property of a system where universities mostly earn money for teaching, but spend a hell of a lot of it doing research.

Page 7 of 62« First...56789...203040...Last »