HESA

Higher Education Strategy Associates

Tag Archives: Netherlands

October 04

New Quality Measurement Initiatives

One of the holy grails in higher education – if you’re on the government or management side of things, anyway – is to find some means of actually measuring institutional effectiveness.  It’s all very well to note that alumni at Harvard, Oxford, U of T (pick an elite university, any elite university) tend to go on to great things.  But how much of that has to do with them being prestigious and selective enough to only take the cream of the crop?  How can we measure the impact of the institution itself?

Rankings, of course, were one early way to try to get at this, but they mostly looked at inputs, not outputs.  Next came surveys of student “engagement”, which were OK as far as they went but didn’t really tell you anything about institutional performance (though it did tell you something about curriculum and resources).  Then came the Collegiate Learning Assessment and later the OECD’s attempt to build on it, which was called the Assessment of Higher Education Learning Outcomes, or AHELO.  AHELO was of course unceremoniously murdered two years ago by the more elite higher education institutions and their representatives (hello, @univcan and @aceducation!) who didn’t like its potential to be used as a ranking (and, in fairness, the OECD probably leant too hard in that direction during the development phase, which wasn’t politically wise).

So what’s been going on in quality measurement initiatives since then?  Well, two big ones you should know about.

The first is one being driven out of the Netherlands called CALOHEE (which is, sort of, short for “(Measuring and Comparing Achievements of Learning Outcomes in Higher Education in Europe).  It is being run by more or less the same crew that developed the Tuning Process about a decade ago, and who also participated in AHELO though they have broken with the OECD since then.  CALOHEE builds on Tuning and AHELO in the sense that it is trying to create a common framework for assessing how institutions are doing at developing students’ knowledge, skills and competencies.  It differs from AHELO in that if it is successful, you probably won’t be able to make league tables out of it.

One underlying assumption of AHELO was that all programs in a particular area (eg. Economics, Civil Engineering) were trying to impart the same knowledge skills and competencies – this was what made giving them a common test valid.  But CALOHEE assumes that there are inter-institutional differences that matter at the subject level.  And so while students will still get a common test, the scores will be broken up in ways that are relevant to each institution given the set of desired learning outcomes at each institution.  So Institution X’s overall score in History relative to institution Y’s is irrelevant, but their scores in, for instance, “social responsibility and civic awareness” or “abstract and analytical thinking” might be, if they both say that’s a desired learning outcome.  Thus, comparing learning outcomes in similar programs across institutions becomes possible, but only where both programs have similar goals.

The other big new initiative is south of the border and it’s called the Multi-State Collaborative to Advance Quality Student Learning (why can’t these things have better names?  This one’s so bad they don’t even bother with an initialism).  This project still focuses on institutional outcomes rather than program-level ones, which reflects a really basic difference of understanding of the purpose of undergraduate degrees between the US and Europe (the latter caring a whole lot less, it seems, about well-roundedness in institutional programming).  But, crucially in terms of generating acceptance in North America, it doesn’t base its assessment on a (likely low-stakes) test.  Rather, samples of ordinary student course work are scored according to various rubrics designed over a decade or more (see here for more on the rubrics and here for a very good Chronicle article on the project as a whole). This makes the outcomes measured more authentic, but implicitly the only things that can be measured are transversal skills (critical thinking, communication, etc) rather than subject-level material.  This will seem perfectly fine to many people (including governments), but it’s likely to be eyed suspiciously by faculty.

(Also, implicitly, scoring like this on a national scale will create a national cross-subject grade curve, because it will be possible to see how an 80 student in Engineering compares in to an 80 in history, or an 85 student at UMass to an 85 student at UWisconsin.  That should be fun.)

All interesting stuff and worth tracking.  But notice how none of it is happening in Canada.  Again.  I know that after 25 years in this business the lack of interest in measurable accountability by Canadian institutions shouldn’t annoy me, but it does.   As it should anyone who wants better higher education in this country.  We can do better.

March 26

Three Stories to Watch in Europe

Europe’s been reasonably quiet for the last few months as far as higher education is concerned, but there are now a number of interesting stories to watch.

Here’s the lowdown on three of them:

In Hungary, the ruling right wing Fidesz party has announced a wholesale change to the way it would fund higher education.  It’s looking to abolish (within the state system at least) a number of courses deemed to be “non-productive” (e.g. communications), and requiring others to become fully tuition-fee funded.  Tuition fees have a complicated history in Hungary.  Hungary adopted tuition fees in 1996, and applied them in a dual-track fashion (kids with good secondary marks could go for free, others could attend the same courses if they paid a fee).  In 2008, a voter-initiated referendum was held on the abolition of tuition fees, which passed by a 4-to-1 margin.  But this was never implemented: the party that promoted the referendum – Fidesz – promptly attained power and reneged on the deal (much to the relief of the universities who relied on the fees).

It’s fair to say that since attaining power, Fidesz’s policies towards higher education have been pretty nightmarish.  The number of funded places has declined by more than half; an attempt in 2013 to make free places contingent on students signing an agreement to work in Hungary for twice the length of their university course was foiled only by the European Commission (which took a dim view of the attempt to restrict mobility rights).  The attempted 2013 reforms drew sustained opposition from students and faculty – a rare event in a country where the governing party has a massive majority.  We’ll see how this new policy plays out.

In the Netherlands, there is a simply fascinating student uprising going on against “managerialist” universities.  It started when the University of Amsterdam announced that a number of different courses in the humanities would merge into a single liberal arts program.  This led to a two-week student sit-in at the Bungehuis (home of the humanities faculty) that ended in a police action, but students resumed the sit-in at a nearby building shortly thereafter.

The students’ critique is not, interestingly enough, about underfunding (the humanities faculty has done rather well in terms of funding recently, despite a small drop in enrolments), but rather about the secretive and “anti-democratic” nature of the modern university and – echoes for Canada here – universities losing money on property deals.  It’s struck enough of a chord that the university has put forward a ten-point plan to meet the students’ demands.   On the face of it, there are some big steps forward here, though likely not enough to satisfy protesters, who may feel they’re on a roll: copycat protests broke-out last week at the London School of Economics.  My guess is that this peters out in a week or two, but it may be the beginning of some valuable discussions about how universities are managed in Europe.

Finally, one consequence of the economic crisis in Russia is that students are not receiving their government bursaries.  Basically, what appears to have happened is that cash-strapped universities have raided funds received from government to pay for short-term costs (such as making payroll).  This probably isn’t more than a one-week story – eventually bursaries will be sorted out.  But it’s indicative of the kinds of problems Russian higher education – indeed, all Russian institutions – are currently experiencing.

February 26

A More Productive Debate on “Differentiation”

One of the big topics over the past three years in Canada – and particularly in Ontario – has been that of “differentiation”.  The idea of differentiation as a boon to the university system essentially traces back to Adam Smith.  Just as in Smith’s hypothetical pin factory production can be increased multi-fold, by having different workers work on different aspects of pin-making, so too can a university system  be made more productive by having institutions concentrate on different aspects of higher education.

It also gains some moral force from the observation that universities tend towards isomorphism.  It’s not that everyone wants to be Harvard or U of T, exactly (though there is a bit of that), but rather it’s that every institution has a drive to make itself more organizationally complex and more research-oriented.   And collegiality makes it difficult for institutions to actually prioritize some fields of study over others: the number of institutions that are genuinely excellent in some fields of study, and yet can’t actually say so out loud, or do anything to promote these areas, for fear of offending other internal constituencies, are too many to count.

So, OK, specialization.  But in what?  The most highly counterproductive aspect of the current debate on differentiation in Canada is that it tends to revolve on only one possible aspect of differentiation (research vs. teaching), and it assumes that the relevant unit of analysis is the institution, as opposed to the discipline.  Neither of these should be a given.

To see how the debate could be re-cast, have a look at how the Dutch have handled the differentiation issue.  A few years ago, the Government struck-up a Committee on the Future Sustainability of the Dutch Higher Education System, whose report was entitled Threefold Differentiation – it’s a very smart plea for greater mission differentiation, without creating a zero-sum game around research.

The key difference between the Dutch and Canadian approaches is the idea of institutional “profiling”.  Institutions can specialize in lots of different ways: the kinds of students it accepts, its involvement in knowledge exchange and interactions with business, its degree of orientation to internationalization and/or regional engagement.  It can also specialize somewhat in terms of its research profile (smaller institutions, for instance, might have doctoral programs in arts or science, but not both).  Thus, their views of differentiation are significantly more multi-dimensional than ours. And more dimensions = more ways for institutions to find a “win”.

A second key difference between the Dutch and Canadian approaches is the idea of performance and incentives.  There’s no pretence in the Netherlands that you can pull institutions away from isomorphism without some form of reward: institutions need to be rewarded for their profiles.  (In Canada, of course, the only incentives government provides is to be big and research-intensive.  And then they whine about institutions choosing not to differentiate themselves. Go figure.)  But the Dutch are equally clear that these rewards need to be based on transparent and measurable outcome indicators.  In Canada, we’ve sort of grasped the indicator bit, without necessarily tying it to rewarding specific profiles.

In short, there are specialization gains to be had through greater differentiation.  But as long as we define differentiation as being specifically about research vs. teaching, my guess is that we aren’t going to realize them.

December 11

Manageable Debt, Part 2

Yesterday, we looked at the principles underlying the discussion on manageable student debt; today we examine how Canadian governments try to help students manage debt, and whether or not their efforts are as efficient as they could be.

Manageable debt loads are a function of three things: total debt, interest rates, and student income.  The last of these three is only vaguely susceptible to government control, but governments can control program interest rates and total debt loads through direct subsidies.  What remains odd about Canadian student aid is that our governments use these tools in such an ineffective manner.

Let’s start with interest rates.  Unlike the Netherlands, where, throughout the loan’s life, student loans have one low interest rate (equal to the government rate of borrowing), our system has negative real interest rates while students are in school, and then strongly positive rates (prime plus 2.5%) during repayment.  Adopting a more Dutch-like system would raise debt-at-graduation slightly, because interest would be charged while students are in school — for the average student, the increase would be about $2000.  However, monthly repayments would actually be lowered by about 8% thanks to lower interest rates after graduation.  Since the Dutch approach is essentially cost-free, it’s a bit of a mystery why we haven’t adopted it, given its obvious positive affects on debt-management.

Or take remission.  What matters in terms of debt manageability is debt at the time a student finishes; yet, nearly all provinces base their debt-management programs on annual debt levels.  Thus, in Ontario, someone who borrows $6,900/year for four years graduates with $27,600 in debt and never sees a penny of remission, while students who borrow $10,000 for a single year (a pretty typical situation, given the rules we have about students becoming independent in their fifth year of studies) get a $3,000 debt write-down from the province.

Why do we do this when it makes so little sense from a debt management perspective?  The answer, mostly, is History.  Remission programs were introduced in the early 1990s to replace grant programs that had become unaffordable.  Since those old programs delivered money on an annual basis, it made political sense for these new ones to do so as well.

But maybe it’s time to take another look at this.  Alberta has long had a policy of capping total debt per degree; and, moreover, it varies the size of this cap by program – doctors, sensibly, are allowed to take on a lot more debt than humanities students.  Properly targeted, this approach could be substantially more efficient at ensuring manageable debt than what most provinces are currently doing.  An Alberta Advantage, indeed.

October 06

Interest on Student Loans – Time to Go Dutch

News out of the U.S. suggests that one possible casualty of that country’s budget crisis is the in-school interest subsidy on student loans. Since Canadian governments almost always end up copying the Americans on student aid eventually (see: income-based grants, rules on institutional designation, workforce-related loan forgiveness, etc.), this seems like a good time for Canada to review its own policies on student loan interest.

Some countries, like Germany and many developing countries, charge no interest at all on student loans (Newfoundland and Labrador eliminated interest on provincial student loans in its 2009 budget). This is a wastefully expensive way to run a student loan system (effectively, once interest rates are considered, it is paying students to borrow) and does a poor job of targeting public money to those who need it most. Other countries, like Australia, charge students inflation but no real interest – this is somewhat less wasteful but still not brilliant. Countries such as the Netherlands are charging students a rate based on the cost of government borrowing – that is, they aren’t subsidizing loans, but they’re providing it to students at a rate substantially below par. Then there are countries which charge students something close to a market rate, and use the “profit” to cover losses from defaults.

The U.S. and Canada are the only countries that try to mix different approaches – unbelievably cheap loans (i.e., full interest subsidies) while students are in school, and market-like rates when borrowers are in repayment. From a policy perspective this makes almost no sense whatsoever as the biggest subsidies go to people who borrow a lot, but who repay quickly. In a system where 20% of borrowers repay within two years (largely thanks to hefty gifts from family members), it’s hard to say that this is an effective way to spend money.

For about what we’re paying now, we could move to a Dutch scheme where we charge students the government rate of borrowing throughout their loans. It would increase student debt at graduation, but also make that significantly easier to pay back. The big winners would be people with low post-graduate incomes that require many years to pay back their debts; the big losers would be people who get family members to pay off their debts after graduation.

Which group would you rather got our tax dollars?