HESA

Higher Education Strategy Associates

August 26

What Students Really Pay

In a couple of weeks, Statistics Canada will publish its annual Tuition and Living Accommodation Cost (TLAC) survey, which is an annual excuse to allow the usual suspects to complain about tuition fees.  But sticker price is only part of the equation: while governments and institutions ask students to pay for part of the educational costs, they also find ways to lessen the burden through subsidies like grants, loan remission, and tax expenditures.  And Statscan never bothers to count that stuff.

Today, we at HESA are releasing a publication called The Many Prices of Knowledge: How Tuition and Subsidies Interact in Canadian Higher Education.  Unlike any previous publication, it looks not just at a single sticker price, but rather at the many different possible prices that students face depending on their situation.  We take ten student cases (e.g. first-year dependent student in college, family income = $80,000; married university student, spousal income = $40,000; etc.), and we examine how much each student would be able to receive in grants, tax credits, and loan remission in each of the ten provinces.  It thus allows us to compare up-front net tuition (i.e. tuition minus grants) and all-inclusive net tuition (i.e. tuition minus all subsidies) not just across provinces, but also across different students within a single province.

Some nuggets:

  • On average, a first-year, first-time student attending university direct from high-school, with a family income of $40,000 or less receives $63 more in subsidies than they pay in tuition, after all subsidies – including graduate rebates – are accounted for (i.e. they pay net zero tuition on an all-inclusive basis).  If they attend college, they receive roughly $1,880 more in subsidies than they pay in tuition (i.e. -$1800 tuition);
  • A first-year, first-time student attending university from a family with $40K in Quebec, after all government subsidies, pays -$393 in all-inclusive net tuition.  In Ontario, the same student pays -$200.  But if we were to include institutional aid, the student in Ontario would likely be the one better off, since students in Ontario with entering averages over 80% regularly get $1,000 entrance awards, while students in Quebec tend not to.  For some students at least, Ontario is cheaper than Quebec;
  • On average, college students who are also single parents receive something on the order of $11,000 in non-repayable aid – that is, about $8,500 over and above the cost of tuition.   In effect, it seems to be the policy of nearly all Canadian governments to provide single parents with tuition plus the cost of raising kids in non-repayable aid, leaving the student to borrow only for his/her own living costs.

The upshot of the study is that Canada’s student aid system is indeed generous: in none of our case studies did we find a student who ended up paying more than 62% of the sticker price of tuition when all was said and done, and most paid far less.  But if that’s the case, why are complaints about tuition so rife?

Two reasons, basically.  First, Canada’s aid system may be generous, but it is also opaque.  We don’t communicate net prices effectively to students because institutions, the provinces, and Ottawa each want to get credit for their own contributions.  If you stacked all the student aid up in a comprehensible single pile, no one would get credit.  And we can’t have that.

The second reason is that Canada only provides about a third of its total grant aid at the point where students pay tuition fees.  Nearly all the rest, stupidly, arrives at the end of a year of studies.  More on that tomorrow.

August 25

Back to School 2014

Morning, all.  Ready for the new school year run-down?

One thing already clear is that pretty much the whole sector has finally come to grips with the reality that annual 4% increases in funding aren’t coming back any time soon. That’s causing institutions to think more strategically than they’ve had to in a long time, which is a Good Thing – the downside is that there appears to be some places where this hard thinking is leading to some fairly ugly clashes between management and labour (hel-lo Windsor!).  Contract negotiations this year could be very interesting.

Federally, the big story will likely be how to make the new Canada First Excellence Research Fund work.  Yeah, remember that?  Will it actually make big investments in big universities, the way its U-15 authors hoped?  Or will the usual Canadian political dynamic intervene and turn it into something that distributes excellence funding more widely?  My money’s on option 2, which means in a few years the U-15 will have to convince people to establish an even newer research fund-distributing body, with even tighter funding award criteria.  Fun times.

The spring budget – the last before the next federal election – will almost certainly be about tax cuts, but it will be interesting to see if the feds think there’s something to be gained politically in more higher education spending.  My guess is no, because CFERF was the Tories’ “big swing” for this mandate.  Provincial budgets, I suspect, will all be status quo, but you never know.  The fact is money’s tight, growth is slow, and almost no one (except possibly Alberta) is looking at good times ahead any time soon.  Frankly, we’re only one good financial crisis away from some more swingeing cuts in public spending.

And speaking of swingeing cuts, some of those are on the table Down Under, along with a very radical plan to de-regulate tuition fees.  I spent part of my summer in Australia and New Zealand learning about some of the many interesting policy developments going on there.  Both countries – so similar to our own, yet with intriguing differences – have important lessons for Canadians on the higher education and skills files, and I’ll be talking a lot about them over the next few months, starting Thursday, with a two-part primer on the Australian de-regulation imbroglio.

One of our big summer projects at HESA towers has been following up on our Net Zero Tuition work with a much more detailed look at how students with specific profiles fare in all ten provinces.  It’s possibly the most detailed look ever taken at how student financial assistance plays out on the ground with real students, and we’ll be releasing it tomorrow, and covering it on the blog both Tuesday and Wednesday.  I look forward to the feedback.

But the real event to look forward to this fall is… the Worst Back-to-School Story of the Year Award!  Now in its third year, this award highlights the most ludicrous, under-researched, over-egged story on higher education.  Previous winners include Carol Goar and Gary Mason.  So far this year the back-to-school journalism scene has been pretty quiet, so the field is wide open.  Do send in any nominations to info@higheredstrategy.com.

Back to work!

August 18

Credit-Transfer, Korean Style

There are not many genuinely unique ideas in higher education.  Today, we at HESA are releasing a paper about one of them: Korea’s Academic Credit Bank System (ACBS), available here.

Korea has long had a problem with credit transfer.  Its higher education institutions are fairly rigid in terms of admissions, and few like accepting transfer students.  Another big problem is that Korean males have to do two years (roughly – it depends on the service) of universal military service, and they tend to do it before they’ve finished their higher education. People wanting to combine credits taken from a variety of institutions while on duty were usually out of luck if they tried to get those credits counted at their old institution.

So, in the 1990s, the Korean government started experimenting with new forms of educational credential.  Their first attempt was to create something called the Self-Study Bachelor’s Degree, in which Koreans could basically take a bunch of challenge exams on their own, and obtain a credential.  But the pick-up rate on this was fairly low (lessons here for people who thought MOOCs would create a tsunami of change, had they cared to check), and so the search for solutions was back on.

Eventually, they hit on the answer: if we can’t get existing institutions to agree on transfer credit, why not create a new institution that specializes in granting transfer credit?  In fact, why not create an institution that doesn’t offer any credit of its own, but just develops degree standards and tells people what courses to take in order to meet them?  No more schools telling you: “yeah, you’ve already taken that methods pre-req class, but you haven’t taken our methods pre-req class, so you’ll need to do that material again” – if the class gets accredited by the ACBS, you can use it as a building block to an ACBS degree.

Does it work?  You might say that.  The ACBS now awards roughly 8% of all degrees in South Korea, second only to the Korea National Open Univerisy (KNOU).  It has curricula for over 200 individual fields of study, and increasingly, the ACBS target audience is older and more female, with more and more degrees coming in areas like social work and early childhood education.

Would the ACBS work well outside Korea?  Hard to say: it’s not clear, for instance, that a Canadian ACBS would get a lot of clients.  Yes, the Canadian credit transfer system outside British Columbia and Alberta (which have quite developed transfer mechanisms) is a less a “system” than a barrel full of ad-hoc solutions.  But as I showed back here and here, this ad-hoc-ery  actually works-out okay in practice.  So while a Credit Bank might be a much cleaner solution than the piecemeal kind of approach Ontario has taken through ONCAT, it may also be overkill.

Where this approach might make even more sense is in the US, where, as Cliff Adelman and others have shown, there are millions of people with unfinished degrees who want to get a degree, but who are worried about the time and expense of having to re-start their studies.  The main barrier, really, is: how do you get something like that past accreditation agencies?  But given the potential benefits, it’s a problem worth thinking about.

August 11

Improving Career Services Offices

Over the last few years, what with the recession and all, there has been increased pressure on post-secondary institutions to ensure that their graduates get jobs.  Though that’s substantially the result of things like curriculum and one’s own personal characteristics, landing a job also depends on being able to get interviews and to do well in them.  That’s where Career Services Offices (CSOs) come in.

Today, HESA released a paper that looks at CSOs and their activities.  The study explores two questions.  The first question deals specifically with university CSOs and what qualities and practices are associated with offices that receive high satisfaction ratings from their students.  The second question deals with college career services – here we did not have any outcome measures like the Globe and Mail, so we focussed on a relatively simple question: how does their structure and offerings differ from what we see in the university sector?

Let’s deal with that second question first: college CSOs tend to be smaller and less sophisticated than those at universities of the same size.  At first glance, that seems paradoxical – these are career-focussed organizations, aren’t they?  But the reason for this is fairly straightforward: to a large extent, the responsibility for making connections between students and employers resides at the level of the individual program rather than with some central, non-academic service provider – a lot of what takes place in a CSO at universities takes place in the classroom at colleges.

Now, to universities, and the question: what is it that makes for a good career services department?  To answer this question we interviewed CSO staff at high- medium- and low-performing institutions (as measured by the Globe and Mail’s pre-2012 student satisfaction surveys) to try to work out what practices distinguished the high-performers.  So what is it that makes for a really good career services office?  Turns out that the budget, staff size, and location of Career Services Offices aren’t really the issue.  What really matters are the following:

  • Use of Data.  Everybody collects data on their operations, but not everyone puts it to good use.  What distinguishes the very best CSOs is that they have an effective, regular feedback loop to make sure insights in the data are being used to modify the way services are delivered.
  • Teaching Job-seeking Skills.  Many CSOs view their mission as making as many links as possible between students and employers.  The very best-performing CSOs find ways to teach job search and interview skills to students, so that they can more effectively capitalize on any connections.
  • Better Outreach Within the Institution.  It’s easy to focus on making partnerships outside the institution.  The really successful CSOs also make partnerships inside the institution. One of the key relationships to be nurtured is academic staff.  Students, for better or for worse, view profs as frontline staff and ask them lots of questions about things like jobs and careers.  At many institutions, profs simply aren’t prepared for questions like that, and don’t know how to respond.  The best CSOs take the time to reach out to staff and partner with them to ensure they have tools at their disposal to answer those questions, and to direct students to the right resources at the CSOs.

If you want better career services, there’s your recipe.  Bonne chance.

July 28

Coming Soon to Ontario: a British Columbia Solution

Unless you’ve been out of the country, or under a rock, for the last couple of years, you’ll be at least vaguely familiar with the concept that the province of Ontario is broke.  So broke, in fact, that it has departed radically from previous practice and, back in 2012, effectively froze physicians’ pay for two years.  Not individual physicians, of course – but on aggregate.  A zero overall increase.  And the government is now working to try to extend this freeze for another couple of years.

That makes good sense.  Health care costs have historically risen at several percentage points above inflation, and have squeezed other parts of the provincial budget (education has held out OK, all things considered, but other budget lines have been less well-protected), and physician costs are a major item in the health care budget.  And so saving money in this area is to be welcomed, even if it is at the expense of physicians – who are a politically tricky group to offend.

So what might the provincial government think of the post-secondary sector handing its employees raises of 3-4% per year, on aggregate?  Do you think aggrieved doctors won’t point to this anomaly during this pay round?  Can you imagine any possible rejoinder the government could offer that would make the least bit of sense?

Don’t you think maybe this situation is about to come to a rather sudden end?

For whatever reason, universities in Ontario have not been able to resist rising wage pressure from full-time faculty.  Despite money getting tighter, they have felt compelled to sign agreements that are plainly beyond their means (Hel-LO, University of Ottawa!).  Some university presidents, unable to deal with this problem on their own, are saying that they need government to step in to play “bad cop”.

What does the “bad cop” look like?  Well, take a look at how BC has handled wage negotiations over the last few years.  There, universities (and all other public sector entities) are given a biennial mandate with respect to employee negotiations.  In 2010 and 2011, the deal was: negotiate what you like, but the net cost of any new compensation deal cannot exceed zero.  In 2012 and 2013, that was changed to: there could be increases, but they had to be balanced dollar-for-dollar with efficiency savings in other areas.  The 2010-11 arrangements, in conception at least, are close to the approach Ontario has taken with its physicians.  And it worked pretty well, at least in the sense that it has kept costs down with a minimum of political fuss.

Not everyone in Ontario agrees that the province needs to take the bad cop approach.  Quite a number of university presidents (you can probably guess which ones) oppose asking the province to legislate on their behalf; not only do they feel it morally incumbent on universities to manage their own books, but also it is tactical suicide to ask governments to legislate – once you go down that road, there’s no telling where they’ll stop micromanaging universities’ affairs.

The problem is, those presidents haven’t exactly been too successful at reigning-in costs, either (hello again, University of Ottawa!).  So it seems almost inevitable that some sort of BC-like solution is on the cards.  The idea that profs could gain 15% in salary over four years, while physicians get zero, simply isn’t politically tenable.

July 21

University of Saskatchewan Detritus

We all remember this spring’s controversy at the University of Saskatchewan over the firing of Robert Buckingham, which resulted in the resignation of the University’s Provost, Brett Fairbairn, and the firing of the President, Ilene Busch-Vishniac.  Despite all the coverage, a number of key questions were never answered, like “how could anyone possibly think firing a tenured professor was a good idea?”  And, “who’s idea was it to fire him anyway – the Provost’s or the President’s?”

We now have more insight, as Fairbairn recently released a five-page letter providing his perspective on events.  Two key points from his account:

  • The decision to fire Buckingham as Dean was a group decision.  The Provost, “leaders responsible for Faculty Relations, HR internal legal expertise and Communications”, and the President (by phone) were all present.  But the key question of whether to dismiss him from the university altogether was referred to HR for further study.  At this point Busch-Vishniac told Fairbarin: “I will stand behind any actions you deem necessary and will not second-guess”.
  • The decision to fire him from both jobs was the HR department’s recommendation.

How HR came to this conclusion isn’t clear; Fairbairn notes that it had happened before at U of S in a case where there had been an irreparable breakdown in relations between employer and employee. Without knowing the case to which he’s referring, it’s hard to know what to make of this.  Certainly, the employer-employee relationship with Buckingham as a dean was irreparably damaged (which is why they were correct to fire him); it’s not at all clear that he couldn’t have remained as a faculty member since he wouldn’t have had any real contact with any of the superiors whose trust he had abused as Dean.  For whatever reason, Fairbairn decided to take the “expert advice” from HR, and did so without looping back to the communications people to get their input (which might have been valuable) or checking with Bush-Vishniac.

Far from backing up Fairbairn as promised, Busch-Vishniac threw him under the bus and asked for his resignation three days later.  That was emphatically the wrong call.  From the moment she gave the go-ahead for Buckingham’s dismissal, it was clear that either both of them would stay, or neither would.  Fairbairn decided to go, astutely noting that “the only thing worse than blame and recrimination among senior leaders is mutual recrimination among senior leaders”.

Fairbairn’s letter is a valuable peek into how crises get managed at universities.  I think it shows him as a manager with mostly the right instincts, but who erred in accepting some terrible advice from professionals who should have known better.  Others – mostly people who genuinely have no insight into how major organizations function – will probably see this distinction as irrelevant since the real crime was firing Buckingham as a Dean in the first place.  Former CAUT director James Turk, in particular, has made the “managers should have a right to criticise each other publicly” case – to which the correct response is: “and how much freedom did Turk allow his staff and executive to criticise his management as CAUT director?”.

If I were at the University of Saskatchewan, though, my main question after reading Fairbairn’s letter would be: “how is it that the HR department got off comparatively lightly?”  Food for thought.

July 14

Paul Cappon, Again

You may have noticed that Paul Cappon – former President of the Canadian Council of Learning – had a paper out last week about how what the country really needs is more federal leadership in education.  It is desperately thin.

Cappon starts by dubiously claiming that Canada is in some sort of education crisis.  Canada’s position as a world-leader in PSE attainment is waved away thusly: “this assertion holds little practical value when the competencies of those participants are at the low end compared with other countries”.  In fact, PIAAC data shows that graduates of Canadian universities, on average, have literacy skills above the OECD average.  Cappon backs up his claim with a footnote referencing this Statscan piece, but said document does not reference post-secondary graduates.  Hmm.

And on it goes.  He cites a Conference Board paper putting a $24 billion price tag on skills shortages as evidence of “decline”, even though there’s no evidence that this figure is any worse than it has ever been (Bank of Canada data suggests that skills shortages are always with us, and were worse than at present for most of the aughts), or that a similar methodology would not lead to even bigger figures in other countries.  He cites a low proportion of STEM graduates in engineering as cause for concern, despite a total lack of evidence that this percentage has anything at all to do with economic growth.

Ridiculously, he proclaims that the Canadian education system (not higher education – all education from kindergarden onwards) is less effective than the Swiss because they beat us on some per capita measures of research output (which are of course largely funded through pockets entirely separate from the education budget).  Yes, really.  For someone so in favour of evidence-based policy, Cappon’s grasp on what actually constitutes evidence is shockingly tenuous.

Having cobbled together a pseudo-crisis, Cappon concludes that “the principal cause of our relative regression is that other countries are coordinating national efforts and establishing national goals… Canada, with no national office or ministry for education, is mired in inertia, each province and territory doing its best in relative isolation.” Now there is no evidence at all that the existence of national systems or national goals makes a damn bit of difference to PIAAC outcomes.  Germany comes in for all sorts of praise because of its co-operation between national and state governments, but their PIAAC and PISA outcomes are actually worse than Canada’s.  Yet Cappon believes – without a single shred of evidence – that if only there were some entity based in Ottawa that could exercise leadership in education that everything would be better.

Two points: first, our nation rests on a very simple compromise.  Back in 1864, the *only* way in which Catholic, Francophone Lower Canada could be tempted into agreeing to a federal government with representation by population was if that new level of government never, ever, ever got its hands on education.  That was, and is, the deal.  It’s not going to change.  Period.

Second, everyone needs to remember that there was a time not long ago that the Director-General of CMEC suggested exactly such a body to some deputy ministers in Ottawa, and Ottawa created a body not dissimilar to what Cappon is describing.  But said CMEC DG didn’t tell the provinces he was negotiating this deal with the feds.  When he was offered the leadership of the new organization, he jumped, taking several CMEC senior staff with him.  The result was that provinces, feeling betrayed by the DG’s chicanery, refused to work with the new organization.  As a result, it achieved little before being shut down in 2011.

That DG, of course, was Paul Cappon.

This report is essentially a self-justification for a dishonest act performed nearly a decade ago, rather than a genuine contribution to public policy.  If Cappon actually cared about repairing federal-provincial relations around learning, a mea culpa would have been more appropriate.

July 07

How to Measure Teaching Quality

One of the main struggles with measuring performance in higher education – whether of departments, faculties, or institutions – is how to measure the quality of teaching.

Teaching does not go entirely unmeasured in higher education.  Individual courses are rated by students through course evaluation surveys, which occur at the end of each semester.  The results of these evaluations do have some bearing on hiring, pay, and promotion (though how much bearing varies significantly from place to place), but these data are never aggregated to allow comparisons of quality of instruction across departments or institutions.  That’s partly because faculty unions are wary about using individual professors’ performance data as an input for anything other than pay and promotion decisions, but it also suits the interests of the research-intensive universities who do not wish to see the creation of a metric that would put them at a disadvantage vis-a-vis their less-research-intensive brethren (which is also why course evaluations differ from one institution to the next).

Some people try to get around the comparability issue by asking students about teaching generally at their institution.  In European rankings (and Canada’s old Globe and Mail rankings), many of which have a survey component, students are simply asked questions about the quality of courses they are in.  This gets around the issue of using course evaluation data, but it doesn’t address a more fundamental problem, which is that a large proportion of academic staff essentially believes the whole process is inherently flawed because students are incapable of knowing quality teaching when they see it.  There is a bit of truth here: it has been established, for instance, that teachers who grade more leniently tend to get better course satisfaction scores.  But this is hardly a lethal argument.  Just control for average class grade before reporting the score.

It’s not as though there isn’t a broad consensus on what makes for good teaching.  Is the teacher clear about goals and expectations?  Does she/he communicate ideas effectively?  Is he or she available to students when needed?  Are students challenged to learn new material and apply this knowledge effectively?  Ask students those kinds of questions and you can get valid, comparable responses.  The results are more complicated to report than a simple satisfaction score, sure – but it’s not impossible to do so.  And because of that, it’s worth doing.

And even the simple questions like “was this a good course” might be more indicative than we think.  The typical push-back is “but you can’t really judge effectiveness until years later”.  Well, OK – let’s test a proposition.  Why not just ask students about a course they took a few years ago, and compare it with the answers they gave in a course evaluation at the time?  If they’re completely different, we can indeed start ignoring satisfaction types of questions.  But we might find that a good result today is in fact a pretty good proxy for results in a few years, and therefore we would be perfectly justified in using it as a measure of teaching quality.

Students may be inexperienced, but they’re not dumb.  We should keep that in mind when dismissing the results of teaching quality surveys.

June 30

The Effects of Tuition Fees (Part 2)

As I mentioned last week, a major paper I’ve been working on for over a year with colleagues from DZHW on the subject of the effects of fees was published last Monday by the EC (available here).  In my last post, I talked about how fees affected institutions – today, I want to talk about how they affect students.

In our report, we looked at case studies over 15 years (1995-2010) from nine countries – Austria, Canada, England, Finland, Germany, Hungary, Poland, Portugal, and South Korea.  These countries represent very different experiences.  Some have private higher education, others don’t.  Some have public sectors that change fees, while others don’t (Hungary and Poland are in the middle, where public universities provide education free for some while charging for others).  And looking across all the different cases, we found = the following:

1)      Most tuition increases have no perceptible effect on enrolment.  The only cases where clear-cut effects could be discerned was in England in 2012, where the increase was about $10,000 in a single year, and in the de-regulation of professional fees in Ontario in the mid-90s (where, bizarrely, low-income students were not affected, but middle-income students were).

2)      That said, there are also some clear-cut cases where tuition has been a driver of increased access.  In both Poland and South Korea, major increases in enrolments were driven by the existence of fee-supported places (mainly but not exclusively in private institutions).

3)      Though this is partly a matter of many countries having education data-sets that make Canada’s look enviable, there is very little evidence that changes in fees have done much to change the composition of the student body.  In every country where there is data, underrepresented groups have done better over time, regardless of the fee regime.  Even in the extreme case of England 2012, under-represented groups (the poorest income quintile, black and Asian students) tended to be less affected by the tuition increase than richer, whiter students. (The one exception here is older students, who were disproportionately affected by the changes.)  To the extent that late-entrants to higher education come from poorer backgrounds, this should be seen as a kind of hidden socio-economic effect of fees.

4)      Changes in tuition fees seem to have had no discernible effects on students’ choice of major, and few discernible effects on students’ decisions about where to study (in Canada, for instance, rates of out-of-province study are actually up over the last decade).

5)      It is not so much that fees themselves have no effect; rather, it is that in nearly all cases, fees are introduced with accompanying increases in student aid.  Sometimes it is paltry compared to the size of the fees required (e.g. South Korea in the 90s), sometimes it is implemented in a fairly clunky way (Germany, mid-2000s).  But it is always there to offset the worst effects of fees.  And in the case of England 2012, it was there to ensure that students weren’t required to pay a single extra penny of costs up-front, which seems to have had a major factor in limiting the impact of the world’s largest-ever tuition increase (in the short-term at least).

The lesson here?  Unless you’re planning on going England-style crazy, international evidence fee increases are unlikely to affect access in a measurable way.

June 23

The Effects of Tuition Fees (Part 1)

For the last eighteen months or so, I’ve been working on a project with colleagues Dominic Orr and Johannes Wespel of the Deutsche Zentrum für Hochschul- und Wissenschaftsforschung (DZHW) for the European Commission, looking at the effects of changes in tuition fees and fee policies on institutions and students.  The Commission published the results on Friday, and I want to tell you a little bit about them – this week I’ll be telling you about the effects on institutions, and next week I’ll summarize the results with respect to students.

The first question we answered had to do with whether or not a rise in tuition ultimately benefits higher education institutions.  Critics of fees sometimes suggest that extra fees do not in fact result in institutions receiving more money, because governments simply pull a fast one on the public and withdraw public money from the system, thus leaving institutions no better off.  Our examination of nine case studies revealed there were certainly some occasions where this was the case – Canada in the mid-90s, Austria in 2001, and the UK in 2012 – but that in the majority of cases fee increases were accompanied by stable or increased government funding.  Moreover, in all the cases where there was an accompanying decrease in public funding, it was signalled well in advance by governments, and indeed the increase in fees was deliberately designed to be a replacement for public funds. We did not find a case where a government “pulled a fast one”.

The second question we asked was how universities reacted to the introduction of fees: did they suddenly start chasing money and becoming much more sensitive to the demands of students and donors?  The answer, by and large, was no, for three reasons.  First, tuition isn’t the only financial incentive on offer to institutions; particularly if they are already funded on a per-student basis, the introduction or increase of fees isn’t likely to change behaviour.  Second, institutions won’t go after fees in ways that they think will negatively affect their prestige.  In Germany for instance, many universities have considerable latitude to raise income via teaching through continuing-education-like programs, but effectively they don’t do this, because they believe that engaging in that sort of activity isn’t prestige-enhancing.  And third, institutions often delay altering their behaviour too much because they don’t believe government policy will “stick”.  In Germany, specifically, the feeling was that the introduction of fees was unlikely to last and so there was no point in getting too invested in attracting new students to take advantage of it.

In fact, although fees in public institutions are often touted as a way to make universities more flexible and more responsive to business, the labour force, etc., this never actually works in reality, because universities are saddled with enormous legacy costs (you can close a program, but you still have to pay the profs), and have a particular self-image that means  they closely-tied to traditional ways.  What does seem to work – at least to some degree – is to allow the emergence of new types of higher education institutions altogether.  In Poland, it was only the emergence of private universities that allowed the system to take on the explosion of demand in the 1990s.  In Finland, an entirely new type of higher education institution (ammattikorkeakoulu or “Polytechnics”) was developed to take care of applied education, and accounted for 80% of all enrolment growth since 1995.

Next week: the effects on students.  See you then.

Page 7 of 68« First...56789...203040...Last »