HESA

Higher Education Strategy Associates

November 15

Ten Years of Global University Rankings

Last week, I had the honour of chairing a session at the Conference on World-Class Universities, in Shanghai.  Held on the 10th anniversary of the release of the first global rankings (both the Shanghai rankings and the Times Higher Ed Rankings – then run by QS – appeared for the first time in 2003).  And so it was a time for reflection: what have we learned over the past decade?

The usual well-worn criticisms were aired: international rankings privilege, the measurable (research) over the meaningful (teaching), they exalt the 1% over the 99%, they are a function of money not quality, they distort national priorities… you’ve heard the litany.  And these criticisms are no less true just because they’re old.  But there’s another side to the story.

In North America, the reaction to the global rankings phenomenon was muted – that’s because, fundamentally, these rankings measure how closely institutions come to aping Harvard and Stanford.  We all had a reasonably good idea of our pecking order.  What shocked Asian and European universities, and higher education ministries, to the core was to discover just how far behind America they were.  The first reactions, predictably, were anger and denial.  But once everyone had worked through these stages, the policy reaction was astonishingly strong.

It’s hard to find many governments in Europe or Asia that didn’t adopt policy initiatives in response to rankings.  Sure, some – like the empty exhortations to get X institutions into the top 20/100/500/whatever – were shallow and jejune.  Others – like institutional mergers in France and Scandinavia, or Kazakhstan setting up its own rankings to spur its institutions to greater heights – might have been of questionable value.

However, as a Dutch colleague of mine pointed out, rankings have pushed higher education to the front of the policy agenda in a way that nothing else – not even the vaunted Bologna Process – has done.  Country after country – Russia, Germany, Japan, Korea, Malaysia, and France, to name but a few – have poured money into excellence initiatives as a result of rankings.  We can quibble about whether the money could have been better spent, of course, but realistically, if that money hadn’t been spent on research, it would have gone to health or defence – not higher education.

But just as important, perhaps, is the fact that higher education quality is now a global discussion.  Prior to rankings, it was possible for universities to claim any kind of nonsense about their relative global pre-eminence (“no, really, Uzbekistan National U is just like Harvard”).  Now, it’s harder to hide.  Everybody has had to focus more on outputs.  Not always the right ones, obviously, but outputs nonetheless.  And that’s worth celebrating.  The sector as a whole, and on the whole, is better for it.

November 14

Canada’s Bologna Moment

If you can cast your mind back all of three weeks, before the Ford video(s) and Mike Duffy going kamikaze on the Prime Minister, there was some big news out of Ottawa about how a Canada-Europe Comprehensive Economic Trade Agreement (CETA) had finally been reached. The finer details of the deal are still unavailable, but one thing that has been promised all along is that this deal will permit the free movement of labour between Canada and Europe.  And that’s a reason for the higher education sector to pay attention.

Freedom of movement is pretty great, when it works.  But the problem with inter-jurisdictional freedom of movement is that it’s easier to achieve in theory than in practice.  Language barriers crop up, for one thing (even within Canada, lots of anglos who would like to move to Montreal don’t because their language skills aren’t good enough for the local labour market).  There’s idiotic regulatory barriers regarding credentials, for another.  But even where a trade agreement gets rid of credential-based regulatory barriers, there’s still the problem of whether employers actually recognize what a credential means, and can hire and pay people accordingly.

This was a problem in Europe back in the 1990s before there was a standard system of degrees, as there were a riot of different credentials on offer across the continent.  A German Diplom was a five-year technical credential, a French Diplome was a 2-year intermediate academic credential on the way to an undergraduate degree, an Armenian Diplom was a secondary school credential – what employer could keep all that straight?  Far easier just to hire a local, whose credential you understand.  So, even though the principle of free movement of labour existed in the European Union, the problem of general credential recognition meant that it was limited in practice.

This problem was a big reason why Europe’s governments got behind the Bologna Process.  Only by standardizing the structure of their higher education systems could they turn de jure mobility rights into a de facto mobility reality.  And so the question for Canada now, is: will this free-labour movement actually mean anything if our higher education systems aren’t aligned with Europe’s?  Canada can’t actually become part of the Bologna Process – that’s reserved for countries which are part of the Council of Europe – but there’s nothing saying we can’t harmonize our system with Bologna Processes.

There’s no guarantee, of course, that the benefits of a big shift like Bologna harmonization are in fact worth the hassle.  But there’s also no doubt that the signing of CETA means that the time to ask ourselves the big questions about Bologna, and its benefits, is now.

November 13

Using PIAAC to Measure Value-Added in Higher Ed: US Good, Australia Abysmal

A few weeks ago, when commenting on the PIAAC release, I noted that one could use the results to come up with a very rough-and-ready measure of “value added” in higher education.  PIAAC contains two relevant pieces of data for this: national mean literacy scores for students aged 16-19 completing upper-secondary education, and national mean literacy scores for students aged 16-29 who have completed Tertiary A.  Simply by subtracting the former from the latter, one arrives at a measure of “value added”, which I reproduce below in Figure 1 (the y-axis is difference in PIAAC scores; PIAAC is scored on a 500-point scale, where 7 points are considered to be equal to roughly 1 year of schooling).

Figure 1: Tertiary A value-added: Mean Tertiary A PIAAC Score Minus Mean Upper Secondary PIAAC Score

 

 

 

 

 

 

 

 

 

 

 

 

This is a bit crude, though; to be genuinely comparable, one needs to control for the proportion of upper-secondary graduates that actually go on to higher education.  Imagine two countries, both of which had the same mean score among upper secondary students, but country X enrols 20% of their upper secondary graduates in Tertiary A, and country Y enrols 40%.  One would expect a larger gap between mean high school and mean Tertiary A scores in country X than in country Y because, comparatively, it’s cherry-picking “better” students.  So we need some way to correct for that.

Fortunately, the OECD’s Education at a Glance provides data both on upper secondary graduation rates, and on tertiary attainment rates by age 30 (indicators A1.2a and A3.1b, if you’re following at home).  From those two figures one can calculate the proportion of upper secondary graduates who go to university.  Since PIAAC publishes not just means, but also scores for the 25th and 75th percentiles, one can now estimate the relevant threshold PIAAC score for getting into a Tertiary program (i.e. if 37.5% of your students get a degree, then the threshold secondary score will be halfway between the mean score and the 75th percentile score).  To get a value-added figure for Tertiary A, one can take the mean score for Tertiary A and subtract this threshold secondary score, rather than the mean secondary score.  The results look like this:

Figure 2: Tertiary A Value-Added: Mean Tertiary A PIAAC Score (16-29 yr.olds) Minus Threshold Upper Secondary PIAAC Score (16-19 yr-olds)

 

 

 

 

 

 

 

 

 

 

 

 

This change in methodology only slightly changes the results: absolute scores are smaller, reflecting the fact that the baseline is now higher, but the rank order of countries is similar.  The US looks pretty good, indicating that its higher education system may compensate for a weak K-12 system; Australia’s result, somehow, is negative, meaning that average PIAAC scores for Tertiary A graduates are lower than the threshold score for secondary graduates heading to university.  Which is a bit mind-boggling, to be honest.

I’m looking for feedback here, folks.  What do you think?  Does this work as a model for calculating tertiary value-added?  How could it be improved?  I’m all ears.

November 12

Jason Kenney, Liberal?

If you’re among the unhappy few in the habit of reading press releases from the Minster of Employment and Social Development Canada (ESDC, formerly HRSDC, formerly HRDC, etc., etc.), there’s one question that will almost certainly be on your mind these days: what exactly is Minister Jason Kenney up to?

After a period of quiet following the July re-shuffle, where he obtained the post, Kenney seems to have settled into a pattern of giving speeches which harp on the following themes:

  • Skills shortages.  Ignoring his own department’s official projections on the subject, he has been going on the usual (incorrect) rant about national crisis, more skilled trades, etc., etc.
  • Apprenticeships.  If only we had more of them, the skills shortage problem wouldn’t be so bad.  As such, he feels justified in hectoring provinces and telling them to smarten up, and to be more like Germany (see yesterday’s One Thought for more on why that’s a waste of time).
  • Education – Labour market misalignment.  If only our high schools taught trades; if only we didn’t funnel people towards universities where the teaching isn’t job-related, etc., etc.

What’s puzzling about all of this isn’t so much that it’s wrong as a diagnosis, but that it’s a wrongness that leads a long way into provincial jurisdiction.  Understandable if you’re a Liberal, perhaps.  But this is a Conservative government that came to office holding certain views about “watertight federalism”.  A government led by a Prime Minister who has only held one first ministers’ meeting in seven years; a PM who seems to believe quite sincerely in letting the provinces do what they want in areas of provincial jurisdiction.

So what is Kenney playing at?

The cynical view is that he’s trying to avoid talking about the Canada Jobs Grant (CJG), the hot mess the government announced in the last budget with no consultation, and which looks increasingly like dying an ignominious death because the provinces won’t play ball.  If he can get a symbolic win or two on some vaguely related fronts (maybe get the provinces to do something on apprenticeship training) then he can claim a win on skills shortages (which is the problem the CJG was ostensibly created to solve) before letting the CJG fade away.

But there’s another, altogether more interesting possibility, which is that the Tories are coming around to the historically Liberal position that advanced education has national economic implications too big for a responsible federal government to ignore.  On balance, that’s probably a good thing.  But Kenney will have to learn that as far as education systems are concerned the provinces have a combined 1289 more years of experience than Ottawa does in running them.  A little humility in approaching the subject wouldn’t go amiss.

November 11

Kevin Lynch is Horribly Wrong

It’s disappointing that Kevin Lynch, former head of the public service in Ottawa, is the latest victim of that peculiarly Canadian disease, where one’s casual knowledge of the German apprenticeship system leads one to lose all critical faculties – as demonstrated in this awful article from the weekend Globe.

The article starts by noting that, “in proficiency in numeracy and literacy among 16-24 year-olds…, Canada is lagging the results for the Nordic countries, Australia and Germany”.  Wrong.  Well, at least partly wrong.  In literacy, the statement is true with respect to Australia, Finland, and Sweden, but differences between Denmark, Germany, Norway, and Canada are statistically insignificant.  And in numeracy, Australia and Norway are identical to Canada (pgs. 72-82 of the PIAAC report).  The article then goes on to note that, “In preparing young Canadians… experiential education appears to be quite valuable, especially for the skilled trades, and here there may be much to learn from others” – “others” apparently meaning Germany.

Leaving aside the issue that German PIAAC results aren’t really better than Canadian ones, it’s hard to understand why Lynch thinks that – even in theory – higher participation in the skilled trades would have strong positive effects on PIAAC scores.  Literacy and numeracy “skills” are quite different than “skilled” trades.

Lynch then sails into the usual puppy love about German vocationalism.  It’s “impressive”, according to him, that 50% of German high school students end up in vocational programs.  As if this was a choice.  As if streaming didn’t enter into it.  As if this streaming didn’t end up disproportionately steering poorer Germans and immigrants into vocational schools.  As if Germans themselves hadn’t noted how this dynamic contributes to Germany having among the most unequal literacy and numeracy outcomes in the OECD.

From there, it’s the usual conflation of apprenticeships with skilled trades, a peculiarly Canadian mistake.  If you look at the top ten occupations for apprenticeships in Germany, only three are in (what we’d call) the skilled trades: mechanic, mechanical engineer, and cook (the other seven – retail sales, office administration, business administration, medical administration, hairdressing, wholesale and export sales, and “sales” – would mostly be taught at colleges in Canada).  And then, to wrap up the article, is the specious argument that this vocational education system is the cause of Germany’s current low level of unemployment (seven years ago Germany had an unemployment rate of 12% – were apprenticeships the cause of that, too?).

Lynch’s argument, then, is: German youth have better PIAAC skills than Canadian youths (partly wrong), PIAAC skills are improved by skilled trades (huh?), German apprenticeships = skilled trades (wrong), and apprenticeships = lower unemployment (wrong).

Experiential education is, of course, a good thing.  But how about we discuss it without all this irrelevant nonsense about Germany?  It doesn’t improve the quality of our debate, at all.

November 08

Better Know a Higher Ed System: Qatar

Until about fifteen years ago, Qatar was a pretty typical Gulf country as far as higher education was concerned. With a single state university, founded and staffed mostly by Egyptians, it satisfied the needs of the small domestic population.  But then the country decided to get serious about higher education.

With help from the RAND corporation, the ruling al-Thani family’s Qatar Foundation established something called Education City, an absolutely unique experiment in cross-border education.  Lots of institutions have set up campuses in foreign countries (including in the Gulf), but Education City was an attempt to create a single super-university, one faculty at a time.  Need an Engineering school?  Bring in Texas A&M.  Med School?  Call Cornell.  And so on.

(College of the North Atlantic – Qatar’s operation is not part of Education City but they were chosen on a similar basis.  I’m told that the Qataris wanted a Canadian partner for their college because: a) our reputation for excellence in vocational/professional education; and, b) our accents are easy to understand.  And then, among the Canadian schools, the Newfs won the competition.  Go figure.)

Since the Emir pays for everything, he got to attach conditions rarely seen in cross-border education, such as insisting that everyone who taught at these schools also had tenure at their home institution, and that knowledge-transfer mechanisms were in place with the existing Qatar University, so that eventually Qatari academics might be able to do all this themselves.  Because, at the end of the day, Education City provides an insurance policy against the day the LNG runs out, whereupon Qataris will have to live by their wits by being a knowledge-based economy (KBE).

But what made the whole thing surreal was the amount of money the al-Thanis sunk into the venture.  “Are we a KBE yet?” the Prince would ask, after having spent hundreds of millions on the new campus.

“Well, no,” would come the reply.  The Education City faculty are good, but KBEs need high-interaction between universities and tech-oriented businesses”.

“Ok,” said the Sheikh. “Let’s go get some of that”.  Boom, Cisco and Microsoft get a purpose-built $500M campus next to Education City, exempted of all rules about hiring Qatari employees.

Then: “So, do we have a KBE yet?”

“Well, no.  Turns out we can’t get good mid-career faculty to come here because there’s no research career for them.  We need granting councils”.

“How much do those cost?”

“Well, in advanced KBEs, governments spend about 1% of GDP on research”.

“Great, here’s the cheque, let me know when we have a KBE”.

For a whole bunch of reasons, this isn’t going to turn out as well as the Emir would like – but it’s still a fascinating approach to creating a higher education system from scratch.

November 07

International Alliances and Research Agreements

In business, companies strive to increase market share; in higher education, institutions compete for prestige.  This is why, despite whatever your told by people in universities, rankings are catnip to university administrations: by codifying prestige, they give institutions actual benchmarks against which they can measure themselves.

But prestige is actually much harder to amass than market share.  Markets can increase in size; prestige is a zero-sum affair (my prestige is related directly to your lack thereof).  And universities have fewer tools than businesses to extend their reach.  Mergers are not unheard of – indeed, the pressure of global rankings has been a factor behind a wave of institutional mergers in France, Russia, and Scandinavia – but these tend to be initiated by governments rather than institutions. Hostile take-overs are even less common (though UBC’s acquisition of a campus in the Okanagan shows it’s not impossible).

So, what’s a university to do?  Increasingly, the answer seems to be: “make strategic alliances”.

These tend to come in two forms: multi-institutional alliances (like Universitas 21, the Coimbra Group, and the like), and bilateral institutional deals.  Occasionally, the latter exercise can go as far as ambitious, near-institutional mergers (see the Monash-Warwick alliance, for instance), but it usually consists of much simpler initiatives – MOUs between two institutions, designed to promote co-operation in fairly general terms.  There’s a whole industry around this now – both QS and Thompson Reuters offer services to help institutions identify the most promising research partners.  And signing these MOUs seem to take up an increasing amount of time, effort, and air miles among senior managers.

So it’s fair to ask: do these MOUs make any difference at all to research output?  I have no hard evidence on this, but I suspect that returns are actually pretty meagre.  While inter-institutional co-operation is increasing all the time, for the most part these links are organic; that is, they arise spontaneously from the interaction of individual researchers coming up with cool ideas for collaboration, rather than from more top-down interactions.  While there’s a lot that governments and institutions can do to promote inter-institutional linkages in general, there’s a very limited amount that central administrations can do to promote specific linkages, that doesn’t quickly become counterproductive.

Having significant international research links is indeed the sign of a good university – the problem is that for managers under pressure to demonstrate results, organic growth isn’t fast enough.  The appeal of all these MOUs is that they give the appearance of rapid progress on internationalization.  But given the time and money expended on these things, some rigour is called for. This is an area where Board members can, and should, hold their administrations to account, and ask for some reasonable cost-benefit analysis.

November 06

Teach for Canada: Attack of the Kielberger Colonialists

I see the Globe has given some laudatory coverage to something called “Teach for Canada”.  The brain-child of a couple of Bay Street types (who have never themselves taught a class), the idea here is to shamelessly rip-off Teach for America (TFA) and apply its methods to the problem of low achievement among the country’s Aboriginal youth.

This is a terrible idea.  And here’s why:

TFA recruits top university graduates right out of their undergraduate program, to do two years of teaching in some of the country’s poorest communities.  The idea is that bright, energetic, idealistic grads can succeed in teaching underprivileged youth, where regular, salaried teachers cannot.  And indeed, there’s some significant evidence that the program does work in terms of raising Math scores – such as this new study from the US Department of Education.

There is, however, no reason to think that this approach would have a similar effect if deployed in Canada among Aboriginal youth.

The reason TFA delivers some modest results is not because their brief training stint and alternative certification is equally effective as teachers college; rather, it’s because the quality of teachers in US public schools is so patchy.  Teaching isn’t a valued profession in the US, and doesn’t attract top students; the teacher-training itself is pretty weak by international standards (see Amanda Ripley’s, The Smartest Kids in the World for a decent summary on this).  Also, schools serving the poorest students tend to get weaker teachers, because funding is local and their tax base can’t support high teacher pay – a problem Canada doesn’t really have to deal with.  Of course, Canada isn’t completely free from these problems, but they’re nowhere near as severe here as they are in the US.

Ah, you say, but what about on reserves?  Doesn’t the argument hold there?

Well, the pay argument certainly does.  But let’s be clear: TFA was designed for urban environments.  TFA staff get ongoing training and mentorship.  TFA staff, for the most part, still get to live in (or close to) hip urban areas.  TFA does not go to reserves in fly-in communities, in part because the number of volunteers would be pretty low, but also because the model itself simply wouldn’t work.

More importantly, perhaps: the idea that what First Nations need are a lot of well-meaning but inexperienced white kids showing up in their communities saying, “we’re here to help!” is plain ludicrous.  There’s no doubt that education for First Nations, particularly those from more remote communities, is in a desperate state, and deserving of vastly more money and policy attention than it currently receives.  But youthful enthusiasm just isn’t a substitute for money and teaching experience.

Teach for Canada is pure do-gooding Kielberger-style colonialism.  It’s an idea that deserves a quick death.

November 05

Owning the Podium

I’m sure many of you saw Western President, Amit Chakma’s, op-ed in the National Post last week, suggesting that Canadian universities need more government assistance to reach new heights of excellence, and “own the podium” in global academia.  I’ve been told that Chakma’s op-ed presages a new push by the U-15 for a dedicated set of “excellence funds” which, presumably, would end up mostly in the U-15′s own hands (for what is excellence if not research done by the U-15?).  All I can say is that the argument needs some work.

The piece starts out with scare metrics to show that Canada is “falling behind”.  Australia has just two-thirds our population, yet has seven institutions in the QS top 100, compared to Canada’s five!  Why anyone should care about this specific cut-off (use the top-200 in the QS rankings and Canada beats Australia 9 to 8), or this specific ranking (in the THE rankings, Canada and Australia each have 4 spots), Chakma never makes clear.

The piece then moves on to make the case that, “other countries such as Germany, Israel, China and India are upping their game” in public funding of research (no mention of the fact that Canada spends more public dollars on higher education and research than any of these countries), which leads us to the astonishing non-sequitur that, “if universities in other jurisdictions are beating us on key academic and research measures, it’s not surprising that Canada is also being out-performed on key economic measures”.

This proposition – that public funding of education is a leading indicator of economic performance – is demonstrably false.  Germany has just about the weakest higher education spending in the OECD, and it’s doing just fine, economically.  The US has about the highest, and it’s still in its worst economic slowdown in over seventy-five years.  Claiming that there is some kind of demonstrable short-term link is the kind of thing that will get universities into trouble.  I mean, someone might just say, “well, Canada has the 4th-highest level of public funding of higher education as a percentage of GDP in the OECD – doesn’t that mean we should be doing better?  And if that’s indeed true, and our economy is so mediocre, doesn’t that give us reason to suspect that maybe our universities aren’t delivering the goods?”

According to Chakma, Canada has arrived at its allegedly-wretched state by virtue of having a funding formula which prioritizes bums-in-seats instead of excellence.  But that’s a tough sell.  Most countries (including oh-so-great Australia) have funding formulae at least as demand-oriented as our own – and most are working with considerably fewer dollars per student as well.  If Australia is in fact “beating” us (a debatable proposition), one might reasonably suspect that it has at least as much to do with management as it does money.

Presumably, though, that’s not a hypothesis the U-15 wants to test.

November 04

Concentration vs. Distribution

I’m spending part of this week in Shanghai at the bi-annual World-Class Universities conference, which is put on by the good folks who run the Shanghai Jiao Tong Rankings. I’ll be telling you more about this conference later, but today I wanted to pick up on a story from the last set of Shanghai rankings in August.  You’d be forgiven for missing it – Shanghai doesn’t make the news the way the Times Higher Education rankings does, because its methodology doesn’t allow for much change at the top.

The story had to with Saudi Arabia.  As recently as 2008, it had no universities in the top 500; now it has four, largely because of the way they are strategically hiring highly-cited scientists (on a part-time basis, one assumes, but I don’t know that for sure).  King Saud University, which only entered the rankings in 2009, has now cracked the top-200, making it by far the fastest rise of any institution in the history of any set of rankings.  But since this doesn’t line up with the “East Asian tigers overtaking Europe/America” line that everyone seems eager to hear, no one published it.

You see, we’re addicted to this idea that if you have great universities then great economic development will follow.  There were some surprised comments on twitter about the lack of a German presence in the rankings.  But why?  Whoever said that having a few strong top universities is the key to success?

Strong universities benefit their local economies – that’s been clear for decades.  And if you tilt the playing-field more towards those institutions – as David Naylor argued in a very good talk last spring, there’s no question that it will pay some returns in terms of discovery and innovation.  But the issue is one of opportunity costs: would such a concentration of resources create more innovation and spill-over benefits than other possible distributions of funds?  Those who make the argument for concentration (see, for instance, HEQCO’s recent paper on differentiation) seem to take this as given, but I’m not convinced their case is right.

Put it this way: if some government had a spare billion lying around, and the politics of regional envy wasn’t an issue, and they wanted to spend it in higher education, which investment would have the bigger impact: putting it all into a single, “world-class” university?  Spreading it across maybe a half-dozen “good” universities?  Or spreading it across all institutions?  Concentrating the money might do a lot of good for the country (not to mention the institution at which it was concentrated – but maybe dispersing it would do more.  As convincing as Naylor’s speech was, this issue of opportunity costs wasn’t addressed.

Or, go back to Shanghai terminology: if it were up to you to choose, do you think Canada would be better served with one institution in the top ten worldwide (currently – none) or seven in the top 100 (currently – four) or thirty-five in the top 500 (currently – twenty-three)?  And what arguments would you make to back-up your decision?  I’m curious to hear your views.

Page 14 of 62« First...1213141516...203040...Last »