Higher Education Strategy Associates

June 11

Tremors in China

I wanted to point everyone’s attention to a small article in the Chinese People’s Daily last Wednesday, which is potentially of enormous significance.

Apparently, of the country’s 31 Provinces, Municipalities, and Autonomous Regions, only seven have disclosed their figures with respect to higher education recruitment.  Every single one of them missed their targets, some by over 10%.  And these seven provinces represent a mix of economic backgrounds: Anhui and Quinghai are relatively poor interior provinces; Shandong and Fujian are richer coastal ones, and the balance are somewhere in between.  It’s a broad, broad swathe of the country – which makes it unlikely either that it’s a one-off fluke, or that the trends are much different in other non-reporting provinces.

Some are suggesting this is a demographic thing – but this is frankly nonsense.  Youth cohorts have been shrinking for several years now, and that hasn’t stopped the flood of students heading to higher education.  This is different.  This is a change in the participation rate.  It’s a change in the proportion of people who want to go to higher education.  It’s families finally starting to react to the high level of graduate under-employment.

This was the kind of thing the Chinese government was trying to forestall when it announced plans to convert 600 universities (out of 2400 in total) into polytechnics.  Indeed, given that the data was for 2013, it might actually have been the cause of the Party’s decision to transform these institutions.  But there’s no guarantee that, in fact, students want that kind of education either; as I explained back here, a major demand-driver for education in Confucian societies is the perception of moral goodness attached to higher studies, which may not be present in more technologically-oriented programs.  The party’s assumption that families skeptical about university education will head to polytechnics is unproven: it may be university or nothing.

What are the knock-on effects of this?  Remember that Chinese public universities took on $41 billion in debt to expand.  If they don’t have fee-paying students filling those seats, the chances of some universities defaulting is going to rise.  Ultimately, none are likely to fail – the prestige hit on local government would be too big – but you can see it leading to a general reining-in of university finance.

And the effect on Chinese students heading abroad?  Well, the era of scarcity in Chinese universities is already well and truly over – even before this drop, over 76% of gaokao-takers now get a place in universities.  Foreign universities don’t fulfill a demand-absorption function anymore – they are very clearly simply competing on quality with domestic institutions.  So far, there is no indication that this demand is slackening, which implies a great hunger in China for quality education, which not all local universities can yet provide.

But take it as a warning.  Youth numbers are declining.  Demand for university education even within the youth cohort is declining.  Eventually, this may translate into lower demand for foreign education as well.  Institutions who depend too heavily on this market may get burned.

June 10

Crazy Managerial Imperatives Around International Students

One of the weirdest – and I mean totally bat-guano-crazy – things in Canadian higher education is the way recruitment of international students is managed.  Although the image of international student recruitment is often seen simply as a money-spinner for institutions, the fact of the matter is that most institutions aren’t coming close to maximizing revenue from this source.  And that’s not because of any high-minded motives of institutions turning away students they don’t think are suitable for their university experience, either.  It’s simply because of the way institutions’ internal budget controls work.

In a regular business, sales forces get the budget they need to hit revenue targets.  But if sales are going well, and management thinks they can get more money by investing money in sales, then the sales force will get more money.  Simple as that.

Compare this to what is happening at many institutions in Canada around international recruitment budgets, where the discussion is more along these lines:

Senior Admin (to international office):  Can you get us more international students?  We could really use the cash.  Budget cuts, you know.

International Office: Um, sure.  But can I have some extra money for that?  Recruitment actually costs money.  Not to mention support once we get them here.

Senior Admin: What?  More money?  Didn’t I just tell you we don’t have any money?

International Office: But… we’ll get our money back and more (NB. There are circumstances where this isn’t true, as described back here, but for the moment let’s assume it is).  You need to spend money to make money.

Senior Admin: Look, there’s a budget freeze on.  If I give non-academic units more money, there’ll be hell to pay.  You know how envious everyone already is that you guys get to fly all over the place?

International Office: (Desperately suppressing the urge to start listing decanal and vice-presidential visits abroad) But you’re not “giving us money”. We generate revenue!

Senior Admin: Yes, but there’s nothing in our budget process that allows us to reflect that.

International Office: (repeatedly bangs head against wall.)

Seriously, this happens.  The idea of investing money to make money later on isn’t entirely foreign (sorry) to universities – but doing so via the international office often isn’t possible.  To a large extent that’s because of their historical roots.  Most of them weren’t set up as revenue-generating units – until a decade ago they were mostly busy doing things like checking student’s health insurance, and helping profs on sabbatical deal with accommodation and paperwork.  As a result, they tend to get tied up with other administrative units like Student Services or Human Resources (which tend to take a hit when times are bad), rather than with revenue units like Advancement (which usually doesn’t).

(As an aside, I’m pretty sure this is one of the reasons international offices turn to agents, rather than building up their own networks abroad; the latter requires upfront investment, while the former just requires paying for students once they arrive – which, as you can imagine, is a lot easier to sell within the bureaucracy.)

If institutions are serious about playing the international game, they need to get serious about how they fund and manage it.  Too many haven’t bothered to do that.

June 09

Teaching Load Versus Workload

I often get into discussions that go like this:

Me: Over time, the number of classes each professor teaches has gone down.  Places where people used to teach 3/2 (three classes one term, two the other) now teach 2/1.  Places where 4/3 or even 4/4 were common are now 3/2.   This has been one of the main things making higher education more expensive in Canada.

Someone else (usually a prof): Yeah, but classes are so much larger now than they used to be.

Me: Do you not think that teaching fewer classes maybe the cause of higher average class size?  Do you think that if everyone taught more classes average class size would fall?

(nota bene: This isn’t the whole story, obviously.  Student-staff ratios have gone up to such a degree that even if profs were teaching the same number of courses, numbers would still be up a bit.  Though how much is hard to say, because of the changing use of sessional lecturers.)

Someone else: Does it matter?  Same number of students, same amount of work.

Me: Is it?  Are three classes of fifty students actually the same amount as five classes of thirty students?  Doesn’t less class prep time more than make up for the increase in marking?

Someone else: Um, well, yeah.  Probably.  But we’re still doing lots of committee work!  And tenure requirements have become much more punishing than they used to be!  And those teaching loads don’t count graduate student supervisions.

Me: No doubt, committee work can take up a lot of time – though much of it exists simply to make the university less effective.  But that research one – that’s not distributed equally across the university, is it? I mean, we know that the pace of publication falls pretty quickly after tenure is granted (see figure 3 of this PPP article by Herb Emery).  And not all university research is of the same quality: Well over 10% of all Canadian faculty (24% in the humanities) have never had a publication cited by anyone else (HESA research, which we demonstrated back here).

Someone else:  And graduate supervision?

Me: Fair point.  But graduate supervision is all over the place.  Supervising a PhD in Science tends to be more intensive than in Arts.  And course-based Masters’ student are increasingly more like undergraduates than doctoral students in the loads they bring.  Hard to measure.

Someone else: But shouldn’t all this be measured?

Me: Of course.  But notice how Canadian university Collective Bargaining Agreements avoid the question of overall workload, even though they often get really specific about teaching loads.  Universities don’t want to measure this stuff because it would expose how many profs are working way too hard, and unions don’t want to measure this stuff because it would expose how many profs aren’t.    Look how hard both sides worked to discredit the HEQCO paper on professorial productivity, which posed exactly that question.

Someone else: is this ever going to change?

Me: Governments could put pressure on institutions to actually enforce the bits of the CBAs that require faculty to actually do the hard-to-measure stuff (committee work, research).  Junior staff could make more of a fuss within the unions to start ensuring equal treatment of workloads within the bargaining unit.  Short of that, no.

Someone else: Aren’t you a bit cynical?

Me: Around here, hard not to be.

June 06

Governance, Stress-Tests, and Preparing for the Worst

It’s the little things that worry me.  The slowdown in China.  The continuing failure of the Euro-zone to grow.  The fact that the ratio of the US Stock Market Cap to GDP is approaching the levels seen right before the crashes of 2001 and 2008.  Our economy might muddle through, or it might not.

Now add on to economic uncertainty the clear evidence that governments are showing decreasing enthusiasm about supporting higher education – nationally, there’s been a real decline in provincial higher ed funding over the past four years to the tune of about three percent.  Better than some sectors, certainly, but also very problematic, given that our universities essentially seize up if their budgets don’t grow at least 3.5% per year.  Oh, and throw in the clear reluctance of most governments to let tuition rise to compensate for any funding cuts.

Given all this, I’d say there’s a reasonable chance that universities in more than one province are heading for budget cuts on the order of 10% or so.  It’s likeliest in Ontario, but it could happen pretty much anywhere.

Is anyone ready for that?  Does anyone have a plan in their back-pocket that would help them get through that kind of restructuring.

I can hear all of you rolling your eyes.  Of course not - who does that?

Well, almost everyone, really.  Any business worth its salt has some pretty clear contingency plans if revenue drops.  Colleges don’t have exact contingency plans per se, but they pretty much all measure break-even points on a per-program basis; if required to cut, they would be able to produce plans very quickly.

But universities?  It is to laugh.  They’ll plan for growth until the cows come.  But plans to shrink?  Never.

Yet, it’s not as though they can claim blindness to the danger.  It’s not as though universities don’t remember the 1990s, when double-digit cuts occurred.  It’s not as though cuts on a limited scale aren’t already happening.  Despite the dangers, universities continue to merrily sign agreements with faculty that commit them to large expenditure increases in the future (Hey!  U of Ottawa!  Yeah, I’m looking at you!) instead of focussing on contingency plans.

I can sort of understand the reluctance of administrators to take this step, given the predictable faculty backlash.  What’s more puzzling is the absence of any pressure on institutions from their Boards of Governors on this score.  Our whole system of university governance is based on spheres of competence: academics run academic affairs through Senate, while Boards – supposedly filled with men and women with a modicum of business nous – are supposed to take care of the money.  And yet, more often than not, “taking care of the money” means dong fundraising or small-ball stuff like advising on endowment strategies.  It doesn’t seem to involve asking hard questions about the medium-to-long term solvency or stress-testing institutions to see how they’d fare if things go south.

Yet it should.  The risks institutions face are getting bigger each year.  A crash may not happen; but if it does, we’d all be better off if our responses were based on thoughtful long-term plans rather than the usual beheaded chicken routine that universities seem to prefer.  Boards of Governors are the ones best-placed to make it happen.  They need to step up and do so.

June 05

Articles of Faith

Further to Tuesday’s blog about STEM panics, I note a new report out from Canada 2020, a young-ish organization with pretensions to be the Liberals’ pet think tank called Skills and Higher Education in Canada: Towards Excellence and Equity.  Authored by the Conference Board’s Daniel Munro, it covers most of the ground you’d expect in a “touch-all-the-bases” report.  And while the section on equity is pretty good, when it comes to “excellence” this paper – like many before it – draws some conclusions based more on faith than facts.

Take for example, this passage:

Differences in average literacy skills explain 55 per cent of the variation in economic growth among OECD countries since 1960. With very high skills and higher education attainment rates, it is not surprising to find Canada among the most developed and prosperous countries in the world. But with fewer advanced degree-holders (e.g., Masters and PhDs), and weak performance on workplace education and training, it is also not surprising to find that Canada has been lagging key international peers in innovation and productivity growth for many years.

The first sentence is empirically correct, but things head south rapidly from there.  The average literacy rates do not necessarily imply, as the second sentence suggests, that higher education attainment rates are a cause of prosperity; Germany and Switzerland do OK with low rates, and Korea’s massive higher education expansion was a consequence rather than a cause of economic growth.  The final sentence goes even further, implying specifically that the percentage of the population with advanced degrees is a determinant of productivity growth.  This is flat-out untrue, as the figure below shows.

Figure 1: Productivity vs. PhDs per 100K of population, select OECD countries














Countries in this graph: US, UK, NL, NO, BE, CH, D, SE, FI, O, DK, UK, US, IE, FR, CA, JP

The pattern is one you see in a lot of reports: find a stat linking growth to one particular educational attainment metric, then infer from this that any increase on any educational metric must produce growth.  It sounds convincing, but it usually isn’t true.

It’s the same with STEM.  Munro tells us Canada has a higher share of STEM of university graduates than the OECD average (true – and something we rarely hear), but also intones gravely that Canada lags “key international competitors” like Finland and Germany – which is simply nonsensical.  Our competitive economic position is in absolutely no way affected by the proportion of STEM grads in Finland (it’s Finland, for God’s sake.  Who cares?); as for Germany, since they have a substantially lower overall university attainment rate than Canada, so our number of STEM grads per capita is still higher than theirs (which is surely a more plausible metric as far the economy is concerned.

I don’t want to come across here as picking on Munro, because he’s hardly the only person who makes these kinds of arguments; such dubious assumptions underpin a lot of Canadian reports on education.  We attempt – for the most part admirably – to benchmark our performance internationally, but then use the results to gee up politicians for action by drawing largely unwarranted conclusions about threats to our competitive position if we aren’t in the top few spots of any given metric.   I don’t doubt these tactics are well-meant (if occasionally a bit cynical), but that doesn’t make them accurate.

The fact is, there are very few proven correlations between attainment metrics in education and economic performance, and even fewer where the arrow of causality runs from education to growth (rather than vice-versa).  If we have productivity problems, are they really related to STEM?  If they are related to STEM, which matters more – increasing STEM PhDs, or improving STEM comprehension among secondary school students?

We have literally no idea.  We have faith that more is better, but little evidence.  And we should be able to do better than that.

June 04

Institutional Strategies: Simulacra or Reinvention?

I recently had the chance to read a re-issue of Simon Marginson and Mark Considine’s, The Enterprise University: Power Governance and Reinvention in Australia.  It’s a heck of a good read; among those currently writing about higher education, Marginson’s probably got the best turn of phrase around.  Some of it – around managerialism and the role of research expenditure in cementing it – seems a bit dated now, in the sense that no one would any longer find it surprising.  And the section on governing boards is a bit Australia-centric.  But the chapter on institutional diversity is so brilliant that everyone in Higher Ed should read it.  (Ministers and deputy ministers should read it four or five times.)  It’s that good.

In this chapter, Marginson & Considine consider how diversity works in practice.  In Australia – much like in Canada – there is a hierarchy of institutional prestige, based mainly on the order in which they were created within their state/province.  We call it the G-5, they call it the “Sandstones”, but it’s basically the same thing.  Then there are the “Redbricks” (roughly, the rest of the U-15, plus maybe York, Simon Fraser, and Guelph), “Gumtrees” (no real direct equivalent as a class, but think Brock/Laurier), “Unitechs” (Ryerson, maybe) and “New Universities” (the new-ish universities in BC and Alberta).

When  the authors examined these institutions they found that, despite “vertical” diversification, there was no attempt to diversify horizontally – in fact, quite the opposite.  As soon as Australian universities gained the freedom to control their own program mix, they all moved pretty swiftly to offer a pretty similar and comprehensive mix of programming. At the same time, all institutions were trying to become more research-intensive, with varying degrees of success.

(Is this sounding familiar yet?  Good.)

Marginson & Considine note the forces of isomorphism at work: some of it comes from the fading power of the disciplines (considerably more advanced in Australia than North America), some of it comes from the incentives provided by funding formulae, and some of it stems from the model of prestige that serves to reinforce the existing power of the Sandstones.  Basically, there’s no percentage in trying to be anything other than mini-Sandstones – which is why most institutional strategy in Australia is just a “simulacra” of strategy.  It isn’t real, it’s just a hollow, low-risk attempt to copy what the big schools do (for an example of this in Canada, see Western).

That said – and this is the bit that I found fascinating – Marginson & Considine still found three ways in which Australian institutions managed to diversify and re-invent themselves, if only a little bit.  One was by being the “entrepreneurial university” and engaging in some private fund-raising activities (e.g. commercial consultancies, bespoke training for companies, etc.), the second was by going big on globalization and international students (by which they mean not just attracting students from abroad, but also providing training or setting up campuses overseas), and third was specializing in distance education.

What’s fascinating about that?  The fact that nearly all Canadian institutions have piled on the second one, leaving the first and third essentially untouched (yes, we have distance ed specialists like Athabasca, but they were set up as a specialist school – no one has tried to reposition themselves through greater distance ed efforts).  The fact of the matter is that no school has ever felt threatened enough to do anything other than copy the big boys – to offer anything other than a simulacra strategy.

I wonder if that will change anytime soon?

June 03

STEM, Shortages, and the Truth About Doctoral Education

Harvard’s Michael S. Teitelbaum came out with an interesting new book last month called, Falling Behind? Boom, Bust and the Global Race for Scientific Talent.  Though it’s a very US- focused book, it’s worth a read as a corrective to the occasional hysterics that people have in Canada about our alleged STEM crisis.

The book starts with a wonderful chapter called “No Shortage of Shortages”, which suggests that the current STEM-shortage panic is the sixth in the US since Sputnik.  He also eviscerates the various employer- and research university-led reports that precipitated the most recent crisis talk (Innovate America, Tapping America’s Potential, and Rising Above the Gathering Storm), and shows that the evidence backing up these claims for crisis  simply don’t hold up.  What does hold up are the structural incentives that exist for various groups to claim there is a crisis when there is none: universities get more money, professors get more grad students, and employers get more PhDs, or more H1-B visas to enable the hiring of foreigners.

An interesting question Teitelbaum raises is whether it might be possible to create a board or agency with the responsibility of declaring when certain occupations are indeed in shortage.  He correctly lists a whole bunch of structural reasons why it might be difficult to find a respected neutral body that interest groups wouldn’t immediately try to undermine, but he does raise the interesting example of the UK’s Migration Advisory Committee, which has the responsibility of advising government on when shortages in specific skilled professions has become sufficiently acute to merit changes in immigration law.  Certainly something to think about with respect to our own Temporary Foreign Workers’ Program.

But to my mind the most important chapter – one everyone in higher education should read – is the chapter on the U.S. Academic Production Process.  He makes the point that the production of doctoral students is a function of research grant availability, not of demand for services of doctorally-educated graduates (and certainly not of the needs of academic institutions for new faculty).  Universities want doctoral students (and increasingly, postdocs) because over time, they have become the go-to form of scholarly labour that university research labs require in order to work.  If they have more money – say, if the US government increases the NIH budget by 100% over five years – there will be a huge explosion in the demand for doctoral students, which is entirely unconnected to the labour market demand for doctoral graduates.

This is a simple and unarguable point, but it is rarely stated quite so bluntly.  Eventually, domestic students figure this out, and fewer go into doctoral studies.  But that doesn’t decrease the demand for this kind of labour – so institutions start reaching out more and more for foreign students, particularly from Asia.  For these students, grad student conditions (and those that come afterwards, even in a depressed labour market) still look pretty good compared to what they can get back home.  To his credit, Teitelbaum doesn’t pretend there are any easy answers to this one and, in the end, simply falls back on the idea of requiring institutions to do a better job informing prospective graduate students about the realities of the academic job market – in terms very similar to the ones I proposed back here.

Anyways – pick up Teitelbaum if you get a chance.  It’s a rewarding read.

June 02

Bad Memory

Some really sobering stuff in a paper I just got from Statscan called, “Job Market Realities for Post-Secondary Graduates”.  Listen to this:

  • “Graduates of a field with low unemployment and little underemployment were also likely to earn high salaries and be content with their jobs.  They were usually graduates of job-oriented fields such as engineering, teacher training, most health disciplines, business, computer science and some technologies.”
  • “A more general education in subjects with little practical application often (leads) to a lower-paid job which made little use of knowledge and skills acquired during the years of study… those who fared worst held degrees/diplomas in fine and applied arts, humanities and social sciences and some of the sciences.”
  • “Many trades, which do not require post-secondary education, pay better than some occupations held by postsecondary graduates, especially office work.”
  • “Newfoundland offered the highest starting salaries, with Saskatchewan a close second.”
  • “Half of all bachelor’s graduates planned to go back to school within two years.”

Enraging what decades of neo-liberalism has done to our education system, huh?  Why can’t we go back to the 70s when everyone had a job, humanities and social science graduates weren’t undervalued and everyone had a job?

Oh, wait, hang on.  This survey is from the 1970s!  In fact, all of this is from the two-year follow-up of the university and college graduating classes of 1976 (the forerunner of the current National Graduates Survey).

Turns out the 1970s weren’t quit the bonanza that some folks like Generation Squeeze like to make it out.  For the class of 1976 in Ontario, the unemployment rate 2 years out for university graduates was 8%.  For the class of 2010, it was 5%.  Average salary 2 years out for the class of 1976 was $14,600, or $47,300 in 2012 dollars.  For the class of 2012, the equivalent figure was $49,277.  In 1978, 80% of recent grads said their job had a relationship with their field of study.  In 2012?  82%.

I could go on here, but you get the picture.  There’s very little going on for graduates in the labour market that wasn’t going on forty years ago.  Back then, we were also in a resource boom, and trades looked good compared to Arts and Science.  Jobs in Newfoundland and Saskatchewan looked pretty good compared to jobs in Ontario (we don’t know about Quebec because back during the first PQ government, Quebec institutions weren’t allowed to participate in national studies like this).  This is just a phase we go through.

Of course, some people may look at this result and see stagnation.  Graduates only getting a $2,000 raise in 40 years!   But this is to miss the point almost entirely.  In 1978, the participation rate at Canadian universities was around 10%; we’re now just over 30%.  That is to say, even accounting for population growth, there are three times as many young people getting salaries which, on average, are the same as the coddled, easy-going graduates of the mid-70s.

Nostalgia makes us look at the past with rose-coloured glasses.  But it’s no basis for making policy.  Look past the soft-focus gauze and the rantings of the hell-in-a handbasket crowd: the fact is, our grads are doing as well as they have at any time in the last 40 years.  We should celebrate that.

May 30

Valuing Foreign Degrees

There was an interesting Statscan paper out yesterday that made some fascinating observations about education, immigration, and human capital.  With the totally hip title, The Human Capital Model of Selection and the Economic Outcomes of Immigrants (authors: Picot, Hou and Qiu), it’s a good example both of what Statscan-type analyses do well, and do poorly.

At one level, it’s a very good study.  It uses the Longitudinal Administrative Databank (Statscan’s coolest database – it’s a longitudinal 20% sample of all of the country’s taxfilers) to follow the fates of newcomers to Canada in terms of earnings.  What they find is that in the first few years after entry, the very large wage premiums that “economic class” immigrants (as opposed to “family class”) with degrees used to have over immigrants without degrees has shrunk substantially.  However, over the longer term, the study also finds that educated immigrants have a much steeper earnings slope than those with less education – which is to say that if you shift the lens from “what are immigrants’ labour market experiences in their first three years in Canada”, to “what are immigrants’ labour market experiences in the first ten-to-fifteen years in Canada”, you get a much different, and more positive story.

Now, a lot of people want to know why immigrants with degrees aren’t doing as well in the short term, even if the decline in long-term fortunes isn’t as severe as first thought.  The authors don’t answer this question, but many others have come up with hypotheses.  When you hear stories about immigrants doing worse than they used to in the labour market, even holding education constant, it’s easy to jump to conclusions.  Canadian immigration since the 1980s has increasingly been from Asian countries, so it’s easy enough to conjure up some racism-related theories about the decline.  But I want to point something else out.  Below I reproduce a table from a this recent UNESCO report on higher education systems in Asia.  It shows the distribution of university professors by various levels of qualifications.

Table 1: Highest Level of Higher Education Instructors’ Academic Attainment, Selected Asian Countries







Here’s the problem: Should we really assume that a Bachelor’s degree from Indonesia confers the same skills that one from the US or Europe does?  Probably not.  And yet every single Statscan study that looks at education, immigration, and earnings assumes that a degree is a degree, no matter where it’s earned. I understand why they would do that; how else would one judge equivalencies? And yet choosing to ignore it doesn’t help either.  The reason today’s university-educated immigrants are doing worse than the ones of 30 years ago may simply be that they have lower average levels of skills because of where they went to school.

None of this is to suggest racism isn’t a factor in deteriorating incomes for new immigrants, or that Canadian employers aren’t ridiculous and discriminatory in their demands that new hires have “Canadian experience”.  It’s simply to say that degrees aren’t all made the same, and it would be nice if some of our research on the subject acknowledged this.

May 29

May ’14 Rankings Round-Up

I’ve been remiss  the last month or so in not keeping you up-to-date with some of the big international rankings releases, namely the Leiden Rankings, the Times Top 100 Under 50 rankings, and the U21 Ranking of National Higher Education Systems.

Let’s start with Leiden (previous articles on Leiden can be found here, and here), a multidimensional bibliometric ranking that looks at various types of publication and impact metrics.  Because of the nature of the data it uses, and the way it displays results, the rankings are both stable and hard to summarize.  I encourage everyone interested in bibliometrics to take a look and play around with the data themselves to see how the rankings work. In terms of Canadian institutions, our Big Three (Toronto, UBC, McGill) do reasonably well, as usual (though the sheer volume of publications from Toronto is a bit of a stunner), perhaps more surprising is how Victoria outperforms most of the U-15 on some of these measures.

Next, there’s the U21 National Systems Rankings (which, again, I have previously profiled, back here and here).  This is an attempt to rank not individual institutions, but rather whole national higher education systems based on Resources, Environments, Connectivity, and Outputs.  The US comes tops, Sweden 2nd, and Canada 3rd overall – we climb a place from last year.  We do this mostly on the basis of being second in the world in terms of resources (that’s right, folks: complain as we all do about funding, and how nasty governments are here to merely maintain budgets in real dollars, only Denmark has a better-resources system than our own), and third in terms of “outputs” (mostly research-based).

We do less well, though, in other areas, notably “Environment”, where we come 33rd (behind Bulgaria, Thailand, and Serbia, among others.  That’s mostly because of the way the ranking effectively penalizes us for: a) being a federation without certain types of top-level national organizations (Germany suffers on this score as well), b) for our system being too public (yes, really), and c) Statscan data on higher education being either unavailable or totally impenetrable to outsiders.  If you were to ignore some of this weirder stuff, we’d have been ranked second.

The innovation in this year’s U21 rankings is the normalization of national scores by per capita GDP.  Canada falls to seventh on this measure (though the Americans fall further, from first to fifteenth).  The Scandinavians end up looking even better than they usually do, but so – interestingly enough – does Serbia, which ranks fourth overall in this version of the ranking.

Finally, there’s the Times Higher Top 100 Institutions Under 50, a fun ranking despite some of the obvious methodological limitations (which I pointed out back here) and won’t rehash again.  This ranking always changes significantly each year because the institutions at the top tend to be close to 50 years out, and as such get rotated out and new ones take their place.  Asian universities took four of the top five spots globally (Postech and KAIST in Korea, HKUST in Hong Kong, and Nanyang in Singapore).  Calgary, in 19th place was the best Canadian performer, but Simon Fraser made 24th and three other Canadian universities took their place for the first time: Guelph (73) UQAM (84) and Concordia (96).

Even if you don’t take rankings overly seriously, all three rankings provide ample amounts of thought-provoking data.  Poke around and you’re sure to find at least a few surprises.

Page 20 of 80« First...10...1819202122...304050...Last »