HESA

Higher Education Strategy Associates

Category Archives: Media

June 09

Why we should – and shouldn’t – pay attention to World Rankings

The father of modern university rankings is James McKeen Cattell, a well-known early 20th-century psychologist, scientific editor (he ran the journals Science and Psychological Review) and eugenecist.  In 1903, he began publishing American Men of Science, a semi-regular rating of the country’s top scientists, as rated by university department chairs.  He then hit on the idea of counting how many of these scientists were graduates of the nation’s various universities.  Being a baseball enthusiast, it seemed completely natural to arrange these results top to bottom, as in a league table.  Rankings have never looked back.

Because of the league table format, reporting on rankings tends to mirror what we see in sports.  Who’s up?  Who’s down?  Can we diagnose the problem from the statistics?  Is it a problem attracting international faculty?  Lower citation rates?  A lack of depth in left-handed relief pitching?  And so on.

The 2018, QS World University Rankings, released last night, are another occasion for this kind of analysis.  The master narrative for Canada – if you want to call it that – is that “Canada is slipping”.  The evidence for this is that the University of British Columbia fell out of the top 50 institutions in the world (down six places to 51st) and that we also now have two fewer institutions in the top 200, (Calgary fell from 196th to 217th and Western from 198 to 210th) than we used to.

People pushing various agendas will find solace in this.  At UBC, blame will no doubt be placed on the institution’s omnishambular year of 2015-16.  Nationally, people will try to link the results to problems of federal funding and argue how implementing the recommendations of the Naylor report would be a game-changer for rankings.

This is wrong for a couple of reasons.  The first is that it is by no means clear that Canadian institutions are in fact slipping.  Sure, we have two fewer in the 200, but the number in the top 500 grew by one.  Of those who made the top 500, nine rose in the rankings, nine slipped and one stayed constant.  Even the one high-profile “failure” – UBC –  only saw its overall score fall by one-tenth of a point; the fall in the rankings was more due to an improvement in a clutch of Asian and Australian universities.

The second is that in the short-term, rankings are remarkably impervious to policy changes.  For instance, according to the QS reputational survey, UBC’s reputation has taken exactly zero damage from l’affaire Gupta and its aftermath.  Which is as it should be: a few months of communications hell doesn’t offset 100 years of scientific excellence.  And new money for research may help less than people think. In Canada, institutional citations tend to track the number of grants received more than the dollar value of the grants.  How granting councils distribute money is at least as important as the amount they spend.

And that’s exactly right.  Universities are among the oldest institutions in society and they don’t suddenly become noticeably better or worse over the course of twelve months.  Observations over the span of a decade or so are more useful, but changes in ranking methodology make this difficult (McGill and Toronto are both down quite a few places since 2011, but a lot of that has to do with changes which reduced the impact of medical research relative to other fields of study).

So it matters that Canada has three universities which are genuinely top class, and another clutch (between four and ten, depending on your definition), which could be called “world-class”.  It’s useful to know that, and to note if any institutions have sustained, year-after-year changes either up or down.  But this has yet to happen to any Canadian university.

What’s not as useful is to cover rankings like sports, and invest too much meaning in year-to-year movements.  Most of the yearly changes are margin-of-error kind of stuff, changes that result from a couple of dozen papers being published in one year rather than another, or the difference between admitting 120 extra international students instead of 140.   There is not much Moneyball-style analysis to be done when so many institutional outputs are – in the final analysis – pretty much the same.

March 21

Worst Higher Education Article of the Decade (So Far)

Stop the presses.  I have found the worst education article of the decade so far.  It is by Don & Alex Tapscott, and it is called The Blockchain Revolution and Higher Education.

How dumb is it?  Solar-powered flashlight dumb.  Tripping over a cordless phone dumb.

The problem is that because it’s Don Tapscott and he is – for reasons that are completely beyond me – treated as some kind of national gem, no one ever calls him on his deep wrongness when it comes to education (remember the time  he dramatically announced that in the third week of January 2013 was the week universities changed forever?), so the likelihood is that some people may take this article seriously.  And this would be a terrible mistake because it is nonsense.

A little background.  A blockchain is a distributed database which is both secure and decentralized – something which has interesting applications in finance, electronic money (eg. bitcoin) and contracts.  Some also claim it will have a huge effect on creative industries because it solves a host of intellectual property issues, but this is more speculative, and it requires a whole lot of legal and policy changes, which at this point are pretty speculative.

But databases, no matter how funky and tech-y they are, don’t have many educational uses.  Imagine for a second we are back in 1981 and someone wrote an article about how Higher Education was about to go through a Lotus 1-2-3 Revolution.  They’d probably have been dismissed as fantastical dreamers.  Those were better days: sadly, no one will think of doing this to the Tapscotts.

Now, if you can get past the start of the article where the authors claim the internet hasn’t actually changed how companies do business (yes really) you will come to the claim that bitcoin will revolutionize education in four ways., to wit: student records, pedagogy, student debt and “the meta-university”.  The heart of the argument here is that blockchain is going to create a sea change in records management.  Now, student records are admittedly still fairly clunky.  And it’s possible (as I noted back here) that in a decade or so that people will come up with universal CVs that will standardize and revolutionize the way people describe credentials and achievements.  And it’s even possible that once that happens people will be able to record those certificates and achievements on blockchain.  But then…so what?  Blockchain’s main advantage is that records can’t be altered, which means it could be a great way of dealing with fraudulent records.  But that’s not really that big of a deal.  It begs the question, for those of us not in the habit of producing fraudulent records, what exactly are the benefits of blockchain?

Well, according to the Taspcotts, it basically comes down to the idea that blockchain could let you record competencies and skills in a reliable way, thus changing the way universities work completely.  Huge, massive, changes.  And once everyone knows reliably what skills and competencies you have, most of the machinery around universities disappears and higher education becomes just one big worldwide open-learning park, and those people who can demonstrate through blockchain that they have certain skills and competencies will be paid to teach others the same things and poof! No more student debt.

Honestly, I’m not making this up.  This is their claim.

The fact that we are having trouble figuring out skills and competencies outside narrow professional frameworks?  Irrelevant.  The fact that badges and other such newfangled credentials aren’t really being embraced by employers because they are often finicky and vague)?  Irrelevant.  The fact that you still need institutions to actually manage the learning process and institutions to measure outcomes (they do not, I acknowledge, need to be the same institutions)?  Irrelevant.  The fact that not having blockchain is no barrier to people paying students to teach other skills and yet no one does it because that’s not really how education works and blockchain changes nothing in that respect?  Irrelevant.

Is higher education over bureaucratized, insufficiently innovative, in need of a jolt?  Sure.  But a piece of code doesn’t fix all that.  There are lots of other problems – often genuine political problems – which have to be solved before the alleged minor blockchain revolution can happen.  And just because it can happen doesn’t mean it will.  The kind of higher education system the Tapscotts seem to want is a type which – to quote Tressie McMillan Cottom – really only appeals to free-ranging autodidacts.  For other learners, the kind of institutions the Tapscotts want is a deeply alienating one.  But techno-fetishist windbags never let issues as small as “what customers want” get in the way of a good fantasy.

In sum: Worst. Higher Ed Article. Ever.  Or pretty close to it.  A pretty much textbook case of how you can be a tech guru while understanding literally nothing of how the world works.

 

December 10

Reports, Books, and CUDO

It’s getting close to that time of year when I need to sign off for the holidays (tomorrow will be the last blog until January 4th).  So before then, I thought it would be worth quickly catching up on a few things.

Some reports you may have missed.  A number of reports have come out recently that I have been meaning to review.  Two, I think, are of passing note:

i) The Alberta Auditor-general devoted part of his annual report (see pages 21-28) to the subject of risk-management of cost-recovery and for-profit enterprises in the province’s post-secondary institutions, and concluded that the government really has no idea how much risk the provinces’ universities and colleges have taken on in the form of investments, partnerships, joint ventures, etc.  And that’s partly because the institutions themselves frequently don’t do a great job of quantifying this risk.  This issue’s a sleeper – my guess is it will increase in importance as time goes on.

ii) The Ontario auditor general reviewed the issue of University Intellectual Property (unaccountably, this story was overlooked by the media in favour of reporting on the trifling fact that Ontarians have overpaid for energy by $37 billion over the last – wait, what?  How much?).  It was fairly scathing about the province’s current activities in terms of ensuring the public gets value for money for its investments. A lot of the recommendations to universities consisted of fairly nitpicky stuff about documentation of commercialization, but there were solid recommendations on the need to track the impact of technology transfer, and in particular the socio-economic impact.  Again, I suspect similar issues will crop up with increasing frequency for both governments and institutions across the country.

Higher Ed Books of the Year.  For best book, I’m going to go with Lauren Rivera’s Pedigree: How Elite Students Get Elite Jobs, which I reviewed back here.   I’ll give a runner-up to Kevin Carey’s The End of College, about which I wrote a three-part review in March (here, here, and here).  I think the thesis is wrong, and as others have pointed out there are some perspectives missing here, but it put a lot of valuable issues about the future of higher education on the table in a clear and accessible way.

Worst book?  I’m reluctantly having to nominate Mark Ferrara’s Palace of Ashes: China and the Decline of American Higher Education.  I say reluctantly because the two chapters on the development of Chinese higher education is pretty good.  But the thesis as a whole is an utter train wreck.  Basically it amounts to: China is amazing because it is spending more money on higher education, and the US is terrible because it is spending less money on higher education (though he never bothers to actually check how much each is spending, say, as a proportion of GDP, which is a shame, as he would quickly see that US expenditure remains way above China’s even after adjusting for the difference in GDP).  The most hilarious bits are the ones where he talks about the erosion of academic freedom due to budget cuts, whereas in China… (you see the problem?  The author unfortunately doesn’t).  Dreck.

CUDO: You may recall I had some harsh things to say about the stuff that Common University Dataset Ontario was releasing on class sizes.  I offered a right of reply, and COU has kindly provided one, which I reproduce below, unedited:

We have looked into the anomalies that you reported in your blog concerning data in CUDO on class size.  Almost all data elements in CUDO derive from third party sources (for example, audited enrolment data reported to MTCU, NSSE survey responses) or from well-established processes that include data verification (for example, faculty data from the National Faculty Data Pool), and provide accurate and comparable data across universities. The class size data element in CUDO is an exception, however, where data is reported by universities and not validated across universities. We have determined that, over time, COU members have developed inconsistent approaches to reporting of the class size data in CUDO.

 COU will be working with universities towards more consistent reporting of class size for the next release of CUDO.

With respect to data concerning faculty workload:  COU published results of a study of faculty work in August 2014,  based on data collected concerning work performed by full-time tenured faculty, using data from 2010 to 2012. We recognize the need for further data concerning teaching done by contract teaching staff. As promised in the 2014 report, COU is in the process of updating the analysis based on 2014-15 data, and is expanding the data collection to include all teaching done in universities by both full-time tenured/tenure track faculty and contract teaching staff. We expect to release results of this study in 2016.

Buonissimo.  ‘Til tomorrow.

September 29

Liberal Arts Deserves Better Arguments

You may have noticed that I failed to award a “worst back-to-school” piece for the second year running.  This is because the bad stuff took a while to come out.  Rest assured, it came, and I now present two of them.

First is Heather Mallick’s little missive on Liberal Arts in the Star last week.  The utterly lazy premise is this: advances in ICT have changed the world dramatically, so what matters now is synthesis.  And by God, Liberal Arts gives you synthesis, even if it doesn’t give you science.  So, yay Liberal Arts.

Leaving aside Mallick’s utterly preposterous statement that ISIS would be a kinder and more humane organization if it took more Liberal Arts courses, there are at least three things wrong with her defence of “Liberal Arts”.

1)  The idea that Liberal Arts doesn’t include sciences.  This is a peculiarly Canadian definition of “Liberal Arts”.  Historically, Math and Astronomy are part of the Liberal Arts.  In the United States, the term usually encompasses the basic natural sciences.  For some reason, Canadians choose to use “Liberal Arts” as a synonym for “humanities”.  I have no idea why this is the case, but it bugs me.  Mallick’s hardly alone in this, though, so maybe I should cut her some slack here.

2)  The idea that Liberal Arts lets you “range widely”.  This is not a necessary outcome of Liberal Arts.  It’s true that an awful lot of Arts programs take a smorgasbord approach to curriculum, rather than present something with a smaller and more coherent offering, but there remain programs that are pretty prescriptive about the courses one must take (Concordia’s Liberal Arts program, for instance, has a pretty large set of core mandatory courses, which precludes much).   

3)  The idea that only Liberal Arts/humanities teaches synthesis.  First, it may well be true that Liberal Arts/humanities teaches synthesis (personally, I think it’s part of what my History degree taught me), but the actual evidence in favour of this proposition is fairly slim, partly because humanities profs are so reluctant to see outcomes such as this tested.  In fact, it’s arguable that there are many humanities disciplines (certain areas of postmodernist studies come to mind) where synthesis is about the last thing going on.  Second, for the umpteenth time, the argument that synthesis is not happening elsewhere in the academy is not only irritating and arrogant, but also it’s not grounded in evidence.

The thing is, as silly as these “defending the liberal arts” pieces are, they’re still miles better than the anti-liberal arts pieces.  The worst of which this year, indubitably, is Rex Murphy’s bilious take on the Alex Johnstone affair.  Johnstone, a federal NDP candidate in Hamilton, gained mild notoriety last week for claiming that she – possessor of a BA and MSW in Peace Studies – had no idea what Auschwitz was because if she did, she wouldn’t have made some slightly off-colour remarks on Facebook seven years ago.

Why the press believed this line is a bit beyond me: seems to me this was a transparent ploy to avoid taking responsibility for having said something stupid.  My guess is they did so partly because it would be difficult to prove the opposite, but also partly also because if it was true, then they could run chinstrokers about how terrible her education must have been.  Colby Cosh took an intellectually respectable shot at it here.  Murphy, on the other hand, went further, and in the process completely went down the rabbit hole.

Murphy’s is a bog-standard hit piece on the humanities: conjure up a few random stories about things that sound (and perhaps are) inane – trigger warnings on Paradise Losta goofy thesis title or two about Madonna and Beyoncé – and then claim, with no evidence whatsoever, that this is representative of all humanities, across all of higher education.  Then promise that the classics – apparently the only place where eternal truths can be found – shall be avenged, preferably by force-feeding Jane Austen to undergraduates.  It would be utter tripe even if he hadn’t gone to the trouble of not only calling a rape survivor at an American Ivy League school a liar, but also an airhead who also probably doesn’t know anything about Auschwitz (yes, really).

I wouldn’t worry so much about crap like Murphy’s if humanities had better defenders.  The problem is that true believers think that arguments like Mallick’s are actually convincing.  But to anyone outside the tribe, they look pretty weak.  Time for better arguments.

September 22

David Cameron, Pork, and World-Class Universities

I am going to assume that by now you have all heard about the… um… interesting news regarding British Prime Minister David Cameron, which was in yesterday’s papers.  If you haven’t, then take a quick look here.  Then come back.  Quickly.  Maybe have a shower first.

Ready?

OK, so, my first thought about this story is “I wonder what kind of day Oxford’s PR folk are going to have?”  Because, honestly, at most universities, the idea that some of your students – indeed, some of your most famous alumni – have at some time in the past been involved in on-campus porcine frottage would not be good news.  The press would want to know what the university knew about these very un-kosher sexual rituals, and when did it know find out?  Is it still going on?  Etc. etc.  And you’d have administrators running around campus worrying: what will this do to applications?  What will this do to fundraising?  Disaster!  How quickly can we close down these clubs?

(This, by the way, has nothing to do with whether or not the story is true.  I think there are some very good reasons to think it isn’t.  The source, Lord Ashdown, has a well-known grudge against Cameron.  And accusations of pig-fiddling are one of the oldest tricks in the political book.  In Fear and Loathing on the Campaign Trail, Hunter S. Thomson described how LBJ had, in his Texas days, told his campaign manager to accuse his opponent of carnal knowledge of sows.  His campaign manager objected, saying they couldn’t call him a pig-f***** because no one would believe it.  To which Johnson replied: “I know, but let’s make the sonofabitch deny it”.)

But no.  On this question yesterday, silence.  No blowback at all on Oxford.  And I can guarantee you that no one – no one – at Oxford thought for a moment about next year’s application figures.  The problem is that everyone knows that whatever else Oxford may be, it’s a playground for Britain’s ruling class.  And let’s face it, the ruling class in Britain are known to get up to some pretty sordid stuff.  So in the popular imagination, it’s already only a small step from membership in the Bullingdon club to what appears to be a barnyard version of the orgy scene from Eyes Wide Shut.  And not to single out Britain here: the same could more or less be said of Yale, with its various Skull and Bones-type societies.  And nobody (well, not many, anyway) think the worst of them.  Indeed, for a certain demographic, the presence of elite kinkiness probably increases an institution’s attractiveness.

But we can abstract from Oxford to say something more general about World-Class Universities, and it is this: being a world-class university means never having to worry about bad PR.  Alumni in a bestiality/necrophilia story? No problem!  Prestigious science faculty in bizarre twitter rant about how 14-year old Muslim children actually conspired to get themselves accused of bomb-making in order to get an invite to the White House?  It is but a laugh.  PR events that would swamp other institutions simply glide off World-Class universities’ backs.

Academic prestige matters.  Built up over enough time it can shield you from pretty much anything.  If you don’t think that’s a motivating factor in institutions’ prestige-seeking activities, you’re simply not paying attention.

September 03

One Lens for Viewing “Administrative Bloat”

The Globe’s Gary Mason wrote an interesting article yesterday about the Gupta resignation.  Actually, let me qualify: he wrote a very odd article, which ignored basically everything his Globe colleagues Simona Chiose and Frances Bula had reported the previous week, in order to peddle a tale in which the UBC Board fired Gupta for wanting to reduce administrative costs. This, frankly, sounds insane.  But Mason’s article did include some very eye-opening statistics on the increase of administrative staff at UBC over the past few years – such as the fact that, between 2009-10 and 2014-15, professional administrative staff numbers increased by 737, while academic staff numbers increased by only 28.  Eye-opening stuff.

And so, this seems as good a time as any to start sharing some of the institution-by-institution statistics on administrative & support (A&S) staff I’ve been putting together, which I think you will find kind of interesting.  But before I do that, I want to show you some national-level data that is of interest.  Not on actual staff numbers, mind you – that data doesn’t exist nationally.  However, through the annual CAUBO/Statscan Financial Information of Universities and Colleges (FIUC) survey, we can track how much we pay staff in various university functions.  And that gives us a way to look at where, within the university, administrative growth is occurring.

FIUC tracks both “academic” salaries and “other” (i.e. A&S) salaries across seven categories: “Instruction & Non-Sponsored Research” (i.e. at the faculty level); “Non-Credit Instruction” (i.e. cont. ed); “Library, Computing, and Communications”; “Physical Plant”; “Student Services”; “External Relations” (i.e. Government Relations plus Advancement); and, “Administration” (i.e. central administration).  Figure 1 shows the distribution of A&S salary expenditures across these different categories for 2013-14.  A little over 32% of total money is spent on faculty, while another 23% is spent in central administration.  Physical plant and student services account for about 11% apiece, while the remaining three areas account for 18% combined.

Figure 1: Distribution of A&S Salaries by Function, in 000s of Dollars, Canada, 2013-14

1

 

 

 

 

 

 

 

 

 

 

 

 

A zoom-in on the figures for central administration is warranted, as there has been some definitional change over time, which makes time-series analyses a bit tricky.  Back in 1998, the reporting rules were changed in a way that increased reported costs by about 30%.  Then, in 2003, about 15% of this category was hacked-off to create a new category: “external relations” – presumably because institutions wanted to draw a distinction between bits of central administration that increased revenues, and those that consumed them.  Figure 2 shows how that looks, over time.

Figure 2: Expenditure on Administrative & Support Salaries in Central Administration, in 000s of 2014 Real Dollars, Canada

2

 

 

 

 

 

 

 

 

 

 

 

 

Long story short: from the 80s through to the mid-90s, administrative & support salaries in central administration rose by a little over 3% per year in real terms.  Then, briefly, they fell for a couple of years, before resuming an upward trend.  Ignoring the one-time upward re-adjustment, aggregate A&S salaries in these two areas combined have been rising at 5.3%, after inflation, since 1999.  Which is, you know, a lot.

Now, let’s look at what’s been going on across the university as a whole.  Figure 3 shows changes in total A&S salary paid over time, relative to a 1979 base.  For this graph, I dropped the “non-credit” category (because it’s trivial); for central admin, I’ve both combined it with “external relations”, and corrected for the 1998 definitional change.  Also, for reference, I’ve included two dotted lines, which represent change in student numbers (in red), and change in total academic salary mass (in yellow).

Figure 3: Change in Real Total Academic & Support Salary Spending (1979-80 = 100) by Function, Canada

3

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Since 1979, student FTEs rose 120%, while academic salary mass doubled, after inflation.  A&S spending in libraries and physical plant rose by considerably less than this, by 27% and 57%, respectively.  A&S spending on “instruction” (that is, faculty & departmental offices) rose almost exactly in tandem with student numbers.  Spending on A&S salaries in central admin and in ICT rose about twice as fast as that, ending the 35-year period at three-and-a-half times their original rate.  But the really huge increases occurred in student services, where expenditures on A&S salaries are now six times as high as they were in 1979.

Over the next couple of weeks, I’ll be able to supplement this picture with institutional data, but the key take-aways for now are as follows: i) “central administration” salaries are growing substantially faster than enrolment and academic salary mass, but they represent less than a quarter of total A&S spending; ii) the largest component of A&S spending – that is, those reporting to academic Deans – is actually growing exactly on pace with enrolment; and, iii) the fastest-growing component of A&S spending is student services.  So, there has been a shift in A&S spending, but it’s not entirely to the bad, unless you’ve got a thing against student services.

More next week.

November 03

Cockroaches

One of the most maddening things about higher education journalism is the widespread assumption of fragility.

Take the notion of vulnerability to technological disruption.  The most recent example of this is a piece from University World News (which really should know better) entitled “Can Universities Survive the Digital Age?”  It’s an absolutely ridiculous question that could only be posed by someone who knew virtually nothing of the history of universities.

Every time there’s a technological innovation, somebody thinks the university must be in trouble.  Only 24 months ago MOOCs were going to kill universities.  In the mid-1990s there was loads of techno-fetishist nonsense about the Death of Distance, and what it would do to education.  Before that it was computers (check out this cute piece from the 1950s), and in the 1960s and 1970 it was television (remember the University of the Air?) – although the earliest predictions about the effects of TV on education date back to the 1930s.  And before that, it was radio that was going to revolutionize education.  And before that, as Gavin Moodie reminds is in this fine article (longer version available here), Gutenberg and the printing press had the potential to “disrupt” higher education (why go through all that oral disputation in Latin if you could just read a book in the vernacular?).

The fact is, every time there has been a change, universities have found a way to incorporate the new technology.  New technologies never replace previous channels of knowledge transmission – rather, new channels are just added to existing ones (this is of course is a major reason why technology is usually a source of university cost inflation, rather than a cost savings).  Universities adapt.  Because the point of a university is simply that it’s a place where people gather to learn and discover using all sorts of tools, not just (as some reformers seem to think) via oral communication.

But it’s not just the techno-fetishists who think universities are fragile.  People are always worrying about whether institutions can survive “cutbacks” (a term people use even though operating grants have been increasing continuously for over 15 years at a rate well above inflation).  Of course they can.  You can cut universities forever, and they will still function.  Look at universities in Africa, operating in conditions of unimaginable penury. Look at universities like Beijing or Tsinghua, which moved their operations thousands of km (on foot) in order to stay open during the Japanese occupation.  Or look back at Canadian universities in the 1930s, when most universities lost half of their funding (or more in the case of the University of Manitoba, where the bursar made off with the endowment).  They survive.

They survive because communities have pride in even the tiniest universities, and sustain them as best they can.  They survive because at least part of the academy will remain to the bitter end, wanting to continue in the mission of transmitting knowledge.  In the midst of wars and famines, universities have survived, sometimes in the most treacherous circumstances imaginable.

Universities are like cockroaches: they are almost impossible to kill.  They’ll be here after the apocalypse.  The idea that a temporary loss of income or some minor technological advance will do them in is simply laughable.

October 30

Times Higher Rankings, Weak Methodologies, and the Vastly Overblown “Rise of Asia”

I’m about a month late with this one (apologies), but I did want to mention something about the most recent version of the Times Higher Education (THE) Rankings.  You probably saw it linked to headlines that read, “The Rise of Asia”, or some such thing.

As some of you may know, I am inherently suspicious about year-on-year changes in rankings.  Universities are slow-moving creatures.  Quality is built over decades, not months.  If you see huge shifts from one year to another, it usually means the methodology is flimsy.  So I looked at the data for evidence of this “rise of Asia”.

The evidence clearly isn’t there in the top 50.  Tokyo and Hong Kong are unchanged in their position.  Tsinghua Beijing and National University of Singapore are all within a place or two of where they were last year.  In fact, if you just look at the top 50, you’d think Asia might be going backwards, since one of their big unis (Seoul National) fell out of the top 50, going from 44th to 52nd in a single year.

Well, what about if you look at the top 100?  Not much different.  In Korea, KAIST is up a bit, but Pohang is down.  Both the Hong Kong University of Science and Technology and Nanyang were up sharply, though, which is a bit of a boost; however, only one new “Asian” university came into the rankings, and that was the Middle Eastern Technical University in Turkey, which rose spectacularly from the 201-225 band last year, to 85th this year.

OK, what about the next 100?  Here it gets interesting.  There are bad news stories for Asian universities.  National Taiwan and Osaka each fell 13 places. Tohoku fell 15, Tokyo Tech 16, Chinese University Hong Kong 20, and Yonsei University fell out of the top 200 altogether.  But there is good news too: Bogazici University in Turkey jumped 60 places to 139th, and five new universities – two from China, two from Turkey and one from Korea – entered the top 200 for the first time.

So here’s the problem with the THE narrative.  The best part of the evidence for all this “rise of Asia” stuff rests on events in Turkey (which, like Israel, is often considered as being European rather than Asian – at least if membership in UEFA and Eurovision is anything to go by).  The only reason THE goes on with its “rise of Asia” tagline is because it has a lot of advertisers and a big conference business in East Asia, and its good business to flatter them, and damn the facts.

But there’s another issue here: how the hell did Turkey do so well this year, anyway?  Well, for that you need to check in with my friend Richard Holmes, who runs the University Ranking Watch blog.  He points out that a single paper (the one in Physics Letters B, which announced the confirmation of the Higgs Boson, and which immediately got cited in a bazillion places) was responsible for most of the movement in this year’s rankings.  And, because the paper had over 2,800 co-authors (including from those suddenly big Turkish universities), and because THE doesn’t fractionally count multiple-authored articles, and because THE’s methodology gives tons of bonus points to universities located in countries where scientific publications are low, this absolutely blew some schools’ numbers into the stratosphere.  Other examples of this are Scuola Normale di Pisa, which came out of nowhere to be ranked 65th in the world, or Federica Santa Maria Technical University in Chile, which somehow became the 4th ranked university in Latin America.

So basically, this year’s “rise of Asia” story was based almost entirely on the fact that a few of the 2,800 co-authors on the “Observation of a new boson…” paper happened to work in Turkey.

THE needs a new methodology.  Soon.

October 28

Responsibility-Centred Budgeting

As I’m on the subject of finances and budgeting these days, I thought it a good time to bring up the topic of “responsibility-centred budgeting” (RCB).  It’s a timely topic, given both this ludicrous article in the Edmonton Journal last week, and the fact that I have one loyal reader who’s been urging me to write about it for months now (Hi, Alan!).

Responsibility-centred budgeting basically says that units (usually faculties, occasionally departments) are responsible for raising their own funds and covering their own costs.  If you use less space, you have more money for other things; if you teach more students, you’ll get more money.  It’s a simple formula, and is very true to the medieval origins of the university, where professors were all required to set their own fees and collect their own money.

At this point in the discussion, everyone should quickly go and read Nick Rowe’s Confessions of a Central Planner, which is by far the best thing ever written in Canada on university management.  There’s lots of good stuff in that piece, but pay particular attention to one specific point Rowe makes: universities, for the most part, get paid based on how many students they teach.  Yet within most institutions, there is a Hobbesian war of all-against-all to get other people to teach students.  This is because historically, in most institutions, the link between the number of students taught and the funding one receives is loose at best, non-existent at worst.

Some people – mainly those who literally have no clue about how universities are actually financed – abhor the idea of money following students within the institution.  Often, they (and “they” are almost always profs in Arts) come up with scare stories about how money will all go to Science, Engineering, and Business.  And while it’s certainly true that professional schools tend to be money-spinners, at most institutions the departments that do the most teaching – and hence, do the work to bring in money from tuition and operating grants – are frequently to be found in Arts (typically, English is one of them).  Science and Engineering, despite being thought of as “big money” disciplines, tend to net out pretty even because they are also “big cost” disciplines.

The importance of RCB is mainly that it aligns incentives within institutions.  When units are responsible for generating their income and covering their own costs, they tend to use fewer resources, and focus more on generating income.  That’s good not just for bottom-line reasons, but also because it fundamentally changes an institution’s culture – from one where the (much-derided) central administration is a government to be lobbied for money, to one where everyone is involved in the search for revenue and efficiencies.

RCB is not quite as simple as it sounds – for it to work, each faculty needs some kind of staff able to do sophisticated budgeting and course development, and that won’t work at smaller institutions.  There still needs to be central oversight to ensure departments don’t go hogwild hiring tenure-track staff on the basis of short-term profits, or start gaming the system of pre-requisites and required courses in order to screw cognate disciplines.  And there’s still a role for central admin to play in redistributing some money to disciplines that could never make it on their own (mostly Fine Arts).

It’s not a panacea, of course, and there are ways it can be misused.  But on the whole, RCB is proving to be an important tool for Canadian universities to deal with their shift to being more tuition-reliant.

September 30

The Problem with Global Reputation Rankings

I was in Athens this past June, at an EU-sponsored conference on rankings, which included a very intriguing discussion about the use of reputation indicators that I thought I would share with you.

Not all rankings have reputational indicators; the Shanghai (ARWU) rankings, for instance, eschew them completely.  But QS and Times Higher Education (THE) rankings both weight them pretty highly (50% for QS, 35% for THE).  But this data isn’t entirely transparent.  THE, who release their World University Rankings tomorrow,  hides the actual reputational survey results for teaching and research by combining each of them with some other indicators (THE has 13 indicators, but it only shows 5 composite scores).  The reasons for doing this are largely commercial; if, each September, THE actually showed all the results individually, they wouldn’t be able to reassemble the indicators in a different way to have an entirely separate “Reputation Rankings” release six months later (with concomitant advertising and event sales) using exactly the same data.  Also, its data collection partner, Thomson Reuters, wouldn’t be able to sell the data back to institutions as part of its Global Institutional Profiles Project.

Now, I get it, rankers have to cover their (often substantial) costs somehow, and this re-sale of hidden data is one way to do it (disclosure: we at HESA did this with our Measuring Academic Research in Canada ranking.  But given the impact that rankings have for universities, there is an obligation to get this data right.  And the problem is that neither QS nor THE publish enough information about their reputation survey to make a real judgement about the quality of their data – and in particular about the reliability of the “reputation” voting.

We know that the THE allows survey recipients to nominate up to 30 institutions as being “the best in the world” for research and teaching, respectively (15 from one’s home continent, and 15 worldwide); the QS allows 40 (20 from one’s own country, 20 world-wide).  But we have no real idea about how many people are actually ticking the boxes on each university.

In any case, an analyst at an English university recently reverse-engineered the published data for UK universities to work out voting totals.  The resulting estimate is that, among institutions in the 150-200 range of the THE rankings, the average number of votes obtained for either research or teaching is in the range of 30-to-40, at best.  Which is astonishing, really.  Given that reputation counts for one third of an institution’s total score, it means there is enormous scope for year-to-year variations  – get 40 one year and 30 the next, and significant swings in ordinal rankings could result.  It also makes a complete mockery of the “Top Under 50” rankings, where 85% of institutions rank well below the top 200 in the main rankings, and therefore are likely only garnering a couple of votes apiece.  If true, this is a serious methodological problem.

For commercial reasons, it’s impossible to expect the THE to completely open the kimono on its data.  But given the ridiculous amount of influence its rankings have, it would be irresponsible of it – especially since it is allegedly a journalistic enterprise – not to at least allow some third party to inspect its data and give users a better sense of its reliability.  To do otherwise reduces the THE’s ranking exercise to sham social science.

Page 1 of 41234