Higher Education Strategy Associates

Category Archives: research

January 06

Adult Discussions About Research Policy

Over the winter break, the Toronto Star published an editorial on research funding that deserves to be taken out to the woodshed and clobbered.

The editorial comes in two parts. The first is a reflection on whether or not the Harper government is a “caveman” or just “incompetent” when it comes to science. I suppose it’s progress that the Star gives two options, but frankly the Harper record on science isn’t hard to decode:

  1. The Conservatives like “Big Science” and have funded it reasonably well.
  2. They’re not crazy about pure inquiry-driven stuff the granting councils have traditionally done and have kept growth under inflation as a result (which isn’t great but is better than what has happened to some other areas of government funding).
  3. They really hate government regulatory science especially when it comes to the environment and have approached it the way the Visigoths approached Rome (axes out, with an intention to cause damage).
  4. By and large they’d prefer if scientists and business would work more closely together; after all, what’s state investment in research and development for if not to increase economic growth?

But that’s not the part of the article that needs a smack upside the head. Rather, it’s these statements:

Again and again, the Conservatives have diverted resources from basic research – science for no immediate purpose other than knowledge-gathering – to private-public partnerships aimed at immediate commercial gain.


…by abandoning basic research – science that no business would pay for – the government is scorching the very earth from which innovation grows.

OK, first of all: the idea that there is a sharp dividing line between “basic” and “applied” research is pure hornswoggle. They aren’t polar opposites; lots of research (including pretty much everything in medicine and engineering) is arguably both. Outside of astronomy/cosmology, very little modern science is for no purpose other than knowledge gathering. There is almost always some thought of use or purpose. Go read Pasteur’s Quadrant.

Second, while the government is certainly making much of its new money conditional on business participation, the government hasn’t “abandoned” basic research. The billions going into the granting councils are still there.

Third, the idea that innovation and economic growth are driven solely or even mainly by domestic basic research expenditures  is simply a fantasy. A number of economists have shown a connection between economic growth and national levels of research and development; no one (so far as I know) has ever proven it about basic research alone.

There’s a good reason for that: while basic research is the wellspring of innovation (and it’s important that someone does basic research), in open economies it’s not in the least clear that every country has to engage in it to the same degree. The Asian tigers, for instance, emphasized “development” for decades before they started putting money into what we would consider serious basic research facilities. And nearly all the technology Canadian industry relies on is American, and would be so even if we tripled our research budgets.

We know almost nothing about the “optimal” mix of R&D, but it stands to reason that the mix is going to be different in different industries based on how close to the technological frontier each industry is in a given country. The idea that there is a single optimal mix across all times and places is simply untenable.

Cartoonishly simple arguments like the Star’s, which imply that any shift away from “basic” research is inherently wrong, aren’t just a waste of time; the “basic = good, applied = bad” line of argument actively infantilizes the Canadian policy debate. It’s long past time this policy discussion grew up.

December 08

Massachusetts, Not Michigan

TD economist Ed Clark gave an enormously important talk last week, which deserves a lot of attention.  You can get the gist of it from two quotes:

“To return to the path to prosperity, Canada needs to stop wasting time worrying about how to get low-wage jobs back from the U.S. or abroad and start thinking about how to use our well-educated population, immigration policies and public health care to our advantage”.

“Stop competing with Michigan. Start competing with Massachusetts”.

Brilliant.  Couldn’t agree more.  I am fed up with our governments’ weird obsession with showering money and policy attention on blue-collar industries (overwhelmingly male-dominated blue collar industries, I might add).  Time to join the 21st century.

BUT.  Let’s be careful about this.

There will no doubt be some who are eager to use Clark’s words as a “Sputnik moment”.  You know, those times when a perceived scientific or economic threat makes politicians more susceptible to spending money on pretty much anything that looks scientific or knowledge-oriented.  Higher education loves Sputnik moments.

The thing is, no one actually knows how to do this well; that is, nobody really knows what kind of spending or programming actually leads to growth.  One of the problems of being on the technological frontier is that everything is uncertain, and so there’s likely to be a lot of false starts and a lot of waste when looking for the next big thing(s).  But universities (and to a lesser extent colleges) seem like a safe bet, so Sputnik moments are like catnip for them.  All they have to say is “give us bucks because knowledge economy” and politicians will shell out (It’ll be dressed up a bit, but that’s the level of sophistication).

The problem with this argument is that it’s maybe only a quarter of the story.  If pouring money into universities guaranteed economic growth, places like Malaysia, Chile, and Sweden would be world-beaters.  There’s a lot more to high-tech economic growth than education.

To the extent that local research is going to translate into local economic growth, you need managers and entrepreneurs who are skilled at taking research from an idea to proof of concept, and from there to market.  You need pools of investment capital that understand developing technological fields and are prepared to invest patiently in them.  And then of course you need a workforce to actually churn out product.  In the absence of this kind of social and financial infrastructure, funding so-called “basic” research is a lot like pushing on string as far as generating economic growth is concerned.

Then there’s the issue of graduates and the skills they deliver to the economy.  If you want to generate more economic growth matter, you need to make sure that institutions are giving students the kinds of skills that will actually raise private sector productivity levels.  That means much greater attention on the part of employers in articulating the skills and competencies they require as productivity rises, and much greater attention on the part of universities and colleges in providing students with such skills and competencies so as to be employable after graduation.

(Before anyone blows a gasket: I am not suggesting that the purpose of higher education is exclusively to generate skilled labour.  What I am suggesting is that if you’re going to use a Sputnik argument for more money, then you’re going to have to demonstrate how that money improves general productivity.  If you don’t like that deal, then don’t use Sputnik arguments.)

All of which is to say: there’s good news on the horizon in terms of getting everybody focused on a high-innovation, high-productivity economic future in Canada.  But the surest way to waste this opportunity would be to call for wanton expansions of current subsidies to higher education.  Change is needed to make public dollars work more effectively.  We should use this opportunity to make those changes.

October 23

Where the Questions Are

I had planned to continue on today with my series about operating budgets by taking a look at some scenarios for Central Canada, but I’ve been on the east coast for work the past couple days, and so that post will have to wait.  We’ll get back to it shortly, I promise.  But for now, let me turn to something I’ve been thinking about lately.

One of the maddening things about many discussions that concern higher education and business is the crudeness of many popular views on their relationship.  Mostly, we hear about how business’ role is to “contribute” to higher education, either via taxes, or philanthropy, or both (depending on where you are on the political spectrum).  Often times, the role of business is to hire “our” graduates (and if that’s not happening then let the agonized introspection begin).

And while those things are all true, what these analyses actually miss is the true role of business, particularly with respect to science: it’s a huge, incomparable reservoir of questions to be answered, and problems to be solved.  Of course, people get this at the level of applied research – by definition, when companies engage with higher education on applied research, it’s to solve specific problems – but they have trouble understanding when it comes to “pure” research.  Partly, that’s due to rhetorical confusion – the wording of “pure” research (a rhetorical device of Vannevar Bush designed to keep money flowing to universities after World War II) implies that interaction between scientists and pretty much anyone else will “contaminate” research.

But a quick history of 20th century science will show you that this is nonsense.  Much of Einstein’s early work was hugely influenced by being immersed in commercial technology at the Swiss patent office.  Quantum physics was an accidental discovery made by German scientists who were trying to design more accurate instruments to measure very small weights.  The Manhattan Project wasn’t about meeting commercial needs, but as research goes, it’s about as applied as it gets.  Etc., etc.

The point here is that there are parts of commercial science that are up banging against the frontiers of the unknown just as much as university science is: just think of what was discovered at Bell Labs, or what Craig Ventner has accomplished.  It’s where the rubber hits the road: where the most advanced academic science gets put into practice and tested in real-world conditions.  Under commercial pressure, commercial science looks for every little advantage when learning how to cure disease, design better buildings, and develop new technology.

Even Vannevar Bush didn’t believe “pure” research happened in a vacuum.  Indeed, the justification for “pure” research is always that someone, somewhere, will find an application for it.  If you don’t have an inkling of where your “pure” research findings might actually be applied someday, you probably aren’t conducting your “pure” research in a way that’s very effective, because you’re not asking the right questions.

And this is the real reason universities need to engage with industry: it’s where the best questions are.  And you’re not going to get top-notch research without top-notch questions.

October 01

A Venn Diagram About Skills Gaps

Short and sweet today, folks, as I know you’re all busy.

We’ve done a lot of research over the years at HESA Towers.  We read up on what employers want – and we also do studies that look at how recent graduates fare in the labour market, and what they wish they’d had more of while in university.  And pretty much, without exception, regardless of field of study, those two sources agree on what students need to be better-prepared for the labour market.
















So, want to give your grads a boost in the labour market?  Figure out how to give them those basic business skills.  Experiential learning is probably the most effective way to do it, but there are other ways, as well, both inside and outside the classroom.

It’s that simple.  Well, not simple at all really.  But at least the problem is well-defined.

August 11

Improving Career Services Offices

Over the last few years, what with the recession and all, there has been increased pressure on post-secondary institutions to ensure that their graduates get jobs.  Though that’s substantially the result of things like curriculum and one’s own personal characteristics, landing a job also depends on being able to get interviews and to do well in them.  That’s where Career Services Offices (CSOs) come in.

Today, HESA released a paper that looks at CSOs and their activities.  The study explores two questions.  The first question deals specifically with university CSOs and what qualities and practices are associated with offices that receive high satisfaction ratings from their students.  The second question deals with college career services – here we did not have any outcome measures like the Globe and Mail, so we focussed on a relatively simple question: how does their structure and offerings differ from what we see in the university sector?

Let’s deal with that second question first: college CSOs tend to be smaller and less sophisticated than those at universities of the same size.  At first glance, that seems paradoxical – these are career-focussed organizations, aren’t they?  But the reason for this is fairly straightforward: to a large extent, the responsibility for making connections between students and employers resides at the level of the individual program rather than with some central, non-academic service provider – a lot of what takes place in a CSO at universities takes place in the classroom at colleges.

Now, to universities, and the question: what is it that makes for a good career services department?  To answer this question we interviewed CSO staff at high- medium- and low-performing institutions (as measured by the Globe and Mail’s pre-2012 student satisfaction surveys) to try to work out what practices distinguished the high-performers.  So what is it that makes for a really good career services office?  Turns out that the budget, staff size, and location of Career Services Offices aren’t really the issue.  What really matters are the following:

  • Use of Data.  Everybody collects data on their operations, but not everyone puts it to good use.  What distinguishes the very best CSOs is that they have an effective, regular feedback loop to make sure insights in the data are being used to modify the way services are delivered.
  • Teaching Job-seeking Skills.  Many CSOs view their mission as making as many links as possible between students and employers.  The very best-performing CSOs find ways to teach job search and interview skills to students, so that they can more effectively capitalize on any connections.
  • Better Outreach Within the Institution.  It’s easy to focus on making partnerships outside the institution.  The really successful CSOs also make partnerships inside the institution. One of the key relationships to be nurtured is academic staff.  Students, for better or for worse, view profs as frontline staff and ask them lots of questions about things like jobs and careers.  At many institutions, profs simply aren’t prepared for questions like that, and don’t know how to respond.  The best CSOs take the time to reach out to staff and partner with them to ensure they have tools at their disposal to answer those questions, and to direct students to the right resources at the CSOs.

If you want better career services, there’s your recipe.  Bonne chance.

May 12

Non-Lieux Universities: Whose Fault?

About four months ago, UBC President Stephen Toope wrote a widely-praised piece called “Universities in an Era of Non-Lieux“.  Basically, the piece laments the growing trend toward the deracinated homogenization of universities around the globe.  He names global rankings and government micro-management of research and enrolment strategies – usually of a fairly faddish variety, as evidenced by the recent MOOC-mania – as the main culprits.

I’m not going to take issue with Toope’s central thesis: I agree with him 100% that we need more institutional diversity; but I think the piece fails on two counts.  First, it leaves out the question of where governments got these crazy ideas in the first place.  And second, when it comes right down to it, the fact is that big research universities are only against institutional diversity insofar as it serves their own interests.

Take global rankings, for instance.  Granted, these can be fairly reductionist affairs.  And yes, they privilege institutions that are big on research.  But where on earth could rankers have come up with the idea that research was what mattered to universities, and that big research = big prestige?  Who peddles that line CONSTANTLY?  Who makes hiring based on research ability?  Who makes distinctions between institutions based on research intensity?  Could it possibly be the academic community itself?  Could it be that universities are not so much victims as culprits here?

(I mean, for God’s sake, UBC itself is a member of “Research Universities Council of BC” – an organization that changed its name just a few years ago so its members would be sure to distinguish themselves from the much more lumpen new [non-research-intensive] universities who caucus in the much less-grandly named BC Association of Institutes & Universities.  Trust me – no rankers made them do that.  They came up with this idea on their own.)

As for the argument that government imposes uniformity through a combination of meddling and one-size-fits-all funding models, it’s a point that’s hard to argue.  Canadian governments are notorious for the way they only incentivize size and research, and then wonder why every university wants to be larger and more research-intensive.  But frankly, this has traditionally worked in research universities’ favour.  You didn’t hear a lot of U15 Presidents moaning about research monocultures as long as the money was still flowing entirely in their direction.  So while Toope is quite right that forcing everyone into an applied research direction is silly, the emergence of a focus on applied research actually has a much greater potential to drive differentiation than your average government policy fad.

So, to echo Toope, yes to diversity, no to “non-lieux”.  But let’s not pretend that the drive to isomorphism comes from anywhere but inside the academy.  We have met the enemy and he is us.

November 07

International Alliances and Research Agreements

In business, companies strive to increase market share; in higher education, institutions compete for prestige.  This is why, despite whatever your told by people in universities, rankings are catnip to university administrations: by codifying prestige, they give institutions actual benchmarks against which they can measure themselves.

But prestige is actually much harder to amass than market share.  Markets can increase in size; prestige is a zero-sum affair (my prestige is related directly to your lack thereof).  And universities have fewer tools than businesses to extend their reach.  Mergers are not unheard of – indeed, the pressure of global rankings has been a factor behind a wave of institutional mergers in France, Russia, and Scandinavia – but these tend to be initiated by governments rather than institutions. Hostile take-overs are even less common (though UBC’s acquisition of a campus in the Okanagan shows it’s not impossible).

So, what’s a university to do?  Increasingly, the answer seems to be: “make strategic alliances”.

These tend to come in two forms: multi-institutional alliances (like Universitas 21, the Coimbra Group, and the like), and bilateral institutional deals.  Occasionally, the latter exercise can go as far as ambitious, near-institutional mergers (see the Monash-Warwick alliance, for instance), but it usually consists of much simpler initiatives – MOUs between two institutions, designed to promote co-operation in fairly general terms.  There’s a whole industry around this now – both QS and Thompson Reuters offer services to help institutions identify the most promising research partners.  And signing these MOUs seem to take up an increasing amount of time, effort, and air miles among senior managers.

So it’s fair to ask: do these MOUs make any difference at all to research output?  I have no hard evidence on this, but I suspect that returns are actually pretty meagre.  While inter-institutional co-operation is increasing all the time, for the most part these links are organic; that is, they arise spontaneously from the interaction of individual researchers coming up with cool ideas for collaboration, rather than from more top-down interactions.  While there’s a lot that governments and institutions can do to promote inter-institutional linkages in general, there’s a very limited amount that central administrations can do to promote specific linkages, that doesn’t quickly become counterproductive.

Having significant international research links is indeed the sign of a good university – the problem is that for managers under pressure to demonstrate results, organic growth isn’t fast enough.  The appeal of all these MOUs is that they give the appearance of rapid progress on internationalization.  But given the time and money expended on these things, some rigour is called for. This is an area where Board members can, and should, hold their administrations to account, and ask for some reasonable cost-benefit analysis.

November 05

Owning the Podium

I’m sure many of you saw Western President, Amit Chakma’s, op-ed in the National Post last week, suggesting that Canadian universities need more government assistance to reach new heights of excellence, and “own the podium” in global academia.  I’ve been told that Chakma’s op-ed presages a new push by the U-15 for a dedicated set of “excellence funds” which, presumably, would end up mostly in the U-15′s own hands (for what is excellence if not research done by the U-15?).  All I can say is that the argument needs some work.

The piece starts out with scare metrics to show that Canada is “falling behind”.  Australia has just two-thirds our population, yet has seven institutions in the QS top 100, compared to Canada’s five!  Why anyone should care about this specific cut-off (use the top-200 in the QS rankings and Canada beats Australia 9 to 8), or this specific ranking (in the THE rankings, Canada and Australia each have 4 spots), Chakma never makes clear.

The piece then moves on to make the case that, “other countries such as Germany, Israel, China and India are upping their game” in public funding of research (no mention of the fact that Canada spends more public dollars on higher education and research than any of these countries), which leads us to the astonishing non-sequitur that, “if universities in other jurisdictions are beating us on key academic and research measures, it’s not surprising that Canada is also being out-performed on key economic measures”.

This proposition – that public funding of education is a leading indicator of economic performance – is demonstrably false.  Germany has just about the weakest higher education spending in the OECD, and it’s doing just fine, economically.  The US has about the highest, and it’s still in its worst economic slowdown in over seventy-five years.  Claiming that there is some kind of demonstrable short-term link is the kind of thing that will get universities into trouble.  I mean, someone might just say, “well, Canada has the 4th-highest level of public funding of higher education as a percentage of GDP in the OECD – doesn’t that mean we should be doing better?  And if that’s indeed true, and our economy is so mediocre, doesn’t that give us reason to suspect that maybe our universities aren’t delivering the goods?”

According to Chakma, Canada has arrived at its allegedly-wretched state by virtue of having a funding formula which prioritizes bums-in-seats instead of excellence.  But that’s a tough sell.  Most countries (including oh-so-great Australia) have funding formulae at least as demand-oriented as our own – and most are working with considerably fewer dollars per student as well.  If Australia is in fact “beating” us (a debatable proposition), one might reasonably suspect that it has at least as much to do with management as it does money.

Presumably, though, that’s not a hypothesis the U-15 wants to test.

October 02

A New Study on Postdocs

There’s an interesting study on postdocs out today, from the Canadian Association of Postdoctoral Scholars (CAPS) and MITACS.  The report provides a wealth of data on postdocs’ demographics, financial status, likes, dislikes, etc.  It’s all thoroughly interesting and well worth a read, but I’m going to restrict my comments to just two of the most interesting results.

The first has to do, specifically, with postdocs’ legal status.  In Quebec, they are considered students. Outside Quebec, it depends: if their funding comes from internal university funds, they are usually considered employees; but, if their funding is external, they are most often just “fellowship holders” – an indistinct category which could mean a wide variety of things in terms of access to campus services (are they students?  Employees?  Both?  Neither?).  Just taxonomically, the whole situation’s a bit of a nightmare, and one can certainly see the need for greater clarity and consistently if we ever want to make policy on postdocs above the institutional level.

The second – somewhat jaw-dropping – point of interest is the table on page 27, which examines postdocs’ training.

Level of Training Received or Available, in % (The 2013 Canadian Postdoc Survey, Table 3, pg. 27)














As the authors note, being trainees is what makes postdocs a distinct group – it’s basically the only thing that distinguishes them from research associates.  So what should we infer from the fact that only 18% of postdocs report receiving any formal training for career development, 15% for research ethics, and 11% on either presentation skills or grant/proposal writing?  If there’s a smoking gun on the charge that Canadian universities view postdocs as cheap academic labour, rather than as true academics-in-waiting, this table is it.

All of this information is, of course, important; however, this study’s value goes beyond its presentation of new data.  One of its most important lessons comes from the fact that a couple of organizations just decided to get together and collect data on their own.  Too often in this country, we turn our noses up at anything other than the highest-quality data, but since no one wants to pay for quality (how Canadian is that?), we just wring our hands hoping StatsCan will eventually sort it out it for us.

But to hell with that.  StatsCan’s broke, and even when it had money it couldn’t get its most important product (PSIS) to work properly.  It’s time the sector got serious about collecting, packaging, and – most importantly – publishing its own data, even if it’s not StatsCan quality.  This survey’s sample selection, for instance, is a bit on the dodgy side – but who cares?  Some data is better than none.  And too often, “none” is what we have.

CAPS/MITACS have done everyone a solid by spending their own time and money to improve our knowledge base about some key contributors to the country’s research effort.  They deserve both to be commended and widely imitated.

May 22

Bad Arguments for Basic Research

Last week’s announcement that the NRC was “open for business” has, if nothing else, revealed how shockingly weak most of the arguments are in favour of “basic” research.

Opponents of the NRC move have basically taken one of two rhetorical tacks.  The first is to present the switch in NRC mandate as the equivalent of the government abandoning basic science.  This is a bit off, frankly, considering that the government spends billions of dollars on SSHRC, NSERC, CIHR, etc.  Even if you’re passionate about basic research, there are still valid questions to be answered about why we should be paying billions of dollars a year to government departments doing basic research when the granting councils fund universities to ostensibly do the same thing.

The second argument is to say that government shouldn’t support applied science, because: a) it’s corporate welfare, and b) all breakthroughs ultimately rely on basic science, and so we should fund that exclusively.  It seems as though those who take this line have never heard of Germany’s Fraunhofer Institute, a publicly funded agency in Germany which does nothing but conduct applied research of direct utility to private enterprises.  It’s generally seen as a successful and useful complement to the government’s investments in basic science through the Max Planck Institute, and to my knowledge, Germany has never been accused of being anti-science for creating and funding Fraunhofer.

Another point here: the benefits of “basic” research leak across national borders. Very little of the upstream basic research that drives our economy is Canadian in origin.  So while it’s vitally important that someone, somewhere, puts a lot of money down on risky, non-applied research, individual countries can – and probably should – make some different decisions on basic vs. applied research based on local conditions.

The relative benefit of a marginal dollar investment in applied research vs. basic research depends on the kind of economy a country has, the pattern of firm size, and receptor capacity for research.  It’s not an easy thing to measure accurately – and I’m not suggesting that the current government has based its decision on anything so empirical – but it’s simply not intellectually honest to claim that one is always a better investment than the other.

Opposition to the NRC change is clearly – and probably justifiably – coloured by a more general irritation at a host of this government’s other policies on science and knowledge (Experimental Lakes, long-form census, etc).  But that’s still no excuse for this farrago of flimsy argumentation.  Rational policy-making requires us to engage in something more than juvenile, binary discussions about what kind of research is “best”.

Page 1 of 41234