HESA

Higher Education Strategy Associates

Category Archives: research

October 23

Where the Questions Are

I had planned to continue on today with my series about operating budgets by taking a look at some scenarios for Central Canada, but I’ve been on the east coast for work the past couple days, and so that post will have to wait.  We’ll get back to it shortly, I promise.  But for now, let me turn to something I’ve been thinking about lately.

One of the maddening things about many discussions that concern higher education and business is the crudeness of many popular views on their relationship.  Mostly, we hear about how business’ role is to “contribute” to higher education, either via taxes, or philanthropy, or both (depending on where you are on the political spectrum).  Often times, the role of business is to hire “our” graduates (and if that’s not happening then let the agonized introspection begin).

And while those things are all true, what these analyses actually miss is the true role of business, particularly with respect to science: it’s a huge, incomparable reservoir of questions to be answered, and problems to be solved.  Of course, people get this at the level of applied research – by definition, when companies engage with higher education on applied research, it’s to solve specific problems – but they have trouble understanding when it comes to “pure” research.  Partly, that’s due to rhetorical confusion – the wording of “pure” research (a rhetorical device of Vannevar Bush designed to keep money flowing to universities after World War II) implies that interaction between scientists and pretty much anyone else will “contaminate” research.

But a quick history of 20th century science will show you that this is nonsense.  Much of Einstein’s early work was hugely influenced by being immersed in commercial technology at the Swiss patent office.  Quantum physics was an accidental discovery made by German scientists who were trying to design more accurate instruments to measure very small weights.  The Manhattan Project wasn’t about meeting commercial needs, but as research goes, it’s about as applied as it gets.  Etc., etc.

The point here is that there are parts of commercial science that are up banging against the frontiers of the unknown just as much as university science is: just think of what was discovered at Bell Labs, or what Craig Ventner has accomplished.  It’s where the rubber hits the road: where the most advanced academic science gets put into practice and tested in real-world conditions.  Under commercial pressure, commercial science looks for every little advantage when learning how to cure disease, design better buildings, and develop new technology.

Even Vannevar Bush didn’t believe “pure” research happened in a vacuum.  Indeed, the justification for “pure” research is always that someone, somewhere, will find an application for it.  If you don’t have an inkling of where your “pure” research findings might actually be applied someday, you probably aren’t conducting your “pure” research in a way that’s very effective, because you’re not asking the right questions.

And this is the real reason universities need to engage with industry: it’s where the best questions are.  And you’re not going to get top-notch research without top-notch questions.

October 01

A Venn Diagram About Skills Gaps

Short and sweet today, folks, as I know you’re all busy.

We’ve done a lot of research over the years at HESA Towers.  We read up on what employers want – and we also do studies that look at how recent graduates fare in the labour market, and what they wish they’d had more of while in university.  And pretty much, without exception, regardless of field of study, those two sources agree on what students need to be better-prepared for the labour market.

1

 

 

 

 

 

 

 

 

 

 

 

 

 

 

So, want to give your grads a boost in the labour market?  Figure out how to give them those basic business skills.  Experiential learning is probably the most effective way to do it, but there are other ways, as well, both inside and outside the classroom.

It’s that simple.  Well, not simple at all really.  But at least the problem is well-defined.

August 11

Improving Career Services Offices

Over the last few years, what with the recession and all, there has been increased pressure on post-secondary institutions to ensure that their graduates get jobs.  Though that’s substantially the result of things like curriculum and one’s own personal characteristics, landing a job also depends on being able to get interviews and to do well in them.  That’s where Career Services Offices (CSOs) come in.

Today, HESA released a paper that looks at CSOs and their activities.  The study explores two questions.  The first question deals specifically with university CSOs and what qualities and practices are associated with offices that receive high satisfaction ratings from their students.  The second question deals with college career services – here we did not have any outcome measures like the Globe and Mail, so we focussed on a relatively simple question: how does their structure and offerings differ from what we see in the university sector?

Let’s deal with that second question first: college CSOs tend to be smaller and less sophisticated than those at universities of the same size.  At first glance, that seems paradoxical – these are career-focussed organizations, aren’t they?  But the reason for this is fairly straightforward: to a large extent, the responsibility for making connections between students and employers resides at the level of the individual program rather than with some central, non-academic service provider – a lot of what takes place in a CSO at universities takes place in the classroom at colleges.

Now, to universities, and the question: what is it that makes for a good career services department?  To answer this question we interviewed CSO staff at high- medium- and low-performing institutions (as measured by the Globe and Mail’s pre-2012 student satisfaction surveys) to try to work out what practices distinguished the high-performers.  So what is it that makes for a really good career services office?  Turns out that the budget, staff size, and location of Career Services Offices aren’t really the issue.  What really matters are the following:

  • Use of Data.  Everybody collects data on their operations, but not everyone puts it to good use.  What distinguishes the very best CSOs is that they have an effective, regular feedback loop to make sure insights in the data are being used to modify the way services are delivered.
  • Teaching Job-seeking Skills.  Many CSOs view their mission as making as many links as possible between students and employers.  The very best-performing CSOs find ways to teach job search and interview skills to students, so that they can more effectively capitalize on any connections.
  • Better Outreach Within the Institution.  It’s easy to focus on making partnerships outside the institution.  The really successful CSOs also make partnerships inside the institution. One of the key relationships to be nurtured is academic staff.  Students, for better or for worse, view profs as frontline staff and ask them lots of questions about things like jobs and careers.  At many institutions, profs simply aren’t prepared for questions like that, and don’t know how to respond.  The best CSOs take the time to reach out to staff and partner with them to ensure they have tools at their disposal to answer those questions, and to direct students to the right resources at the CSOs.

If you want better career services, there’s your recipe.  Bonne chance.

May 12

Non-Lieux Universities: Whose Fault?

About four months ago, UBC President Stephen Toope wrote a widely-praised piece called “Universities in an Era of Non-Lieux“.  Basically, the piece laments the growing trend toward the deracinated homogenization of universities around the globe.  He names global rankings and government micro-management of research and enrolment strategies – usually of a fairly faddish variety, as evidenced by the recent MOOC-mania – as the main culprits.

I’m not going to take issue with Toope’s central thesis: I agree with him 100% that we need more institutional diversity; but I think the piece fails on two counts.  First, it leaves out the question of where governments got these crazy ideas in the first place.  And second, when it comes right down to it, the fact is that big research universities are only against institutional diversity insofar as it serves their own interests.

Take global rankings, for instance.  Granted, these can be fairly reductionist affairs.  And yes, they privilege institutions that are big on research.  But where on earth could rankers have come up with the idea that research was what mattered to universities, and that big research = big prestige?  Who peddles that line CONSTANTLY?  Who makes hiring based on research ability?  Who makes distinctions between institutions based on research intensity?  Could it possibly be the academic community itself?  Could it be that universities are not so much victims as culprits here?

(I mean, for God’s sake, UBC itself is a member of “Research Universities Council of BC” – an organization that changed its name just a few years ago so its members would be sure to distinguish themselves from the much more lumpen new [non-research-intensive] universities who caucus in the much less-grandly named BC Association of Institutes & Universities.  Trust me – no rankers made them do that.  They came up with this idea on their own.)

As for the argument that government imposes uniformity through a combination of meddling and one-size-fits-all funding models, it’s a point that’s hard to argue.  Canadian governments are notorious for the way they only incentivize size and research, and then wonder why every university wants to be larger and more research-intensive.  But frankly, this has traditionally worked in research universities’ favour.  You didn’t hear a lot of U15 Presidents moaning about research monocultures as long as the money was still flowing entirely in their direction.  So while Toope is quite right that forcing everyone into an applied research direction is silly, the emergence of a focus on applied research actually has a much greater potential to drive differentiation than your average government policy fad.

So, to echo Toope, yes to diversity, no to “non-lieux”.  But let’s not pretend that the drive to isomorphism comes from anywhere but inside the academy.  We have met the enemy and he is us.

November 07

International Alliances and Research Agreements

In business, companies strive to increase market share; in higher education, institutions compete for prestige.  This is why, despite whatever your told by people in universities, rankings are catnip to university administrations: by codifying prestige, they give institutions actual benchmarks against which they can measure themselves.

But prestige is actually much harder to amass than market share.  Markets can increase in size; prestige is a zero-sum affair (my prestige is related directly to your lack thereof).  And universities have fewer tools than businesses to extend their reach.  Mergers are not unheard of – indeed, the pressure of global rankings has been a factor behind a wave of institutional mergers in France, Russia, and Scandinavia – but these tend to be initiated by governments rather than institutions. Hostile take-overs are even less common (though UBC’s acquisition of a campus in the Okanagan shows it’s not impossible).

So, what’s a university to do?  Increasingly, the answer seems to be: “make strategic alliances”.

These tend to come in two forms: multi-institutional alliances (like Universitas 21, the Coimbra Group, and the like), and bilateral institutional deals.  Occasionally, the latter exercise can go as far as ambitious, near-institutional mergers (see the Monash-Warwick alliance, for instance), but it usually consists of much simpler initiatives – MOUs between two institutions, designed to promote co-operation in fairly general terms.  There’s a whole industry around this now – both QS and Thompson Reuters offer services to help institutions identify the most promising research partners.  And signing these MOUs seem to take up an increasing amount of time, effort, and air miles among senior managers.

So it’s fair to ask: do these MOUs make any difference at all to research output?  I have no hard evidence on this, but I suspect that returns are actually pretty meagre.  While inter-institutional co-operation is increasing all the time, for the most part these links are organic; that is, they arise spontaneously from the interaction of individual researchers coming up with cool ideas for collaboration, rather than from more top-down interactions.  While there’s a lot that governments and institutions can do to promote inter-institutional linkages in general, there’s a very limited amount that central administrations can do to promote specific linkages, that doesn’t quickly become counterproductive.

Having significant international research links is indeed the sign of a good university – the problem is that for managers under pressure to demonstrate results, organic growth isn’t fast enough.  The appeal of all these MOUs is that they give the appearance of rapid progress on internationalization.  But given the time and money expended on these things, some rigour is called for. This is an area where Board members can, and should, hold their administrations to account, and ask for some reasonable cost-benefit analysis.

November 05

Owning the Podium

I’m sure many of you saw Western President, Amit Chakma’s, op-ed in the National Post last week, suggesting that Canadian universities need more government assistance to reach new heights of excellence, and “own the podium” in global academia.  I’ve been told that Chakma’s op-ed presages a new push by the U-15 for a dedicated set of “excellence funds” which, presumably, would end up mostly in the U-15′s own hands (for what is excellence if not research done by the U-15?).  All I can say is that the argument needs some work.

The piece starts out with scare metrics to show that Canada is “falling behind”.  Australia has just two-thirds our population, yet has seven institutions in the QS top 100, compared to Canada’s five!  Why anyone should care about this specific cut-off (use the top-200 in the QS rankings and Canada beats Australia 9 to 8), or this specific ranking (in the THE rankings, Canada and Australia each have 4 spots), Chakma never makes clear.

The piece then moves on to make the case that, “other countries such as Germany, Israel, China and India are upping their game” in public funding of research (no mention of the fact that Canada spends more public dollars on higher education and research than any of these countries), which leads us to the astonishing non-sequitur that, “if universities in other jurisdictions are beating us on key academic and research measures, it’s not surprising that Canada is also being out-performed on key economic measures”.

This proposition – that public funding of education is a leading indicator of economic performance – is demonstrably false.  Germany has just about the weakest higher education spending in the OECD, and it’s doing just fine, economically.  The US has about the highest, and it’s still in its worst economic slowdown in over seventy-five years.  Claiming that there is some kind of demonstrable short-term link is the kind of thing that will get universities into trouble.  I mean, someone might just say, “well, Canada has the 4th-highest level of public funding of higher education as a percentage of GDP in the OECD – doesn’t that mean we should be doing better?  And if that’s indeed true, and our economy is so mediocre, doesn’t that give us reason to suspect that maybe our universities aren’t delivering the goods?”

According to Chakma, Canada has arrived at its allegedly-wretched state by virtue of having a funding formula which prioritizes bums-in-seats instead of excellence.  But that’s a tough sell.  Most countries (including oh-so-great Australia) have funding formulae at least as demand-oriented as our own – and most are working with considerably fewer dollars per student as well.  If Australia is in fact “beating” us (a debatable proposition), one might reasonably suspect that it has at least as much to do with management as it does money.

Presumably, though, that’s not a hypothesis the U-15 wants to test.

October 02

A New Study on Postdocs

There’s an interesting study on postdocs out today, from the Canadian Association of Postdoctoral Scholars (CAPS) and MITACS.  The report provides a wealth of data on postdocs’ demographics, financial status, likes, dislikes, etc.  It’s all thoroughly interesting and well worth a read, but I’m going to restrict my comments to just two of the most interesting results.

The first has to do, specifically, with postdocs’ legal status.  In Quebec, they are considered students. Outside Quebec, it depends: if their funding comes from internal university funds, they are usually considered employees; but, if their funding is external, they are most often just “fellowship holders” – an indistinct category which could mean a wide variety of things in terms of access to campus services (are they students?  Employees?  Both?  Neither?).  Just taxonomically, the whole situation’s a bit of a nightmare, and one can certainly see the need for greater clarity and consistently if we ever want to make policy on postdocs above the institutional level.

The second – somewhat jaw-dropping – point of interest is the table on page 27, which examines postdocs’ training.

Level of Training Received or Available, in % (The 2013 Canadian Postdoc Survey, Table 3, pg. 27)

 

 

 

 

 

 

 

 

 

 

 

 

 

As the authors note, being trainees is what makes postdocs a distinct group – it’s basically the only thing that distinguishes them from research associates.  So what should we infer from the fact that only 18% of postdocs report receiving any formal training for career development, 15% for research ethics, and 11% on either presentation skills or grant/proposal writing?  If there’s a smoking gun on the charge that Canadian universities view postdocs as cheap academic labour, rather than as true academics-in-waiting, this table is it.

All of this information is, of course, important; however, this study’s value goes beyond its presentation of new data.  One of its most important lessons comes from the fact that a couple of organizations just decided to get together and collect data on their own.  Too often in this country, we turn our noses up at anything other than the highest-quality data, but since no one wants to pay for quality (how Canadian is that?), we just wring our hands hoping StatsCan will eventually sort it out it for us.

But to hell with that.  StatsCan’s broke, and even when it had money it couldn’t get its most important product (PSIS) to work properly.  It’s time the sector got serious about collecting, packaging, and – most importantly – publishing its own data, even if it’s not StatsCan quality.  This survey’s sample selection, for instance, is a bit on the dodgy side – but who cares?  Some data is better than none.  And too often, “none” is what we have.

CAPS/MITACS have done everyone a solid by spending their own time and money to improve our knowledge base about some key contributors to the country’s research effort.  They deserve both to be commended and widely imitated.

May 22

Bad Arguments for Basic Research

Last week’s announcement that the NRC was “open for business” has, if nothing else, revealed how shockingly weak most of the arguments are in favour of “basic” research.

Opponents of the NRC move have basically taken one of two rhetorical tacks.  The first is to present the switch in NRC mandate as the equivalent of the government abandoning basic science.  This is a bit off, frankly, considering that the government spends billions of dollars on SSHRC, NSERC, CIHR, etc.  Even if you’re passionate about basic research, there are still valid questions to be answered about why we should be paying billions of dollars a year to government departments doing basic research when the granting councils fund universities to ostensibly do the same thing.

The second argument is to say that government shouldn’t support applied science, because: a) it’s corporate welfare, and b) all breakthroughs ultimately rely on basic science, and so we should fund that exclusively.  It seems as though those who take this line have never heard of Germany’s Fraunhofer Institute, a publicly funded agency in Germany which does nothing but conduct applied research of direct utility to private enterprises.  It’s generally seen as a successful and useful complement to the government’s investments in basic science through the Max Planck Institute, and to my knowledge, Germany has never been accused of being anti-science for creating and funding Fraunhofer.

Another point here: the benefits of “basic” research leak across national borders. Very little of the upstream basic research that drives our economy is Canadian in origin.  So while it’s vitally important that someone, somewhere, puts a lot of money down on risky, non-applied research, individual countries can – and probably should – make some different decisions on basic vs. applied research based on local conditions.

The relative benefit of a marginal dollar investment in applied research vs. basic research depends on the kind of economy a country has, the pattern of firm size, and receptor capacity for research.  It’s not an easy thing to measure accurately – and I’m not suggesting that the current government has based its decision on anything so empirical – but it’s simply not intellectually honest to claim that one is always a better investment than the other.

Opposition to the NRC change is clearly – and probably justifiably – coloured by a more general irritation at a host of this government’s other policies on science and knowledge (Experimental Lakes, long-form census, etc).  But that’s still no excuse for this farrago of flimsy argumentation.  Rational policy-making requires us to engage in something more than juvenile, binary discussions about what kind of research is “best”.

May 08

Fundamental Research

Scientific discovery is not valuable unless it has commercial value” (John McDougall, NRC president, yesterday).

Discovery comes from what scientists think is important, not what industry thinks is important.  Fundamental scientific advancement drives innovation, and that is driven by basic research.” (David Robinson, CAUT Associate Executive Director, yesterday).

Some days, the level of discourse in Canadian higher education policy seems to be improving.  Other days, like yesterday, it is full of childish, one-dimensional arguments about the nature of science and research, arguments that the rest of the world outgrew of fifteen or twenty years ago, and I just want to weep.

The basic concept of research was invented by Vannevar Bush in his 1945 work, Science: The Endless Frontier.  In order to press for greater funding of university research, Bush made a sharp distinction between “basic” (or “fundamental”) research, “performed without thought of practical ends” at universities, and “applied” research” (something to be left to business and the military) that developed from the former.  To have more of the latter, he conveniently argued, you needed more of the former.

But this neat division was a rhetorical device rather than a meaningful scientific taxonomy.  As Donald Stokes pointed out in his book, Pasteur’s Quadrant, outside of theoretical physics, there really aren’t many fields of science where scientists knock about “without thought of practical ends”.   Fundamental research often solves very practical problems that industry faces (which is true for a great deal of research in Engineering, Computer Science, and Chemistry), or which quite clearly has commercial applications (true for much medical research, for instance).  Discovery, as David Robinson says, does come from “what scientists think is important”, but that begs the question: “how do they decide what’s important”?  The answer, often, is discovered by interacting with industry and finding out what companies think is important.  If that weren’t true, frankly, the contribution of university science to economic growth would be a hell of a lot smaller than it is.

As for the notion that scientific discovery is not valuable without a commercial application: man, that’s some strong ganja they’re smoking on Montreal Road.   Are mathematics worthless because you can’t patent an equation?  Was Galileo just some flâneur because he never made a penny off heliocentrism?  How the hell can you tell, a priori, whether something has a commercial application?  I mean, Rutherford wasn’t thinking about multi-billion dollar industries in telecommunications, nuclear power, and quantum computing when he did his gold foil experiments.  Yet all those industries would be non-existent if we still thought that atoms were solid shells.

As a country, our scientific and academic leaders should do better than this.

April 25

The Leiden Rankings 2013

Though it was passed over in silence here in Canada, the new Leiden university research rankings made a bit of a splash elsewhere, last week.  I gave a brief overview of the Leiden rankings last year.  Based on five years’ worth of Web of Science publication and citation data (2008-2012), it is by some distance the best way to compare institutions’ current research output and performance.  The Leiden rankings have always allowed comparisons along a number of dimensions of impact and collaboration; what’s new – and fabulous – this year is that the results can be disaggregated into five broad areas of study (biomedical sciences, life & earth sciences, math & computer science, natural sciences & engineering, and social sciences & humanities).

So how did Canadian universities do?

The big news is that the University of Toronto is #2 in the world (Harvard = #1) in terms of publications, thanks mainly to its gargantuan output in biomedical sciences.  But when one starts looking at impact, the story is not quite as good.  American universities come way out in front on impact in all five areas of study – natural, since they control the journals and they read and cite each others’ work more often than they do that of foreigners.  The UK is second in all categories (except math & computer science), third place in most fields belongs to the Dutch (seriously – their numbers are stunning), followed by the Germans and Chinese, followed (at a distance) by Canada and Australia.   Overall, if you look at each country’s half-dozen or so best universities, sixth or seventh is probably where we rank as a country in all sub-fields, and overall.

Also of interest is the data on collaboration, and specifically the percentage of publications which have an international co-author.  That Canada ranks low on this measure shouldn’t be a surprise: Europeans tend to dominate this measure because there are so many countries cheek by jowl.  But the more interesting finding is just how messy international collaboration is as a measure of anything.  Sure, there are some good schools with high levels of international collaboration (e.g. Caltech).  But any indicator where the top schools are St. Petersburg State and King Saud University probably isn’t a clear-cut measure of quality.

Among Canadian schools, there aren’t many big surprises.  Toronto, UBC, and McGill are the big three; Alberta does well in terms of volume of publications, but badly in terms of impact; and Victoria and Simon Fraser lead the way on international collaborations.

If you have even the slightest interest in bibliometrics, do go and play around with the customizable data on the Leiden site.  It’s fun, and you’ll probably learn something.

Page 1 of 3123