HESA

Higher Education Strategy Associates

September 15

Innovation Policy: Are Universities Part of the Problem?

We’re talking a lot about Innovation in Canada these days. Especially in universities, where innovation policy is seen as a new cash funnel. I would like to suggest that this attitude on the part of universities is precisely part of Canada’s problem when it comes to Innovation.

Here’s the basic issue: innovation – the kind that expands the economy – is something that firms do. They take ideas from here and there and put them together to create new processes or services that fill a market need in a way that creates value (there’s public sector innovation too but the “creating value” thing is a bit trickier, so we’ll leave that aside for now while acknowledging it exists and matters a lot).

Among the many places the ideas come from are higher education institutions (HEIs). Not necessarily local HEIs: ideas travel, so Toronto firms can grab ideas from universities in Texas, Tromso or Tianjin as well as U of T. The extent to which they will focus on ideas generated locally has to do not only with the quality of the local ideas, but also with the way the ideas get propagated locally. Institutions whose faculty are active and involved in local innovation networks will tend to see their ideas picked up more often that those who do not, partly because contact with local firms generates “better” scientific questions and partly because they will have more people paying attention to their work.

But ideas are really only a part of what matters in innovation. Does the business climate encourage firms to innovate? What’s the structure of business taxation? What kind of management and worker skill base exists? What regulations impede or encourage innovation? What barriers to competition and new entrants exist? What kind of venture capital is available? Does government procurement work in favour of or against new products or services? All of this matters in terms of helping to set firms’ priorities and set it on a more-innovative or less-innovative path.

The problem is, all this stuff is boring to politicians and in some cases, requires directly attacking entrenched interests (in Canada, this specifically has to do with protectionism in agriculture telecoms and banking). It requires years of discipline and trade-offs and politicians hate discipline and trade-offs. If only there were some other way of talking about innovation that didn’t require such sacrifice.

And here’s where universities step in to enable bad policies. They write about how innovation is “really” about the scientific process. How it’s “really” about high tech industries of the future and hey, look at all these shiny labs we have in Canada, wouldn’t it be great if we had more? And then all of a sudden “innovation” isn’t about “innovation” anymore, it’s about spending money on STEM research at universities and writing cheques to tech companies (or possibly to real estate companies to mediate a lot of co-working spaces for startups). Which as far as I can tell seems to be how Innovation Minister Navdeep Bains genuinely approaches his file.

Think I’m exaggerating? Check out this article from Universities Canada’s Paul Davidson about innovation in which the role of firms is not mentioned at all except insofar as they are not handing enough money to universities. Now, I get it: Paul’s a lobbyist and he’s arguing his members’ case for public support, which is what he is paid to do. But what comes across from that article is a sense that for Universities , l’Innovation c’est nous. Which, as statements of innovation policy go, is almost Nickelbackian in its levels of wrongness.

I don’t think this is a universal view among universities, by the way. I note SFU President Andrew Petter’s recent article in the same issue of Policy magazine which I think is much clearer in noting that universities are only part of the solution and even then, universities have to get better at integrating with local innovation networks. And of courses colleges, by putting themselves at the more applied end of the spectrum, are inherently aware that their role is as an adjunct to firms.

Universities are a part – a crucial part, even – of innovation systems. But they are a small crucial part. Innovation Policy is not (or should not be, anyway) code for “industrial policy in sci/tech things universities are good at”. It is (or should be) about firms, not universities. And we all need to remember that.

September 14

The Canadian Way of Study Abroad

A few years ago, I think around the time that HESA Towers ran a conference on internationalization, I realized there was something weird about the way Canadian higher education institutions talked about study abroad.  They talked about it as helping students “bridge the gap between theory and practice”, “increasing engagement”, and “hands-on learning”.

That’s odd, I thought.  That sounds like experiential learning, not study abroad.  Which is when it hit me: in Canada, unlike virtually everywhere else in the world, study abroad to a large degree is experiential learning.

In Europe, when they say study abroad, they mostly mean study at a foreign institution in the same field through the Erasmus program.  In the US, they may mean this, or they may means studying in facilities owned by their home universities but located in different countries.  For instance, Wake Forest owns a campus in Venice, Webster University has a campus in Leiden, University of Dallas has one in Rome (have a browse through this list). Basically, if your students are paying megabucks to be at a US campus, the ideas can’t just give them exchange semesters at some foreign public school because who knows about the quality, the safety, etc.

But look at how Canadian institutions showcase their study abroad: McGill talks up its science station in Barbados.  University of Alberta showcases its international internships.  University of Saskatchewan has a fabulous little Nursing programs which ties together practicums in East Saskatoon and Mozambique.  The stuff we like to talk about doesn’t seem to actually involve study in the sense of being in a classroom, per se.  That’s not to say our universities don’t have typical study-abroad programs: we’ve got thousands of those.  They’re just not where the sizzle is.  It’s a distinctly Canadian take on the subject.

This brings me to a point about measuring the benefits of study abroad.  Let’s take it for granted that being abroad  for a while makes students more independent, outward-looking, able to problem-solve, etc.  What is it, exactly, about being abroad that actually makes you that way?  Is it sitting in classes in a foreign country?  Is it meeting foreign people in a foreign country?  Is it meeting people from your own culture in a foreign country (too often the main outcome of study abroad programs)?  What about if you actually get to work in a foreign country? And – crucially for the design of some programs – how long does it take for the benefits to kick in?  A week?  A month?  A year?  When do diminishing returns set in?

Despite study-abroad being a multi-billion dollar niche within higher education, we actually don’t know the answer to many of these questions.  There isn’t a lot of work done which picks apart the various elements of “study abroad” to look at relative impact.  There is some evidence from Elspeth Jones in the UK that many of the benefits actually kick in after as little as 2-4 weeks, which suggests there may be cheaper ways of achieving all these purported benefits.

Of course, one of the reasons we have no answers to this is that it’s pretty hard to unpack the “treatment” involved in study abroad.  You can’t, for instance, randomly assign people to a program that just sits in class, or force people to make friends among locals rather than among the study-abroad group.  But, for instance, it would be possible to look at impacts (using some of the techniques we talked about yesterday) based on length of study abroad period.  It would be possible to compare results of programs that have students mostly sit in class to ones where they do internships.  It would be possible to examine internships based on whether or not they actually made friends among local students or not, a question not asked enough in study evaluation work.  It would also be possible to examine this based on destination country: are the benefits higher or lower depending on proficiency in the destination country’s language?

These questions aren’t easily answerable at the level of an individual institution – the sample size on these would simply be too small.  But one could easily imagine a consortium of institutions agreeing to a common impact assessment strategy, each collecting and sharing data about students and also collectively collecting data on non-mobile students for comparative purposes (again, see yesterday’s blog), perhaps through the Canadian Undergraduate Survey Consortium.  It would make a heck of a project.

If anyone’s interested in starting a consortium on this, let me know.  Not only would it be fun, but it might help us actually design study abroad experiences in a more thoughtful, conscious and impactful way.  And we’d find out if the “Canadian Way” is more effective than more traditional approaches.

Worth a try, I think.

September 13

Measuring the Effects of Study Abroad

In the higher education advocacy business, an unhappily large proportion of the research used is of the correlation = causation type.  For instance, many claim that higher education has lots of social benefits like lower crime rates and higher rates of community volunteering on the grounds that outcomes of graduates are better than outcomes of non-graduates in these areas.  But this is shaky.  There are very few studies which look at this carefully enough to eliminate selection bias – that is, that the people who go to higher education were less disposed to crime/more disposed to volunteering to begin with.  The independent “treatment” effect of higher education is much more difficult to discern.

This applies in spades to studying the question of the effects of study abroad.  For instance, one widely quoted study  of the Erasmus program showed that five years after graduation, unemployment rates for graduates who had been in a study-abroad program were 23% lower than for those who did not.  But this is suspect.  First of all “23% lower” actually isn’t all that much for a population where unemployment is about 5% (it means one group has unemployment of 4% and the other 5%, more or less).  Second of all, there is a selection bias here.  The study-abroad and non-study abroad populations are not perfectly identical populations who differ only in that they have been given different “treatments”: they are different populations, one of which has enough drive and courage to pick up sticks to move to another country and (often) study in another language.  It’s quite possible they would have had better employment outcomes anyways.  You can try to limit bias by selecting a control group that is similar to the study abroad population by selecting a group that mimics them in terms of field of study, GPA, etc, but it’s not perfect and very few studies do so anyway (a very honourable mention here to the GLOSSARI project from Georgia headed by Don Rubin)

(Before we go any further: no, I don’t think employability skills are the only reason to encourage study abroad.  I do however think that if universities and colleges are going to frame their claim for more study abroad in economic terms – either by suggesting students will be more employable or making more general claims of increasing economic competitiveness – then it is incumbent on them to actually demonstrate some impact.  Claiming money on an economic imperative and them turning around and saying “on that doesn’t matter because well-rounded citizen” doesn’t really wash.

There are other ways of trying to prove this point about employability, of course.  One is to ask employers if they think study abroad matters.  They’ll usually say yes, but it’s a leap of faith to go from that to saying that study abroad actually is much a help in actually landing a job.  Some studies have asked students themselves if they think their study abroad experience was helpful in getting a job.  The answer is usually yes, but it’s hard to interpret what that means, exactly.

Since it’s difficult to work out directly how well internationalization is helping students get jobs, some people try to look at whether or not students get the skills that employers want (self-discipline, creativity, working in teams, etc).  The problem with this approach of course, is that the only real way to do this is through self-assessment which not everybody accepts as a way to go (but in the absence of actual testing of specific skills, there aren’t a whole lot of other options).  Alternatively, if you use a pre-post evaluation mechanism, you can at least check on the difference in self-assessment of skills over time, which might then be attributed to time spent in study abroad.  If that’s still not enough to convince you (if, for instance, you suspect that all students self-assessments would go up over the space of a few months, because all students are to some degree improving skills all the time), try a pre-post method with a control group, too: if both groups’ self-assessments go up, you can still measure the difference in the rate at which the self-reported skills increase across the two groups.  If they go up more for study-abroad students than for stay-at-homes, then the difference in the rates of growth can, cautiously, be attributed to the study abroad period.

Basically: measuring impacts takes time, and is complicated.  And despite lots of people in Canada avowing how important outbound mobility is, we never seem to take them time, care and expense to do the measurement.  Easier, I suppose, to rely on correlations and hope no one notices.

It’s a shame really because I think there are some interesting and specifically Canadian stories to tell about study abroad.  More on that tomorrow.

September 12

How Many Canadian Students Study Abroad? How Many Should?

If you look at the current issue of Policy Options, there is a startling claim made in the sub-headline of an article by Universities Australia CEO Belinda Robinson; namely, that “five times as many Australian undergraduates are studying abroad as their Canadian counterparts”.  It’s not a claim Robinson herself makes – it seems likely that it’s been added by the editorial staff at Policy Options.  The problem is it’s not correct.

The Canadian numbers come from a periodic survey Universities Canada does of its members on internationalization (the last example of this is here).  The last time they did the survey they found that 2.6% of students did a “for-credit” international experience, and another 0.5% did a non-credit course: total, 3.1%.  That’s if you believe universities can actually keep track of this stuff; my guess is there’s a substantial number who can’t or don’t capture this data well – particularly in those cases where students are arranging foreign experiences on their own, so this number is likely at least a bit of an undercount.

Now, in the aforementioned article Robinson noted that over 16% of Australian students had an overseas experience of some kind.  Someone, clearly, took that 16%, divided it by the Canadian 3% and voila!  Over 5 times more!  Except this is apples and oranges: the Canadian figure refers to students who go abroad in any given year while the Australian figure is the percentage who go abroad at some point in their career.   We don’t actually know what percentage of Canadian undergraduates go abroad over the course of their degree.  Back in the days when we HESA Towers used to run a national student panel, we found that 8% of current students (in the panel, at any rate, which was skewed to upper-year students and hence should come closer to the Australian picture) had had some kind of exchange experience.

(This is one of those things we could answer pretty easily if Statscan put a question about it in the NGS, or if CUSC put it in their tri-annual surveys of graduating students.  Hint hint.)

Wonky figures aside, Australia does seem to have been doing something right over the past few years, having quadrupled its out-bound student flow since 2000, and perhaps Canada should be emulating it.  But there is a genuine question here about what the “right” number of students going abroad.  What‘s should our target be?  I’ve seen serious commentators say we shouldn’t be looking at Australia, but rather Germany, where something like 30% of students go abroad at some point (actually, it’s 30% of “upper-year” students have gone abroad, based on a survey of students – on which measure Canada is, as noted earlier, about 8%).

This strikes me as a bit pie-in-the-sky.  For a German student, the marginal cost of studying elsewhere in the EU is fairly low; over two-thirds of undergraduates in Germany already live alone (source – the excellent Eurostudent website), and they have a host of potentially awesome international destinations within a couple of hundred dollars transportation fare by budget airline.  In Canada, a greater percentage of students live with their parents, so the average marginal cost is going to be higher, not to mention the fact that most of the international destinations we care about (we’re inconsistent about whether or not to call the US an “international experience”, mostly we mean Europe and Asia) are a couple of thousand dollars away, and unless you live in Toronto or Vancouver, the costs of living abroad on one’s own are likely to be somewhat higher than it is back here.   So it’s overall a much more expensive proposition for Canadians than Germans.

And what are the benefits of study abroad?  Anything that might justify the extra expense?  Well, I’ll get into this in some length over the next couple of days, but ask yourself: when’s the last time you heard about a recent graduate getting or losing a job because of having/not having an international experience?  Exactly.  Whatever you might be able to get a corporate exec to say re: the need for global competencies blah blah blah, Canadian employers, be they in the private or public sector, simply don’t seem to care that much about international experiences (lest you think I am being harsh on Canada and its complacency in international affairs, I urge everyone to read Andrea Mandell-Campbell’s book Why Mexicans Don’t Drink Molson.  Tl:dr: too often Canadians believe the hokey “The World Needs More Canada” line when in fact the reverse is usually true).

I think it will be hard to make the financial case for raising our rate of outbound mobility, simply because neither students nor governments will put money into this kind of project if there aren’t clear signals from the labour market that the return will balance the expense.  Instead, study abroad will remain what it too often is now: a holiday somewhere nice.  For all the talk of study abroad as an inter-cultural experience, it is striking how many students take their study abroad in the US, the UK, Australia or France: in cultures not dissimilar from their own.

So how can we measure and sell the benefits of study abroad?  Tune in tomorrow.

September 09

Some Intriguing New UK Access Data

The UK’s Higher Education Statistics Agency (also known in these parts as “the other HESA”) put out an interesting report recently on participation in higher education in England (available here).  England is of course of great interest to access researchers everywhere because its massive tuition hike in 2012 is a major natural policy experiment: if there is no clear evidence of changes in access after a tuition hike of that magnitude then we can be more confident that tuition hikes elsewhere won’t have much of an effect either (assuming students are all given loans to cover the fees as they are in England).  I’ve written about previously about some of the evidence that has come out to date back here, here, here and here: mostly the evidence has shown little to no effect on low-income students making a direct transition to university, but some effects on older students.

The new (other) HESA report is interesting.  You may have seen the Guardian headline on this, which was that since the change in fees, the percentage of state school students who proceeded to higher education by the age of 19 fell from 66% to 62% in the years either side of the policy change (note: regular state-school students make up a little over 83% of those enrolled in A-level or equivalent courses, with the rest split about equally between selective state schools and independent schools).  On the face of it, that’s a pretty bad result for those concerned about access.

But there are three other little nuggets in the report which the Guardian chose to ignore.  The first was that if you looked simply at those who took A-levels, the drop was much smaller (from 74% to 72%).  Thus the biggest drop was from those taking what are known as “A-level equivalents” (basically, applied A-levels).  The second is that among the very poorest students – that is, those who receive free school meals, essentially all of whom are in the main state sector – enrolment rates essentially didn’t move at all.  They were 21% in 2011/12, 23% in 2012/13 and 22% in 2013/14. All of this is a long way up from 13% observed in 2005, the year before students from families with incomes below £20,000 had to start paying tuition.  Third and last, the progression rate of state school students to the most selective institutions didn’t change at all, either.

So what this means is that the decline was most concentrated not on the poor in state schools but in the middle-class, and landed more on students with “alternative” credentials.  That doesn’t make a loss of access any more acceptable, but it does put a crimp in the theory that the drop was *caused* by higher tuition fees.  If “affordability” (or perceived affordability) were the issue, why would it hit middle-income students more than lower-income students?  If affordability were the issue, why would it be differentially affecting those taking alternative credentials?  There some deeper questions to answer here.

 

September 08

Trends in Canadian University Finance

New income and expenditure data on Canadian universities came out over the summer courtesy of StatsCan and our friends over at the Canadian Association of University Business Officers (CAUBO), so today it’s time to check in on what the latest financial trends.

In 2014-15, income at Canadian Universities was, overall, a record 35.5 Billion dollars (just above 2% of GDP, if you’re counting).  That’s up 1% in real terms over the previous year and up 5% on five years ago (2009-10).  But the composition of that income is changing.  Total government income is down 2% in real terms from last year and down 8% from 2009-10 (the latter being somewhat exaggerated because the base year included a lot of money from the 2009 budget stimulus via the Knowledge Infrastructure Program (KIP).  Income from student fees, on the other hand, was up 5% on the previous year and 32% up from 2009-10, again taking inflation into account.  That doesn’t mean that fee levels increased that much; this is aggregate income so part (maybe even most) of this change comes from changes in domestic and (more pertinently) international enrollment.

Figure 1: Change in Real Income by Source, Canadian Universities, 2014-15 vs 2013-14 and 2009-10

ottsyd20160907-01

Let’s turn to a look at expenditures by type.  Salary mass for academic staff actually fell slightly last year after inflation, but over five years the overall salary budget for academics is up by 10%, after inflation. Again, this isn’t what’s happening to average salaries, it’s what’s happening to aggregate salaries, so it’s partially a function of average and partially a function of staff numbers.  For non-academic salaries, it’s an 11% increase over five years.  And yes, you’re reading that right: labour costs have risen 10% while income has risen only 5%.  Again, that’s exaggerated a bit by fluctuations in incoming funds for capital expenditures, but it’s probably not sustainable in the long term.  Because other elements of the budget are increasing quickly too: for instance, scholarship expenditures rose by 21% over that period to stand at over $1.87 billion.

 Figure 2: Change in Real Expenditures by Type, Canadian Universities, 2014-15 vs 2013-14 and 2009-10

 ottsyd20160907-02

Finally, let’s take a look at expenditures by function within the operating budget.  Operating budgets as a whole are actually up quite a bit – 14% (this is partially offset by falls in the capital and research budgets).  Here’s how the money gets used:

 Figure 3 – Division of Canadian Universities’ Operating Budgets by Expenditure Function, 2014-15

ottsyd20160907-03

As you’d expect and hope, the lion’s share (57%) of the operating budget goes to instruction and non-sponsored research.  Most of the rest goes on three categories: administration, student services, and physical plant.  Figure 4 shows how growth in each of these areas has differed.

Figure 4: Change in Real Expenditures by Function, Canadian Universities, 2014-15 vs 2013-14 and 2009-10

ottsyd20160907-04

If you look at the “big four” spending areas, instructional and admin costs rose at roughly the same rate over fiver years (14% vs. 15%), while student services rose more quickly (21%) and physical plant less so (7%, with a 4% drop in the last year).  Non-credit instruction is up very strongly for reasons I cannot quite fathom.  But look at computing costs (up 31%) and “External Relations” (which includes Government Relations, alumni relations/fundraising and other marketing costs – up 27%).

In sum: i) government funding is down in real dollars but student income has replaced that income and more besides, so that institutional budgets are still increasing at inflation +1% per year; ii) compensation budgets (academic and non-academic alike) are rising faster than income, which is a problem for the medium-term and iii) there are a lot of small-ish budget items that are growing much more quickly than salaries (scholarships, computing, student services etc.) but given that compensation is 60% of the total budget, that’s still where the majority of the restraint needs to happen.

September 07

Unpleasantness at Brock

So, everybody is talking about the kerfuffle at Brock: yet another presidential hire gone wrong, though this time the slamming-on-the-brakes happened before the hire actually started working, which I suppose is progress.

What actually happened?  At the moment, here’s what we know for sure:  Wendy Cukier, a former VP at Ryerson was offered the President’s job at Brock in December 2015 with a start date of September 1.  She was undergoing what seemed to be a normal transition, starting to meet with faculty, up until a few weeks ago when meetings suddenly ceased.  On Monday August 29th, news emerged that Cukier and the Board had mutually agreed to suspend the appointment, and to look for a new President.  Cukier returned to her professorial position at the Ted Rogers business school at Ryerson.

Now, no one has yet actually asserted in print that the reason for the “mutual” change of heart is a report about Cukier’s alleged bullying of staff while at Ryerson, but many news outlets have reported that an inquiry into such allegations took place and by putting two facts side by side the journalists clearly expect the reading public to make that leap.  The inquiry into those allegations is said to have occurred in late 2015 (i.e. around the time Cukier’s appointment at Brock occurred),  an investigative report into the allegations is reported to have been received by Ryerson in January 2016 (i.e. after the appointment).  We don’t know what the inquiry’s report said, and news outlets have been careful to avoid directly stating that there was any connection between the two.

Brock, obviously, is a bit screwed now.  Their interim President is the VP Finance & Administration (not an academic and a former CFL player to boot, which has made the faculty union extremely sniffy in an oh-my-God-what-will-other-universities-think-of-us kind of way, which is frankly juvenile).  The some-say acting, some-say interim Provost is an outsider: Martin Singer, the Arts Dean from York who is best remembered for deciding to allow Saudi males to not study with girls in the name of religious accommodation. The VP Research is also interim.  It’s going to be a tough two years working to sort this out.

To the extent anyone is talking about the general implications, there same to be three.  First, some people have posited that gender is an issue in the affair.  On the facts of this particular case that seems a stretch.  It is however undeniable that recent university President “departures” (let’s call them that) have been disproportionately female (Leavitt at King’s, Ghazzali at UQTR, Lovett-Doust at Nipissing, Scherf at Thompson Rivers, Woodsworth at Concordia, Busch-Vishniac at Saskatchewan, Hitchcock at Queen’s and now Cukier), at least compared to the mostly male population of university presidents.  I’d argue that – contra Jennifer Berdahl and the view that only alpha male behaviour is rewarded in universities – there’s a disproportionate number of individuals in that group who were let go precisely because they were too alpha.  If there’s a gender case to be made here, it might be about what kinds of leadership styles get women promoted to decanal and vice-Presidential positions in the first place.

Second is the role of non-disclosure agreements (NDAs), which again are getting in the way of Everyone’s Right to Know Every Last Detail (though to be honest, Brock’s Board of Governors has 27 members and I’m willing to bet that that’s too many to keep a secret for long).  NDAs don’t get a lot of favourable press and some say they should be done away with, but it’s hard to see how that’s possible. If someone is being let go for some reason that reflects badly on them but which is short of being “with cause”, you can either pay them a small amount of money now and have them leave quietly (I’m actually a bit surprised no one has yet commented publicly on whether there was a payout and if so how big it was), or you can trash them publicly and pay a lot of money after the inevitable lawsuit.  As public institutions, I don’t think universities and colleges have a lot of flexibility on that point.

The third implication people are drawing form this is that here again we have Another Failed Board Search, Why Can’t Boards Get Things Right, Need for Immediate Governance Overhaul, etc.  But I think this is overdone. The Brock University Board Chair has gone on record saying his university “did not know” about the Ryerson report (there is no word about when Brock became aware of it).  But unlike one or two Presidential searches I’ve heard of, Brock actually *did* its homework and interviewed quite a few people about Cukier.  It’s just that, as far as we know, no one at Ryerson told them about the results of the inquiry, presumably because it was a “personnel matter” and hence confidential.  If staff at Ryerson knew about the issue and withheld information from the Brock search committee, that’s hardly something the Brock board can be blamed for.  Sometimes bad things happen even if you do everything by the book.

Finally, let me stress that we don’t yet know the full story.  Maybe we never will.  The staff allegations at Ryerson might only be a small part of the issues involved.  Keep an open mind.  There’s probably more to come.

September 06

Announcements

Guys!  I’ve got it solved!  This whole funding thing!

You know how Liberal MPs are taking up the entire back-to-school season with on-campus announcements of Strategic Investment Fund (SIF) money?  It’s annoying, right?  I mean this is money isn’t some “favour” delivered through hard work and pork-barrelling by the local MP.  It’s technocratically-determined funding decided upon by a professional public service.  And yet all the universities and colleges have to go through this rigamarole, saying “thank you” to the local MP, and having pictures taken that can be used ad nauseam in local media.

OK, I get it.  Politicians need to get “credit”, and it’s not just about personal political advantage (though I suppose that never goes amiss).  It’s important that the public knows how their money is spent and media “events” help with that process.  To that extent, it’s perfectly legitimate.   But why is it legitimate for some types of spending and not others?  Why do the feds get these heaps of publicity for a few hundred million dollars when provinces hand out over a billion dollars a month every year?

That’s not a novel observation on my part, or anything.  Everyone has had this discussion of course.  It hasn’t exactly passed unnoticed that announcements of capital projects (especially ribbon-cuttings) get more fanfare than announcements of operating grants. And there’s a too-smug, too-certain line that everyone knows about how “if only we could do ribbon-cuttings for operating grants” then politicians would give money for that, too.

Now, there’s at least some truth to this.  Relative to operating grants, universities and colleges have been getting more money for capital these past fifteen years or so.  And presumably the ability to get good press out of announcing such funding has at least some small role to play in it.

But do we really know that we can’t hold media events for operating grant announcements?  Or have we just never tried?

I mean, clearly, the fact that the money has already been announced is no barrier to getting media out to events.  Every last dime of SIF has already been announced weeks ago.  Hell, last week the Science Minister showed up at Humber College to re-announce changes to the Canada Student Loans Plan that had not only been announced five months ago but which had actually gone into effect four weeks previously.  Timeliness and novelty are clearly not the issue.

Some people might say: “ah, well, you can’t announce operating grants because they aren’t new.”  But this is small-time thinking.  There’s almost always a part of the funding that is new, even if it’s only 1 or 2%.  And what that money is funding changes quite a bit every year.  One year it might be buying RECORD LEVELS OF ENROLLMENT, and in another SIXTY NEW PROFESSORS AND A NEW CENTER FOR STUDENTS WITH DISABILITIES.  Tie it in with some kind of re-announcement about new goals, multi-year agreements, whatever, and you’ve got yourself a bona fide news event.

Not a ribbon cutting, maybe, but a reason for provincial politicians and institutional officials to be pleasant to one another in public, to explain to the electorate what their money is buying, and have some photos taken.  And who knows?  If people are right that positive media is what begets more capital funding announcements, maybe it’ll help bring operating grants back up a bit too.

So come on, institutional government-relations types and provincial media-flack types.  It can’t be beyond your wit to organize some media for all that massive public investment.  Give it a try.  It can’t be any less legitimate than this interminable parade of SIF announcements to which we’re currently being subjected.

September 02

New Thoughts on Innovation Policy

A new book on innovation policy came out this summer from a guy by the name of Mark Zachary Taylor, who teaches at Georgia State.  The book is called The Politics of Innovation: Why Some Countries are Better Than Others at Science and Technology and to my mind it should be required reading for anyone interested in following Canada’s innovation debate.

First, things first: how does Taylor measure how “good” a country is at Science & Technology?  After all, there are lots of ways of measuring inputs to tech, but not many on the outputs side.  The measure Taylor selects is patents.  And yes, this is highly imperfect (though it does correlate reasonably well with other possible measures like multi-factor productivity), but Taylor doesn’t over-egg the data.  For most of the book, his interest is less in scoring countries and then using various types of regression analyses to come up with explanations for the scores; rather, he tends to group countries into fairly wide buckets (“Most innovative”, “mid-level innovative“, and “rapid innovators” showing rapid progress like Korea and Taiwan).  Canada – probably to the surprise of anyone who follows innovation policy in our country – comes up as one of the “most innovative” along with Japan, Germany, Sweden and Switzerland. This may either be a sign of us being too tough on ourselves, or Taylor being out to lunch (I’m a bit unsure which, to be honest).

But put that aside: what’s important about this book is that it provides a good, critical tour d’horizon of the kinds of institutions that support innovation (research universities, patent protection, etc.) and explicitly rejects the idea that “good institutions” are enough to drive innovation forward.  This seems to me to be quite important.  Much of the innovation commentariat loves playing the game of “look-at-that-institution-in-a-country-I-think-does-better-than-us-we-should-really-have-one-of-those” (think Israel’s government-sponsored venture capital funds, for instance).  The riposte to this is usually “yeah, but that’s sui generis, the product of a very special set of political/institutional factors and would never work here”.  And that’s true as it goes, but Taylor goes a bit further than that.

First, he focuses on how open a country is to both inward and outward flows of knowledge and human capital.  Obviously, higher education plays some role here, but on an economy-wide basis, the real question is: are firms sufficiently well-networked that they can effectively hire abroad or learn about market opportunities in other countries?  Taiwan and Israel have worked this angle very effectively, cultivating ties with targeted groups in the United States and elsewhere (my impression is that Canada does not do this in anything near the same level – one wonders why not).

Second, Taylor doesn’t just stop at asking the question of how nations innovate (answer: they design domestic institutions and policies to lower transaction & information costs, distribute and reduce risk, and reduce market failures in innovation).  He also tries to get at the much more interesting question of why countries innovate.  Why do Finns innovate like mad and Norwegians not?  Why Taiwan not Philippines?  Or, for that matter, why the US and not us?  Institutions play some role here, but it’s not the whole story.  Culture matters.

Or, in Taylor’s telling: perceptions of internal and external threat matter.  His argument is that everywhere, the urge to innovate is countered by the wailings of bereavement from those who lose from technological innovation.  In many countries, the political power of losers is sufficient to create a drag on innovation.  Only in places where the country feels an existential threat (e.g. Israel, Taiwan) do political cultures feel they have the necessary social license to ignore the losers and give the innovators free rein.   Taylor calls this “creative insecurity”.

I have to say I don’t find this last bit entirely persuasive.  The bit about losers having too much power is warmed-over Mancur Olson with a tech-specific focus (Taylor goes to some length to say it’s not, but really it is). and while the second part is a plausible explanation for  some places -Singapore, say – his attempt at formalization requires some serious torquing of the data (Finland cannot credibly be described as being under external threat) and/or some very odd historical interpretations (Taylor’s view that Israel was under greater external threat after 1967 than before it would probably not be accepted by many mid-east specialists).

That said, it arguably does explain Canada.  Our resource base gives us an undeniable cushion that other advanced countries lack.  We lack external threats (and since the late-90s we lack internal ones too).  Frankly, we’re just not hungry enough to be top-of-the-pack.  Even in parts of the country that should be hungry – Nova Scotia, for example – there’s simply not that much appetite to sacrifice dollars spent on social policy to make investments in innovation. See, for instance, the carping over Dalhousie’s participation in the MIT’s Regional Entrepreneurship Acceleration Program.

Say it softly: as a country, we might not be cut out for this innovation stuff.  Sure, we like spending money on gee-whizzy tech things and pretending we’re at the cutting edge of this, that or the other, but it’s a long way from that to actually being innovative.  Innovation is tough.  Innovation causes pain and requires sacrifice.  But Canadians prefer comfort to sacrifice:  we can’t get rid of harmful dairy monopolies, our national dress is fleece, etc.

Anyways, read the book.  And have a great weekend.

September 01

Reminder: Hot Jobs and Hot Careers Never Last

There was an interesting piece in the National Post last week about unemployed professionals in the Alberta oil and gas industry.  In amidst the occasional whine about the oil industry being so unloved by the rest Canada, there is a serious article about what happens to people in specialized professions when the economic tide swings away from that profession.  Some quotes:

Philip Mulder, spokesman for the Association of Engineers and Geoscientists of Alberta (APEGA), said its geoscientists are faring badly in the current climate because they are predominantly employed in oil and gas…PetroLMI/ENFORM, which tracks labour market trends in the sector, said 145,000 people worked directly in oil and gas extraction, pipelines and services in the province on average in the first four months of 2016, down from 176,000 during 2014. Meanwhile, 17,300 were unemployed on average during the first four months of 2016, up from 6,500 during 2014. The rest left the industry by moving elsewhere or to other sectors, retiring, going back to school.

Lemiski, 33, studied for eight years at the University of Alberta to earn a degree in geology and a master’s in earth and atmospheric sciences.  He started his career six years ago as an exploration geologist at Talisman Energy Inc., but was laid off when the company ran into financial difficulties and cut its exploration program. He found work at Nexen Inc., owned by China’s CNOOC Ltd., as a petrophysicist, but that didn’t last long and he was back on the street in March.

I could go on, but you get the idea.

Times are obviously tough for professionals in this industry and it’s hard not to sympathize with people whose profession is vanishing under their feet.  But let me simply point out that this situation could have been predicted by anyone a) with half a brain and b) who remembers what the early 1990s recession was like in Alberta.  Cyclical industries are just that – cyclical.  They rise and they fall.  That’s fine until people forget that both are equally likely.

And of course, the top of a boom is precisely where people tend to forget that.  One of many examples: the Canadian Society of Exploration Geophysicists in 2011, saying Canada would need 6,000 geophysicists to deal with retirements between now and 2020.  Here’s a Petroleum Labour Market Information report saying that the country wasn’t training enough people, and predicting that there would be a shortfall of 1500-200 geologists and geophysicists in Alberta alone by 2022. And so of course there was a drumbeat across the west:  why weren’t those out-of-touch universities doing more to graduate more geologists?

This isn’t the first time something like this has happened of course. Back in the late 1990s the Ontario government fell for the pitch of the IT industry that there simply had to be more computer science graduates in the province or woe, disaster, earthquakes, pestilence, etc.  So the Harris government, which had hitherto done nothing but cut the bejesus out of Ontario universities, decided to spend millions to “double the pipeline” in computer sciences (i.e. double the intake of students in order to double the number of graduates).  Ontario universities took the money and implemented the plan, which delivered a record number of graduates… who hit the job market in 2002, right in the middle of the worst tech bust in history.  Cue thousands of underemployed Engineers, at least for a few years.

My point here is not that universities should ignore the labour market.  That would be silly; most students attend university to get a better job, and ignoring the labour market is tantamount to malpractice.  My point, rather, is that one has to be extremely careful about trying to time the market by shifting enrollment into specific programs.  Any job that’s in high demand today has good chances of looking very different in five years – the time it takes to design a program and get a new boatload of students through it.  All you end up with are a lot of disappointed and underemployed students.

Pay attention to long-term trends?  Certainly.  Short-term booms?  As little as possible.  Even in Alberta.

Page 10 of 109« First...89101112...203040...Last »