HESA

Higher Education Strategy Associates

Category Archives: Uncategorized

September 28

International Rankings Round-Up

So, the international rankings season is now more or less at an end.  What should everyone take away from it?  Well, here’s how Canadian Universities did in the three main rankings (the Shanghai Academic Ranking of World Universities, the QS Rankings and the Times Higher Rankings).

ottsyd20160928

Basically, you can paint any picture you want out of that.  Two rankings say UBC is better than last year and one says it is worse.  At McGill and Toronto, its 2-1 the other way.  Universities in the top 200?  One says we dropped from 8 to 7, another says we grew from 8 to 9 and a third says we stayed stable at 6.  All three agree we have fewer universities in the top 500, but they disagree as to which ones are out (ARWU figures it’s Carleton, QS says its UQ and Guelph, and for the Times Higher it’s Concordia).

Do any of these changes mean anything?  No.  Not a damn thing.  Most year-to-year changes in these rankings are statistical noise: but this year, with all three rankings making small methodological changes to their bibliometric measures, the year-to-year comparisons are especially fraught.

I know rankings sometimes get accused of tinkering with methodology in order to get new results and hence generate new headlines, but in all cases, this year’s changes made the rankings better, either making them more difficult to game, more reflective of the breadth of academia, or better at handling outlier publications and genuine challenges in bibliometrics.  Yes, the THE rankings threw up some pretty big year-to-year changes and the odd goofy result (do read my colleague Richard Holmes’ comments the subject here) but I think on the whole the enterprise is moving in the right direction.

The basic picture is the same across all of them.  Canada has three serious world-class universities (Toronto, UBC, McGill), and another handful which are pretty good (McMaster, Alberta, Montreal and then possibly Waterloo and Calgary).  16 institutions make everyone’s top 500 (the U-15 plus Victoria and Simon Fraser but minus Manitoba, which doesn’t quite make the grade on QS), and then there’s another half-dozen on the bubble, making it into some rankings’ top 500 but not others (York, Concordia, Quebec, Guelph, Manitoba, Concordia).  In other words, pretty much exactly what you’d expect in a global rankings.  It’s also almost exactly what we here at HESA Towers found when doing our domestic research rankings four years ago. So: no surprises, no blown calls.

Which is as it should be: universities are gargantuan, slow-moving, predictable organizations.  Relative levels of research output and prestige change very slowly; the most obvious sign of a bad university ranking is rapid changing of positions from year to year.   Paradoxically, of course, this makes better rankings less newsworthy.

More globally, most of the rankings are showing rises for Chinese universities, which is not surprising given the extent to which their research budgets have expanded in the past decade.  The Times threw up two big surprises; first by declaring Oxford the top university in the world when no other ranker, international or domestic, has them in first place in the UK, and second by excluding Trinity College Dublin from the rankings altogether because it had submitted some dodgy data.

The next big date on the rankings calendar is the Times Higher Education’s attempt to break into the US market.  It’s partnering with the Wall Street Journal to create an alternative to the US News and World Report rankings.  The secret sauce of these rankings appears to be a national student survey, which has never been used in the US before.  However, in order to get a statistically significant sample (say, the 210-students per institution minimum we used to use in the annual Globe and Mail Canadian University Report) at every institution currently covered by USNWR would imply an astronomically large sample size – likely north of a million students.  I can pretty much guarantee THE does not have this kind of sample.  So I doubt that we’re going to see students reviewing their own institution; rather, I suspect the survey is simply going to ask students which institutions they think are “the best”, which amounts to an enormous pooling of ignorance.  But I’ll be back with a more detailed review once this one is released.

September 27

Lying With Statistics, BC Edition

A couple of weeks ago I came across a story in the Vancouver Sun quoting a Federation of Post-Secondary Educators of BC (FPSE) “report” (actually more of a backgrounder) which contained two eye-catching claims:

  1.  “per-student operating grants have declined by 20 per cent since 2001 when adjusted for inflation.”
  2.  “government revenues from tuition fees have increased by almost 400 per cent since 2001”

The subtext here is clear.  20% down vs. 400% up?  How heinous!  How awful can the Government of British Columbia be?

Well now.  What to make of this dog’s breakfast?

Let’s start with the second point.  First of all, it’s institutional income, and not government income.  But leaving that aside, there was indeed a very big rise in tuition fees back in 2001-2 and 2002-3 (presumably why the authors chose 2001 as a base…if one used 2003 as a base, it would be a very different and much less dramatic story).   But if you simply look at average university tuition (college tuition is untracked) the increase since 2001 is only 110% (in nominal dollars).  Assume the increase for colleges was a bit higher because they were working from a lower base and perhaps we can nudge that up to 125%.  Still: how does one get from there to 400%?

First, remember that the authors (whoever they may be) are talking about aggregate tuition, not average tuition.  So some of this simply reflects an increase in enrollments.  In 2001-2, there were 196,000 students in BC.  In 2013-14, the last year for which we currently have data, there were 277,515 – an increase of 41%.  Back of the envelope, multiply that by the 110% nominal tuition increase and that gets you to a 176%.  Still a ways to go to 400% though.

Second, a lot of students are moving from lower-cost programs to higher cost-programs.  Some of that is happening within universities (e.g., from Arts to Engineering), but in BC it’s mostly a function of colleges turning themselves into universities and charging more tuition.  University enrollment rose from 80,388 to 179,917 while college enrolments went from 116,007 to 197,698.  That’s a lot of extra fees.

Third, BC has a lot more international students than it used to, and they pay more in fees on an individual basis than domestic students do.  Add those two factors together and you get another 19% or so increase in aggregate fees, which brings us to a 210% total increase.

That’s still nowhere near 400%.  So, I went and checked the source data – Statistics Canada’s Financial Information of Universities and Colleges (FIUC) for Universities (cansim 477-0058 if you’re a nerd) and the Financial Information of Community Colleges and Vocational Schools (cansim 477-0060) to try to find an answer.  Here’s what I found:

ottsyd2016093001

Yeah, so actually not 400%, more like 207% – reasonably close to the 210% from our back-of-the-envelope exercise.  The best excuse I can come up with for the Federation of BC Post-Secondary Educators’ number is that if you extend the universities number out another year (to 2014-15), you get to $1.258B, which is almost four times (actually 3.74x) of the 2001-02 figure (which is still only a 274% increase).  But you have to a) torque the living daylights out of the numbers and b) actively confuse percentage increases and multiples to get there.

But now let’s move over to the other side of the ledger, where the Federation notes a 20% drop in government support per student, adjusted for inflation.  Let’s note straight off the first inconsistency: they’re adjusting the government grants for inflation and not doing the same for tuition.  Second inconsistency: they’re adjusting the government grants for the size of the student population and not doing the same for tuition.

It’s easy to see why FPSE does this.  As we’ve already noted, student numbers were up by 41% between 2001-2 and 2013-14.  Just do the math: a 20% per student cut while student numbers are rising by 41% actually means that government support has risen by 13%.  In real dollars. (I went back to the source data myself and came up with 14% –  Close enough).  Chew on that for a second: FPSE is ragging on a government which has increased funding for post-secondary education by – on average – 1% over and above inflation every year since 2001-2.

So quite apart from any problems with the 400% number, FPSE is making a deliberate apples-to-oranges comparison by adjusting only one set of figures for student growth and inflation.  Here’s how those numbers compare on a number of different apples-to-apples basis (and I’m being nice to FPSE here and allowing different end dates for fees and grants based on different data availability):

ottsyd2016093001-2

Now, it seems to me there’s enough in the Clark government’s record to mount a decent attack without resorting this kind of nonsense.  It certainly under-invests relative to what it could be doing given the province’s growing population.  It carries a faux-populist pro-extraction industry line to the detriment of investing in expanding knowledge industries.  It has stayed out of step with most of the rest of the country in the last ten years by not improving student assistance.  And a fair, non-torqued comparison between student fees and government grants still shows students are bearing an increasing share of the cost.

So why stoop to using transparently false figures?  One might expect that kind of chicanery from the Canadian Federation of Students, which has form in this area.  But this is from an organization which represents professors: people who actually use statistics in the search for the truth.  So why is the organization which represents them using statistics in a way that wouldn’t pass muster in an undergraduate course?

I’m quite sure most profs wouldn’t be OK with this.  So why do FPSE’s member locals tolerate it?

September 22

MOOCs at Five

It was five years ago last month that Stanford set up the first MOOC.  MOOCs were supposed to change the world: Udacity, Coursera and EdX were going to utterly transform education, putting many universities out of business.  Time to see how that’s going.

(Ok, ok: the actual use of the term MOOC was applied to a 2008 University of Manitoba course led by George Siemens and Stephen Downes.  Technically, using Downes’ taxonomy, the 2008 MOOC was a “cMOOC” – the “c” standing for connectivist, if I am not mistaken – while the versions that became popular through Coursera, Udacity and EdX, etc. are “xMOOCs”, the difference being essentially that learning in the former is more participative and collaborative while the latter has more in common with textbooks, only with video.  But the Stanford MOOC is what usually gets the attention, so I’m going to date it from there).

In the interests of schadenfreude if nothing else, allow me to take you back to 2012/3 to look at some of the ludicrous things people said about the likely effects of MOOCs.

  • “In 50 years there will only be 10 institutions in the world delivering higher education” (Sebastian Thrun, former CEO of Udacity)
  • “Higher Education is now being disrupted; our MP3 is the massive open online course (or mooc), and our Napster is Udacity” – Clay Shirky.
  • “Higher education is just on the edge of a crevisse (sic)…five years from now these enterprises (i.e. universities) are going to be in real trouble” Clayton Christensen

And of course who can forget that breathless cliché-ridden gem of an op-ed from Don Tapscott, about the week that higher education changed forever in January 2013 (i.e. the week he sat in on a couple of seminars on the subject in Davos).  Classic.

So, five years on, where are the MOOC pioneers now?  Well, Sebastian Thrun of Udacity got out of the disrupting-higher-education business early after coming to the realization that his company “didn’t have a good product”; the company pivoted to providing corporate training.  Over at Coursera, the most hyped of the early pioneers, founders Andrew Ng and Daphne Koller have both left the company (Ng left two years ago for Baidu, Koller left last month for one of Alphabet’s biotech enterprises).  Shortly after Koller’s departure, Coursera released this announcement  which was widely interpreted as the company throwing in the towel on the higher education market and following Udacity down the corporate training route.

EdX, the platform owned jointly by MIT and Harvard thus seems to be the last MOOC provider standing.  Perhaps not coincidentally, it is also the one which has (arguably) been most successful in helping students translate MOOC work into actual credits.  It has partnered with Arizona State University in its “Global Freshman Academy” and even allows conversion of some credits towards a specific MIT MBA (conditional on actually spending a semester on campus and paying normal fees to finish the program).   These “micro-MBAs” seem to catching on, but precisely because they are “micro”, they haven’t made a big impact on EdX’s overall numbers: their user base is still less than half Coursera’s.

So what’s gone wrong?  It isn’t a lack of sign-ups.  The numbers of people taking MOOCs continues to grow at a healthy clip, with Global enrolments to date now over 35 million.  The problem is there’s no revenue model here.  Depending on whose numbers you’re using, the number of users paying for some kind of certification (a fee which is usually priced in double digits) is at best around 3%.  So, work that out: 35 million users, with a 3% conversion rate, at $50 per user, and you’ve got a grand total of $52.5 million in total revenue.  Over five years.  Using content existing institutions are giving them essentially for free at a cost of anywhere between $50,000 and $250,000 per course.

This is not sustainable and never was.  Whatever valid points MOOC boosters had about the current state of education (and I think they had more than a few), the proposed solution wasn’t one that met the market test.  The basic problem is (and always has been) that higher education is fundamentally a prestige market.  Distance education is low prestige; distance education which doesn’t give out actual course credit doubly so.  You can disguise this by making delivery the domain of top universities, as EdX and to a lesser extent Coursera did – but top institutions don’t want to diminish their prestige by handing out actual credit to the hoi polloi over the internet.   So what you get is this unsatisfying compromise which in the end not enough people want to pay for.

Some of us said this five years ago (here, here and here) when MOOC-mania was in full flow and critical faculties were widely suspended.  Which just goes to show: higher education is the world’s most conservative industry and the rate of successful innovation is tiny.  Your best bet for imagining what higher education looks like in the future is what it looks like today, only more expensive.

 

September 21

Unit of Analysis

The Globe carried an op-ed last week from Ken Coates and Douglas Auld, who are writing a paper for the MacDonald Laurier institute on the evaluation of Canadian post-secondary institutions. At one level, it’s pretty innocuous (“we need better/clearer data”) but at another level I worry this approach is going to take us all down a rabbit hole. Or rather, two of them.

The first rabbit hole is the whole “national approach” thing. Coates and Auld don’t make the argument directly, but they manage to slip a federal role in there. “Canada lacks a commitment to truly high-level educational accomplishment”, needs a “national strategy for higher education improvement” and so “the Government of Canada and its provincial and territorial partners should identify some useful outcomes”. To be blunt: no, they shouldn’t. I know there is a species of anglo-Canadian that genuinely believes the feds have a role in education because reasons, but Section 93 of the constitution is clear about this for a reason. Banging on about national strategies and federal involvement just gets in the way of actual work getting done.

Coates & Auld’s point about the need for better data applies to provinces individually as well as collectively. They all need to get in the habit of using more and better data to improve higher education outcomes. I also think Coates and Auld are on the right track about the kinds of indicators most people would care about: scholarly output, graduation rates, career outcomes, that sort of thing. But here’s where they fall into the second rabbit hole: they assume that the institution is the right unit of analysis for these indicators. On this, they are almost certainly mistaken.

It’s an understandable mistake to make. Institutions are a unit of higher education management. Data comes from institutions. And they certainly sell themselves as a unified institutions carrying out a concerted mission (as opposed to the collections of feuding academic baronetcies united by grievances about parking and teaching loads they really are). But when you look at things like scholarly output, graduation rates, and career outcomes the institution is simply the wrong unit of analysis.

Think about it: the more professional programs a school has, the lower the drop-out rate and the higher the eventual incomes. If a school has medical programs, and large graduate programs in hard sciences, it will have greater scholarly output. It’s the palette of program offerings rather than their quality which makes the difference when making inter-institutional comparisons. A bad university in with lots of professional programs will always beat a good small liberal arts school on these measures.

Geography play a role, too. If we were comparing short-term graduate employment rates across Canada for most of the last ten years, we’d find Calgary and Alberta at the top – and most Maritime schools (plus  some of the Northern Ontario schools) at the bottom. If we were comparing them today, we might find them looking rather similar. Does that mean there’s been a massive fall-off in the quality of Albertan universities? Of course not. It just means that (in Canada at least) location matters a lot more than educational quality when you’re dealing with career outcomes.

You also need to understand something about the populations entering each institution. Lots of people got very excited when Ross Finnie and his EPRI showed big inter-institutional gaps in graduates incomes (I will get round to covering Ross’ excellent work on the blog soon, I promise). “Ah, interesting!” people said. “Look At The Inter-Institutional Differences Now We Can Talk Quality”. Well, no. Institutional selectivity kind of matters here. Looking at outputs alone, without taking into account inputs, tells you squat about quality. And Ross would be the first to agree with me on this (and I know this because he and I co-authored a damn good paper on quality measurement a decade ago which made exactly this point).

Now, maybe Coates and Auld have thought all this through and I’m getting nervous for no reason, but their article’s focus on institutional performance when most relevant outcomes are driven by geography, program and selectivity suggests to me that there’s a desire here to impose some simple rough justice over some pretty complicated cause-effect issues. I think you can use some of these simple outcome metrics to classify institutions – as HEQCO has been doing with some success over the past couple of years – but  “grading” institutions that way is too simplistic.

A focus on better data is great. But good data needs good analytical frameworks, too.

September 19

Counting Sessionals

Much rejoicing last Thursday when Science Minister Kirsty Duncan announced that the federal government was re-instating the funding for the Universities and Colleges Academic Staff System (UCASS), which was last run in 2011.     But what caught most people’s attention was the coda to the announcement, which said that Statistics Canada was going to “test the feasibility” of expanding the survey to include “part-time and public college staff” (the “C” in UCASS stands for colleges in the Trinity College sense, not the community college sense, so despite the name public colleges have never been in the survey).

What to make of this?  It seems that by “part-time” Statscan meant sessionals/adjuncts/ contract faculty.  That’s a bit off because every university I know of makes a very sharp distinction between “part-time” (many of whom are tenured) and “sessionals”.  It make one worry that Statistics Canada doesn’t understand universities well enough to use the correct terminology, which in turn bodes ill for their future negotiations with universities around definitions.

Because let’s be clear about this: universities will do almost anything they can to throw sand in the gears on this.  They do not want data on sessionals in the public eye, period.  Oh sure, in public the Presidents will welcome transparency, evidence-based decision-making, etc.  But institutional research shops – the ones who will actually be dealing with Statscan on this file – are Olympic champions in shutting down government attempts to liberate information.  In fact, that’s arguably their main purpose.  They won’t actually say no to anything – they’ll just argue relentlessly about definitions until Statscan agrees to a reduced program of data collection.  Statscan knows this is coming – they have apparently allocated four years (!!!) for negotiations with institutions, but the safest guess is that this simply isn’t going to happen.

And to be fair to universities, the kind of data UCASS would provide about sessionals would be pretty useless – a lot of work for almost nothing.  UCASS can count individuals, and track their average salaries.  But average salary data would be useless: it would conflate people teaching one course with people teaching five.  And since UCASS had no way to track workload (you’d actually need to blow up the survey and start again if you wanted to get at workload, and as interesting as that might be, good luck getting universities to green-light it), the data is meaningless.  Knowing the number of sessionals tells you nothing about what proportion of undergraduates are being taught by sessionals.  Are 200 sessionals teaching one course each worse than 100 teaching two courses apiece?  Of course not.  But if raw numbers are the only thing on offer then we’ll ascribe meanings to them where they arguably shouldn’t exist.

You see, “sessionals” are not really a single phenomenon.  Many are professionals who have full-time jobs and like teaching a class on a side, and they’re usually a huge boon to a department (especially in professional faculties like law, nursing an business) because they help expose students to a life beyond academia.  Others are PhD students teaching a class while another professor is away – and thus learning valuable skills.  The “bad” sessionals – the one people claim to want to stamp out – are the ones who have a PhD, are teaching multiple classes the way professors do.  I suspect this is a pretty small percentage of total sessionals, but we don’t know for sure.  And adding sessionals to UCASS won’t get us any closer to finding out because even if they wanted to, universities couldn’t collect data on which of their employees have other full-time jobs outside the institution.

Despite all the kumbayahs on Tuesday about how this UCASS expansion is going to promote “evidence-based decision-making”, I’m genuinely having trouble imagining a single policy problem where data from UCASS would make a difference.  Universities already know how many sessionals they employ and whether numbers are going up or down; UCASS might let them know how many sessionals other universities employ but frankly who cares?  It’s not going to make a difference to policy at an institutional level.

If you really wanted to know something about sessionals, you’d probably start with requiring institutions simply to provide contact information for every individual with teaching responsibilities who is not tenure-track, along with the amount paid to them in the previous academic year (note: Statscan couldn’t do this because it would never use the Stats Act to compel data this way.  Provincial governments could do so, however).  Then you’d do a survey of the instructors themselves – number of classes taught, other jobs they have, income from other jobs, etc.  Now I know some of you are going to say; didn’t some folks at OISE do that just recently?  Well, almost.  Yes, they administered almost this kind of survey, but because they weren’t drawing their sample from an administrative database, there’s no way to tell how representative their sample is and hence how accurate their results are.  Which is kind of important.

So, anyways, two cheers for the return of UCASS.  More data is better than less data.  But the effort to get data on sessionals seems like a lot of work for very little practical return even if universities can be brought round to co-operate.

September 07

Unpleasantness at Brock

So, everybody is talking about the kerfuffle at Brock: yet another presidential hire gone wrong, though this time the slamming-on-the-brakes happened before the hire actually started working, which I suppose is progress.

What actually happened?  At the moment, here’s what we know for sure:  Wendy Cukier, a former VP at Ryerson was offered the President’s job at Brock in December 2015 with a start date of September 1.  She was undergoing what seemed to be a normal transition, starting to meet with faculty, up until a few weeks ago when meetings suddenly ceased.  On Monday August 29th, news emerged that Cukier and the Board had mutually agreed to suspend the appointment, and to look for a new President.  Cukier returned to her professorial position at the Ted Rogers business school at Ryerson.

Now, no one has yet actually asserted in print that the reason for the “mutual” change of heart is a report about Cukier’s alleged bullying of staff while at Ryerson, but many news outlets have reported that an inquiry into such allegations took place and by putting two facts side by side the journalists clearly expect the reading public to make that leap.  The inquiry into those allegations is said to have occurred in late 2015 (i.e. around the time Cukier’s appointment at Brock occurred),  an investigative report into the allegations is reported to have been received by Ryerson in January 2016 (i.e. after the appointment).  We don’t know what the inquiry’s report said, and news outlets have been careful to avoid directly stating that there was any connection between the two.

Brock, obviously, is a bit screwed now.  Their interim President is the VP Finance & Administration (not an academic and a former CFL player to boot, which has made the faculty union extremely sniffy in an oh-my-God-what-will-other-universities-think-of-us kind of way, which is frankly juvenile).  The some-say acting, some-say interim Provost is an outsider: Martin Singer, the Arts Dean from York who is best remembered for deciding to allow Saudi males to not study with girls in the name of religious accommodation. The VP Research is also interim.  It’s going to be a tough two years working to sort this out.

To the extent anyone is talking about the general implications, there same to be three.  First, some people have posited that gender is an issue in the affair.  On the facts of this particular case that seems a stretch.  It is however undeniable that recent university President “departures” (let’s call them that) have been disproportionately female (Leavitt at King’s, Ghazzali at UQTR, Lovett-Doust at Nipissing, Scherf at Thompson Rivers, Woodsworth at Concordia, Busch-Vishniac at Saskatchewan, Hitchcock at Queen’s and now Cukier), at least compared to the mostly male population of university presidents.  I’d argue that – contra Jennifer Berdahl and the view that only alpha male behaviour is rewarded in universities – there’s a disproportionate number of individuals in that group who were let go precisely because they were too alpha.  If there’s a gender case to be made here, it might be about what kinds of leadership styles get women promoted to decanal and vice-Presidential positions in the first place.

Second is the role of non-disclosure agreements (NDAs), which again are getting in the way of Everyone’s Right to Know Every Last Detail (though to be honest, Brock’s Board of Governors has 27 members and I’m willing to bet that that’s too many to keep a secret for long).  NDAs don’t get a lot of favourable press and some say they should be done away with, but it’s hard to see how that’s possible. If someone is being let go for some reason that reflects badly on them but which is short of being “with cause”, you can either pay them a small amount of money now and have them leave quietly (I’m actually a bit surprised no one has yet commented publicly on whether there was a payout and if so how big it was), or you can trash them publicly and pay a lot of money after the inevitable lawsuit.  As public institutions, I don’t think universities and colleges have a lot of flexibility on that point.

The third implication people are drawing form this is that here again we have Another Failed Board Search, Why Can’t Boards Get Things Right, Need for Immediate Governance Overhaul, etc.  But I think this is overdone. The Brock University Board Chair has gone on record saying his university “did not know” about the Ryerson report (there is no word about when Brock became aware of it).  But unlike one or two Presidential searches I’ve heard of, Brock actually *did* its homework and interviewed quite a few people about Cukier.  It’s just that, as far as we know, no one at Ryerson told them about the results of the inquiry, presumably because it was a “personnel matter” and hence confidential.  If staff at Ryerson knew about the issue and withheld information from the Brock search committee, that’s hardly something the Brock board can be blamed for.  Sometimes bad things happen even if you do everything by the book.

Finally, let me stress that we don’t yet know the full story.  Maybe we never will.  The staff allegations at Ryerson might only be a small part of the issues involved.  Keep an open mind.  There’s probably more to come.

September 06

Announcements

Guys!  I’ve got it solved!  This whole funding thing!

You know how Liberal MPs are taking up the entire back-to-school season with on-campus announcements of Strategic Investment Fund (SIF) money?  It’s annoying, right?  I mean this is money isn’t some “favour” delivered through hard work and pork-barrelling by the local MP.  It’s technocratically-determined funding decided upon by a professional public service.  And yet all the universities and colleges have to go through this rigamarole, saying “thank you” to the local MP, and having pictures taken that can be used ad nauseam in local media.

OK, I get it.  Politicians need to get “credit”, and it’s not just about personal political advantage (though I suppose that never goes amiss).  It’s important that the public knows how their money is spent and media “events” help with that process.  To that extent, it’s perfectly legitimate.   But why is it legitimate for some types of spending and not others?  Why do the feds get these heaps of publicity for a few hundred million dollars when provinces hand out over a billion dollars a month every year?

That’s not a novel observation on my part, or anything.  Everyone has had this discussion of course.  It hasn’t exactly passed unnoticed that announcements of capital projects (especially ribbon-cuttings) get more fanfare than announcements of operating grants. And there’s a too-smug, too-certain line that everyone knows about how “if only we could do ribbon-cuttings for operating grants” then politicians would give money for that, too.

Now, there’s at least some truth to this.  Relative to operating grants, universities and colleges have been getting more money for capital these past fifteen years or so.  And presumably the ability to get good press out of announcing such funding has at least some small role to play in it.

But do we really know that we can’t hold media events for operating grant announcements?  Or have we just never tried?

I mean, clearly, the fact that the money has already been announced is no barrier to getting media out to events.  Every last dime of SIF has already been announced weeks ago.  Hell, last week the Science Minister showed up at Humber College to re-announce changes to the Canada Student Loans Plan that had not only been announced five months ago but which had actually gone into effect four weeks previously.  Timeliness and novelty are clearly not the issue.

Some people might say: “ah, well, you can’t announce operating grants because they aren’t new.”  But this is small-time thinking.  There’s almost always a part of the funding that is new, even if it’s only 1 or 2%.  And what that money is funding changes quite a bit every year.  One year it might be buying RECORD LEVELS OF ENROLLMENT, and in another SIXTY NEW PROFESSORS AND A NEW CENTER FOR STUDENTS WITH DISABILITIES.  Tie it in with some kind of re-announcement about new goals, multi-year agreements, whatever, and you’ve got yourself a bona fide news event.

Not a ribbon cutting, maybe, but a reason for provincial politicians and institutional officials to be pleasant to one another in public, to explain to the electorate what their money is buying, and have some photos taken.  And who knows?  If people are right that positive media is what begets more capital funding announcements, maybe it’ll help bring operating grants back up a bit too.

So come on, institutional government-relations types and provincial media-flack types.  It can’t be beyond your wit to organize some media for all that massive public investment.  Give it a try.  It can’t be any less legitimate than this interminable parade of SIF announcements to which we’re currently being subjected.

September 02

New Thoughts on Innovation Policy

A new book on innovation policy came out this summer from a guy by the name of Mark Zachary Taylor, who teaches at Georgia State.  The book is called The Politics of Innovation: Why Some Countries are Better Than Others at Science and Technology and to my mind it should be required reading for anyone interested in following Canada’s innovation debate.

First, things first: how does Taylor measure how “good” a country is at Science & Technology?  After all, there are lots of ways of measuring inputs to tech, but not many on the outputs side.  The measure Taylor selects is patents.  And yes, this is highly imperfect (though it does correlate reasonably well with other possible measures like multi-factor productivity), but Taylor doesn’t over-egg the data.  For most of the book, his interest is less in scoring countries and then using various types of regression analyses to come up with explanations for the scores; rather, he tends to group countries into fairly wide buckets (“Most innovative”, “mid-level innovative“, and “rapid innovators” showing rapid progress like Korea and Taiwan).  Canada – probably to the surprise of anyone who follows innovation policy in our country – comes up as one of the “most innovative” along with Japan, Germany, Sweden and Switzerland. This may either be a sign of us being too tough on ourselves, or Taylor being out to lunch (I’m a bit unsure which, to be honest).

But put that aside: what’s important about this book is that it provides a good, critical tour d’horizon of the kinds of institutions that support innovation (research universities, patent protection, etc.) and explicitly rejects the idea that “good institutions” are enough to drive innovation forward.  This seems to me to be quite important.  Much of the innovation commentariat loves playing the game of “look-at-that-institution-in-a-country-I-think-does-better-than-us-we-should-really-have-one-of-those” (think Israel’s government-sponsored venture capital funds, for instance).  The riposte to this is usually “yeah, but that’s sui generis, the product of a very special set of political/institutional factors and would never work here”.  And that’s true as it goes, but Taylor goes a bit further than that.

First, he focuses on how open a country is to both inward and outward flows of knowledge and human capital.  Obviously, higher education plays some role here, but on an economy-wide basis, the real question is: are firms sufficiently well-networked that they can effectively hire abroad or learn about market opportunities in other countries?  Taiwan and Israel have worked this angle very effectively, cultivating ties with targeted groups in the United States and elsewhere (my impression is that Canada does not do this in anything near the same level – one wonders why not).

Second, Taylor doesn’t just stop at asking the question of how nations innovate (answer: they design domestic institutions and policies to lower transaction & information costs, distribute and reduce risk, and reduce market failures in innovation).  He also tries to get at the much more interesting question of why countries innovate.  Why do Finns innovate like mad and Norwegians not?  Why Taiwan not Philippines?  Or, for that matter, why the US and not us?  Institutions play some role here, but it’s not the whole story.  Culture matters.

Or, in Taylor’s telling: perceptions of internal and external threat matter.  His argument is that everywhere, the urge to innovate is countered by the wailings of bereavement from those who lose from technological innovation.  In many countries, the political power of losers is sufficient to create a drag on innovation.  Only in places where the country feels an existential threat (e.g. Israel, Taiwan) do political cultures feel they have the necessary social license to ignore the losers and give the innovators free rein.   Taylor calls this “creative insecurity”.

I have to say I don’t find this last bit entirely persuasive.  The bit about losers having too much power is warmed-over Mancur Olson with a tech-specific focus (Taylor goes to some length to say it’s not, but really it is). and while the second part is a plausible explanation for  some places -Singapore, say – his attempt at formalization requires some serious torquing of the data (Finland cannot credibly be described as being under external threat) and/or some very odd historical interpretations (Taylor’s view that Israel was under greater external threat after 1967 than before it would probably not be accepted by many mid-east specialists).

That said, it arguably does explain Canada.  Our resource base gives us an undeniable cushion that other advanced countries lack.  We lack external threats (and since the late-90s we lack internal ones too).  Frankly, we’re just not hungry enough to be top-of-the-pack.  Even in parts of the country that should be hungry – Nova Scotia, for example – there’s simply not that much appetite to sacrifice dollars spent on social policy to make investments in innovation. See, for instance, the carping over Dalhousie’s participation in the MIT’s Regional Entrepreneurship Acceleration Program.

Say it softly: as a country, we might not be cut out for this innovation stuff.  Sure, we like spending money on gee-whizzy tech things and pretending we’re at the cutting edge of this, that or the other, but it’s a long way from that to actually being innovative.  Innovation is tough.  Innovation causes pain and requires sacrifice.  But Canadians prefer comfort to sacrifice:  we can’t get rid of harmful dairy monopolies, our national dress is fleece, etc.

Anyways, read the book.  And have a great weekend.

September 01

Reminder: Hot Jobs and Hot Careers Never Last

There was an interesting piece in the National Post last week about unemployed professionals in the Alberta oil and gas industry.  In amidst the occasional whine about the oil industry being so unloved by the rest Canada, there is a serious article about what happens to people in specialized professions when the economic tide swings away from that profession.  Some quotes:

Philip Mulder, spokesman for the Association of Engineers and Geoscientists of Alberta (APEGA), said its geoscientists are faring badly in the current climate because they are predominantly employed in oil and gas…PetroLMI/ENFORM, which tracks labour market trends in the sector, said 145,000 people worked directly in oil and gas extraction, pipelines and services in the province on average in the first four months of 2016, down from 176,000 during 2014. Meanwhile, 17,300 were unemployed on average during the first four months of 2016, up from 6,500 during 2014. The rest left the industry by moving elsewhere or to other sectors, retiring, going back to school.

Lemiski, 33, studied for eight years at the University of Alberta to earn a degree in geology and a master’s in earth and atmospheric sciences.  He started his career six years ago as an exploration geologist at Talisman Energy Inc., but was laid off when the company ran into financial difficulties and cut its exploration program. He found work at Nexen Inc., owned by China’s CNOOC Ltd., as a petrophysicist, but that didn’t last long and he was back on the street in March.

I could go on, but you get the idea.

Times are obviously tough for professionals in this industry and it’s hard not to sympathize with people whose profession is vanishing under their feet.  But let me simply point out that this situation could have been predicted by anyone a) with half a brain and b) who remembers what the early 1990s recession was like in Alberta.  Cyclical industries are just that – cyclical.  They rise and they fall.  That’s fine until people forget that both are equally likely.

And of course, the top of a boom is precisely where people tend to forget that.  One of many examples: the Canadian Society of Exploration Geophysicists in 2011, saying Canada would need 6,000 geophysicists to deal with retirements between now and 2020.  Here’s a Petroleum Labour Market Information report saying that the country wasn’t training enough people, and predicting that there would be a shortfall of 1500-200 geologists and geophysicists in Alberta alone by 2022. And so of course there was a drumbeat across the west:  why weren’t those out-of-touch universities doing more to graduate more geologists?

This isn’t the first time something like this has happened of course. Back in the late 1990s the Ontario government fell for the pitch of the IT industry that there simply had to be more computer science graduates in the province or woe, disaster, earthquakes, pestilence, etc.  So the Harris government, which had hitherto done nothing but cut the bejesus out of Ontario universities, decided to spend millions to “double the pipeline” in computer sciences (i.e. double the intake of students in order to double the number of graduates).  Ontario universities took the money and implemented the plan, which delivered a record number of graduates… who hit the job market in 2002, right in the middle of the worst tech bust in history.  Cue thousands of underemployed Engineers, at least for a few years.

My point here is not that universities should ignore the labour market.  That would be silly; most students attend university to get a better job, and ignoring the labour market is tantamount to malpractice.  My point, rather, is that one has to be extremely careful about trying to time the market by shifting enrollment into specific programs.  Any job that’s in high demand today has good chances of looking very different in five years – the time it takes to design a program and get a new boatload of students through it.  All you end up with are a lot of disappointed and underemployed students.

Pay attention to long-term trends?  Certainly.  Short-term booms?  As little as possible.  Even in Alberta.

August 30

Know Your Incoming Students (Part 1)

As the school year starts, it’s always valuable to take a look at trends in incoming students.  The best tool we have for that in Canada for doing this is the Canadian Undergraduate Survey Consortium’s triennial survey of first-year students (the most recent version is here.  It’s not the greatest of instruments: consortium membership changes from cycle to cycle, so the base population is neither equal to the national first-year population nor stable from cycle to cycle.  But since Statistics Canada is now declining to do any surveys at all of enrolled students, it’s all we’ve got, and by and large, participating institutions are reasonably representative of the country as a whole.  There’s also a problem with CUSC occasionally changing the wording of the questions, thus making time series somewhat difficult – but then again, this is an aggravating habit that Statscan has in spades.  So, with all those difficulties acknowledged, let’s begin.

Here’s my favourite chart, on the theme of visible minorities (a subject I tackled a few months ago):

Figure 1: Visible Minority Students as a Share of all First-year students, 2001-2016

ottsyd20160829-01 

Now part of the increase comes from the way this value is calculated.  The question used to simply be: “are you a visible minority?”’; it now asks directly about ethnicity and infers visible minority status on that basis (basically, if you do not declare yourself as white or aboriginal, you are considered “visible minority”).  So part of the increase may have to do with a change in the question phrasing.  But still, this is pretty impressive.  Even if you take out all the international students (not all of whom are visible minority), you’re talking 33% of all students (compared to just 22% of all Canadians 15-24%) being visible minority.  That would make Canada possibly the only country in the world where visible minorities have that kind of advantage.

Here’s another intriguing time series where the phrasing of the question makes an enormous difference: students with disabilities.

Figure 2: Percentage of Students Indicating they Have a Disability, 2001-2016

ottsyd20160829-02

The jump in the last three years is definitely due to the way the question was posed.  In all previous years, the question was “do you have a disability”?  In 2016, the question specifically referenced nine different kinds of disability (including learning disability, which accounted for over half the responses). It then asked if the disability was serious enough to require the university to provide accommodation.  Only 32% of those listing disabilities – or 7% of all students – said yes, which brings us back down to the range of previous years.  Moral: how you ask a question matters a lot.

On to socio-economic background, which this survey measures via parents’ educational background.  This time series is a bit messed up because CUSC changed the wording of the question this year (formerly, the survey asked about each parent separately, now it just asks about “parents’” highest level).  But here goes:

Figure 3: Percentage of Students’ Parents Possessing a Bachelor’s Degree or Higher

ottsyd20160829-03

It’s hard to know exactly what to make of these results.  Since children tend to hit university about 30 years after their parents do, this graph is to some extent just reflecting the expansion of access to university in the 70s and 80s.  But that’s not quite the whole story.  According to the census, in 2001, 17% of adults aged 45-64 possessed a bachelor’s degree or higher; in 2011 it was 21% (I know, I know, National Household Survey – but still).  So it appears as if more recent cohorts of first year students are slightly more likely to come from better-educated households than their predecessors, which is not a particularly good finding.

More tomorrow.

Page 2 of 41234