HESA

Higher Education Strategy Associates

Author Archives: Alex Usher

March 13

Tea Leaves on the Rideau

Last Tuesday, federal Finance Minister Bill Morneau set the date for the federal budget for next Wednesday (March 22) and naturally people are wondering: what goodies are in store?  Without being privy to any inside information, here’s my take on where we are going.

At the press conference announcing the budget date, Minister Morneau dropped some important hints.  The biggest one is that, contrary to what had been heavily promoted for the past year, this budget will not be an “Innovation Budget”, but will represent a “downpayment” on an Innovation Budget.  From this we should probably deduce two things.  One: the feds are broke.  Well, maybe not broke, but certainly unwilling to increase borrowing in the face of a $30 billion deficit, slow growth and adverse demographic trends.  Two: the government has – THANK GOD – attained enough self-awareness to discern that does not really know what it’s doing on this file.  I noted back here that the Finance Minister’s Economic Council was flatly in opposition to the Innovation Ministry’s ideas about innovation clusters, and it probably came to the conclusion that making big budget commitments in the face of such disagreement was untenable.

To be clear: I am thrilled with this outcome.  Yes, it’s too bad the feds seem to have wasted a year on this file.  But far better to take a sober second look at the issue and make smart policy rather than to charge forward in order to meet an artificial deadline.  I also take it as a favourable sign that the government has brought Ivey Professor Mike Moffatt – co-author of a large recent piece on Innovation Policy by Canada 2020 – into the ministry on a temporary basis. For one thing, he actually understands what innovation policy means outside the tech sector, a concept which has been missing from ministry discourse since the minute Minister Bains was appointed.

(Many of you have been asking to me on twitter to explain what the hell the terms “Innovation” and “Innovation Policy” actually mean.  Sit tight: we’ll work on that one this week.)

There were also hints from the Minister that this would be a “skills” budget, a sentiment which has left many puzzled.  A year ago, the big issue for the near term was supposed to be the renegotiation of Ottawa’s Labour Market Development Agreements with the provinces, which mostly hasn’t happened. Since then there have been no major policy initiative apart from that.  There has been – via the consultations on Innovation policy – something of an understanding that skills are a big part of the innovation problem, but government thinking doesn’t appear to have progressed much beyond “more coders”! as a result.  (At a rough approximation, this government’s skills policy is more or less the same as the last ones, only if you just take out all the references to welding and insert the coding instead).

The worry here is that the “big initiative” will in fact be the implementation of the horrifically-named “FutureSkills Lab” promoted by Dominic Barton, chair of Morneau’s Economic Advisory committee (which I described back here).  If that’s the case, we may be about to view the first really big policy disaster of the Trudeau era.  First of all, no one is going to buy FutureSkills – essentially a kind of policy laboratory – as something which will help Canadians in anything other than the long term.  Second of all, the feds have yet to discuss the idea meaningfully with the provinces and without their buy-in, this initiative will be Dead on Arrival, just as the Canadian Council on Learning was.

To be clear: I don’t think this is going to be the “big initiative”.  I don’t think the Liberals are that stupid.  But I guess we’ll see.

What about Science?  Here, the news is not good.  You may recall that the Government of Canada commissioned a Fundamental Science Review, and asked by the inimitable David Naylor to run it.  Naylor, as requested, submitted the report to the Minister of Science in December.  The Government of Canada has yet to publish it and refuses to answer questions about when it might be published.  Why?  It seems transparently obvious that the government found some of the findings inconvenient, and would prefer to bury it until after the budget.  Maybe the report suggested the system needed more money (which would have been beyond the committee’s remit since it was only asked to comment on the management of the system, not the size).  Maybe the report suggested that certain science bodies which the government has already decided to fund were redundant.  Either way, the government seems to have decided the budget will be easier to spin if we haven’t all first read Naylor’s report.  I have a hard time imagining how this could a harbinger of good news.

In sum: don’t bank on anything big in this budget.  In fact, brace yourself for at least one major piece of goofiness.  Fingers crossed it doesn’t happen, but best to be prepared.

March 03

Mega-Trends in International Higher Education – A Summary

Over the past few weeks, we’ve looked at some of the big changes going on in higher education globally.  To wit:

  • Higher education student numbers are continuing to rise around the world. This massification in many countries is being accompanied by stratification.  Getting a “distinctive” degree at a prestige university remains hard; going abroad remains a good way of getting it.  So increases in international student numbers are likely to continue, ceteris paribus.
  • Institutions in developing countries are unlikely to increase their global prestige level any time soon. Climbing the ladder costs money most developing-world governments don’t have, and in any case, the definition of prestige is changing in ways that make it difficult for universities in developing countries to follow.
  • Demographic forces have been a significant part of the rise in global student numbers; however, for the next decade or so, these trends will not be quite so favourable (though by 2030 they should be trending positive again).
  • Similarly, the end of the commodity super-cycle means a lot of countries that were getting rich off the rise of countries like China are no longer getting richer, in developed-country currency terms, anyway (and even India is not doing well by this measure). This means at least some potential international students are looking for cheaper alternatives.

So what does all this mean?  How do we sum up these trends?

First of all, we need to stop all this nonsense talk about international higher education being a “bubble”.  It’s not.  The fundamentals of demand – rising numbers of students wanting a prestige degree – are strong, as are developed universities’ market position as a purveyors of prestige degrees.

There are two things which could undermine this.  Demographic headwinds might mean that universities would need to do more to increase the percentage of students studying abroad in order to keep up the trends (rather than simply relying on the overall trends in increased participation).  Clearly, recent economic setbacks and currency slides in a number of countries make it more difficult to do this, at least if you’re an institution in one of the countries where the currency remains strong.  If, like Canada, you’re not, then this is a chance to steal a march on countries who either have strong currencies (the US) or who through some sort of policy lobotomy have decided they don’t want international students (the UK).  In any case, international student numbers have held up for the last few years in the face of these headwinds: the real test is what happens if economic growth starts to stall in China.

The other potential game-changer is one I alluded to a couple of times last year (see here and here); which is whether or not sending-country governments start to deliberately shut off the taps, deny students exit visas, and begin discriminating against graduates of foreign universities in the labour market.  A year ago, that might have sounded crazy; today, such moves are by no means unthinkable in Xi’s China, Putin’s Russia or Erdogan’s Turkey.  Others may follow.

In short, there is risk today in the world of international student mobility.  But it is political rather than economic.  All we can do is keep plugging away and hope that the global situation does not get worse.

In the meantime, the OTTSYD be taking a break for reading week, and will return to our regular schedule on March 13.

March 02

Bravo, New Brunswick

Readers may remember that about this time last year, I was giving the Government of New Brunswick a bit of stick for a botched student aid roll-out. Today I am pleased to give credit where it is due, and congratulate the folks in Fredericton for fixing the problem and developing a much better student aid system.

Let’s go back 12 months to pick up the story.  In February 2016, the Ontario government had come up with a fabulous new system which basically made a promise of grants equal to or greater than average tuition for students from low to mid-family incomes.  At family incomes above that, students received a declining amount of money out to about $110,000 at which point the grant flattens to a little under $2,000 (a remnant of the government’s ludicrous “30% tuition rebate” from 2011) and then falls to zero a little over $160,000.  With a bit of clumsiness this eventually, sort of, got branded as “free tuition for low- and middle-income students, which it isn’t, quite, but close enough for advertising.  Cue what is seen to be a major policy success.

It was such a success that New Brunswick decided to copy it later last spring.  Like Ontario, they built on the change to Canada Student Grants and eliminated some of their own tax credits (including the egregiously wasteful graduate tax rebate) to fund a “Tuition Access Bursary”, which guaranteed a grant equal to tuition (up to a maximum of $10,000, which was more generous than Ontario) for students from families making under $60,000. Which is great, right?  Well, yes, except the problem is, there was no phase-out for the grant.  At $59,999 in family income, there you were raking in $6500 or so in grants and at $60,001 you got $1200 in grants (the federal middle-income grant) and that’s not great social policy.  Making it worse was the fact that families in that $60K to $70K would also be losing a lot of money in tax credits that both the federal and provincial governments were ending in order to pay for this new benefit; my back-of-the envelope calculation was that in this range, parents were going to be about $1,200 worse off as a result of the change.

In any case, because I and others pointed out this flaw, the government after a brief period of defensive blustering decided it was best to go back to the drawing board and revisit the formula.  They did so and last week came up with a new “Tuition Relief for the Middle Class”, which basically involved taking a sliding declining scale of grants for families earning between $60-100,000 onto the existing Tuition Access Bursary (which has been renamed the “Free Tuition Program”).  Arguably, the New Brunswick program is now somewhat better than the Ontario program because 1) it’s not just “grants up to “average” tuition”, a caveat which I suspect is going to leave a lot of people slightly cheesed off when the program starts and 2) It still manages not to subsidize people up to that absurdly high $160K + threshold that Ontario insists on maintaining.  Ontario gets points for making its aid portable, though – New Brunswick’s program is only available to students who study in-province, which I think is a shame.

The announcement – which you know, hey guys, it’s a good news story! – was marred somewhat by some media sniping about how the number of beneficiaries is about 30% short of what was estimated last year.  To me this is neither here nor there: government cost estimates on year 1 of a new program are often a matter of throwing numbers at a dartboard.  The good news is that there is still money to either raise the entry threshold for the Free Tuition Program or (better still) expand the debt relief program or top up the amount of money available to high-need mature students and parents through the New Brunswick Bursary Program.

Now, all we need for this to be perfect is for New Brunswick to come up with a smart, credible monitoring program to examine the effects of these changes on participation over the next few years.

(New Brunswick folk: that’s on the way, right guys?  Right?  Well, you know where to find me if you need a hand…)

Anyways, as I say, credit where it is due.  Well done, New Brunswick.

March 01

Under-managed universities

I have been having some interesting conversations with folks recently about “overwork” in academia.  It is clear to me that a lot of professors are absolutely frazzled.  It is also clear to me that on average professors work hard – not necessarily because The Man is standing over them with a whip but because as a rule academics are professional and driven, and hey, status within academia is competitive and lots of people want to keep up with the Joneses.

But sometimes when I talk to profs – and for context here the ones I speak to most often are ones roughly my own age (mid-career) or younger – what I hear a lot of is about work imbalance (i.e. some professors are doing more work than others) or, to put it more bluntly, how much “deadwood” there is in universities (the consensus answer is somewhere between 20-30%).  And therefore, I think it is reasonable to ask the question: to what extent do some people’s “overwork” stem from the fact that some professors aren’t pulling their weight?

This is obviously something of a sticky question, and I had an interesting time discussing it with a number of interlocutors of twitter last week.  My impression is that opinion roughly divides up into three camps:

1)      The self-righteous Camp.  “This is ridiculous I’ve never heard professors talking like this about each other, we all work hard and anyway if anyone is unproductive it’s because they’re dealing with kids or depressed due to the uncaring, neoliberal administration smashing its boot into the face of academia forever…”

2)      The Hard Science Camp. “Well, you know there are huge differences in workload expectation across the institution – do you know how much work it is to run a lab? Those humanities profs get away with murder…”

3)       The “We’ve earned it” Camp “Hey look at all the professions where you put in the hours at the start and get to relax later on. We’re just like that. Would you want to work hours like a junior your whole life? And by the way older profs just demonstrate productivity on a broader basis than just teaching and research….”

There is probably something to each of these points of view.  People do have to juggle external priorities with academic ones at some points in their lives; that said, since most of the people who made the remarks about deadwood have young kids themselves, I doubt that explains the phenomenon. There probably are different work expectations across faculties; that said, in the examples I was using, my interlocutors were talking about people in their own units, so that’s doesn’t affect my observation, much.  Perhaps there are expectations of taking it easier as careers progress, but I never made the argument that deadwood is related to seniority so the assumption that this was what caused deadwood was… interesting).  So while acknowledging that all of these points may be worthwhile, I still tend to believe that at least part of the solution to overwork is dealing with the problem of work imbalances.

Now, at some universities – mainly ones which have significantly upped their research profile in the last couple of decades – this might genuinely be tough because the expectations of staff who were hired in the 1970s or 1980s might be very, very different than the expectations of ones hired today.  Places like Ryerson or MacEwan are obvious examples, but can also be true at places like Waterloo, which thought of itself as a mostly undergraduate institution even into the early 1990s.  Simply put, there is a huge generational gap at some universities in how people understand “the job” because they were hired in totally different contexts.

What strikes me about all of this is that neither management nor – interestingly – labour seem to have much interest in measuring workload for the purpose of equalizing it.  Sure, there’s lots of bean counting, especially in the sciences, especially when it comes to research contracts and publications and stuff like that.  But what’s missing is the desire to use to adjust individuals’ work loads in order to reach common goals more efficiently.

My impression is that in many departments, “workload management” means, at most, equalizing undergraduate teaching requirements.  Grad supervisions?  Those are all over the place.  “Service”?  Let’s not even pretend that’s well-measured.  Research effort?  Once tenure has been given, it’s largely up to individuals how much they want to do.  The fiercely competitive may take on 40 or 50 hours a week on top of their other duties, others much less.  Department heads – usually elected by professors in the department themselves – have limited incentive and means to get the overachievers to maybe cool it sometimes and the underachievers to up their game.

In short, while it’s fashionable to say that professors are being “micro-managed” by universities, I would argue that on the rather basic task of regulating workload for common good, academics are woefully under-managed.  I’d probably go even further and say most people know they are undermanaged and many wish it could change.  But at the end of the day, academics as a voting mass on Senates and faculty unions consistently seem to prefer undermanagement and “freedom” to management and (perhaps) more work fairness.

I wonder why this is. I also wonder if there is not a gender component to the issue.

What do you think?  Comments welcome.

February 28

The “Not Enough Engineers” Canard

Yesterday I suggested that Ottawa might be as much of the problem in innovation policy as it is the solution.  Today I want to make a much stronger policy claim: that Canada has a uniquely stupid policy discourse on innovation.   And as Exhibit A in this argument I want to present a piece posted over at Policy Options last week.

The article was written by Kat Nejatian, a former staffer to Jason Kenney and now CEO of a payment technology company (OVERCONFIDENT TECH DUDE KLAXON ALERT).  Basically the piece suggests that the whole innovation problem is a function of inputs: not enough venture capital and not enough engineers.  Let me take those two pieces separately.

First comes a claim that Canada’s Venture Capital funding is following further and further behind the United States.  He quotes a blog post from Wellington Financial saying: American venture-capital-backed companies raised US$93.37 per capita in 2006, while in Canada we raised US$45.76 per capita. Nearly a decade later, in 2015, US companies had doubled their performance, raising an average of US$186.23 per capita, while Canadian companies had only inched up to US$49.42.

There are two problems here.  First, these figures are in USD at current exchange rates.  You may remember that 2006 was an extraordinarily good year for the Canadian dollar, and 2015 less so, so this isn’t the best comparison in the world.  Second, they in no way match up with other published data on venture capital as a percentage of GDP.  The reference years are different, but the Conference Board noted that the VC funding as a percentage of GDP grew in Canada from .06 to .1% of GDP between 2009 and 2013, and now stands second in the world only to the US (the US grew from .13% to .18% while all of Europe fell back sharply).  And Richard Florida noted in The Atlantic that in terms of VC funding per capita, Toronto is the only non-American city which cracks the world’s top 20.  I am not sure what to make of these differences; I expect some of it has to do with definitions of venture capital (early-stage vs. late-stage for example).  But looking at more than one data point throws Nejatian’s hypothesis into doubt.

But the bigger whopper in this article has to do with the claim that Canada does not educate enough engineers.  Now forget the fact that the number of engineering graduates has very little to do with success in innovation, even if you define innovation a narrowly as Nejatian does (i.e. as tech and nothing else).  His numbers are simply and outrageously wrong.  He claims Canada produced only 12,000 new Engineering grads; in fact, the number of undergraduate degrees awarded in Architecture & Engineering in 2014 was 18,000, and that’s excluding math and computer science (another 5,400), not to mention new graduate degrees in both those areas (another 11,700).  He claims the UK produces 3.5 times the number of engineers per capita that Canada does.  It doesn’t; there is a gap, but it’s not very big – 9% of their degrees go to engineers compared to 8% of ours (see figure below).  He repeats the scare claim – demolished long ago by Vivek Wadhwa among others – that India is going to eat our lunch because it graduates 1.5 million engineers per year.  This argument needs to go back to 2006 where it belongs: only a tiny percentage of these engineers are of the calibre of North American E-schools, and one recent Times of India  piece suggested that 93% of them were not actually employable (which sounds like an exaggeration but still points to a significant underlying problem).

Figure 1: Science & Engineering Degrees as % of Total Degrees Awarded, Selected OECD Countries

OTTSYD 2017-02-27-1

(See what I mean?  The US has the smallest percentage of undergraduate degrees in engineering and yet it leads everyone else in tech…yet apparently that doesn’t matter to Nejatian – all that matters is MOAR ENGINEERS.  I mean, if we increase our proportion of degrees in engineering by about 60% we could be as innovative as…Italy?)

I could go on, but you get the picture.  This is a terrible argument using catastrophically inaccurate data and yet it gets a place in what is supposed to be our country’s premier publication on public policy.  It’s appalling.  But it fits with the way we talk about innovation in this country.  We focus on inputs rather than processes and relationships.  We see a lack of inputs and immediately try to work out how to increase them rather than asking i) do these inputs actually matter or ii) why are they low in the first place (actually, the only redeeming feature about this article is that it doesn’t make any recommendations, which given the quality of the analysis is really a blessing for all concerned).

Could Canada do with a few more engineers?  Probably.  It’s the one field of study where incomes of new graduates are still rising in real terms, which suggests the demand could support a greater supply.  But the causal link between Engineers and innovation is a vast oversimplification.  If we want better policy in this country, we need to start by improving the quality of the discourse and analysis.  Policy Options has done us all a disservice by letting this piece go out under their name.

February 27

Can Ottawa Do Innovation?

The National Post’s David Akin had a useful article last week entitled Canada Has Failed at Innovation for 100 years: Can The Trudeau Government Change That?  Read it, it’s good.  It’s based around a new-ish Peter Nicholson article in Canadian Public Policy which is unfortunately not available without a subscription.  But Nicholson’s argument appears to be: we’ve done pretty well our entire history as a country copying or importing technology from Americans: what exactly is it that Ottawa is going to do to “shock” us into becoming a massive innovator?

Good question.  But I have a better question: does it make any sense that the federal government is leading on these kinds of policies?  Wouldn’t provinces bet better suited to the job?  Knee-jerk centralists (my guess: probably half my subscribers) probably find that suggestion pretty horrific.  But hear me out.  There are a number of really good reasons why Ottawa probably isn’t best placed to lead on this file.

First: innovation policy is to a large extent is about people and skills.  And skills policy has been fully in the hands of provincial governments for over twenty years now.  We accept that provincial governments are closer to local labour markets and local business for skills purposes.  Surely the same is also true for innovation?

Second: Canada is huge.  We’re not like Singapore or Israel or Taiwan, where industries are essentially homogenous across the entire country.  We are more like China or the US, where a single industry might look completely different in one part of the country than another.  If you haven’t already read Run of the Red Queen: Government, Innovation, Globalization and Economic Growth in China by Dan Breznitz and Michael Murphree, I recommend it.  Besides showing how innovation can be profitable even when it is not of the “new product”/”blue sky” (a truth to which our current government seems utterly oblivious), it shows how the structure of a single industry (in this case, IT) can be utterly different in different parts of a single country.  That’s also true in Canada.  And it’s why it’s tough to draw up decent national policies on a sectoral level.

(A corollary to that second point, which I made back here: because the country is so big, any government attempt to play the “cluster” game in the name of improved innovation is bound to get wrapped up in regional politics pretty quickly.  Anyone who’s watched Montreal and Toronto’s unseemly jockeying for a single big federal investment in Artificial Intelligence will know what I mean.)

Over the course of the past twenty years, of course, many provinces have set up their own innovation ministries or agencies.  But apart from the partial exceptions of Ontario and Quebec, they tend to be poor cousins of the federal ministry: understaffed and not especially well-resourced.  As a result, they’re not at present any more effective than Ottawa in driving innovation.  But that could change with more effective investment.  And of course, Ottawa would always have a role to play: if nothing else, its authority over competition policy means it will always have levers which it can and should use to promote innovation (even if at present it seems extremely reluctant to use this particular lever).

In short, it’s worth considering the hypothesis that it’s not “Canada” which has failed at innovation, but Ottawa.

February 24

Four Mega-Trends in International Higher Education – Catch-up is Hard

One of the perpetual alleged threats to cross-border education is that universities in the developing world might someday rival those in the west. Once that happens, the theory goes, students won’t need to go abroad and the whole international student thing goes up in smoke. It’s not an implausible theory, but it underplays how difficult catching up actually is.

The most basic problem for universities in developing countries is paying staff. Those talented and fortunate enough to have a terminal degree (this is by no means a given at universities in many countries), are often quite mobile. Salaries not high enough at Kenyan universities? Move to Tanzania, where salaries are higher? Not high enough in Tanzania? Try South Africa. Horizons not big enough in Johannesburg? There’s always New Zealand. And so on, and so on. It’s not quite a global market – more a series of overlapping regional ones. But the point is that institutions have to pay vastly over the odds in order to keep staff.

The gaps in national pay scales are huge. In Tanzania, a full professor makes about $3000 (US) per month. That may not sound like much, but it’s about 65 times the average private sector wage. A top academic could double that by heading to South Africa, but even that is roughly six times the average private sector wage. Move to Canada or Australia and it doubles again, even though here wages of topic academics are roughly three times the average private sector wage. To put all that another way: it takes the Tanzanian state twenty times as much financial effort in relative terms in order to pay its academics a quarter of what they’d earn in the west. Now, sure, not all Tanzanian academics would make it in Western universities. Yet that’s still the price required to keep talent at home. Financially, just staying in place is a killer; moving up the quality ladder is beyond most countries’ resources.

Even where top talent can be retained either through significant salaries, major research funds or linguistic barriers, the fact is that institutional prestige isn’t just a function of talent. As my friend Jamil Salmi says, it’s the product of talent, resources, governance (a critical missing factor in too many countries) – and above all, time. Prestige is a stock, not a flow. It comes from decades of consistent growth and excellence. Even if a new university pops up with billions of dollars and a few celebrity professors – say, the King Abdullah University of Science and Technology in Saudi Arabia or Nazarbayev University in Kazakhstan – that doesn’t mean it is instantly going to gain prestige.  In fact, to be more precise: it takes decades of having graduates with great careers before anyone will think an institution to be genuinely prestigious. And not many developing-world institutions can claim such things.

And in any case, prestige is a moving target. As I noted back here, the very nature of institutional prestige is changing, shifting from being a mainly research-based institution to a mainly economic-growth-catalysing institution. And as noted back here, it’s very difficult in many developing countries for universities t play such a role, first because they have few domestic technology-intensive businesses with whom to collaborate, and second because they rarely have either the academic culture or leadership to turn them in this new direction. There are a few – NUS in Singapore, KAIST and Pohang in Korea, some of the C9 in China – that can say this. Precious few others can.

What does that mean for international education?  Prestige – however you measure it – is a lagging indicator of economic growth, not a leading one. That means that even if economic catch-up of developing countries continues, it’s going to be a loooong time before the prestige of those countries’ universities matches those of those in developed countries. That doesn’t give developed-country universities a monopoly on international students by any means; south-south student mobility is an increasing percentage of all international student flows. But it does mean that quality/prestige will remain a source of competitive advantage for these institutions for many years – maybe even decades – to come.

February 23

Garbage Data on Sexual Assaults

I am going to do something today which I expect will not put me in good stead with one of my biggest clients.  But the Government of Ontario is considering something unwise and I feel it best to speak up.

As many of you know, the current Liberal government is very concerned about sexual harassment and sexual assault on campus, and has devoted no small amount of time and political capital to getting institutions to adopt new rules and regulations around said issues.  One can doubt the likely effectiveness of such policies, but not the sincerity of the motive behind them.

One of the tools the Government of Ontario wishes to use in this fight is more public disclosure about sexual assault.  I imagine they have been influenced with how the US federal government collects and publishes statistics on campus crime, including statistics on sexual assaults.  If you want to hold institutions accountable for making campuses safer, you want to be able to measure incidents and show change over time, right?

Well, sort of.  This is tricky stuff.

Let’s assume you had perfect data on sexual assaults by campus.  What would that show?  It would depend in part on the definitions used.  Are we counting sexual assaults/harassment which occur on campus?  Or are we talking about sexual assaults/harassment experiences by students?  Those are two completely different figures.  If the purpose of these figures is accountability and giving prospective students the “right to know” (personal safety is after all a significant concern for prospective students), how useful is that first number?  To what extent does it make sense for institutions to be held accountable for things which do not occur on their property?

And that’s assuming perfect data, which really doesn’t exist.  The problems multiply exponentially when you decided to rely on sub-standard data.  And according to a recent Request for Proposals placed on the government tenders website MERX, the Government of Ontario is planning to rely on some truly awful data for its future work on this file.

Here’s the scoop: the Ministry of Advanced Education and Skills Development is planning to do two surveys: one in 2018 and one in 2024.  They plan on getting contact lists of emails of every single student in the system – at all 20 public universities, 24 colleges and 417 private institutions and handing them over to a contractor so they can do a survey. (This is insane from a privacy perspective – the much safer way to do this is to get institutions to send out an email to students with a link to a survey so the contractor never sees the names without students’ consent).  Then they are going to send out an email to all those students – close to 700,000 in total – offering $5/per head to answer a survey.

Its not clear what Ontario plans to do with this data.  But the fact that they are insistent that *every* student at *every* institution be sent the survey suggests to me that they want the option to be able to analyze and perhaps publish the data from this anonymous voluntary survey on a campus by campus basis.

Yes, really.

Now, one might argue: so what?  Pretty much every student survey works this way.  You send out a message to as many students as you can, offer an inducement and hope for the best in terms of response rate.  Absent institutional follow-up emails, this approach probably gets you a response rate between 10 and 15% (a $5 incentive won’t move that many students)  Serious methodologists grind their teeth over those kinds of low numbers, but increasingly this is the way of the world.  Phone polls don’t get much better than this.  The surveys we used to do for the Globe and Mail’s Canadian University Report were in that range.  The Canadian University Survey Consortium does a bit better than that because of multiple follow-ups and strong institutional engagement.  But hell, even StatsCan is down to a 50% response rate on the National Graduates Survey.

Is there non-response bias?  Sure.  And we have no idea what it is.  No one’s ever checked.  But these surveys are super-reliable even if they’re not completely valid.  Year after year we see stable patterns of responses, and there’s no reason to suspect that the non-response bias is different across institutions.  So if we see differences in satisfaction of ten or fifteen percent from one institution from another, most of us in the field are content to accept that finding.

So why is the Ministry’s approach so crazy when it’s just using the same one as every one else?  First of all, the stakes are completely different.  It’s one thing to be named an institution with low levels of student satisfaction.  It’s something completely different to be called the sexual assault capital of Ontario.  So accuracy matters a lot more.

Second, the differences between institutions are likely to be tiny.  We have no reason to believe a priori that rates differ much by institutions.  Therefore small biases in response patterns might alter the league table (and let’s be honest, even if Ontario doesn’t publish this as a league table, it will take the Star and the Globe about 30 seconds to turn it into one).  But we have no idea what the response biases might be and the government’s methodology makes no attempt to work that out.

Might people who have been assaulted be more likely to answer than those who did not?  If so, you’re going to get inflated numbers.  Might people have reasons to distort the results?  Might a Men’s Rights group encourage all its members to indicate they’d been assaulted to show that assault isn’t really a women’s issue?  With low response rates, it wouldn’t take many respondents to get that tactic to work.

The Government is never going to get accurate overall response rates from this approach.  They might, after repeated tries, start to see patterns in the data: sexual assault is more prevalent in institutions in large communities than in small ones, maybe; or it might happen more often to students in certain fields of study than others.  That might be valuable.  But if the first time the data is published all that makes the papers is a rank order of places where students are assaulted, we will have absolutely no way to contextualize the data, no way to assess its reliability or validity.

At best, if the data is reported system-wide, the data will be weak.  A better alternative would be to go with a smaller random sample and better incentives so as to obtain higher  response rates.  But if it remains a voluntary survey *and* there is some intention to publish on a campus-by campus basis, then it will be garbage.  And garbage data is a terrible way to support good policy objectives.

Someone – preferably with a better understanding of survey methodology – needs to put a stop to this idea.  Now.

February 22

Notes for the NDP Leadership Race

As contestants start to jump into the federal NDP leadership race, it’s only a matter of time before someone starts promising free tuition to all across the land.  Now, I’m not going to rehash why free tuition is both regressive and undesirable (though if you really want to take a gander through the archives on free tuition, have a look here).  But I do think I can do some public service by talking about federalism and higher education, or rather: what the feds can and cannot do in this sphere.

The entire Canadian constitution is based around a compromise on education dating from 1864.  Upper Canada came to the Quebec conference with one overriding aim: representation by population in Parliament, so that their superior population would give them the most seats in Parliament.  Lower Canada agreed if and only if a second, local, and equal tier of government was created which would have jurisdiction over education and health, because over-their-dead-bodies were a bunch of (mostly) Orangemen going to get their hands on a hallowed set of (mostly) French catholic institutions.

There’s nothing in there that stops Ottawa’s ability to give money to individuals for the purpose of education.  This is why, despite all the sturm und drang, Quebec never put up a legal fight to the Canada Millennium Scholarship Foundation: Ottawa can give cash to whoever it wants, whenever it wants.  But when it comes to dealing with institutions, their ability to direct money to areas of provincial jurisdiction is subject to provincial veto.  The provinces accept (with limits, in Quebec’s case) that the feds can flow money to institutions for the purposes of academic research.  Hence the Canadian Foundation for Innovation.  They do not accept that it can send money to institutions for operating purposes.

(Historical footnote: there was a period where nine out of ten of them were prepared to accept this.  Back in the mid-1950s, there was a ruse in which the federal government handed tens of millions of dollars every year (a lot back then) to Universities Canada – then known as the National Conference of Canadian Universities and Colleges – which it would then distribute to institutions.  In theory this was a canny work-around to the constitution.  In practice, it stalled because Duplessis blew a gasket and told Quebec universities that if they touched a dime of that money, he’d take it out of their provincial funding.  Pierre Elliott Trudeau then wrote a wonderful article in la Cite called “Federal Grants to Universities” explaining why Duplessis was 100% right and St. Laurent was in kookooland, constitutionally speaking.  It’s a great article, read it if you can.  Anyway, this arrangement lasted into the 1960s, when the feds got out of this arrangement and moved into per-capita grants instead.  And that door is now shut: there is no going back through it.)

Politically, there is a fantasy shared by some on the political left that the federal government can simply re-acquire policy leadership in the post-secondary field by passing an act of Parliament and adding great wodges of cash to existing transfers… with strings attached.  I’ve previously (here) torn a strip off the idea of a federal Post-Secondary Education Act, but let me focus here specifically on the idea that a generalized fiscal transfer could actually affect tuition fees.  Let’s just imagine how that discussion would go.

Ottawa: we want to give each of you money so that you bring your tuition fees to zero.  Quebec and Newfoundland, your fees are about $3000, so we’ll give you that per student…

Ontario: Our fees are $7500 a student or so.  Fork it over.

Quebec and Newfoundland: Hold it.

I could go on here about the nuances of fiscal federalism, but basically that’s the problem in a nutshell (for my American readers: in some less disastrous timeline, Hillary Clinton is facing exactly this problem as she attempts to implement her free tuition promise for public universities). There are ways the federal government could bribe provinces into lowering tuition.  In fact, something like that actually happened in Nova Scotia as a result of the NDP-Liberal budget deal in the minority Parliament of 2005.  But you wouldn’t necessarily get them to lower by an equal amount, and you definitely wouldn’t get them to go to zero because they have vastly different starting points.

So, here’s the quick heads-up to all prospective New Democrat leadership candidates: even if it wanted to, the Government of Canada has no sensible way to eliminate tuition nationally.  If you do manage to form a government, this will be broken promise #1.  So don’t promise it.  Instead, think about ways to support students which don’t involve tuition.  There is a whole whack of things you could do with student assistance instead.  And the best part is: if you use student aid as a tool instead of tuition, you can channel aid to those who actually need it most.

February 21

Two Studies to Ponder

Sometimes, I read research reports which are fascinating but probably wouldn’t make for an entire blog post (or at least a good one) on their own.  Here are two from the last couple of weeks.

Research vs. Teaching

Much of the rhetoric around universities’ superiority over other educational providers is that their teachers are also at the forefront of research (which is true if you ignore sessionals, but you’d need a biblically-sized mote in your eye to miss them).  But on the other hand, research and teaching present (to some extent at least) rival claims on an academic’s time, so surely if more people “specialized” in either teaching or research, you’d get better productivity overall, right?

Anyone trying to answer this question will come up pretty quickly against the problem of how to measure excellence in teaching.   Research is easy enough: count papers or citations or whatever other kind of bibliometric outcome takes your fancy.  But measuring teaching is hard.  One line of research tries to measure the relationship between research productivity and things like student evaluations and peer ratings.  Meta-analyses show zero correlation between the two: high research output has no relationship with perceived teaching quality.  Another line of research looks at research output versus teaching output in terms of contact hours.  No surprise there: these are in conflict.  The problem with those studies is that the definitions of quality are trivial or open to challenge.  Also, very few studies do very much to control for things like discipline type, institutional type, class size, stage of academic career, etc.

So now along comes a new study by David Figlio and Morton Schapiro of Northwestern University, which has a much more clever way of identifying good teaching.  They look specifically at professors teaching first year courses and ask the question: what is the deviation in grades each of their students receives in follow-up courses in the same subject. This is meant to measure whether or not professors are “inspiring” their students.  Additionally, the measure how many students actually go on from each professor’s first year class to major in a subject.  The first is meant to measure “deep learning” and the second to measure how well professors inspire their students.  Both measures are certainly open to challenge, but they are still probably better than the measures used in earlier studies.

Yet the result is basically the same as those earlier studies: having a better publishing record is uncorrelated with teaching quality measures: that is, some good researchers have good teaching outputs while others don’t.

Institutions should pay attention to this result.  It matters for staffing and tenure policies.  A lot.

Incubator Offsets

Christos Kolympiris of Bath University and Peter Klein of Baylor University have done the math on university incubators and what they’ve found is that there are some interesting opportunity costs associated with them.  The paper is gated, but a summary can be found here.  The main one is that on average, universities see a decrease in both patent quality (as measured by patent citations) and licensing revenues after establishing an incubator.  Intriguingly, the effect is larger at institutions with lower research income, suggesting that the more resources are constrained, the likelier it is that incubator funding is being drawn from other areas of the institutional research effort, which then suffer as a result.

(My guess, FWIW, is that it also has to do with limited management attention span.  At smaller institutions, there are fewer people to do oversight and hence a new initiative takes away managerial focus in addition to money).

This intriguing results is not an argument against university or polytechnic incubators; rather, it’s an argument against viewing such initiatives as purely additive.  The extent to which they take resources away from other parts of the institution needs to be considered as well.  To be honest, that’s probably true of most university initiatives, but as a sector we aren’t hardwired to think that way.

Perhaps we should be.

Page 2 of 11112345...102030...Last »