Higher Education Strategy Associates

November 25

The Australian Experiment in Cutting Red Tape

One thing everybody hates is red tape – especially pointless reporting requirements which take up time, money and deliver little to no value.  Of late, Canadian universities have been talking more and more about various types of reporting burden and how they’d really like being freed from some of it.  For those interested in this subject, it’s instructive to see how the issue has been handled in Australia.

The peak university body in Australia (called – appropriately – Universities Australia) began the drumbeat on this issue about six years ago.  They commissioned an independent third party (self-interested note to university associations: 3rd party investigations give your policy positions credibility!) to provide an authoritative report on Universities’ reporting requirements.  The report went into exhaustive detail in terms of how much staff time and IT resources institutions devote to each of 18 separate data reports required by the commonwealth government.  What they found was that the median Australian institution was spending 2,000 days of staff time and $800-900,000 per year on these reports, roughly a third of which went on collecting data on research publications.

Now, that may not sound like much.  But that’s only data going to the federal ministry responsible for higher education.  It did not include reporting costs related to quality assurance bodies and submissions to the national higher education regulator(s).  It did not include the costs of research assessment exercises (and certainly didn’t count the cost of applying to various funding agencies for money, which is a whole other nightmare story in and of itself).  It did not count regulation related to state governments (Australia is a federation but in contrast to Canada, higher education is mostly but not exclusively a federal responsibility), or anything relating to its required reporting to the charities commission, reporting on government compacts (similar to Ontario’s MYAs), health agencies, the Australian Bureau of Statistics, professional registration bodies….the list goes on.

The point here is not that institutions should be free of reporting requirements.  If we want transparency and good system stewardship, we need institutions to be providing a lot of data – in many cases much more data than they currently provide.  The point is that nobody is co-ordinating those data requests and making any effort to reduce overlap.  If you’re getting data/reporting requests from a dozen or more different bodies, it would be useful if those bodies spoke to each other once in awhile.  Also, as a general principle, or keep regulatory requests to what is absolutely needed rather than what regulators might just like to know (appallingly, the Government of Ontario last year attached a rider to a childcare bill gave itself the right to take any piece of data held by an Ontario university or college, including physical and mental health records, something which in my line of work is known as “as far away from good practice as humanly possible”).

There were, I think, two key suggestions which came out of this exercise. One was that they government should be required to post a comprehensive annual list and timetable of institutional reporting.  This was less for the universities’ benefit than the government’s: it helps to be actively reminded about other people’s reporting burdens.  The second was a very sensible suggestion about how a streamlining of requirements could be handled by the creation of a national central data repository.  The design of this system is shown in the figure below.


This is similar in spirit to the way North American universities have created “common data sets” in reaction to requests for information from rankers and guide-book makers; where it differs is that it brings data customers into the heart of the data collection process, and it explicitly requires them to put data out into statistical reports for public consumption.  In other words, part of the quid pro quo for more streamlined reporting is more transparent reporting.  Which is a lesson I think Canadian institutions should take to heart.

The results of this were mixed.  The government held its own hearings on regulation which led to significant cuts to the main higher education regulator, TEQSA, which left the university somewhat relieved (they got a much lighter-touch regulator as a result) and somewhat horrified (while they liked a light touch for themselves, they were panicked at the prospect of light touch regulation for the country’s many private providers).  As for the report commissioned by Universities Australia, while the Government responded to the review in a very positive manner  very little in terms of concrete change seems to have happened.

Still, these reports change the tone of the discussion around higher education at least for a time.  It would be useful to try something similar here – especially in Ontario.

November 24

Who’s More International?

We sometimes think about international higher education as being “a market”. This is not quite true: it’s actually several markets.

Back in the day, international education was mostly about graduate students; specifically, at the doctoral level. Students did their “basic” education at home and then went abroad to get research experience or simply emigrate and become part of the host country’s scientific structure. Nobody sought these students for their money; to the contrary these students were usually getting paid in some way by their host institution. They were not cash cows they did (and still do) contribute significantly to their institutions in other ways, primarily as laboratory workhorses.

In this market, the United States was long the champion since its institutions were the world’s best and could attract top students from all over the world. In absolute terms, it is still the largest importer of doctoral students. But in percentage terms, many other countries have surpassed it. Most of them, like Switzerland, are pretty small and small absolute numbers of international students nevertheless make up a huge proportion of the student body (in this case, 55%). The UK and France, however, are both relatively large markets, and despite their size they now lead the US in terms of percentage of doctoral students who are international (42 and 40% vs 35%). Canada, at 27%, is at right about the OECD average.

Figure 1: International Students at Doctoral Level as Percentage of Total

Let’s turn now to Master’s students, who most definitely *are* cash-cows. Master’s programs are short degrees, mainly acquired for professional purposes and thus people are prepared to pay a premium for good ones. The biggest market here are for fields like business, engineering and some social sciences. Education could be a very big market for international Master’s but tends not to be  because few countries (or institutions, for that matter) seem to have worked out the secret for international programs in what is, after all a highly regulated profession. In any case, this market segment is where Australia and the UK absolutely dominate, with 40 and 37% of their students being international. Again, Canada is a little bit better than the OECD average (14% vs. 12%).

Figure 2: International Students at Master’s Level as Percentage of Total

Figure 3 turns to the market which is largest in absolute terms: undergraduate students. Percentages here tend to be smaller because domestic undergraduate numbers are so large, but we’re still talking about international student numbers in the millions here. The leader here is – no, that’s not a misprint – Austria at 19% (roughly half of them come from Germany – for a brief explainer see here). Other countries at the top will look familiar (Great Britain, New Zealand, Australia) and Canada doesn’t look to bad, at 8% (which strikes me as a little low) compared to an OECD average of 5%. What’s most interesting to me is the US number: just 3%. That’s a country which – in better days anyway – has an enormous amount of room to grow its international enrollment and if it hadn’t just committed an act of immense self-harm would have be a formidable competitor for Canada for years to come.

Figure 3: International Students at Bachelor’s Level as Percentage of Total


Finally, let’s look at sub-baccalaureate credentials, or as OECD calls them, “short-cycle” programs. These are always a little bit complicated to compare because countries’ non-university higher education institutions and credentials are so different. Many countries (e.g. Germany) do not even have short-cycle higher education (they have non-university institutions, but they still give out Bachelor’s degrees). In Canada, obviously, the term refers to diplomas and certificates given out by community colleges. And Canada does reasonably well here: 9% of students are international, compared to 5% across OECD as a whole. But look at New Zealand: 24% of their college-equivalent enrollments are made up of international. Some of those will be going to their Institutes of Technology (which in general are really quite excellent), but some of this will also be students from various Polynesian nations coming to attend one of the Maori Wānanga.
Figure 4: International Students in Short-Cycle Programs as Percentage of Total


Now if you look across all these categories, two countries stand out as doing really well without being either of the “usual suspects” like Australia or the UK. One is Switzerland, which is quite understandable. It’s a small nation with a few really highly-ranked universities (especially ETH Zurich), is bordered by three of the biggest countries in the EU (Germany, France, Italy), and it provides higher education in each of their national languages. The more surprising one is New Zealand, which is small, has good higher education but no world-leading institutions, and is located in the middle of nowhere (or, at least, 5000 miles from the nearest country which is a net exporter of students). Yet they seem to be able to attract very significant (for them, anyway) numbers of international students in all the main higher education niches. That’s impressive. Canadians have traditionally focused on what countries like Australia and the UK are doing in international higher education because of their past track record. But on present evidence, it’s the Kiwis we should all be watching, and in particular their very savvy export promotion agency Education New Zealand.

Wellington, anyone?

November 23

Persuading High School Students

Over the years, a lot of people have surveyed incoming university students to find out why they chose a particular institution.  Most of these surveys contain a battery of questions about influencers: i.e. what were the sources of information that a student used to make their decision.  What researchers are looking for, usually, is some indication that school websites or career fairs or Maclean’s rankings or whatever are actually having some impact.  But year after year, students essentially give the same two answers for “top influencers”: namely, “family and friends”.  This doesn’t really help institutions because they have no idea what family and friends are telling the students, where they get their information, etc.  Institutions simply want to understand how to get information about their offerings into the information pipeline.

Here at HESA Towers, we’ve been working on a program of research on this for a couple of years now.  Two years ago, we followed a couple of hundred grade 12 students for a year to look at how the timing and type of information students received changed their views about institutions over time.  This gave us some interesting insights on which sources of data at which points in time seemed to make a difference to students.  This year we are doing something similar both with students in grade 11 (one of our big findings in looking at grade 12 students was how many of them had their minds made up about an institutions before their final year of studies) and with parents of students in grade 12.  I won’t bore you with the details here (though by all means get in touch if you want details about how to obtain our research – see the grey box below); what I want to do today instead is talk specifically about grade 12 students’ epistemology when it comes to choosing an institution.

Briefly, students know that institutions are selling them something.  From what we’ve seen, they are actually quite sophisticated media consumers – very willing to question institutions and not take for granted what they see on websites.  Actually, not to put too fine a point on in general, high school students hate institutional websites.  Like, with the fire of a thousand suns.  There are few if any exceptions.

Now precisely because students know they are being told something, their fondest wish is to be able to “look under the hood”, so to speak.  They want to be able to hear from other students what it’s like to be at a particular school. When they do this, they are not thinking like “investors”, they are thinking like “consumers”.  If you’re going to dedicate four years of your life to something, you want to know you won’t be lonely and/or bored.  Choosing an institution is, in many ways, effectively choosing a lifestyle or a “brand” for four years of their lives: what they really want to know is whether they will meet people from whom they can learn and with whom they can have a good time.

Many institutions understand this, and their response is to make “real students” available to prospective students to explain from a credible first-person perspective what it is they can expect.  But while high school students appreciate this effort, they know there is still an information asymmetry: high school students have no way of knowing whether these chosen students are reliable guides or not.

Now, the most credible source of information for grade 12 students are people they already know and who can give them first-hand straight dope.  They might trust adults to tell them about programs, but when it comes to student life and explaining it in a way a high school student can understand, they’re only going to listen to kids more or less their own age who come from a similar background.  Siblings, first and foremost, but apart from that the students that most closely meet this criteria are students from one’s own school in the graduating class one year ahead.

Now here’s the bit that I think eludes a lot of people.  A high school student does not need to speak with older classmates in order to obtain needed information about a particular post-secondary institution.  All they need to do is register where various graduates choose to go to college/university and they can make their own inferences about institutional brand.  Basically if you’re a high school student and all the older kids you admired went to institution Y, then that school starts out with a huge advantage in recruiting you even if you never spoke to any of those students about life at institution Y.  It’s wordless viral marketing, but no less effective for that.

(My son did this in a negative way: he put his efforts into avoiding the institutions which attracted the greatest number of what he considered “douchebags” from the graduating class prior to his own.  I won’t offend the institutions which got eliminated via this process, but let’s just say that via this method he concluded that Wilfrid Laurier must be a decent place to study.)

This is more or less how universities become branded without ever actually spending money on branding.  Students with particular characteristics (the jocks, the tree-huggers – whatever) choose institution X and that then affects how younger students with the same characteristics view each institution.  Breaking this cycle is very hard and goes well beyond a couple of ad campaigns.  Institutions seeking a new kind of student have to pro-actively identify and persuade a different type of student to come to their institution and in some cases actually discourage some of their more traditional students from attending (very difficult to do in an era when money is tight).  Success in this form of persuasion is very time-consuming, and takes a lot of patient work because results will take years to become apparent.  It means paying attention to many, many high schools and actually getting to know and assessing the individual students that come your way from each.

That doesn’t mean marketing to students is hopeless.  There are other parts of an institution’s value proposition that can be emphasized (employability, opportunity, etc) in ways that will make prospective sit up and take notice.  It’s just to say that students have already decided a lot about a school long before they first see a website or a viewbook, and marketing campaigns need to be conducted with that in mind.

November 22

Higher Salaries + Lower Workloads = More Sessionals

On Sunday night, the University of Manitoba and its faculty union hashed out a tentative deal to end a three-week strike.  No details are publicly available yet, but I think the dispute – and the likely strategies used to resolve it – are a useful way of understanding some general concepts around the economics of universities in Canada.

Directly or indirectly, institutions get their operating funds from having students sit in classrooms.  Tuition fees are directly related to credit hours and government operating grants are usually at least indirectly related to them.  One might question this in a place like Manitoba, where there is no actual funding formula and money is just handed out as a block on a historical basis, but as I showed back here the distribution of funding between Manitoba institutions actually looks almost exactly like it would if the province were using a weighted enrollment formula system like Quebec’s or Ontario’s.  So we can more or less dispense with that argument and make the simple equation “bums in seats” = revenue.

The main issues at play in the Manitoba dispute were related to salaries (faculty want more) and workload (faculty would like to limit management’s ability to increase it).  Now, if you want a big rise in pay, the university needs to find revenue to compensate.  In general, the way Canadian universities have been meeting faculty pay demands over the last six years or so is to raise enrollment, in particular international student enrollment, because it usually brings in more dollars per student.  On the whole, they’ve been reasonably successful at doing so. But the other faculty demand – maintained or reduced workloads – makes this a difficult trick to pull off.  Even if you fully accept the logic behind reducing workloads, the fact that revenue is a function of bums in seats means that faculty’s two goals are essentially incompatible.  Effectively, what is being demanded is that the university spend more and earn less.

Absent a major tuition increase, there are only two ways to square this circle.  The first, which the faculty association likes to talk about at great length is that the university can afford to do both because there are millions of dollars being salted away in various nefarious ways (which for the most part is nonsense because what on earth so senior administrators possibly have to gain by not spending money?) or, frivolously spent on fixing buildings or that old favourite “administrative bloat”.  While it’s certainly true some non-academic expenses have been rising, an awful lot of those increases have been concentrated in areas like IT and student services rather than everyone’s favourite bogeyman of “central administration”.  Undoubtedly some savings could be found in these places and diverted to faculty salaries, but they would be unlikely to do the trick entirely.  According to data from the Financial Information of Universities and Colleges (FIUC), the U of M’s entire “academic salaries” budget was just over $158 million in 2014-15; a 6.9% increase would mean an $11 million hit just in salaries plus another $2 million (roughly) in benefits.  In contrast, the entire budget for salaries in central administration is $22 million.

The second way of dealing with the problem is to allow faculty salaries to rise while simultaneously lowering the average cost of instructors.  A contradiction in terms?  Well, no.  All one has to do is hire more sessionals.  Since they are remunerated at – effectively –about a quarter of the rate of a full-time professor  it’s possible to both increase bums in seats (i.e. revenue) and keep the increase in average instructional costs to well below 6.9%.

I obviously don’t know what’s in the agreement reached Sunday night and there’s not going to be anything in the agreement which explicitly says “let’s go hire more sessionals”.  But it’s implicit in the logic of the faculty’s demands.  Universities don’t like to admit this is how they deal with faculty pay hikes because they are wary of charges of “cheapening” undergraduate education, and faculty unions don’t like to admit this is what happens because GOD FORBID their pay demands have negative externalities.  Still, both sides know exactly how this process works and neither side can claim the least bit of innocence in the process.

It’s the way the game is played.

November 21

The “Poorly Educated” and the US Election

Morning all.  Hope you’ve been well.

During the US election and its aftermath, a lot of the discussion has focused on the issue of education.  Specifically, many pollsters noted large shifts in favour of the democrats among college-educated whites and even larger shifts rightward from less-educated whites.  Trump’s statement in June that he “love(d) the poorly-educated” was in retrospect quite significant.  From this, many on the left have deduced that “education is more important than ever”, a statement which is almost perfectly calculated to feed every piece of Republican paranoia about leftist indoctrination (and hence likely to make higher education a real target in the 25 states where the Republicans control both houses and the Governor’s mansion).

But I think there’s an important caveat to the education story.  Check out the map from NYT’s excellent The Upshot, which shows in green the counties which showed the biggest democrat-to-republican shift between 2012 to 2016.


If there was an education issue at work here, it was a pretty particular one – one which seems to have manifested itself mostly around the Great Lakes.  It’s the rust-belt, more or less.  Places where the economy has been in either relative or absolute decline for 50 years or more.  It’s not about trade deals; think back to movies set in this area from the 1970s like Slapshot or Breaking Away.  When this area complains about the economy, it has little to do with Obama’s policies or even trade deals. These places were already in deep trouble long before anyone even dreamed of NAFTA. This is about communities that have been falling apart for 50 years.

In the region’s glory days, it was one of the wealthiest regions of the entire globe.  And during that period, it had an unbelievably low education-to-wealth ratio; maybe the lowest of any place in the world, at any time.  Higher Education?  Who needed it, when well-playing manufacturing jobs were a dime a dozen.  Human capital was for suckers.  And that, unfortunately, is an attitude which has endured.

Times changed, of course.  Underinvestment in capital put paid to the steel industry, titanic incompetence the auto industry, while energy costs reduced the competitiveness of pretty much every other sector.  Result: de-industrialization.  There’s nothing unique about this process.  It’s happened in many places around the world Flanders, Lancashire, Eastern Germany.  What’s unique about the US rust belt is mostly how rich it was before the fall.

There was a lot of post-election commentary about how the Rust Belt’s swing to Trump was really a way of saying “we’re upset because no one listens to us” but that’s not strictly speaking true.  People listen.  It’s just that literally no one knows how to effectively reverse de-industrialization.  Americans have it worse because for a variety of reasons (most of them rooted in racism) they lack much in the way of a social safety net.  And as American scholar Toney Carnevale is fond of saying, when a country lacks social programs, education actually becomes the safety net.

But there’s a problem with that.  In declining economies, education is no guarantee of a job because hiring is low.  So the ones with education leave (to California, say, or New York) leaving the remaining population with lower average skill levels, thus making economic regeneration even harder.  People come to see education as a vehicle for personal salvation, but also potentially as an agent of community destruction because while they are creating human capital, they are also priming it for export.  I’m sure many readers in Atlantic Canada will know that feeling.

And to top it off, you have to remember that what people in the area really liked about the old days wasn’t just the middle-class jobs, but the fact that mental toil wasn’t necessary.  JD Vance, in his recent book Hillbilly Elegy (this fall’s de rigeur read for those wanting to “get” Trump voters) notes with some sorrow that the people of his home communities in Ohio and Kentucky lack much in the way of desire for self-improvement through education.  “We don’t study as children and we don’t make our kids study as parents,” he says. “We hillbillies need to wake the hell up”.  That awakening may happen, but it didn’t on November 8th.  In the key states that swung the election, Trump’s promise to “Make America Great Again” was taken up by the people with the subtext “We Want Jobs That Don’t Require College Again”.

In short: more education, on its own, won’t solve the problems of de-industrialization.  And even if that weren’t true, it’s not clear that more education is a medicine everyone wants to take.

November 11

The New WSJ/Times Higher Education Rankings

Almost the moment I hit send on my last post about rankings, the inaugural Wall Street Journal/Times Higher Education rankings of US universities hit the stands.  It didn’t make a huge splash mainly because the WSJ inexplicably decided to put the results behind their paywall (which is, you know, BANANAS) but it’s worth looking at because I think in many ways it points the way to the future of rankings in many countries.

So the main idea behind these rankings is to try to do something different from the US News & World Report (USNWR) rankings which are a lot like Maclean’s rankings (hardly a surprise since the latter was explicitly modelled on the former back in 1991).  In part, the WSJ/THE went down the same road that Money Magazine went in terms of looking at output data: graduate outcomes like earnings and indebtedness, except that they were able to exploit the huge new database of institutional-level data on these things that the Obama administration.  In addition to that, they went a little bit further and created their own student survey to get evidence about student satisfaction and engagement.

Now this last thing may seem like old hat in Canada: after all, the Globe and Mail ran a rankings based on student surveys from 2003 to 2012 (we at HESA were involved from 2006 onwards and ran the survey directly for the last couple of years).  It’s also old hat in Europe, where a high proportion of rankings depend at least in part on student surveys.  But in the US, it’s an absolute novelty.  Surveys usually require institutional co-operation, and organizing this among more than a thousand institutions simply isn’t easy:  “top” institutions would refuse to participate, just as they won’t do CLA, NSSE, AHELO or any measurement system which doesn’t privilege money.

So what the Times Higher team did was effectively what the Globe did in Canada thirteen years ago: find students online, independent of their institutions, and survey them there.  The downside is that the minimum number of responses per institution is quite low (50, compared with the 210 we used to use at the Globe); the very big upside is that students’ voices are being heard and we get some data about engagement.  The result was more or less what you’d expect from the Canadian data: smaller colleges and religious institutions tend to do extremely well on engagement measures (the top three for Engagement were Dordt College, Brigham Young and Texas Christian).

So, I give the THE/WSJ effort high marks for effort here.  Sure, there are problems with the data.  The “n” is low and the resulting number have big error margins.  The income figures are only for those who have student loans and includes both those who graduated and those who did not.  But it’s still a genuine attempt to shift rankings away from inputs and towards processes and outputs.

The problem?  It’s still the same institutions coming in at the top.  Stanford, MIT. Columbia, Penn, Yale…heck, you don’t even hit a public institution (Michigan) until 24th position.  Even when you add all this process and outcome stuff, it’s still the rich schools that dominate.  And the reason for this is pretty simple: rich universities can stay relatively small (giving them an advantage on engagement) and take their pick of students who then tend to have better outcomes.  Just because you’re not weighting resources at 100% of the ranking doesn’t mean you’re not weighting items strongly correlated to resources at 100%.

Is there a way around this?  Yes, two, but neither is particularly easy.  The first is to use some seriously contrarian indicators.  The annual Washington Monthly rankings  does this, measuring things like percentage of students receiving Pell Grants, student participation in community service, etc.  The other way to do this is to use indicators similar to those used by THE/WSJ, but to normalize them based on inputs like income and incoming SATs.  The latter is relatively easy to do in the sense that the data already (mostly) exists in the public, but frankly there’s no market.  Sure, wonks might like to know about which institutions perform best on some kind of value-added measure, but parents are profoundly uninterested in this.  Given a choice between sending their kids to a school that efficiently gets kids from the 25th percentile up to the 75th percentile and sending their kid to a school with top students and lots of resources, finances permitting they’re going to take the latter every time.  In other words, this is a problem, but it’s a problem much bigger than these particular rankings.

My biggest quibble with these rankings?  WSJ inexplicably put them behind a paywall, which did much to kill the buzz.  After a lag of three weeks, THE made them public too, but too little too late.  A missed opportunity.  But still, they point the way to the future, because a growing number of national-level rankings are starting to pay attention to outcomes (American rankings remarkably are not pioneers here: in fact, the Bulgarian National Rankings  got there several years ago, and with much better data).  Unfortunately, because these kinds of outcomes data are not available everywhere and are not entirely compatible even where they are, we aren’t going to see these data sources inform international rankings any time soon.  Which is why, mark my words, literally all the interesting work in rankings over the next couple of years is going to happen in national rankings, not international ones.

November 10

Measuring Innovation

Yesterday, I described how the key sources of institutional prestige were beginning to shift away from pure research & publication towards research & collaboration with industry.  Or, to put it another way, the kudos now come not from solely doing research, but rather in participating in the process of turning discoveries into meaningful and commercially viable products.  Innovation, in other words (though that term is not unproblematic).  But while we all have a pretty good grasp on the various ways to measure research output, figuring out how to measure an institutions’ performance in terms of innovation is a bit trickier.  So today I want to look at a couple of emerging attempts to do just that.

First out of the gate in this area is Reuters, which has already published two editions of a “top 100 innovative universities” list.  The top three won’t surprise anyone (Stanford, MIT, Harvard) but the next three – Texas, Washington and the Korea Advanced Institute of Science and Technology – might:  it’s a sign at least that some non-traditional indicators are being put in the mix. (Obligatory CanCon section: UBC 50th, Toronto 57th and that’s all she wrote.)

So what is Reuters actually measuring?  Mostly, it’s patents.  Patents filed, Success rates of patents filed, percentage of patents for which coverage was sought in all three of the main patent offices (US, Europe, japan), patent citations, patent citation impact…you get the idea.  It’s a pretty one-dimensional view of innovation.  The bibliometric bits are slightly more interesting – percent of articles co-written with industry partners, citations in articles originating in industry – but that maybe gets you to one and a half dimensions, tops.

Meanwhile, the THE may be inching towards an innovation ranking.  Last year, it released a set of four “innovation indicators”, but only published the top 15 in each indicator (and included some institutions not usually thought of as universities in the list, such as “Wright-Patterson Airforce Base”, the Scripps Research Institute” and the “Danish Cancer Society”) which suggests this was a pretty quick rip-and-grab from the Scopus database rather than a long, thoughtful detailed inquiry into the subject.  Two of the four indicators, “resources from industry” and “industry contribution” (i.e. resources from industry as a percentage of total research budget), are based on data from the THE’s annual survey of institutions and while they may be reasonable indicators of innovation, for reasons I pointed out back here, you should intensely distrust the data.  The other two indicators are both bibliometric:  “patent citations” and “industry collaboration” (i.e. co-authorships).  On the whole, THE’s effort is slightly better than Reuters’, but is still quite narrow.

The problem is that the ways in which universities support innovation in an economic sense are really tough to measure.  One might think that counting spin-offs would be possible, but the definition of a spin-off might vary quite a bit from place to place (and it’s tough to know if you’ve caught 100% of said activity).  Co-working space (that is space where firms and institutions interact) would be another way to measure things, but it’s also very difficult to capture.  Economic activity in university tech parks is another, but not all economic activity in tech parks are necessarily university- or even science-based (this is an issue in China and many developing countries as well).  The number of students engaged in firm-based work-integrated learning (WIL) activities would be great but a) there is no common international definition of WIL and b) almost no one measures this anyway.  Income from patent licensing is easily obtainable in some countries but not others.

What you’d really want, frankly, is a summary of unvarnished opinions about the quality of industry partnerships with the businesses themselves, perhaps weighted by the size of the businesses involved (an 8 out of 10 at Yale probably means more than a 9 out of 10 at Bowling Green State).  We can get these at a national level through the World Economic Forum’s annual competitiveness survey, but not at an institutional level, which is presumably more important.  And that’s to say nothing of the value of finding ways to measure the various ways in which institutions support innovation in ways other than through industry collaboration.

Anyways, these problems are not insoluble.  They just take imagination and work.  If I were in charge of metrics in Ontario, say, I could think of many ways – some quantitative, some qualitative – that we might use to evaluate this.  Not many of them would translate easily into international comparisons.  For that to happen would require a genuine international common data set to emerge.  That’s unlikely to happen any time soon, but that’s no reason to throw up our hands.  It would be unimaginably bad if, at the outset of an era where institutions are judged on their ability to be economic collaborators, we allow patent counts to become the standard way of measuring success.  It’s vitally important that thoughtful people in higher education put some thought into this topic.

November 09

A Second Thought About Half-Way Through A Pretty Awful Day

Forgive the intrusion.  But our neighbour to the South electing a quasi-fascist narcissist isn’t an every day occasion.  There are some significant short-term consequences for Canadian higher education, and I thought I would just quickly enumerate them so that debate and preparation can begin.

First, the chances of a recession in the next couple of years just shot up quite a bit.  Tearing up NAFTA also means tearing up the FTA: there will be a pause in business investment while everyone works out what on earth the new rules are going to be.  Other forms of protectionist legislation, even if not aimed at us, has the potential to wreak serious havoc as well.  Unlike previous recessions, interest rate cuts cannot be part of our policy arsenal as they are already near-zero.  To some people’s minds, that calls for massive Keynesian borrowing-and-spending.  But as we’ve already seen with the first round of Trudeau spending, it’s not at all clear that the intended multiplier effects work very well in a small open economy.  Long story short: provincial governments were never likely to be flush enough to grants serious relief to universities and colleges any time soon, but yesterday’s vote made such prospects even more remote.

Second, the forecast demand for Canada as an international education destination just went Through. The. Roof.  Already earlier this week, the annual i-barometer global survey of education agents named Canada the #1 “hot” destination for students.  But now, with a President-elect who degrades women, despises Hispanic and Muslims and openly consorts with anti-semites, there’s going to be a huge diversion of interest away from the United States and (since the UK has already hung out a huge “Sod Off” sign on its window), this diversion is be headed towards exactly three places: New Zealand, Australia, and Canada.  One recent study suggested fully 65% of international students would be less likely to study in the US if Trump were elected.  Even if that over-states the case by a factor of two, we’re talking about a couple of hundred thousand internationally mobile students up for grabs.  Not to mention the almost-certain increase in the number of Americans heading North.

That has a couple of implications.  The main one is that Canadian universities are about to get more pricing power:  No more being the discount end of North American higher education.  But we have to up our game significantly.  We have to have real presence – not just agents – in major export markets.  And we have to up the student experience international students receive as well.   There are significant opportunities here: but also some potential significant costs.  There’s no time like now to have a really thorough debate about internationalization on our campuses.

Third, while I have been impressed by how by some prominent Americans (Jonathan Chait, Lin-Manuel Miranda) are coming out strongly this AM saying (correctly) “Screw moving to Canada, we need to stay and fight”, the fact of the matter is there are going to be a lot of faculty wanting to head north and a lot fewer of our own professors wanting to head south.  Universities will have a much better set of potential hires in front of them for the next couple of years.  This is great news: but we should try not to squander this opportunity the way we squandered the post-2008 rush north.  We can and should use the opportunity to poach selectively; but perhaps not break the bank on salaries while doing so.

(Also: I’m pretty sure we’re not going to be hearing about brain drain and the loss of talent to the US for awhile, so it’s an opportunity as well to re-calibrate some of our arguments about education and the labour market).

So the net effect here for Canadian institutions over the medium-term: less government money, more opportunities in international education, and thicker academic labour markets.  On balance, it’s probably more good news than bad, provided we act deliberately and rapidly while ensuring that these moves have wide buy-ins on our campuses.

But beyond the simple dollars and cents of it all, there are deeper issues.  A monster has become President of the United States.  Misery is going to fall upon the American people for the next two years if not four: on Blacks, immigrants, women, LGBTQs.  We all know people down there, know what they must be feeling today, and our hearts ache for them.  We need to show solidarity with them whenever we can.  But we also need to be vigilant here in Canada.  We are not immune to nativism and intolerance.

Last night around 11 PM Dalhousie President Richard Florizone tweeted “When voices of intolerance are loudest don’t be despondent – be emboldened, and even more committed to values of diversity & inclusion.” And that’s exactly right.   We have to work – and work hard – at these things and for fairness, every day.  In the end, that kind hard work is all that ever makes a difference.

November 09

Shifting Sources of Prestige

The currency of academia is prestige.  Professors try to increase theirs by publishing better and better papers, giving talks at conferences and so on.  Becoming more prestigious means offers to co-author with a more illustrious class of academics, increasing the chance of book deals at better university presses, etc.  And at the institutional level, universities become more prestigious by being able to attract and nurture a more prestigious group of professors, something which is done by lavishing them with higher salaries, more research funds, better equipment, better graduate students (and to a lesser degree undergraduate students too).  All this has been clear for a long time.

In any given field, we might know which ten or twenty people are at the top globally – Nobel Prize winners for instance (speaking of which, this Freakonomics podcast on How to Win a Nobel Prize is hugely informative and entertaining on the how the Swedish committees decide who really is “top of the field”).  But after that it is pretty hazy: one’s list of the top twenty health researchers who have yet to win the Nobel for Medicine probably depends a lot on what sub-field you’re in and how you evaluate the last decade’s relative progress in various other subfields.  Same with universities before rankings came along.  It doesn’t take a genius to work out that Toronto, McGill and UBC are the top three in Canada.  But after that it gets fuzzy.  If you were in Medicine, you might think number four was McMaster; in Engineering Waterloo and in Arts Montreal or Alberta.

Then along came large bibliometric databases, and shortly thereafter, rankings.  And then we knew how to measure prestige.  We did it by measuring publications, citations, and whatnot: the more, the better.  Universities began managing towards this metric, which built on longstanding trends in most disciplines towards more demanding publication requirements for tenure (the first known use of the phrase “publish or perish” dates from 1942).  Want prestige?  Research. Publish.  Repeat.

But I get the real sense that this starting to change, for universities if not individual professors.  I can’t provide much strong evidence here: you won’t see the change in the usual rankings because they are hardwired for old definitions of prestige.  Nevertheless, if you look around at which universities are “hot”, and receive the acclaim, it’s not necessarily the ones who are doing the publishing; rather, it’s the ones that are actively contributing to the dynamism of their local economies.  MIT’s gradual overtaking of Harvard is one example of this.  But so too is the fuss over institutions like SUNY Albany and its associated nanotech cluster, Akron and its Advanced materials cluster.  In Canada, the obvious example is Waterloo but even here in Toronto, Ryerson has become a “hot” university in part because of its focus on interacting with business in a couple of key areas such as tech (albeit in quite a different way from Waterloo).

To be clear, it’s not a case of publishing v. working with industry.  Generally speaking, companies like to know that the people they are working with are in fact at the front of their fields; no publishing, no partnership.  But it’s more of a general orientation: increasingly, the prestigious universities are the ones who not only have a concentration of science and engineering talent, but also have a sufficiently outward focus to act as an anchoring institution to one or more industrial clusters.

What’s interesting about this trend is that it has some clear winners and losers.   To even have a hope of working in an industrial centre, you need to be in a mid-size city which already has some industry (even if, as in Akron’s case, it’s down in the dumps).  That works for Canada, because (Queen’s excepted) nearly all our big and prestigious universities are in mid-sized or large cities.  In the US, however, it’s more difficult.  Their universities are often older, built in a time where people believed universities were better-off situated away from the “sinful” cities.  And so you have big, huge research institutions in places like Champaign, Illinois or Columbia Missouri which are going to struggle in this new environment (even places like Madison and Ann Arbour are far enough away from big cities to make thing problematic).  Basically, the Morrill Act is now imposing some pretty serious legacy costs on American higher education.

Part of the reason this shift hasn’t been more widely acknowledged is that bibliometrics are a whole lot easier to measure than economic value (and are valued more in tenure discussions).  But some people are starting to have a go at this problem, too.  More on this tomorrow.

November 08

Why I Do This Stuff

It’s Election Day in the America.  It’s a day that always make me think about how I got into this business.

Back in 1992, I was trying to stay out of a godawful job market by doing a Q-year in Economics at McGill (ended disastrously: don’t ask).  On November 2nd, I was sitting with some friends in the Shatner Building reading a New York Times story about the celebrations being planned in Little Rock for the next evening.  It was clearly going to be the biggest party in North America that week.  So we decided to go.

Some local car rental company had unwisely signed a deal with the student’s union offering any club a 3-day rental for $150, so seven of us (mostly from The Tribune) jumped into a Chevy Astrovan at 5 o’clock Monday afternoon and drove 23 hours straight from Montreal to Little Rock.  We had a good time while there (I managed to blag my way into a temporary press pass which – with a little help from a local laminating shop – became my ticket to hang out in the basement of the Excelsior Hotel following Wolf Blitzer around listening to him bullshit with other reporters).  But for me, the real event of that trip happened the next morning.

We left Little Rock at midnight, needing to get home so some of us could take midterms on Thursday. At about 7AM, we pulled into a McDonald’s near Champaign, Illinois.  By this time, none of us had really bathed or slept properly in about 48 hours and six of the seven of us were smokers (a fairly representative percentage in Montreal in the early 90s), so there was a kind of blue haze that followed us out of the van as we trudged into a nearly-empty restaurant for some coffee and McMuffins.

“You all look like crap,” said the woman behind the counter (note: she did not say “crap”, she used a different word).  We gave her the back story on our trek, and told her that we were coming back from Little Rock.

Somewhat to our bemusement, the woman began to cry.  “You saw the President?”  At first I thought this was just an exaggerated expression of royal-like deference American sometimes display towards the Commander-in-Chief.  But no.  She recovered slightly and said “Mr. Clinton is our President and my boy is going to go to college”.

So that was it.  Amidst all the back and forth of the campaign: Gennifer Flowers, The Comeback Kid, Ross Perot jumping in, Sister Souljah, Double Bubba, Ross Perot leaving, Murphy Brown, Dan Quayle’s spelling ability, Ross Perot coming back in again, “it’s the economy, stupid”….the main thing this woman had keyed in on was that Clinton was determined to expand access through a “Domestic G.I. Bill” higher education by letting all students either a) borrow via an income-contingent loan or b) do national service  (that never quite happened, though Clinton did manage to introduce both an income-contingent loan repayment option and Americorps).

The fact that we’d been near the man who was going to make this happen was just a bit overwhelming for her.  And that made a big impression on me.  Among those who have degrees, there’s often a world-weary cynical pose about higher education “not being worth it” – devalued degrees, crippling student debt, etc.  But to a family who’s never had someone attend post-secondary, and that moment when they realise they can go is something magical.  They’re not foolish enough to think that going to university or college means but they do know for sure – rightly – that going to higher education is by far the best way to get step up to the middle class.  And if you aren’t already in the middle class, that’s a Big Deal.  Something worth devoting a career to, anyway.

And that – in part – is how a visit to an Illinois McDonald’s 24 years ago got me studying student aid and from there, higher education generally.  And why every four years I think about that morning, and that woman, and wonder whether her son succeeded in college or not.  I really wonder.

Page 10 of 114« First...89101112...203040...Last »