HESA

Higher Education Strategy Associates

Tag Archives: outcomes

March 15

What Is Innovation/Innovation Policy, Anyway?

I write and tweet a lot about innovation policy, mainly with respect to my frustration with our current government’s two-dimensional views on the subject.  I’ve been meaning to write a piece on how to do innovation policy right, but based on a number of conversations I’ve had with folks, I think it’s important first to deal with the question of: “what is innovation” and “what is innovation policy”? Because frankly these terms are getting slung around with such abandon that they appear to have lost all meaning and many people are simply dismissing the policy area as a large waste of time.

Sometimes, it’s hard not to sympathize with this point of view.  This week in a Vancouver Sun op-ed, UBC President Santa Ono described innovation as “a never-ending exchange between the realities of today and the potential of tomorrow.”  To which I think most people would respond with a heathy “you what, mate?”

More helpfully, Ono also included in his article the OECD definition of Innovation: “the implementation of a new or significantly improved product (good or service) or process, a new marketing method, or a new organizational method in business practices, workplace organization or external relations.”  Even apart from any methodological difficulties in tracking this, it is still a bit tricky as a definition.  For instance, not everything which is new is beneficial or adds value. Crystal Pepsi comes to mind here.

But more broadly, the best way to think of innovation as a policy goal is: can a country/province/whatever become a place where people can put new ideas into practice easily and quickly.  It doesn’t have to be a “new product” (although Canadian governments sometimes act like this is what it means); it can also be a new process.  And it need not be strictly in the commercial sphere; innovation in the public sector is important, too.

Or, at a broader level, how can we be more like Estonia and less like Greece?

Now, as you can imagine, this is tricky because no one actually knows how to be more like Estonia (or Denmark, or Finland – pick a Nordic-ish country).  We can sort of describe what makes them the way they are, but there is no road map to getting from here to there.  But basically the critical questions are:

  1. How do we get ideas to generate and circulate faster?  (This goes back to the Paul Romer question I posed back here).
  2. How do we get people to turn ideas into practical ideas to create new/improved products and processes?
  3. How do we ensure that new products and processes do not get stamped down simply for reasons of inertia or protecting vested interests?

Some of these issues lend themselves to direct government spending.  Some of them are about regulations and incentives, which governments can also change.  And finally, some of them are issues of culture (institutional and otherwise) which is an altogether trickier terrain.

Governments can address the skills part of this through education.  Funding more doctoral students, attracting more top profs, etc. leads to more Highly Qualified Personnel and hence generating more ideas at the frontier of science.  Funding of basic science also plays a role here.  Governments can change the nature of secondary and undergraduate of education in ways that might make workers more likely to be problem-solvers, idea generators and early adopters of other people’s new ideas/technology.  It can incentivize entrepreneurialism through tax policy and to some extent through grants and – maybe, the jury’s still out – education as well.  But culture plays a big role here and credible ideas for how to shift this are thin on the ground (though I think we can all probably agree that a strategy of having Ministers go around encouraging people to “take risks” and “think outside the box” has, to put it politely, a low probability of success).

As for the third part, not squashing new products…obviously regulatory and competition policy really plays a role.  But again, part of it is cultural, and takes place inside firms and other institutions, areas where policy does not easily reach.  Take the medical products industry: how do you combine a culture of looking after patient safety with a culture of encouraging innovation (which by definition means making mistakes on the way to success)?  Or how do you get companies to pursue innovations which may make existing profitable product lines less profitable?

Evidently, this is complicated stuff.  In some ways it is easier to step back and say what Innovation Policy is not.  It is not Science Policy, which is about deciding how to invest in basic research – though Science Policy affects Innovation Policy for obvious reasons.  It is not Growth Policy, which is about finding the highest rates of economic growth in the short term, because Innovation Policy is in the end much more concerned with developing ideas which will matter 10-20 years out than what will boost growth right now.  It is definitely not Industrial Policy, because it is about economy-wide pre-conditions for industry, not about picking winning industries because they seem to be “hot”.

(I have recently been informed that the Ministry of Innovation actually includes my daily blog posts in its media monitoring.  If whoever is in charge of this operation could mark up that last section IN HUGE RED INK before slipping a copy to Minister Bains, that’d be awesome.  Thanks.)

So that’s my take on the meaning of innovation and innovation policy.  Tomorrow: what an ideal policy looks like.

September 14

The Canadian Way of Study Abroad

A few years ago, I think around the time that HESA Towers ran a conference on internationalization, I realized there was something weird about the way Canadian higher education institutions talked about study abroad.  They talked about it as helping students “bridge the gap between theory and practice”, “increasing engagement”, and “hands-on learning”.

That’s odd, I thought.  That sounds like experiential learning, not study abroad.  Which is when it hit me: in Canada, unlike virtually everywhere else in the world, study abroad to a large degree is experiential learning.

In Europe, when they say study abroad, they mostly mean study at a foreign institution in the same field through the Erasmus program.  In the US, they may mean this, or they may means studying in facilities owned by their home universities but located in different countries.  For instance, Wake Forest owns a campus in Venice, Webster University has a campus in Leiden, University of Dallas has one in Rome (have a browse through this list). Basically, if your students are paying megabucks to be at a US campus, the ideas can’t just give them exchange semesters at some foreign public school because who knows about the quality, the safety, etc.

But look at how Canadian institutions showcase their study abroad: McGill talks up its science station in Barbados.  University of Alberta showcases its international internships.  University of Saskatchewan has a fabulous little Nursing programs which ties together practicums in East Saskatoon and Mozambique.  The stuff we like to talk about doesn’t seem to actually involve study in the sense of being in a classroom, per se.  That’s not to say our universities don’t have typical study-abroad programs: we’ve got thousands of those.  They’re just not where the sizzle is.  It’s a distinctly Canadian take on the subject.

This brings me to a point about measuring the benefits of study abroad.  Let’s take it for granted that being abroad  for a while makes students more independent, outward-looking, able to problem-solve, etc.  What is it, exactly, about being abroad that actually makes you that way?  Is it sitting in classes in a foreign country?  Is it meeting foreign people in a foreign country?  Is it meeting people from your own culture in a foreign country (too often the main outcome of study abroad programs)?  What about if you actually get to work in a foreign country? And – crucially for the design of some programs – how long does it take for the benefits to kick in?  A week?  A month?  A year?  When do diminishing returns set in?

Despite study-abroad being a multi-billion dollar niche within higher education, we actually don’t know the answer to many of these questions.  There isn’t a lot of work done which picks apart the various elements of “study abroad” to look at relative impact.  There is some evidence from Elspeth Jones in the UK that many of the benefits actually kick in after as little as 2-4 weeks, which suggests there may be cheaper ways of achieving all these purported benefits.

Of course, one of the reasons we have no answers to this is that it’s pretty hard to unpack the “treatment” involved in study abroad.  You can’t, for instance, randomly assign people to a program that just sits in class, or force people to make friends among locals rather than among the study-abroad group.  But, for instance, it would be possible to look at impacts (using some of the techniques we talked about yesterday) based on length of study abroad period.  It would be possible to compare results of programs that have students mostly sit in class to ones where they do internships.  It would be possible to examine internships based on whether or not they actually made friends among local students or not, a question not asked enough in study evaluation work.  It would also be possible to examine this based on destination country: are the benefits higher or lower depending on proficiency in the destination country’s language?

These questions aren’t easily answerable at the level of an individual institution – the sample size on these would simply be too small.  But one could easily imagine a consortium of institutions agreeing to a common impact assessment strategy, each collecting and sharing data about students and also collectively collecting data on non-mobile students for comparative purposes (again, see yesterday’s blog), perhaps through the Canadian Undergraduate Survey Consortium.  It would make a heck of a project.

If anyone’s interested in starting a consortium on this, let me know.  Not only would it be fun, but it might help us actually design study abroad experiences in a more thoughtful, conscious and impactful way.  And we’d find out if the “Canadian Way” is more effective than more traditional approaches.

Worth a try, I think.

July 25

The low-wage graduate problem

The week before last, the Canadian Centre for the Study of Living Standards (CSLS) put out a report (available hereon trends on low-paid employment  in Canada from 1997 to 2014 (meaning full-time jobs occupied by 20-64 year olds where the hourly earnings are less than 66% of the national median).  It’s an interesting and not particularly sensationalist report based on Labour Force Survey public-use microdata; however one little factoid has sent many people into a tizzy.  Apparently, the percentage of people with Master’s or PhDs who are in low-wage jobs (where the hourly earnings are less than two-thirds of the national median) had jumped from 7.7% to 12.4%.  This has led to a lot of commentary about over-education, yadda yadda, from the Globe and Mail, the CBC, and so on.

This freak-out is a bit overdone. I won’t argue that the study is good news, but I think there are some things going on underneath the numbers which aren’t given enough of an airing in the media.

First of all, as CSLS explains in great detail, the two important findings are that the incidence of low-wage work in the economy has stayed more or less stable, and second, Canadians on the whole are a lot more educated than they used to be.  This leads to a compositional paradox: even though all seven levels of education saw increases in the incidence of low-wages (see Figure below), overall the fraction of Canadians with low wage jobs dropped ever-so-slightly from 27.9% in 1997 to 27.6% in 2014.

ottsyd 20160721-1

Now you have to be careful about interpretation here, particularly with respect to charges of “over-education”.  Yes, the proportion of grads in low-wage jobs is going up.  But the average wage income of university graduates is actually increasing: between 1995 and 2010, it rose by 6% after inflation.  And that’s while the number of people in the labour force with a university degree increased by 94%, and the proportion of the labour force with a university degree jumped from 19.3% to 28.7% (I would break out data on Masters/PhD specifically if I could, but public Statscan data does not separate Bachelors from higher degrees). 

What that tells us is that the economy is creating a lot more high-paying jobs which are being filled by an ever-expanding number of graduates.  But at the same time, more graduates are in low-wage jobs, which suggests that while averages are increasing, so is variance around the mean.

Another factor at work here is immigration.  Since the mid-1990s, the number of immigrants over 25 with university degrees has increased from 815,000 (23.2% of all degree holders) to 1.87 million (33% of all degree holders).  It’s not clear how many of those have graduate degrees (thanks Statscan!) but I think it’s reasonable to assume, given the way our immigration points system works, that the proportion of immigrants with advanced degrees is even higher.

The problem is that immigrants with degrees – particularly more recent immigrants – have a really hard time in the Canadian labour market, particularly at the start (see a great Statscan paper on this here).  To some extent this is rational because the degrees and the skills they confer are genuinely not compatible (see my earlier post on this), and to some extent it reflects various forms of discrimination, but that’s not the point here.  There are over one million new immigrants with degrees over the past fifteen or so years, many of them from overseas institutions.  The CSLS-inspired freak-out is about the fact that over the past 17 years the number of degree-holders has increased by 450,000 (of which 130,000 are at the Master’s/PhD level).  Simple logic suggests that most of the problem people are seeing in the CSLS data is more about our inability to integrate educated immigrants than it is about declining returns to education among domestic students.  I know the data CSLS uses doesn’t allow them to look at the results by where a degree was earned, but I’d bet serious money this is the crux of the problem.

So, you know, chill everybody.  Canadian graduates still do OK in the end.  And remember that comparisons of educational outcomes over time that don’t control for immigration need to be taken with a grain of salt.