HESA

Higher Education Strategy Associates

Author Archives: Alex Usher

February 20

Canada’s Rankings Run-up

Canada did quite well out of a couple of university rankings which have come out in the last month or so: the Times Higher education’s “Most International Universities” ranking, and the QS “Best Student Cities” ranking.  But there’s actually less to this success than meets the eye.  Let me explain.

Let’s start with the THE’s “Most International” ranking.  I have written about this before, saying it does not pass the “fall-down-laughing” test which is really the only method of testing a ranking’s external validity.  In previous years, the ranking was entirely about which institutions had the most international student, faculty and research collaborations.  These kinds of criteria inevitably favour institutions in small countries with big neighbours and disfavour big countries with few neighbours, it was no surprise that places like the University of Luxembourg and the Qatar University would top the list, and the United States would struggle to put an institution in the top 100.  In other words, the chosen indicators generated a really superficial standard of “internationalism” that lacked credibility (Times readers were pretty scathing about the “Qatar #1 result).

Now as a result of this, the Times changed it methodology.  Drastically.  They didn’t make a big deal of doing so (presumably not wishing to draw more attention to the rankings’ earlier superficiality), but basically, i) they added a fourth set of indicators (worth 25% of total) for international reputation based on THE’s annual survey of academics and ii) they excluded any institution which didn’t receive at least 100 in said academic survey.  (check out Angel Calderon’s critique of the new rules here for more details, if that sort of thing interests you).  That last one is a big one: in practice it means the universe for this ranking is only about 200 institutions.

On the whole, I think the result is a better ranking and confirms more closely to what your average academic on the street thinks of as an “international” university.  Not surprisingly, places like Qatar and Luxembourg suddenly vanished from the rankings.  Indeed, as a result of those changes fully three-quarters of the institution that were ranked in 2016 disappeared from the rankings in 2017.  Not surprisingly, Canadian universities suddenly shot up as a result.  UBC jumped from to 40th to 12th, McGill went from 76th to 23rd, Alberta from 110th to 31st, Toronto from 128th to 32nd, and so on.

Cue much horn-tooting on social media from those respective universities for these huge jumps in “internationality”.  But guys, chill.  It’s a methodology change.  You didn’t do that: the THE’s methodologists did.

Now, over to the second set of rankings, the QS “Best Student Cities”, the methodology for which is here.  The ranking is comprised of 22 indicators spread over six areas: university quality (i.e. how highly-ranked, according to QS, are the institutions in that city), “student mix”, which is a composite of total student numbers, international student numbers and some kind of national tolerance index,; “desirability”, which is a mix of data about pollution, safety, livability (some index made up by the Economist), corruption (again, a piece of national-level data) and students’ own ratings of the city (QS surveys students on various things); “employer activity”, which is mostly based on an international survey of employers about institutional quality, “affordability”, and “student view” (again, from QS’ own proprietary data.

Again, Montreal coming #1 is partly the result of a methodology change. This is the first year QS added student views to the mix, and Montreal does quite well on that front’ eliminate those scores and Montreal comes third.  And while the inclusion of student views in any ranking is to be applauded, you have to wonder about the sample size.  QS says they get 18,000 responses globally…Canada represents about 1% of the world’s students and Montreal institutions represent 10-15% of Canadian students, so if the responses are evenly distributed, that means there might be 20 responses from Montreal in the sample (there’s probably more than that because responses won’t be evenly distributed, but my point is we’re talking small numbers).  So I have my doubts about the stability of that score.  Ditto on the employer ratings, where Montreal somehow comes top among Canadian cities, which I am sure is news to most Canadians.  After all, where Montreal really wins big is on things like “livability” and “affordability”, which is another way of saying the city’s not in especially great shape economically.

So, yeah, some good rankings headlines for Canada: but let’s understand that nearly all of it stems from methodology changes.  And what methodologists give, they can take away.

February 17

Four Mega-trends in International Higher Education – Economics

If there’s one word everyone can agree upon when talking about international education, it’s “expensive”. Moving across borders to go to school isn’t cheap and so it’s no surprise that international education really got big certain after large developing countries (mainly but not exclusively China and India) started getting rich in the early 2000s.

How rich did these countries get? Well, for a while, they got very rich indeed. Figure 1 shows per capita income for twelve significant student exporting countries, in current US dollars, from 1999 to 2011, with the year 1999 as a base. Why current dollars instead of PPP? Normally, PPP is the right measure, but this is different because the goods we’re looking at are themselves priced in foreign currencies. Not necessarily USD, true – but we could run the same experiment with euros and we’d see something largely similar, at least from about 2004 onwards. So as a result figure 1 is capturing both changes in base GDP and change in exchange rates.

Figure 1: Per Capita GDP, Selected Student Exporting Countries, 1999-2011 (1999=100), in current USD

Figure 1: Per Capita GDP, Selected Student Exporting Countries, 1999-2011 (1999=100), in current USD

And what we see in figure 1 is that every country saw per capita GDP rise in USD, at least to some degree. The growth was least in Mexico (70% over 12 years) and Egypt (108%). But in the so-called “BRIC” countries world’s two largest countries, the growth was substantially bigger – 251% in Brazil, 450% in India, 626% in China, and a whopping 1030% in Russia (and yes, that’s from an artificially low-base on Russia in 1999, ravaged by the painful transition to a market economy and the 1998 wave of bank failures, but if you want to know why Putin is popular in Russia, look no further). Without this massive increase in purchasing power, the recent flood of international students would not have been possible.

But….but but but. That graph ends in 2011, which was the last good year as far as most developing countries are concerned. After that, the gradual end to the commodity super-cycle changed the terms of trade substantially against most of these countries, and in some countries local disasters as well (e.g. shake-outs of financial excess after the good years, sanctions, etc) caused GDP growth to stall and exchange rates to fall. The result? Check out figure 2. Of the 10 countries in our sample, only three are unambiguously better off in USD terms now than they were in 2011: Egypt, Vietnam, and (praise Jesus) China. Everybody else is worse off or (in Nigeria’s case) will be once the 2016 data come in.

Figure 2: Per Capita GDP, Selected Student Exporting Countries, 2011-2015 (2011=100), in current USD

Figure 2: Per Capita GDP, Selected Student Exporting Countries, 2011-2015 (2011=100), in current USD

Now, it’s important not to over-interpret this chart. We know that many of these countries have been able to maintain. Yes, reduced affordability makes it harder for student to study abroad – but we also know that global mobility has continued to increase even as many countries have it the rough economically (caveat: a lot of that is because of continued economic resilience in China which has yet to hit the rough). Part of the reason is that if a student wants to study abroad and can’t make it to the US, he or she won’t necessarily give up on the idea of going to a foreign university or college: they might just try to find a cheaper alternative. That benefits places which have been pummelled by the USD in the last few years – places like Canada, Australia and even Russia.

In short: economics matters in international higher education, and economic headwinds in much of the world are making studying abroad a more challenging prospect than they did five years ago. But big swings in exchange rates can open up opportunities for new providers.

February 16

How to Fund (3)

You all may remember that in early 2015, the province of Ontario announced it was going to review its university funding formula.  There was no particular urgency to do so, and many were puzzled as to “why now”?  The answer, we were told, was that the Liberal government thought it could make improvements in the system by changing the funding structure.  Specifically, they said in their consultation document that they thought they could use a new formula to improve i) improve quality/student experience, ii) support differentiation, iii) enhance sustainability, iv) increase transparency and accountability.

Within the group of maybe 100 people who genuinely understand this stuff, I think the scoffing over points iii) and iv) were audible as far as the Maritimes.  Transparency and accountability are nice, but you don’t need a new funding formula to get them.  The Government of Ontario could compel institutions to provide data any time it wants to (and often does).  If institutions are “insufficiently transparent” it means government isn’t asking for the right data.

As for enhancing sustainability?  HA!  At a system-level, sustainability means keeping costs and income in some kind of balance.  Once it became clear that there was no extra government money on the table for this exercise, and that tuition fees were off the table, and they would not use the formula to in any way rein in staff salaries or pensions (as I suggested back here) , everybody said “ok, guess nothing’s happening on that front” (we were wrong, as it turned out, as we’ll see in a second).  But the bit about quality, student experience and differentiation got people’s attention.  That sounded like incentivizing certain things.  Output-like things, which would need to be measured and quantified.  So the government was clearly entertaining the idea of some output-based measures, even as late as December 2015 when the report on the consultation went out (see that report here).  Indeed, the number one recommendation was, essentially, “the ministry should apply an outcomes-based lens to all of its investments).

One year later, the Deputy Minister for Advanced Education sent out a note to all institutions which included the following passage:

 The funding formulas are meant to support stability in funding at a time when the sector is facing demographic challenges while strengthening government’s stewardship role in the sector. The formulas also look to create accountable outcomes, beyond enrollment, that reflect the Strategic Mandate Agreements (SMAs) of each institution.

 As you know, our goal is to will focus our sector on high-quality student outcomes and away from a focus on growth. As such, the funding formula models are corridors which give protection on the downside and do not automatically commit funds for growth on the upside.

Some of that may require translation but the key point does not: all of a sudden, funding formulas were not about applying an outcome based lens on investment, they were about “stability”.  Outcomes, yes, but only as they apply to each institution’s SMA, and no one I know in the sector thinks that the funding envelope that will be devoted to promoting SMAs is going to be over five percent.  Which, given that tuition is over 50% of income, means that maybe, at best, we’re looking to about 2% of total funding might be outcome-based.  As I’ve said before, this is not even vaguely enough to affect institutional behaviour.

What happened?  My guess is it’s a mix of four things.  First, there was a change of both Minister and Deputy Minister and that’s always a crap shoot.  Priorities change, sometimes radically.  Second, the university sector made its displeasure known.  They didn’t do it very publicly, and I have no insider knowledge of what kind of lobbying occurred, but clearly, a number of people argued very strenuously that this was a Bad Idea.  One that gored oxes.  Very Bad.  Third, it finally dawned on some people at the top of the food chain that a funding formula change, in the absence of any new revenue tools, meant some institutions would win, and others would lose.  And as the provincial government’s recent 180 on Toronto toll roads has shown, this low-in-the-polls government is prepared to go a long way to avoid making any new “losers”.

Finally, that “sustainability” thing came back in a new form.  But now it was no longer about making the system sustainable, but about finding ways to make sure that a few specific small institutions with precarious finances (mostly but not exclusively in northern Ontario) didn’t lose out as adverse demographics and falling student numbers began to eat into their incomes.  Hence the language about corridors “giving protection on the downside”.  It’s ridiculous for three reasons.  One, it’s a half-solution because institutions vulnerable to demographic decline lose at least as much from lost tuition revenue as they do in lost government grant.  Two, it’s a departing horse/open barn door issue: the bulk of the demographic shift has already happened and so to some extent previous losses are just going to be locked in.  Three – and this is most important – the vulnerable institutions make up maybe 8% of total enrolments.  Building an entire new funding system just to solve a problem that affects 8% of students is…I don’t know.  I’m kind of lost for words.  But I bet if you looked it up in the dictionary it would be under “ass backwards”.

And that, my friends, is how Ontario blew a perfectly good chance to introduce a sensible, modern performance-based funding system.  A real shame.  Perhaps others can learn from it.

February 15

How to Fund (2)

As I noted yesterday, in Canada we have some kind of phobia about output-based funding.  In the 1990s, Ontario and Alberta introduced, and then later killed, key performance indicators with funding attached.  Quebec used to pay some money out to institutions based on the number of degrees awarded, not just students enrolled, but they killed that a few years ago too (I’m sure the rumour that it did so because McGill did particularly well on that metric is totally unfounded).

Now, there is no doubt that the history of performance indicators in Canada hasn’t been great.  Those Ontario performance indicators from the 1990s?  They were cockamamie and deserved to die (student loan defaults as a performance measure?  Really?  When defaults are more obviously correlated with program of study, geographic location, and the business cycle?).  But even sensible measures like student completion rates get criticized by the usual suspects (hi OCUFA!), and so governments who even think about basing funding on outputs rather than inputs have to steel themselves to being accused of making institutions “compete” for funding, of creating “winner and losers,” of “neoliberalism,” yadda yadda.  You know the story.

Yet output based funding is not some kind of extremist idea.  Leave aside the nasty United States, where two-thirds of states have some kind of performance-based funding, all of which one way or another are based on student progress and completion.  Let’s look to wonderful, humane Europe, home to all ideas that are progressive and inclusive in higher education.  How do they deal with output-based funding formulae?

Let’s start with Denmark and England, both of which essentially offer 100% of their teaching-related funding on an output basis (these are both countries where institutions are funded separately for research and teaching), because although their formulas are essentially enrolment-weighted ones like Ontario’s and Quebec’s, they only fund courses which students successfully finish.  (Denmark also has another slice of teaching funding which is based on “on-time” student completion).  Students don’t finish, the institution doesn’t get paid.  Period.

Roughly two-thirds of higher education funding in Finland – yes, vicious neo-liberal Finland – is output-based.  A little more than half of that comes from the student side, based on credit progression, degree completions and the number of employed graduates.  On the research side, output-based funding is based on number of doctorates awarded, publications, and the outcome of research competitions.  It’s a similar situation in the Netherlands where over half the teaching component of funding comes from the number of undergraduate and master’s degrees awarded, while well over half the research funding comes from doctorates awarded plus various metrics of research performance.

All throughout Europe we see similar stories, though few have quite as much funding at risk on performance measures as the four above.  Norway and Italy both have performance-based components (mostly based on degree completions) of their systems which involve 15-25% of total funding.  France provides five percent of its institutional funding based on the number of master’s and bachelor’s degree completions (the latter adjusted in a very sensible way for the quality of the institutions’ students’ baccalaureat results).  Think about that for a moment.  This is France, for God’s sake, a country whose public service laughs at the concept of value for money and in which a major-party Presidential candidate can advocate for 32-hour week and not be treated as an absolute loon.  Yet they think some output-oriented funding is just fine.

I could go on: all German Länder have at least some performance-based funding both for student completions and research output, though the structure of these incentives varies significantly.  The Czech Republic, Slovenia, and Flemish Belgium also all have performance-based systems (mainly for student completions).  New Zealand provides 5% of total institutional funding based on a variety of success/completion measures (the exact measures vary a bit, properly, depending on the type of institution).  Finally, Austria and Estonia have mission-based funding systems, but in both cases measures looking at research performance and student completions indicators which form part of their reporting systems.

You get the picture.  Output-based funding is common.  It’s not revolutionary.  It’s been used in many countries without much fuss.  Have there been transition teething troubles?  Yes there have (particularly in Estonia); but with a little foresight and planning those can be mitigated.

And why have they all adopted this kind of funding?  Because funding is an essential tool in steering the system.    Governments can use output based funding to purchase institution’ attention and get them to focus on key outcomes.  If, on the other hand, they simply hand over money based on the number of students institutions enroll, then what gets incentivized are larger institutions, not better institutions.

Ontario, with its recent formula review, had a golden opportunity to introduce some of these principles to Canada.  It failed to so.  I’ll explain why tomorrow.

February 14

How to Fund (1)

Over the next three days, I want to talk about funding formulas.  I know I did this a couple of years ago, at the start of the Ontario funding formula review exercise (see here, here, and here, but it’s worth revisiting  partly because I’m cheesed off at how Ontario managed to botch the review, but also, it’s because I’ve been looking at funding formulas in Europe and the US for article I’ve been writing, and it’s absolutely stunning to me how pretty much everyone except Canada has some kind of performance measurement in their formula, but we can’t because everyone is afraid of creating “winners” and “losers”.  So today I’d like to give you a kind of grand overview of how funding formulas actually work, tomorrow I’ll have an overview of how formulas work in various parts of the world and then on Thursday I’ll come back to blow off steam about Ontario.

Are you sitting comfortably?  Then I’ll begin.

There are basically seven ways governments can hand money over to institutions.  They are, in more or less ascending order of policy sophistication:

Negotiated BudgetsMost of the developing world works on this system.  An institution tots up its wish list for the year, shows up at the Minister’s office, which says yea or nay to a variety of requests, and that’s that.  The government is under no obligation to treat institutions in the same manner and so “favoured” institutions often make out pretty well under this system.  This system tends to exist in countries where trust in institutions is low: effectively this system gives government a line-by-line veto over institutions budgets.

Historically-based lump sums.  This is more or less how it’s done in most Canadian provinces.  Government looks at what they gave each institution last year (which probably has at least some relationship to costs and outputs) takes a gander at provincial finances this year, and decides what everyone’s going to get in consequence this year.  It’s a step up on negotiated budgets in the sense that everyone gets treated more equally.  In provinces like Newfoundland and PEI, where there’s only one institution, this systems makes sense (because really, why make a formula when there’s only one institution?).  It probably makes less sense in Alberta, which also uses it.

Enrolment-based fundingIn most of North America, including Canada’s two biggest provinces, the majority of cash transferred by governments to institutions is simply based on the numbers of students enrolled, with more expensive programs given an extra “weight”, allegedly based on real costs (so, a medicine student is worth 5x an arts student, etc.).  These weights vary quite a bit from jurisdiction to jurisdiction so it’s a bit dubious that they are really based on “actual costs”.  It’s better to think of them as consensual fictions which are mutually convenient for both institutions’ and government’s planning purposes.  (Intriguingly, during the Ontario funding formula discussions, one of the most urgent pleas from institutions was “don’t mess with the subject weightings”.  Make of that what you will.)

Output-based FundingFitfully, the world is moving to various types of output-based funding.  In the US that means small amounts of funding based on various measure of progress/completion; in Europe, it’s quite large amounts of funding, usually some combination of student output and research outcomes.   Where it is based on student funding, the money is often weighted by the discipline from which the student graduate, just the way enrolment funding is. Note that output-based funding is not the same thing as outcome based funding.  There are a very few places which get funded based on student employment rates or student loan default rates (though the Harris government in Ontario did give that a try for awhile and countries like Finland do incorporate employment outcomes in funding decisions at the margins).

Competitive Funding. In Canada, we traditionally think of competitive funding as occurring at the level of the individual researcher.  But increasingly, we’re seeing funding being competitive at the institutional level (think CFI, think CFREF).  In other countries – particularly those which provide money for teaching and research in two separate envelopes – this has been the norm for a long time.

Mission-based funding.  There are a few places – Austria in particular – where funding is at least partially conditional on fulfilling a particular mandate or reaching a set of goals.  (Arguably, British Columbia uses this method, but the conditions are softer than in Austria).  In some ways this is a throwback to a negotiated budget system, but with an actual check for  “return on investment”.

Other Stuff.  Governments hand out money for all kinds of reasons.  Some are recurrent, such as being a Northern university (in Ontario, anyway) or a university serving the French community. In some countries you will see special envelopes for institutions to maintain art galleries and museums.  Then there’s the stuff which is basically play-money for ministers wanting to make a few headlines: throw-away money for a new building here, a new building there, money for mental health initiatives, or entrepreneurship centres, or what have you.  Most often seen in election years.

(If you want to split hairs, there’s an eighth way: subsidizing student tuition dollars through loan and grants.  But we’ll stick to the direct methods of subsidy for now).

Most jurisdictions of course use multiple means of dispersing funds.  For instance, though a majority of the world’s jurisdictions provide funds primarily as “negotiated”, “lump-sum” or “enrolment-based formula” systems, they still often have pockets of money given out competitively, or as “other” funding.

Internationally, what is striking about Canada (and to some extent the US) is how reliant they are on methods 2 and 3 compared to Europe which on the whole uses method 4 (output-based funding) a lot more.  I’ll show those differences in more detail tomorrow.

February 13

When Should the Education System Say “No”?

There’s an argument going on in the UK right now about re-introducing grammar schools.  Until the 1960s, grammar schools were a selective tier of the secondary system.  Everyone took exams at the age of eleven, and the most academically able were selected to go to these schools, the purpose of which (everyone understood) was to enable people to go to university.  Those who did not pass were essentially out of luck as far as further education went: their choices were circumscribed by the time they were eleven. Germany and some other central European countries still operate on this basis.  For some reason, the current government thinks it’s a good idea to go back to that system.

Like many others, I think it’s wrong for the education system to filter people at an early age.  Among other things, streaming – or any rationing by ability, really – is inevitably classist.  Yes, some poor kids will get through and get “a good education” and by some people’s lights this makes selection an “engine of mobility”.  But far more are consigned to the loser bin at an early age.  And that’s not good: you can’t ask the education system to kill people’s dreams off at such an early age.

But here’s the question: if not then, when?  Should the education ever say no to someone’s dreams?

We used to say “no” to people a lot.  We used to fail out a lot of kids from high school and that was OK, because hey, we had to have standards (I note with interest that Ken Coates and Bill Morrison, in their new book Dream Factories, have taken to calling near-universal high school completion rates an obvious example of “dumbing down”. Nice.) We used to restrict entry to university a lot.  Heck, 30 years ago we had fewer than half the number of students we had today, and the median student today would have had trouble accessing university in the late 1980s.  In some parts of Europe, even though they have so-called “open” admissions systems (everyone who passes the exit examination of the top-secondary school stream, such as the baccalaureat or the abitur) it remains policy to fail out large numbers of students after first year who “can’t handle the work” – that is, say yes, then say no.

To a considerable degree, widening access is about learning how not to say no to people.  But to some extent this just puts off the day of reckoning, because after education comes the labour market and the labour market is under no obligation to say “yes” to anyone.  There are more people who want to be professors than there are tenure-track jobs, more people wanting to be lawyers (crazy but true) than there are positions at law firms, more teacher-wannabes than teaching positions.  “No” comes, eventually, at least for some.

Now some people will argue that because the labour market says “no”, the education system also needs to say no – especially when it comes to professional schools. To these people, the expansion of law school (or Master’s degrees in education, take your pick) is a travesty. All those people paying for an education which doesn’t necessarily bring in a huge rate of return?  What we need to do is reduce the number of incoming students so as to raise average rates of return!  (There is a similar argument with doctoral students: there are never going to be enough academic jobs for these students, so why let them in in the first place)?

I get that argument, but to me it doesn’t wash any more than early selection washes.  Yes, there are more wannabe lawyers and teachers than available positions.  But why should anyone but law firms and schools be the ones who say no?  Why should higher education institutions be the gate-keepers?  Until you’ve actually given people a chance to succeed at a professional school, how would you know who the best lawyers/teachers will be anyway?  And how, in practice, will institutional gate keeping not simply re-introduce the class-based outcomes?

The only legitimate argument in favour of limiting enrolment, it seems to me, is if public money is at stake.  At some point, a government which feels it is not getting a good return on its investment because graduates are not getting jobs would be within its right to stop funding new places.  But if students are spending their own money, as they do for law school, why should anyone want to stop students from spending their own money to pursue their desired career?

Yes, consumers need to be protected from mis-selling, obviously; institutions shouldn’t be allowed to mislead people about the odds of someone eventually saying “no”.   But other than that, the moral case for institutions as gate keepers isn’t much better than that for bringing back grammar schools.

February 10

Four Megatrends in International Higher Education – Demographics

Last week I noted that one of the big factors in international education was the big increase in enrolments around the world, particularly in developing countries.  Part of that big increase had to do with a significant increase in the number of youth around the world who were of “normal” age for higher education – that is, between about 20 and 24.  Between 2000 and 2010, that age-cohort grew by almost 20%, from a little over 500 million to a little over 600 million.  Nearly all (95%) of that growth came from Asia and Africa.

Figure 1: Number of People Aged 20-24, by Continent, 2000 to 2030

ottsyd-20170210-1

But as figure 1 shows, 2010 was a peak year for the 20-24 age group.  Over the course of the 2010s, numbers globally will decline by 10%, and not reach 2010 levels again until 2030 (intriguingly, this is almost exactly true for Canada, as well).  A problem for international higher education?  Well, maybe.  Demography isn’t destiny.  But to get a bit more insight, let’s look at what’s happening to the demographics within each region.

In Europe, the numbers for the 20-24 year old group are falling drastically.  In Western Europe, the decline is relatively moderate and reflects a gradual drop in the birth rate which has been going on for about fifty years.  In Eastern Europe, the fall is more precipitous, a reflection the fall in the birth rate during the occasionally catastrophic years of the switch from socialism to capitalism.  In Russia, youth numbers are set to drop by – ready for this? – fifty per cent (or six million people) between 2010 and 2020.

Figure 2: Number of People Aged 20-24, Selected Countries in Europe, 2000 to 2030

ottsyd-20170210-2

In East Asia, the story of the first ten years of the century was the huge increase in youth numbers in China (yes, the one-child rule was in effect, but the previous generation was so large that raw numbers continued to increase anyway).  But once we reach 2010, the process reverses itself.  China’s youth cohort drops by 40% between 2010 and 2020. Similarly, Vietnam’s drops by 20%, as does Japan’s (which additionally lost another 20% between 2000 and 2010).  Of the countries in the region, only Indonesia is still seeing some gentle growth.

Figure 3:  Number of People Aged 20-24, Selected Countries in East Asia, 2000 to 2030

ottsyd-20170210-3

The story changes as we head west in Asia.  India will continue to see rises – albeit small ones – in the number of youth through to 2030 at least.  Pakistan will see an increase of 50%, albeit from a much smaller base.  Numbers in Bangladesh will rise fractionally, while those in Turkey will stay constant.  Iran, however, is heading in the other direction; there, because of the precipitous fall in the birth rate in the 1990s, youth numbers will fall by 40% between 2010 and 2020 (i.e. on a similar scale to China) before recovering slightly by 2030.

Figure 4: Number of People Aged 20-24, Selected Countries in Southern & Western Asia, 2000 to 2030

ottsyd-20170210-4

I’m going to skip the Americas, because numbers there stay pretty constant over the whole period and the graphs therefore look pretty boring (just a bunch of lines as flat as a Keanu Reeves performance).  But here comes Africa, where youth numbers are expanding relentlessly.

Figure 5: Number of People Aged 20-24, Selected Countries in Africa, 2000 to 2030

ottsyd-20170210-5

The six countries portrayed here – Nigeria, Ethiopia, Egypt, Kenya, South Africa and Tanzania – make up just 40% of the continent’s population, but they are quite representative of the continent as a whole.  By 2030, there will be more 20-24 year-olds in Nigeria than there are in North America, and growth in numbers in Tanzania, Kenya and Ethiopia (as well as Nigeria) between 2015 and 2030 will exceed 50%.  The outliers here are South Africa, where youth cohort numbers are going to stay more or less constant, and Egypt, where the numbers drop in the 2010s before starting to grow again in the 2020s.

So what can we learn from all this?  Well, what it means is that overall, youth numbers are shifting from richer and middle-income countries to poorer ones.  While many developed countries like the US, France, Canada and the UK are more or less holding their numbers constant (or, more often, showing a dip in the 2010s and a subsequent rise in the 2020s), we are seeing big, permanent drops in numbers in places like Russia, Iran, China and Vietnam and big increases in places like Nigeria, Pakistan and Kenya.

Ceteris paribus, this is bad news for international student flows because on average, the potential client base is going to be coming from poorer countries.  But keep in mind two things: first, international education is by and large the preserve of the top five percent of the income strata anyway, so national average income may not be that big a deal.  Second, while the size of the base populations may be changing, what really matters for total numbers is the fraction of the total population which chooses to study abroad.  China is a good example here: as our data shows, the youth population is falling drastically but international student numbers are up because an increasing proportion of students are choosing to study abroad.

Bottom line: the world youth population is now more or less stable, after decades of growth.  For international education to continue to grow means finding ways to convince people further down the income strata that study abroad is a good investment.

February 09

Skills and Youth

What with the Advisory Council on Growth’s paper on skills, and the Expert Panel on Youth Employment wrapping up, public policy is suddenly back to a focus on skills – and in particular what skills youth should have.  So, let’s talk about that.

While some in the federal government will state forcefully that they are not – repeat NOT- going to be like the previous government and tell students what fields they should study (read: welding), literally every time skills come up they start babbling about coding, tech and whatnot.  So as near as I can tell, this government is just as directive about skills as the previous one, it’s just that a) they’re pushing a different set of skills and b) they aren’t actively trashing programs of study they see as less valuable, the way the Tories did with sociology.

The Liberals’ urge to get everyone tech-ing is understandable, if shallow.  What’s the one part of the youth labour market where kids are doing better than ever?  Engineering and computer science.  Are tech-enabled industries the wave of the future?  Well, kinda, depending on your definition of what that means.  But let’s think a little bit more about what that means.

Consider what I would call “hard” tech skills: the people who actually do code or computer science for a living. There’s just not that many of them around.  And here’s a secret: even if Canada becomes some kind of massive tech haven, there still won’t be that many around.  It’s simply not a high-employment industry.  Defining it really ambitiously and assuming high rates of growth, these jobs might equal five percent of the labor force.  So, yeah, let’s increase the size of engineering and CS programs, a bit.  But that’s not a skills solution for the economy as a whole.  We need something for the other 95% of the population.

Now, there’s a broader set of tech skills that matter to a broader subsection of the population.  Some people call these “coding skills” but it’s actually closer to digital literacy.  Basically, people who work with databases all the time – whether they are in accounting or sales or advertising or what have you – can become more productive if they better understand the logic behind databases and have some understanding of how algorithms might improve their use.  Artists and designers can command higher salaries if they have some digital skills.  To be clear – this doesn’t mean we need more credentials in these areas.  It means we need more people in the workforce who possess thee skills as part of their toolkit.  They could learn this stuff through coding schools or “bootcamps”, or maybe more colleges and universities could integrate these skills into existing programs but more likely most people are going to acquire these skills informally.  Which is fine, as long as they have them.

But still, put those two sets of tech skills together and you’re covering maybe a quarter of the labour force.  And that’s not good enough.  What are we going to do for everyone else?

No one has a crystal ball that can help understand what jobs of the future look like.  But it does seem the case that if technology is going to be as disruptive as the tech-boosters think it will be, then a lot of jobs are going to be automated.  In fact, human employment will be increasingly be concentrated in things that computers or robots cannot do.  And in the main, those are either jobs that require a wide variety of physical skills or jobs that involve judgement and empathy.  Last year, Geoff Colvin wrote a book on this subject called Humans are Underrated, which is worth reading if you’re into this topic.

Put it this way.  We’ve got a minority of our future workers who will be working hard to make better robots and algorithms to do things humans can’t do (at least not near the price computers can do it).  But we’re also going to have a majority of our future workers who are going to have to work hard at making themselves unreplaceable by machines by employing very human skills like empathy and narrative.  Why in the name of all that’s holy would we focus our energies just on the first group of workers?  Why not acknowledge what’s actually happening in the labour market and say: we’re going to work on both?

A final point about skills and youth.  As I noted back here something really does seem to have changed in the labour market after 2008.  Full-time enrolment rates in particular have shifted downwards – but this is much more pronounced among the younger age groups (15-19) than it is among older ones (25-29).  This is consistent with a theory of skills-biased technological change: younger people have fewer skills than older ones.  But be careful here in equating the acquisition of skills with obtaining an education.  Employers want people who can get a job done: by and large when they talk about “skills shortages” what they actually mean is “experienced worker shortages”, because to them acquired tacit knowledge matter at least as much formally-acquired knowledge.   To put that a little more concisely: it’s not just that education is more important than ever, but experience is also more important than ever, especially for young people.

I know the Expert Panel will be thinking about these issues, because they kindly invited me to a roundtable event last week and we talked about all this (thanks, Vass!).  But the people who really need to be thinking about these issues are colleges and universities – perhaps more the latter than the former.  Study after study for the last two decades have shown that the number one reason students attend university is to get a god job.

As I’ve just run through, jobs are about experience and skills.  Could be tech skills, could be empathy/narrative skills: either is fine.  Slowly, institutions are coming around to the idea that experience matters and so work-integrated learning is expanding.  Great.  Hard tech skills?  We’ve got a lot of that covered.  Integrating second-level tech skills into other programs in Arts, Science and business.  Getting there (in some places, anyway).  But the narrative/empathy stuff?  I know some people blather on about how humanities give you these skills somehow by osmosis, but do they really?  Who’s checking?  How is it being measured?  And why on earth would we want to limit that stuff to the Liberal Arts anyway?

If I were a university President, these are the kinds of things I’d be asking my Deans to think about.

February 08

New York, New York

With the Republicans in control of both Congress and the White house for at least the next two years, the fight for “free tuition” is moving to the state level.  And so to New York, where Governor Cuomo has proposed a form of “free tuition” for anyone attending the City University of New York (CUNY) or the State University of New York (SUNY) and whose family earns less than $125,000.  So what does this mean exactly?

Well, to be clear, it’s not the same kind of free tuition Hillary Clinton was offering back in the election campaign.  (There are many kinds of free tuition, as I noted back here; refresh your memory, if you like).  Clinton was offering – with scant details – a vision where with enough federal funds, states and their public university systems would agree to stop charging tuition fees to students from families below $125,000 in income (or, roughly, 80% of the student population.  That idea was always a little bit pie-in-the-sky: the impracticalities of it were well covered by Kevin Carey at the time.  What Cuomo is offering instead is a top-up plan to make tuition “net free”.  Basically, he’s going to offer students below the cut-off line whatever amount of grants it takes to equal the amount they pay in tuition.  This payment, to be known as an ‘Excelsior Scholarship” (really), is thus equivalent to tuition minus any grants the student is already receiving from the federal or state governments via the Pell grant system.

Now, you might be saying to yourself: hey, that kind of sounds like the Ontario model.  That’s good, isn’t it?  To which the answer is: yes, it is a lot like the Ontario model.  It’s income-targeted net free tuition.  Except a) in some respects it’s going to be more like New Brunswick, with a big step-function (link to: ) at $125,001 instead of a nice smooth slope of benefits like Ontario and b) the threshold for getting full benefits is ludicrously high and has perverse consequences.

What do I mean by perverse consequences?  Well, the thing is that for students at the low-income level of the spectrum, federal and state grants already equal tuition.  So literally none of the money involved here is going to help them.  The biggest winners in the Cuomo proposal are precisely those people who get no grants right now – basically from families with about $80K and up in family income.  And yet these are the people who have the least trouble going to college right now.

The question here is: if you have a couple of hundred million dollars to spend, why would you give it to a group of people who have no issue attending in the first place?  Why not put money where it will be most effective? Columbia University’s Judith Scott-Clayton suggests there’s good evidence that money going to institutions creates better access outcomes than simply limiting the price.

Even Chile, once very keen on full “gratuidad”, has belatedly come around to this realization.  For budgetary reasons, the government was forced to limit its recent introduction of “free” tuition to students from families in the bottom six deciles of income.  This summer, the Chilean Treasury Department published cost estimates for the program.  In its present state the fully-phased in cost of the program will be 607 billion pesos (about $1.25 billion Canadian, or about $950M American).  Adding each of the next four deciles raises the price by about 350 billion, or 58%.  That is to say, free tuition for everyone would cost over 2 trillion pesos, or over three times as much as it costs for the bottom six deciles.  That difference is equal to 1.5% of GDP.  And what would be the purpose of spending all that money?  The very fact that it costs so much is a reflection of the fact that participation from these groups is already so high they don’t really need government help.  What kind of socialist government prioritizes handing over 1.5% of GDP to families in the top four income deciles?

In short, while targeted free tuition makes a great deal of sense, it really does need to be targeted.  If targeting weakens, the program becomes more expensive and less effective.  New York’s plan, clearly, suffers from insufficient targeting.  Ontario’s plan has it about right.  But beware: the Premier occasionally muses about extending the plan to higher income groups and there’s certainly a chance such an idea will make it into the policy conversation as the provincial election approaches.  That way madness and much wasted public funding lies.

February 07

Innovation and Skills Redux

So, yesterday Federal Finance Minister Bill Morneau’s Advisory Council on Economic Growth released five (!) papers on innovation, skills, and a bunch of other things.  I’m sure there’s a lot of ink on these in today’s papers, mainly around proposals to raise the retirement age (which we actually did two years ago, except the Trudeau government reversed it, but now evidence-based policy FTW, as the kids say).  I’ll restrict myself to some brief thoughts about two areas in particular: innovation and skills

On Innovation:   I must admit I got a bit of a thrill reading page 9 of the report, in which the Council body-slams the innovation Minister’s ideas about geographically-based innovation “clusters”.  They’re polite about it, “applauding” the Minister for coming up with such a great idea, but then go on to say that they’ve actually read the literature and know what works, and it ain’t clusters.  Hilarious.

What do they propose instead?  Well, it’s something called “innovation marketplaces”.  What are those you ask?  Well, to quote the report they’re “centers of technology and industry activity that are developed and driven by the private sector. An innovation marketplace brings together researchers and entrepreneurs with public and private customers around a common business challenge. These marketplaces match innovation demand from corporations and governments with innovation supply from researchers and entrepreneurs. This matchmaking strengthens supply-chain relationships and the flow of information, thereby fueling further innovation.”

If you think that sounds super hand-wavy, you are not alone.  In practice, there’s some overlap with the ideas Minister Bains has been peddling for months (Artificial Intelligence!  Cleantech!) but these idea are more focussed on industry and less geographically-based, both of which are Good Things.  However, it still equates innovation with new product development, specifically in gee-whizzy tech areas, which is a Bad Thing.  (Non-gee-whizzy sectors get their due in a separate paper on growth; a Good Thing to the extent that at least the Council conceptually understands the difference between Growth Policy and Innovation Policy.  I’m yet to be convinced the Minister has such an understanding.)  So there’s some overlap in ideas but considerable differences in the kinds of programs that are supposed to get us there.

But the budget’s only a couple of weeks away.  How does this circle get squared?   Messily, I suspect.  But we’ll have to wait and see.

On Skills:  According to the report, everything is going to be solved by a new agency going by the godawful name “Futureskills Lab”.  As near as I can tell, this agency is going to be a lot like the Canadian Council on Learning was, only: i) more focused on skills than education (by “skills” they seem to mean tech skills – eight of the ten examples of skills used in the report are tech), ii) more focused on (industry-led) experimentation and dissemination and “what works” and iii) it’s also going to be handed the prize of finally sorting out all that Labour Market Information stuff that Don Drummond has been yelling about for years and no one trusts Statscan to get right.  (I kid….Don Drummond would never raise his voice).

OK, so…there’s nothing wrong with funding lots of experimentation on skills and training.  In fact, it’s a great idea.  Fantastic.  The over-focus on tech skills is <headdesk> inducing, but my guess is that reality will kick in after a year or two and we’ll get a broader and more sensible set of skills priorities.  And there’s nothing wrong with better Labour Market Information, though I’m not particularly convinced that adopting all of Drummond’s recommendations will bring us to some kind of Labour Market Nirvana. (Short version, which maybe I should elaborate in a future blog: what Drummond mostly wants is backward-looking, which is great for economic analysis, not especially helpful for job-seekers or students looking to specialize).

But why do we need a new institution to do all this?  ESDC could fund experiments and analyses thereof.  Statscan could do the LMI stuff.  What advantage does a new institution necessarily have?  I’m not saying there are no advantages: the Millennium Scholarship Foundation is an example of an arguably unnecessary institution which nonetheless was responsible for some pretty interesting policy and delivery innovations.  But the advantages are uncertain and not well-argued in the report.

And there’s another issue.  The Council is keen that FutureSkills Lab be collaborative.  Super collaborative.  Especially with the provinces.  They really like the whole Canada Institute for Health Information (CIHI) model.  Well, the thing is, the federal government did try something similar a decade ago.  It was called the Canadian Council on Learning (CCL) – remember that? It was well-intentioned, but a political disaster because the feds set it up before actually talking to the provinces, leading the latter to essentially boycott it.  More to the point, CIHI works because it is responsible (in part) to the provinces, not just the feds.  If the Council recognizes the importance of this point, it is not evident in the report, which dances back and forth between saying it should “collaborate with” the Forum of Labour Market Ministers (i.e. with provincial governments) and saying it should be “accountable” to them.

I’ll stick my neck out on this one: “accountable to” will fly, “collaborate with” will not.  If the federal government is going to take up this idea from the council, it needs to make clear to the provinces within the next few days if not hours that this is going to be 100% CIHI clone, accountable to provinces and feds and not a federal creature collaborating with provinces.  If that doesn’t happen, regardless of the merits of more experimentation and better LMI data, this idea is going to be an expensive repeat of the CCL failure.  Federalism still matters.

Page 3 of 11112345...102030...Last »