Higher Education Strategy Associates

September 21

The China “Crisis”

It’s no secret that China dominates the world market when it comes to sending students abroad.  About 20% of all globally-mobile students are from China; in countries like the US, Canada, and the UK, they are far and away the number one source of foreign students.  (In all three countries, Chinese students account for as many foreign students as the next four source countries combined.)

Now every once in awhile – more and more frequently these days – you get some bad economic data from China, and everybody wants to be the first person to predict the coming “China Crisis”: oh Dear Lord, Chinese students are going to disappear, how will everyone cope?

To which I say: chill.  The Chinese market isn’t going anywhere, at least not for economic reasons.

If the argument is that China’s financial turmoil might lower Chinese incomes, and therefore reduce the affordability of foreign education, you need to keep in mind that Chinese families don’t fund education the way we do.  They save.  A lot.  For years.  Unlike North American families, Chinese families don’t try to make things work using their current incomes.  And so unless Zhounior’s savings were fully invested in the Shanghai stock-exchange just before the crash, some short-term economic instability isn’t going to matter that much.

And if things get worse?  What if financial instability leads to political instability?  I’d say that’s more likely to lead to an increase in study abroad rather than a decrease.  For wealthy Chinese families, sending students abroad for their education is at least as much about giving kids a foot in the door for emigration as it is a tool with which to advance their careers in China.  Having your kid in a foreign university is a hedge against precisely this kind of political uncertainty.

Now, this doesn’t mean the Chinese market is impervious to decline.  The fall in the size of the Chinese university-age cohort still matters, but that’s a long-term phenomenon, not a short-term one.  The troubles that graduates have in the labour market is real, and is affecting the composition of demand for higher education.  But remember: the proportion of Chinese undergraduates who choose to study abroad every year is 1-2% of the total.  What happens in that 1-2% market is only barely related to what goes on in the mass market.  It’s like trying to guess what’s going on with Mercedes-Benz sales from the sale of Toyota Corollas. The “mass market” looks nothing like the “elite market”.

The single thing that would most disrupt the flow of students out of China would be a sudden and noticeable increase in the availability of enrolment places at prestigious domestic institutions.  That is, either the big prestigious institutions could expand, or new institutions could join the ranks of the elite; either would reduce the demand for foreign education.  But the former flat-out isn’t happening; and the latter, while not impossible, seems unlikely under present circumstances.

In short, there are solid reasons to prepare for an eventual cresting of demand from China.  But the prospect in the short-term of a bursting of the Chinese student “bubble” is less convincing.  Plan accordingly.

September 18

Party Platform Analysis: The Conservatives

Back again for some more election platform analysis.  This week: the Conservatives.  But first, a caveat.  Part of the problem with trying to analyze party platforms in a 326-day election is that one’s rhythm gets all thrown off.  In a five-week campaign, all of the announceables are pretty much there in the first 21 days or so, so you more or less know when a party’s done announcing things.  In this election, we’re weeks into the campaign and we can’t be completely sure if the parties are done announcing things, unless, like the Greens, they actually publish the entire manifesto at once (an idea which, judging by their behaviour, the other parties find ridiculously passé).  So what I’m about to analyze is the Conservative platform as of Wednesday the 16th of September.  It’s possible there is a little more to come, but I have a feeling there isn’t – if I’m wrong, I will add some analysis later in the campaign.

Now, I should start by acknowledging that there loads of people in PSE who won’t care a fig what Conservatives promise, because they think the Harper record consists entirely of some kind of “War on Science”.  Long-time readers will know I’m not a fan of that theory: treatment of science and data within government (e.g. the long-from census) has been pretty horrible, but they haven’t done so badly on funding of academic science.  Arguably, by historic standards, their support has been the second-best of any government in Canadian history.  Their problem, however, is that first place goes to their immediate predecessors.

Anyways, the Tory strategy on higher education in this election seems to be to go with small, but tightly-targeted promises.  The first, released a couple of days after the election call, was a change to the Apprenticeship Job Creation Tax Credit (not to be confused with the much sillier Apprenticeship Completion Bonus). This credit targets employers, which is the right focus, since they are the ones who control the supply of apprenticeship “places”.  Currently, it provides employers with a non-refundable tax credit of up to 10% of wages paid to each first- and second-year apprentice employed, up to a maximum of $2,000 per employee.  The tweak announced on August 3rd was to include third- and fourth-year apprentices, and bump the maximum reclaimable amount to $2,500.

This is one of those “meh” announcements.  Does it do a lot of good?  Probably not.  The credit makes sense in first and second year because those employees are noobs who require so much supervision that they don’t always add value.  By their third and fourth year, however, apprentices are getting hired because they add value to an employer, not because there’s a tax break involved (and in any case, in a lot of companies, the people doing the taxes don’t always talk to the HR people who make hiring decisions, so the logic model here of how this increases the supply of spaces isn’t perfect).  But on the other hand, it doesn’t do a lot of harm either.  It’s small ball – I didn’t see a cost estimate for this, but it’s got to be somewhere in the $30-50 million range.

The other, better announcement had to do with improvements to the system of Canada Education Savings Grants (CESG).  You remember those?  Introduced in 1998, they initially paid a 20 cent top-up on every dollar placed in a Registered Education Savings Plan (RESP), up to a maximum of $400/year (later increased to $500).  About ten years ago the system was tweaked to create something called an A-CESG, which changed the top-up rate on the first $500 contributed to 40 cents on the dollar for families in the bottom income quartile, and 30 cents on the dollar for those in the second quartile.  In early September, the Conservatives announced they would raise those top-ups again, to 60 cents and 40 cents, respectively.

Some of the usual suspects dismissed this announcement out-of-hand because “savings are only for the rich”.  That’s idiotic – it’s right there in the design that this money only goes to families with below-median income.  In that sense, this is a tight, targeted, progressive measure.  But like with the apprenticeship credit, you have to wonder if it’s actually going to change anything.  Why give more money to people who are already saving, rather than – say – adjusting the Canada Learning Bond (which essentially kick-starts RESPs for low-income families by making a $500 initial donation) and making it an automatic benefit,  instead of an application-based one?  It’s not so much that it’s a bad promise; it’s just less effective than it could be.

This, to my mind, sort of sums up the Conservative record.  They can be counted on to do something every year for post-secondary education: just not always the most effective thing.

Next week: probably the NDP, if they’ve fully release their platform.

September 17

A Global Higher Education Rankings Cheat Sheet

As you likely noticed from the press generated by the release of the QS rankings: it’s now rankings season!  Are you at a university that seems to care about global rankings?  Are you not sure what the heck they all mean, or why institutions rank differently on different metrics?  Here’s a handy cheat-sheet to understand what each of them does, and why some institutions swear by some, but not by others.

Academic Ranking of World Universities (ARWU): Also known as the Shanghai Rankings, this is the granddaddy of world rankings (disclaimer: I sit on the advisory board), having been first out of the gate back in 2003.  It’s mostly bibliometric in nature, and places a pretty high premium on publication in a select few publications.  It also, unusually, scores institutions on how many Nobel or Field prizes their staff or alumni have won.  It’s really best thought of as a way of measuring large deposits of scientific talent.  There’s no adjustment for size or field (though it publishes separate ratings for six broad fields of study), which tends to favour institutions that are strong in fields like medicine and physics. As a result, it’s among the most stable rankings there is: only eleven institutions have ever been in ARWU’s top ten, and the top spot has always been held by Harvard.

Times Higher Education (THE) Rankings: As a rough guide, think of THE as ARWU with a prestige survey and some statistics on international students and staff tacked-on.  The survey is a mix of good and bad.  They seem to take reasonable care in constructing the sample and, for the most part, questions are worded sensibly.  However, the conceit that “teaching ability” is being measured this way is weird (especially since institutions’ “teaching” scores are correlated at .99 with their research scores).  The bibliometrics are different from ARWU’s in three important ways, though.  The first is that they are more about impact (i.e. citations) than publications.  The second is that said citations are adjusted for field, which helps institutions that are strong in areas outside medicine and physics, like the social sciences.  The third is that they are also adjusted for region, which gives a boost to universities outside Europe and North America.  It also does a set of field rankings.

QS Rankings: QS used to do rankings for THE until 2009 when the latter ended the partnership, but QS kept trucking on in the rankings game.  It’s superficially similar to THE in the sense that it’s mostly a mix of survey and bibliometrics.  The former is worth more, and is somewhat less technically sound than the THE’s survey, and it gets regularly lambasted for that.  The bibliometrics are a mix of publication and citation measures.  Its two distinguishing features are: 1) data from a survey of employers soliciting their views on graduate employability; and, 2) they rank ordinally down to position 500 (other rankings only group in tranches after the first hundred or so institutions).  This latter feature is a big deal if you happened to be obsessed with minute changes in ranking order, and regularly feature in the 200-to-500 range.  In New Zealand, for instance, QS gets used exclusively in policy discussions for precisely this reason.

U-Multirank: Unlike all the others, U-Multirank doesn’t provide data in a league-table format.  Instead, it takes data provided by institutions and allows users to choose their own indicators to provide of “personalized rankings”.  That’s the upside.  The downside is that not enough institutions actually provide data, so its usefulness is somewhat less than optimal.

Webometrics RankingsAs a rule of thumb: the bigger, and more complicated, and more filled with rich data a university website is, the more important a university it is likely to be.  Seriously.  And it actually kind of works.  In any case, Webometric’s big utility is that it ranks something like 13,000 universities around the world, and so for many countries in the developing world, it’s the only chance for them to see how they compare against other universities.

September 16

An Argument About the Effects of Tuition Reductions

At various times in the past (herehere, and here, for example), I have made the argument that lowering tuition fees is regressive because the benefits will accrue to people who are either the children of the wealthy, or people who will be wealthy, or both.  I have also said that where neither of those conditions is true (for example, some types of community college programs), there is a reasonable case for free tuition.

As a rule, people who disagree with this position make one of three tactical responses.  The first is to hurl abuse, usually with the word “neo-liberal” thrown in for good measure.  These people we can safely ignore.  The second is to take the Hugh McKenzie-CCPA route, which is to say it’s OK to have these kinds of transfers to the rich because they pay more taxes than everyone else.  This is not prima facie idiotic, but it’s a very, very difficult argument to make as a progressive.  In fact, you can only really make it through a syllogism like this: “I am progressive.  I made a statement.  Therefore the statement is progressive”.  Evaluate as you will.

But there is also a (rarer) third response, which says: “ah, but you’re only looking at ceteris paribus results.  Surely free tuition would bring all sorts of new students to the table, and change the benefit calculus.”  Now it is undeniably true that *if* there was a massive shift in demand, then my argument would be wrong.  So let’s look at that *if* – how likely is it to happen?  What would have to happen in order for such a shift to take place?

Let’s look at this logically: would lower fees make anyone less likely to want to attend higher education?  No.  So any shift is not going to come from a fall in demand from upper-income groups, it’s going to have to come from a surge in demand from lower-income youth.  That’s possible, though unproven. There is, for instance, no data from either Manitoba or Newfoundland to suggest that this is what happened when they reduced tuition over a decade ago.  But let’s assume for the moment it’s true.

Now, you have to ask the question: even if aggregate demand increases, are universities likely to take in more students as a result of fee reductions?  Unless you’re also assuming that governments are going to spend a whole extra wad cash for expansion, on top of cash for eliminating fees (NB: the Green Party plan for free tuition in Canada does not do this; neither does the Chilean free tuition experiment), the answer here is “probably not” (or at least not much).  But if the supply of spaces is more or less fixed, then for any benefit-shifting to happen, additional students from poorer backgrounds are actually going to have to displace richer kids in order to close the gap.  Poor kids in, rich kids out.  That’s not an impossible outcome, but given that: a) universities ration places through grades; and, b) youth from higher-income families have an advantage in terms of academic preparation (go see any number of PISA studies on that one), it seems very unlikely.

But let’s suspend disbelief, and assume governments ARE in fact prepared to both reduce price and expand capacity.  What wold happen then?  Well, we don’t know, really.  But we do know that governments have been expanding university capacity tremendously over the past 15 years – partly through higher funding, and partly through higher fees.  And as far as we know (and admittedly we don’t know as much as we should), access has in fact been widened, at least as far as ethno-cultural backgrounds are concerned.  But that raises a question: if you can improve access simply by increasing capacity, why not just do that instead of spending all that money to also make it free?

In short, we know a way to improve access, and it doesn’t involve making higher education free.  Conversely, we know that making higher education free, on it’s own, is very unlikely to change the social composition very much (i.e. it won’t be effective on its own terms), and therefore will provide extraordinary benefits to children of upper-income families.

September 15

Visible Minority Numbers Rise Sharply

I was poking around some data from the Canadian Undergraduate Survey Consortium the other day and I found some utterly mind-blowing data.  Take a look at these statistics on the percentage of first-year students self-identifying as a “visible minority” on the Consortium’s triennial Survey of First Year Students:

Figure 1: Self-Identified Visible Minority Students as a Percentage of Entering Class, 2001-2013














Crazy, right?  Must be all those international students flooding in.

Er, no.  Well, there are more students with permanent residences outside Canada, but they aren’t necessarily affecting these numbers, because they represent only about 7% of survey respondents.  If we assume that 80% of these students are themselves visible minorities, and we pull them out of the data, the visible minority numbers look like this:

Figure 2: Visible Minority Students, International* vs. Domestic, 2001-2013














*assumes 80% of students with permanent residences outside Canada are “visible minorities”

That’s still a heck of a jump.  Maybe it has something to do with the changing demographics of Canadian youth?

Well, we can sort of track this by looking at census data on visible minorities, aged 15-24, from 2001 and 2006, and (yes, yes, I know) the 2011 National Household Survey, and then marry these up with the 2001, 2007, and 2013 CUSC data.  Not perfect, but it gives you a sense of contrasting trends.  Here’s what we find.

Figure 3: Domestic Visible Minority Students as a Percentage of Total vs. Visible Minorities as a Percentage of all 15-24 Year-Olds, 2001, 2007, 2013














So, yes, a greater proportion of domestic youth self-identify as visible minorities, but that doesn’t come close to explaining what seems to be going on here.

What about changes in the survey population?  Well, it’s true that the consortium metric isn’t stable, and that there is some movement in institutions over time.  If we just look at 2007 and 2014 – a period during which the number of visible minority students almost doubled – we can see how a change in participating schools might have shifted things.

Table 1: Schools Participating in CUSC First-Year Survey, 2007 and 2013




















Here’s what stands out to me on that list.  York and Waterloo are in the 2013 survey, but were not there in 2007, which you’d think would skew the 2013 data a bit higher on visible minorities (although not that much – together, these two schools were only 7% of total sample).  On the other hand, UBC Vancouver was there in the 2007 survey, but not 2013, which you’d think would skew things the other way.  On the basis of this, I’d say a school participation probably contributed somewhat to the change, but was not decisive.

I could end this post with a call for better data (always a good thing).  But if a trend is big enough, even bad data can pick it up.  I think that might be what we’re seeing here with the increase in visible minority students.  It’s a big, intriguing story.

September 14

Better Post-Secondary Data: Game On

On Saturday morning, the US Department of Education released the College Scorecard.  What the heck is the College Scorecard, you ask?  And why did they release it on a Saturday morning?  Well, I have no earthly idea about the latter, but as for the former: it’s a bit of a long story.

You might remember that a little over a year ago, President Obama came up with the idea for the US Government to “rate” colleges on things like affordability, graduation rates, graduate earnings and the like.  The thinking was that this kind of transparency would punish institutions that provided genuinely bad value for money by exposing said poor value to the market, while at the same encouraging all institutions to become more attentive to costs and outcomes.

The problem with the original idea was three-fold.  First, no one was certain that the quality of available data was good enough.  Second, the idea of using the same set ratings for both quality improvement and to enforce minimum standards was always a bit dicey.  And third, the politics of the whole thing were atrocious – the idea that a government might declare that institution X is better than institution Y was a recipe for angry alumni pretty much everywhere.

So back in July, the Administration gave up on the idea of rating institutions (though it had been quietly backing away from it for months); however, it didn’t give up on the idea of collecting and disseminating the data.  Thus, on Saturday, what it released instead was a “scorecard”; a way to look up data on every institution without actually rating those institutions.  But also – and this is what had nerds in datagasm over the weekend – it released all of the data (click “download all data” here).  Several hundred different fields worth.  For 20 years. It’s totally unbelievable.

Some of the data, being contextual, is pretty picayune: want to know which institution has the most students who die within four years of starting school?  It’s there (three separate branches of a private mechanics school called Universal Technical Institute).  But other bits of the data are pretty revealing.  School with the highest average family income? (Trinity College, Connecticut.)  With the lowest percentage of former students earning over $25,000 eight years after graduation? (Emma’s Beauty Academy in Mayaguez, PR.)  With the highest default rates? (Seven different institutions – six private, one public – have 100% default rates.)

Now, two big caveats about this data.  The first is that institutional-level data isn’t, in most cases, all that helpful (graduate incomes are more a function of field of study than institution, for instance). The second caveat is that information around former students and earnings relate only to student aid recipients (it’ a political/legal thing – basically, the government could look up the post-graduation earnings for students who received aid, but not for students who funded themselves).  The government plans to rectify that first caveat ahead of next year’s release; but you better believe that institutions will fight to their dying breath over that second caveat, because nothing scares them more than transparency.  As a result, while lots of the data is fun to look at, it’s not exactly the kind of stuff with which students should necessarily make decisions (a point made with great aplomb by the University of Wisconsin’s Sara Goldrick-Rab.

Caveats aside, this data release is an enormous deal.  It completely raises the bar for institutional transparency, not just in the United States but everywhere in the world.  Canadian governments should take a good long look at what America just did, and ask themselves why they can’t do the same thing.

No… scratch that.  We ALL need to ask governments why they can’t do this.  And we shouldn’t accept any answers about technical difficulties.  The simple fact is that it’s a lack of political will, an unwillingness to confront the obscurantist self-interest of institutions.

But as of Saturday, that’s not good enough anymore.  We all deserve better.

September 11

Party Platform Analysis: The Greens

So, we’ve been in this ghastly election period for several weeks now, but it’s just starting to get interesting, with parties releasing actual platforms.  I’ll be putting together briefs on each of the parties as they come out, starting today.

Let’s start with the Green Party, which is the first to have released a complete platform.  This platform is slimmer than the sprawling 185-page monstrosity the Party had up on its website for the first weeks of the campaign, and which contained all sorts of fun stuff, like family policy that had been outsourced to Fathers 4 Justice.  It’s slicker, and presumably represents what the party thinks are the most salable bits of their full-range of endorsed policies.

So, here’s what they say they’ll do on post-secondary education.  First, they are going to abolish tuition fees for domestic students, progressively, by 2020.  Second, they will lift the 2% cap on annual increases to the Post-Secondary Student Support Program to First Nations (though why this would be necessary if tuition was eliminated isn’t entirely clear).  Third, they have a plan to cap federal student debt at $10,000 (again, with tuition eliminated from need, not entirely clear this would be necessary).  Fourth, they will abolish interest on student loans (what’s left of them), and increase bursaries (though why you’d need to if tuition was abolished, and loans capped, isn’t clear).  This, it says, will “jump-start the Canadian economy”.  On top of that, the Party says it is unacceptable that youth unemployment is twice the national average (though, in fact, internationally this is on the low side), so it will be spending $1 billion per year to employ students directly through an Environmental Service Corps.

On science policy, there is a lot about “evidence-based decision making” (though this seems to be conspicuously absent in post-secondary policy), and a promise to restore funding to scientific endeavours within the Government of Canada (e.g. at Parks Canada, Health Canada, etc.), but nothing whatsoever with respect to granting councils.

The costing for all of this is somewhat dubious.  The party puts the cost of eliminating domestic tuition at a mere $5 billion, which is about $3 billion short of where it is, and probably closer to $4 billion by the time the plan is supposed to be rolled out (seriously – just multiply the 1 million domestic university students by average tuition, and you get a number bigger than what the Greens seem to be assuming).  But on the other hand, they’ve probably over-budgeted on student debt relief; they have this costed at $2.5 billion in the year of maximum costs (it declines after that), whereas I can’t see how it would cost more than $1 billion even if they didn’t get rid of tuition (briefly: average federal debt of those with debt over $10,000 is about $19,000, and only 46% of the 200,000 or so who graduate each year have federal debt over $10,000, so 92k x $9,000 = $888 million).  So while the Party clearly hasn’t a clue what it’s talking about in terms of costing, at least the errors aren’t all in the same direction.

To its credit, the Party is planning to partially pay for this ambitious agenda by cutting $1.6 billion in education tax credits. But that still leaves a net bill of about $4.5 billion (their numbers – about $7.5 billion in actual fact) in 2019-2020.  And for what?  To make education cheaper for people who already go?  To transfer billions back to the upper middle-class?  To be – as the intro to the policy suggests – more like Germany and Austria, where access rates are actually significantly worse than they are here?

What should we think of such a platform?  Well, even if we ignore the fact that it’s a massive net transfer of wealth to the upper-middle-class (such benefits as the poor would receive from lower tuition would be counteracted by the loss of offsetting subsidies) it’s a pretty poor showing.  Is there really nothing better we could do in higher education with $8 billion than to make it cheaper?  What about using that money to hire more professors?  Do more with IT?  Invest in research?

No, apparently it’s all about cheaper.  And for what?  To be more like Germany and Austria, which have lower access rates than we do?  This is stupid policy made by people who can’t count.  The Greens can and should do better than this.

More as the parties release platform details.

September 10

Improving the Discourse on Skills and Education

Recently, I did a fascinating set of roundtable discussions with employers and employer associations, and it brought home to me how one-dimensional much of our talk is regarding skills.

Broadly speaking, there are four sets of skills employers care about.  The first are job- or occupation-related skills: can a mechanic actually fix a car? Can an architect design buildings? And so on.  By and large, if you ask employers whether universities and colleges are successfully providing their graduates with this skill set, they say yes (in some fields, in some parts of the country, there are complaints that there aren’t enough graduates, but that’s a different story).  And that’s true more or less across blue-collar and white-collar occupations.

Then there’s a set of skills that, in Canada, go by the name: “essential skills” or “foundational skills”.  Most of this is basic literacy/numeracy stuff, but with communication, basic teamwork, and (increasingly) IT skills in there as well.  Here, Canada has a problem, and employers are not shy when it comes to talking about this.  Secondary school dropouts and recent immigrants who have yet to fully master one of our official languages tend to have the most problems with these skills, and the issue is concentrated in certain industries and occupations.  This tends to affect blue-collar jobs more than white-collar ones, but it’s also an issue in lower-level health and social service occupations (especially IT skills).

The third set of skills often gets called “soft-skills” or “integrative skills”.  This involves workplace savvy, primarily in white-collar industries: knowing how to act with clients, basic business and financial skills, how to operate in a multi-disciplinary/multi-functional team, and those somewhat nebulous qualities of critical thinking and problem-solving.  Basically, this is the stuff that Arts faculties claim to give you: the integrative thinking skills that keep businesses running.  They’re not the skills that get you hired, but they’re the skills that get you promoted.  Again, this is an area where employers who need these skills voice frustration with new graduates.

Finally, there are what get termed “leadership skills”.  It’s not always 100% clear what employers mean by this, but it usually is thought of as being different (and of a higher order) than the integrative thinking skills.  Again, this isn’t desired across the board: companies are hierarchies, and not everyone is at the top, so it’s actually a set of traits necessary in only a few.  But again, companies see these as lacking in young people, though, to be honest, young grads don’t have the experience to be put in leadership roles, and so it’s actually something they’ll need to a few years into their careers.

Now, when someone in business starts talking about a skills crisis, we mostly assume they mean that their new hires are lacking some set of skills, and lots of people (some of them inside the system itself) therefore use this as a stick with which to beat educational institutions for not doing their jobs.  But usually, when business says it needs skills, what it actually means is that it needs experienced workers with lots of on-the-job skills.  As such, what educational institutions can contribute in the short-term is pretty marginal.

But even to the extent that institutions can contribute – say, over the medium-to-long term – a simple desire for “more skills” doesn’t help very much.  Skills profiles vary enormously from occupation-to-occupation, and so too do perceptions of which skills are missing in each one.  Even within a single company, needs may vary substantially from one job to the next.  Getting business to be more specific about needs is a huge and urgent task.

Exacerbating this problem is our insistence that programs at different levels of education have to be of common length (mostly 4-year Bachelor’s degrees outside Quebec, mostly 2-year college programs outside Ontario).  For some occupations, this might be too much time in-class; for others it might not be enough.  If you’re running a 2-year program and someone tells you that grads need “more skills”, then the biggest question is: what should be dropped from the existing curriculum?  Forget competency-based education; we’d be a lot better-off if we could just get competency-adjusted curriculum lengths.  But here, governments tend to (unhelpfully) prefer standardized solutions.

Anyways, this is stuff that institutions – particularly community colleges – deal with all the time.  It’s a thankless but necessary job; getting it right is literally the foundation of the nation’s prosperity.

September 09

The Growth of Administration (Part 2)

In yesterday’s blog, I ended on the observation that over the period 2000-2012 at the 12 major universities where we have data (UBC, SFU, Alberta, Calgary, USask, Manitoba, Carleton, York, Toronto, Waterloo, Western, and Memorial) the rate of growth of support staff and administration was 16% faster than the rate of growth of academic staff.  To wit:

Figure 1: Growth in Support/Admin Positions vs Faculty Positions, 12 Large Institutions, 2000-2012














But that’s a 12-institution average.  In fact, very few individual institutions exhibit anything like this pattern.

Figure 2 shows increases in faculty and staff complements at each of the 12 institutions.  Some caution is required with the numbers: notably, while the definitions of “faculty” and “staff” are consistent over time at every institution, they differ across institutions in a number of ways.  So what you want to focus on here, above all, is the inter-institutional differences in the gap between staff and faculty hires.

Figure 2: Patterns in Institutional Staff Growth, 12 Institutions, 2000-2012














(To be clear: “staff” here includes any position that is not an academic post, the term does not only pertain to professional staff.  I’ll get to why this distinction is important later in the post.)

To start at the left side of the graph: Saskatchewan, Toronto, and Alberta are three institutions where administrative/support hiring massively outstripped faculty hiring.  At Saskatchewan, staff numbers went up 68% over the period 2000-2012, compared to faculty growth of just 19%.  At Toronto, the comparable figures were 49% and 2%; at Alberta, it was 53% and 18%.  These are places where claims of administrative bloat seem pretty clear cut.

But move along to the right, and one realizes the problem (if indeed it is one) isn’t universal.  At places like UBC, York, and Carleton, growth in admin/support is only slightly higher than growth in faculty numbers (note: our time period misses some of the recent growth from 2013 & 2014, to which Gary Mason’s article referred. At Waterloo and Calgary, faculty numbers increased more quickly than admin/support numbers from 2000-2012.

(If you’re wondering why SFU and MUN are off to one side, it’s because, over the course of the past decade, these two seem to have had some kind of change in how support staff were counted.  In MUN’s case, it seems to have resulted in a one-time loss of about 300 staff; at SFU, it led to a gain of 300 staff.  As a result of these shifts, SFU’s growth in support staff looks titanic, while Memorial appears to have shed staff.  Neither scenario is likely.  The grey bar for those two institutions are my very crude “best-guess” attempt to adjust for these definitional changes.)

The lesson here is that there really isn’t a single pattern prevalent across all institutions.  At some places, admin/support numbers are clearly growing wildly; at others, they are pretty stable.  It’s therefore probably better if people stopped making generalizations on this topic.

In the next post on this subject, which I’ll do sometime next week, I’ll try to answer the question: what are all these new staff doing, anyway?


September 08

The Growth of Administration (Part 1)

So, last week we talked about growth in non-academic staff; however, due to data limitations, we could only talk about dollars rather than numbers.  This is because no one actually collects non-academic staff numbers in Canada, and so most of the data (and anecdotes) around “academic bloat” comes from the US.  Last winter, I became sufficiently frustrated with fact-free arguments about “bloated administration” that I devoted part of my holiday to gathering data on this phenomenon.  I never quite finished the project back then, but last week gave me the nudge to try to put all of this data on the table.

You might wonder how I did this, given that no one keeps track of data, nationally.  Well, individual institutions *do* keep track of these numbers.  And some of them even put data up on the web, in things called “factbooks” or “databooks”.  And while the data definitions are sufficiently diverse that you can’t really make a national database from this information, it is possible to track changes over time at individual campuses, and then aggregate those changes across institutions.  So that’s what I did.

For my sample, I took 25 universities, which collectively comprise about 75% of the national system: the U-15, plus SFU, Victoria, Carleton, York, Ryerson, Guelph, Concordia, UQAM, UNB, and Memorial.  Of these, only about half had usable public institutional data on staff. UQAM, Concordia, Laval, and Montreal have very little institutional data online; apparently, Quebec schools don’t seem to think making data public is particularly important.  McGill’s website indicates that an institutional factbook containing such data exists; unfortunately, it is password-protected, because obviously the public can’t be trusted with such things (UNB’s data is also password-protected).  Dalhousie publishes a little bit of data (mostly about students), and very little else.  Queen’s has a “fast facts” page that touches on faculty numbers, but only back to about 2009.  Finally, Victoria, Guelph, and Ryerson all publish loads of institutional data online, but nothing on non-academic staff.

That leaves us with 14 institutions.  York, Carleton, and Calgary are all officially awesome, and have staff data on their websites going all the way back to 1990.  UBC, SFU, Alberta, Saskatchewan, Manitoba, Western, Waterloo, Toronto, and Memorial all have data back to 2000 (though in some cases, a trip to the Wayback Machine is required to get at it all). McMaster and Ottawa at least have data back to 2005.

So, what patterns do we see when we look at data from these institutions?  Well, if we just look at the national picture from 2005 to 2012 at the 14 institutions (remember: I did this last Christmas – there would probably be another year worth of data available if I did this again), we see that support and admin personnel numbers grew by a little over 17%, compared to a rate of faculty growth of about 11%.

Figure 1: Growth in Support/Admin positions vs Faculty positions, 14 large institutions, 2005-2012














However, if we pull Ottawa and McMaster out of the picture (because their data doesn’t go further back than this) and take the long view, back to 2000, we get a more striking picture.  At the 12 institutions where the best data is available, the rate of growth of admin and support staff outstripped academic staff growth by about 16% over twelve years.

Figure 2: Growth in Support/Admin positions vs Faculty positions, 12 large institutions, 2000-2012














We unfortunately cannot tell whether the pattern at our 12 universities is representative of trends across all institutions.  But these 12 collectively account for about 36% of the system by enrolments, which is a pretty big sample, so it’s unlikely that full national trends differ too much from this.

But there are more interesting stories to tell once you drill down into the data at a little more depth.  Tune in tomorrow.

Intrigued by this data so far?  Want to add your institution’s data to this list?  Send us a note at info@higheredstrategy.com.

Page 14 of 96« First...1213141516...203040...Last »