Higher Education Strategy Associates

September 15

Visible Minority Numbers Rise Sharply

I was poking around some data from the Canadian Undergraduate Survey Consortium the other day and I found some utterly mind-blowing data.  Take a look at these statistics on the percentage of first-year students self-identifying as a “visible minority” on the Consortium’s triennial Survey of First Year Students:

Figure 1: Self-Identified Visible Minority Students as a Percentage of Entering Class, 2001-2013














Crazy, right?  Must be all those international students flooding in.

Er, no.  Well, there are more students with permanent residences outside Canada, but they aren’t necessarily affecting these numbers, because they represent only about 7% of survey respondents.  If we assume that 80% of these students are themselves visible minorities, and we pull them out of the data, the visible minority numbers look like this:

Figure 2: Visible Minority Students, International* vs. Domestic, 2001-2013














*assumes 80% of students with permanent residences outside Canada are “visible minorities”

That’s still a heck of a jump.  Maybe it has something to do with the changing demographics of Canadian youth?

Well, we can sort of track this by looking at census data on visible minorities, aged 15-24, from 2001 and 2006, and (yes, yes, I know) the 2011 National Household Survey, and then marry these up with the 2001, 2007, and 2013 CUSC data.  Not perfect, but it gives you a sense of contrasting trends.  Here’s what we find.

Figure 3: Domestic Visible Minority Students as a Percentage of Total vs. Visible Minorities as a Percentage of all 15-24 Year-Olds, 2001, 2007, 2013














So, yes, a greater proportion of domestic youth self-identify as visible minorities, but that doesn’t come close to explaining what seems to be going on here.

What about changes in the survey population?  Well, it’s true that the consortium metric isn’t stable, and that there is some movement in institutions over time.  If we just look at 2007 and 2014 – a period during which the number of visible minority students almost doubled – we can see how a change in participating schools might have shifted things.

Table 1: Schools Participating in CUSC First-Year Survey, 2007 and 2013




















Here’s what stands out to me on that list.  York and Waterloo are in the 2013 survey, but were not there in 2007, which you’d think would skew the 2013 data a bit higher on visible minorities (although not that much – together, these two schools were only 7% of total sample).  On the other hand, UBC Vancouver was there in the 2007 survey, but not 2013, which you’d think would skew things the other way.  On the basis of this, I’d say a school participation probably contributed somewhat to the change, but was not decisive.

I could end this post with a call for better data (always a good thing).  But if a trend is big enough, even bad data can pick it up.  I think that might be what we’re seeing here with the increase in visible minority students.  It’s a big, intriguing story.

September 14

Better Post-Secondary Data: Game On

On Saturday morning, the US Department of Education released the College Scorecard.  What the heck is the College Scorecard, you ask?  And why did they release it on a Saturday morning?  Well, I have no earthly idea about the latter, but as for the former: it’s a bit of a long story.

You might remember that a little over a year ago, President Obama came up with the idea for the US Government to “rate” colleges on things like affordability, graduation rates, graduate earnings and the like.  The thinking was that this kind of transparency would punish institutions that provided genuinely bad value for money by exposing said poor value to the market, while at the same encouraging all institutions to become more attentive to costs and outcomes.

The problem with the original idea was three-fold.  First, no one was certain that the quality of available data was good enough.  Second, the idea of using the same set ratings for both quality improvement and to enforce minimum standards was always a bit dicey.  And third, the politics of the whole thing were atrocious – the idea that a government might declare that institution X is better than institution Y was a recipe for angry alumni pretty much everywhere.

So back in July, the Administration gave up on the idea of rating institutions (though it had been quietly backing away from it for months); however, it didn’t give up on the idea of collecting and disseminating the data.  Thus, on Saturday, what it released instead was a “scorecard”; a way to look up data on every institution without actually rating those institutions.  But also – and this is what had nerds in datagasm over the weekend – it released all of the data (click “download all data” here).  Several hundred different fields worth.  For 20 years. It’s totally unbelievable.

Some of the data, being contextual, is pretty picayune: want to know which institution has the most students who die within four years of starting school?  It’s there (three separate branches of a private mechanics school called Universal Technical Institute).  But other bits of the data are pretty revealing.  School with the highest average family income? (Trinity College, Connecticut.)  With the lowest percentage of former students earning over $25,000 eight years after graduation? (Emma’s Beauty Academy in Mayaguez, PR.)  With the highest default rates? (Seven different institutions – six private, one public – have 100% default rates.)

Now, two big caveats about this data.  The first is that institutional-level data isn’t, in most cases, all that helpful (graduate incomes are more a function of field of study than institution, for instance). The second caveat is that information around former students and earnings relate only to student aid recipients (it’ a political/legal thing – basically, the government could look up the post-graduation earnings for students who received aid, but not for students who funded themselves).  The government plans to rectify that first caveat ahead of next year’s release; but you better believe that institutions will fight to their dying breath over that second caveat, because nothing scares them more than transparency.  As a result, while lots of the data is fun to look at, it’s not exactly the kind of stuff with which students should necessarily make decisions (a point made with great aplomb by the University of Wisconsin’s Sara Goldrick-Rab.

Caveats aside, this data release is an enormous deal.  It completely raises the bar for institutional transparency, not just in the United States but everywhere in the world.  Canadian governments should take a good long look at what America just did, and ask themselves why they can’t do the same thing.

No… scratch that.  We ALL need to ask governments why they can’t do this.  And we shouldn’t accept any answers about technical difficulties.  The simple fact is that it’s a lack of political will, an unwillingness to confront the obscurantist self-interest of institutions.

But as of Saturday, that’s not good enough anymore.  We all deserve better.

September 11

Party Platform Analysis: The Greens

So, we’ve been in this ghastly election period for several weeks now, but it’s just starting to get interesting, with parties releasing actual platforms.  I’ll be putting together briefs on each of the parties as they come out, starting today.

Let’s start with the Green Party, which is the first to have released a complete platform.  This platform is slimmer than the sprawling 185-page monstrosity the Party had up on its website for the first weeks of the campaign, and which contained all sorts of fun stuff, like family policy that had been outsourced to Fathers 4 Justice.  It’s slicker, and presumably represents what the party thinks are the most salable bits of their full-range of endorsed policies.

So, here’s what they say they’ll do on post-secondary education.  First, they are going to abolish tuition fees for domestic students, progressively, by 2020.  Second, they will lift the 2% cap on annual increases to the Post-Secondary Student Support Program to First Nations (though why this would be necessary if tuition was eliminated isn’t entirely clear).  Third, they have a plan to cap federal student debt at $10,000 (again, with tuition eliminated from need, not entirely clear this would be necessary).  Fourth, they will abolish interest on student loans (what’s left of them), and increase bursaries (though why you’d need to if tuition was abolished, and loans capped, isn’t clear).  This, it says, will “jump-start the Canadian economy”.  On top of that, the Party says it is unacceptable that youth unemployment is twice the national average (though, in fact, internationally this is on the low side), so it will be spending $1 billion per year to employ students directly through an Environmental Service Corps.

On science policy, there is a lot about “evidence-based decision making” (though this seems to be conspicuously absent in post-secondary policy), and a promise to restore funding to scientific endeavours within the Government of Canada (e.g. at Parks Canada, Health Canada, etc.), but nothing whatsoever with respect to granting councils.

The costing for all of this is somewhat dubious.  The party puts the cost of eliminating domestic tuition at a mere $5 billion, which is about $3 billion short of where it is, and probably closer to $4 billion by the time the plan is supposed to be rolled out (seriously – just multiply the 1 million domestic university students by average tuition, and you get a number bigger than what the Greens seem to be assuming).  But on the other hand, they’ve probably over-budgeted on student debt relief; they have this costed at $2.5 billion in the year of maximum costs (it declines after that), whereas I can’t see how it would cost more than $1 billion even if they didn’t get rid of tuition (briefly: average federal debt of those with debt over $10,000 is about $19,000, and only 46% of the 200,000 or so who graduate each year have federal debt over $10,000, so 92k x $9,000 = $888 million).  So while the Party clearly hasn’t a clue what it’s talking about in terms of costing, at least the errors aren’t all in the same direction.

To its credit, the Party is planning to partially pay for this ambitious agenda by cutting $1.6 billion in education tax credits. But that still leaves a net bill of about $4.5 billion (their numbers – about $7.5 billion in actual fact) in 2019-2020.  And for what?  To make education cheaper for people who already go?  To transfer billions back to the upper middle-class?  To be – as the intro to the policy suggests – more like Germany and Austria, where access rates are actually significantly worse than they are here?

What should we think of such a platform?  Well, even if we ignore the fact that it’s a massive net transfer of wealth to the upper-middle-class (such benefits as the poor would receive from lower tuition would be counteracted by the loss of offsetting subsidies) it’s a pretty poor showing.  Is there really nothing better we could do in higher education with $8 billion than to make it cheaper?  What about using that money to hire more professors?  Do more with IT?  Invest in research?

No, apparently it’s all about cheaper.  And for what?  To be more like Germany and Austria, which have lower access rates than we do?  This is stupid policy made by people who can’t count.  The Greens can and should do better than this.

More as the parties release platform details.

September 10

Improving the Discourse on Skills and Education

Recently, I did a fascinating set of roundtable discussions with employers and employer associations, and it brought home to me how one-dimensional much of our talk is regarding skills.

Broadly speaking, there are four sets of skills employers care about.  The first are job- or occupation-related skills: can a mechanic actually fix a car? Can an architect design buildings? And so on.  By and large, if you ask employers whether universities and colleges are successfully providing their graduates with this skill set, they say yes (in some fields, in some parts of the country, there are complaints that there aren’t enough graduates, but that’s a different story).  And that’s true more or less across blue-collar and white-collar occupations.

Then there’s a set of skills that, in Canada, go by the name: “essential skills” or “foundational skills”.  Most of this is basic literacy/numeracy stuff, but with communication, basic teamwork, and (increasingly) IT skills in there as well.  Here, Canada has a problem, and employers are not shy when it comes to talking about this.  Secondary school dropouts and recent immigrants who have yet to fully master one of our official languages tend to have the most problems with these skills, and the issue is concentrated in certain industries and occupations.  This tends to affect blue-collar jobs more than white-collar ones, but it’s also an issue in lower-level health and social service occupations (especially IT skills).

The third set of skills often gets called “soft-skills” or “integrative skills”.  This involves workplace savvy, primarily in white-collar industries: knowing how to act with clients, basic business and financial skills, how to operate in a multi-disciplinary/multi-functional team, and those somewhat nebulous qualities of critical thinking and problem-solving.  Basically, this is the stuff that Arts faculties claim to give you: the integrative thinking skills that keep businesses running.  They’re not the skills that get you hired, but they’re the skills that get you promoted.  Again, this is an area where employers who need these skills voice frustration with new graduates.

Finally, there are what get termed “leadership skills”.  It’s not always 100% clear what employers mean by this, but it usually is thought of as being different (and of a higher order) than the integrative thinking skills.  Again, this isn’t desired across the board: companies are hierarchies, and not everyone is at the top, so it’s actually a set of traits necessary in only a few.  But again, companies see these as lacking in young people, though, to be honest, young grads don’t have the experience to be put in leadership roles, and so it’s actually something they’ll need to a few years into their careers.

Now, when someone in business starts talking about a skills crisis, we mostly assume they mean that their new hires are lacking some set of skills, and lots of people (some of them inside the system itself) therefore use this as a stick with which to beat educational institutions for not doing their jobs.  But usually, when business says it needs skills, what it actually means is that it needs experienced workers with lots of on-the-job skills.  As such, what educational institutions can contribute in the short-term is pretty marginal.

But even to the extent that institutions can contribute – say, over the medium-to-long term – a simple desire for “more skills” doesn’t help very much.  Skills profiles vary enormously from occupation-to-occupation, and so too do perceptions of which skills are missing in each one.  Even within a single company, needs may vary substantially from one job to the next.  Getting business to be more specific about needs is a huge and urgent task.

Exacerbating this problem is our insistence that programs at different levels of education have to be of common length (mostly 4-year Bachelor’s degrees outside Quebec, mostly 2-year college programs outside Ontario).  For some occupations, this might be too much time in-class; for others it might not be enough.  If you’re running a 2-year program and someone tells you that grads need “more skills”, then the biggest question is: what should be dropped from the existing curriculum?  Forget competency-based education; we’d be a lot better-off if we could just get competency-adjusted curriculum lengths.  But here, governments tend to (unhelpfully) prefer standardized solutions.

Anyways, this is stuff that institutions – particularly community colleges – deal with all the time.  It’s a thankless but necessary job; getting it right is literally the foundation of the nation’s prosperity.

September 09

The Growth of Administration (Part 2)

In yesterday’s blog, I ended on the observation that over the period 2000-2012 at the 12 major universities where we have data (UBC, SFU, Alberta, Calgary, USask, Manitoba, Carleton, York, Toronto, Waterloo, Western, and Memorial) the rate of growth of support staff and administration was 16% faster than the rate of growth of academic staff.  To wit:

Figure 1: Growth in Support/Admin Positions vs Faculty Positions, 12 Large Institutions, 2000-2012














But that’s a 12-institution average.  In fact, very few individual institutions exhibit anything like this pattern.

Figure 2 shows increases in faculty and staff complements at each of the 12 institutions.  Some caution is required with the numbers: notably, while the definitions of “faculty” and “staff” are consistent over time at every institution, they differ across institutions in a number of ways.  So what you want to focus on here, above all, is the inter-institutional differences in the gap between staff and faculty hires.

Figure 2: Patterns in Institutional Staff Growth, 12 Institutions, 2000-2012














(To be clear: “staff” here includes any position that is not an academic post, the term does not only pertain to professional staff.  I’ll get to why this distinction is important later in the post.)

To start at the left side of the graph: Saskatchewan, Toronto, and Alberta are three institutions where administrative/support hiring massively outstripped faculty hiring.  At Saskatchewan, staff numbers went up 68% over the period 2000-2012, compared to faculty growth of just 19%.  At Toronto, the comparable figures were 49% and 2%; at Alberta, it was 53% and 18%.  These are places where claims of administrative bloat seem pretty clear cut.

But move along to the right, and one realizes the problem (if indeed it is one) isn’t universal.  At places like UBC, York, and Carleton, growth in admin/support is only slightly higher than growth in faculty numbers (note: our time period misses some of the recent growth from 2013 & 2014, to which Gary Mason’s article referred. At Waterloo and Calgary, faculty numbers increased more quickly than admin/support numbers from 2000-2012.

(If you’re wondering why SFU and MUN are off to one side, it’s because, over the course of the past decade, these two seem to have had some kind of change in how support staff were counted.  In MUN’s case, it seems to have resulted in a one-time loss of about 300 staff; at SFU, it led to a gain of 300 staff.  As a result of these shifts, SFU’s growth in support staff looks titanic, while Memorial appears to have shed staff.  Neither scenario is likely.  The grey bar for those two institutions are my very crude “best-guess” attempt to adjust for these definitional changes.)

The lesson here is that there really isn’t a single pattern prevalent across all institutions.  At some places, admin/support numbers are clearly growing wildly; at others, they are pretty stable.  It’s therefore probably better if people stopped making generalizations on this topic.

In the next post on this subject, which I’ll do sometime next week, I’ll try to answer the question: what are all these new staff doing, anyway?


September 08

The Growth of Administration (Part 1)

So, last week we talked about growth in non-academic staff; however, due to data limitations, we could only talk about dollars rather than numbers.  This is because no one actually collects non-academic staff numbers in Canada, and so most of the data (and anecdotes) around “academic bloat” comes from the US.  Last winter, I became sufficiently frustrated with fact-free arguments about “bloated administration” that I devoted part of my holiday to gathering data on this phenomenon.  I never quite finished the project back then, but last week gave me the nudge to try to put all of this data on the table.

You might wonder how I did this, given that no one keeps track of data, nationally.  Well, individual institutions *do* keep track of these numbers.  And some of them even put data up on the web, in things called “factbooks” or “databooks”.  And while the data definitions are sufficiently diverse that you can’t really make a national database from this information, it is possible to track changes over time at individual campuses, and then aggregate those changes across institutions.  So that’s what I did.

For my sample, I took 25 universities, which collectively comprise about 75% of the national system: the U-15, plus SFU, Victoria, Carleton, York, Ryerson, Guelph, Concordia, UQAM, UNB, and Memorial.  Of these, only about half had usable public institutional data on staff. UQAM, Concordia, Laval, and Montreal have very little institutional data online; apparently, Quebec schools don’t seem to think making data public is particularly important.  McGill’s website indicates that an institutional factbook containing such data exists; unfortunately, it is password-protected, because obviously the public can’t be trusted with such things (UNB’s data is also password-protected).  Dalhousie publishes a little bit of data (mostly about students), and very little else.  Queen’s has a “fast facts” page that touches on faculty numbers, but only back to about 2009.  Finally, Victoria, Guelph, and Ryerson all publish loads of institutional data online, but nothing on non-academic staff.

That leaves us with 14 institutions.  York, Carleton, and Calgary are all officially awesome, and have staff data on their websites going all the way back to 1990.  UBC, SFU, Alberta, Saskatchewan, Manitoba, Western, Waterloo, Toronto, and Memorial all have data back to 2000 (though in some cases, a trip to the Wayback Machine is required to get at it all). McMaster and Ottawa at least have data back to 2005.

So, what patterns do we see when we look at data from these institutions?  Well, if we just look at the national picture from 2005 to 2012 at the 14 institutions (remember: I did this last Christmas – there would probably be another year worth of data available if I did this again), we see that support and admin personnel numbers grew by a little over 17%, compared to a rate of faculty growth of about 11%.

Figure 1: Growth in Support/Admin positions vs Faculty positions, 14 large institutions, 2005-2012














However, if we pull Ottawa and McMaster out of the picture (because their data doesn’t go further back than this) and take the long view, back to 2000, we get a more striking picture.  At the 12 institutions where the best data is available, the rate of growth of admin and support staff outstripped academic staff growth by about 16% over twelve years.

Figure 2: Growth in Support/Admin positions vs Faculty positions, 12 large institutions, 2000-2012














We unfortunately cannot tell whether the pattern at our 12 universities is representative of trends across all institutions.  But these 12 collectively account for about 36% of the system by enrolments, which is a pretty big sample, so it’s unlikely that full national trends differ too much from this.

But there are more interesting stories to tell once you drill down into the data at a little more depth.  Tune in tomorrow.

Intrigued by this data so far?  Want to add your institution’s data to this list?  Send us a note at info@higheredstrategy.com.

September 04

Costing an Inuit University

There is an interesting initiative afoot to create something called the Inuit Nunangat University.  A workshop report on the concept is here.  Today, I thought I would contribute to the debate by looking at what such an initiative might cost.

Some background: the idea of an Arctic university is not new.  Many people have noted that Canada is the only member of the Arctic Council that does not have a university north of the Arctic Circle.  This largely has to do with a lack of major population centres, but no matter.  The Gordon Foundation wrote about this problem a few years ago.  At the time, my take on it was that the Arctic could probably support a small university on the model of the University of Greenland – roughly a dozen faculty working mainly in language and culture, with a bit of professional programming (i.e. BEds) thrown in.

Now, this new proposed university is somewhat hazy regarding scope (not surprising given that, at the moment, it’s just a workshop report).  It’s clear given that the proposal is for an Inuit university, rather than a University of Nunavut, that culture and language are going to be at the centre of the institutional mission: this proposal is less a University of the Arctic than it is an Inuit version of First Nations University.  Clearly, the authors have some big hopes for the future – programs in Science, Medicine, and Engineering are proposed – but equally clearly, any northern university is going to be fairly small for a long time.  The Inuit population of Canada is about 72,000; the population of Nunavut is about 35,000.  The territory only churns out about 240 high school graduates each year, and the local college (Nunavut Arctic College) already enrols about 1,300 students per year.  And some university-bound students will choose a southern university regardless of local options.  Put all that together and you’re very unlikely to see enrolments at such a university reach 1,000 for a long time, and 500 is probably a more realistic upper band.

In Canada, there are a number of similarly-sized stand-alone universities.  For instance, there is Université Ste. Anne (370 FT students), Canadian Mennonite University (480 FT) and The King’s University, Alberta  (670 FT students).  And while these universities are usually pretty tight  for money, they are all viable.  But they don’t have research programs to speak of, and they definitely don’t have Engineering or Medical schools attached to them.  These sorts of professional schools simply aren’t feasible without much larger student numbers.

For argument’s sake, let’s say a future Inuit Nunagat University ends up at about 600 students.  That’s close to the size of King’s University in Alberta, which somehow (honestly not sure how they do it) manages to staff faculties of Arts, Social Science, Science, and Business with about 45 full-time professors, on an annual operating budget (in 2013-14) that was just shy of $14 million.  That’s about $21,500 per student – but it doesn’t include any programs that might be considered “high-cost”.  It also assumes you can do all your programming in a single spot, rather than via distance education and community delivery; but that’s anathema in a territory that spans 2 million square miles and three time zones.  And there’s also the fact that staff costs are higher in the north.

To get a sense of what kind of adjustment factor you’d need to make to translate the $21,500 into a Nunavut context, consider the case of Nunavut Arctic College.  It provides programming in something like 25 locations across the territory, and does so at a cost of about $41,000 per student (excluding free services provided to the college by the Government of Nunavut, which would add another $7,700 or so).  That’s roughly two and a half times the per-student cost of college in the rest of the country.  So it seems fair to assume that a King’s-like institution would cost about $21,500 x 2.5 = $53,500 per student.  And that’s just for low-cost programs: no medicine, or engineering, or anything like that.  Total annual cost?  About $32 million.  And that’s before you get to any capital expenditures, or any of the other things on the workshop wish-list, like low tuition, grants, student housing, etc.

Now $32 million is a mind-bogglingly huge amount in the context of Nunavut own-source tax revenues, which are only about $180 million per year.  But since close to 90% of the Nunavut budget comes from Ottawa, it is actually only equal to a little under 2% of the entire territorial budget.  That’s still not a small ask, but it is in the realm of the financially possible, provided ambitions around program offerings remain modest.

September 03

One Lens for Viewing “Administrative Bloat”

The Globe’s Gary Mason wrote an interesting article yesterday about the Gupta resignation.  Actually, let me qualify: he wrote a very odd article, which ignored basically everything his Globe colleagues Simona Chiose and Frances Bula had reported the previous week, in order to peddle a tale in which the UBC Board fired Gupta for wanting to reduce administrative costs. This, frankly, sounds insane.  But Mason’s article did include some very eye-opening statistics on the increase of administrative staff at UBC over the past few years – such as the fact that, between 2009-10 and 2014-15, professional administrative staff numbers increased by 737, while academic staff numbers increased by only 28.  Eye-opening stuff.

And so, this seems as good a time as any to start sharing some of the institution-by-institution statistics on administrative & support (A&S) staff I’ve been putting together, which I think you will find kind of interesting.  But before I do that, I want to show you some national-level data that is of interest.  Not on actual staff numbers, mind you – that data doesn’t exist nationally.  However, through the annual CAUBO/Statscan Financial Information of Universities and Colleges (FIUC) survey, we can track how much we pay staff in various university functions.  And that gives us a way to look at where, within the university, administrative growth is occurring.

FIUC tracks both “academic” salaries and “other” (i.e. A&S) salaries across seven categories: “Instruction & Non-Sponsored Research” (i.e. at the faculty level); “Non-Credit Instruction” (i.e. cont. ed); “Library, Computing, and Communications”; “Physical Plant”; “Student Services”; “External Relations” (i.e. Government Relations plus Advancement); and, “Administration” (i.e. central administration).  Figure 1 shows the distribution of A&S salary expenditures across these different categories for 2013-14.  A little over 32% of total money is spent on faculty, while another 23% is spent in central administration.  Physical plant and student services account for about 11% apiece, while the remaining three areas account for 18% combined.

Figure 1: Distribution of A&S Salaries by Function, in 000s of Dollars, Canada, 2013-14














A zoom-in on the figures for central administration is warranted, as there has been some definitional change over time, which makes time-series analyses a bit tricky.  Back in 1998, the reporting rules were changed in a way that increased reported costs by about 30%.  Then, in 2003, about 15% of this category was hacked-off to create a new category: “external relations” – presumably because institutions wanted to draw a distinction between bits of central administration that increased revenues, and those that consumed them.  Figure 2 shows how that looks, over time.

Figure 2: Expenditure on Administrative & Support Salaries in Central Administration, in 000s of 2014 Real Dollars, Canada














Long story short: from the 80s through to the mid-90s, administrative & support salaries in central administration rose by a little over 3% per year in real terms.  Then, briefly, they fell for a couple of years, before resuming an upward trend.  Ignoring the one-time upward re-adjustment, aggregate A&S salaries in these two areas combined have been rising at 5.3%, after inflation, since 1999.  Which is, you know, a lot.

Now, let’s look at what’s been going on across the university as a whole.  Figure 3 shows changes in total A&S salary paid over time, relative to a 1979 base.  For this graph, I dropped the “non-credit” category (because it’s trivial); for central admin, I’ve both combined it with “external relations”, and corrected for the 1998 definitional change.  Also, for reference, I’ve included two dotted lines, which represent change in student numbers (in red), and change in total academic salary mass (in yellow).

Figure 3: Change in Real Total Academic & Support Salary Spending (1979-80 = 100) by Function, Canada

















Since 1979, student FTEs rose 120%, while academic salary mass doubled, after inflation.  A&S spending in libraries and physical plant rose by considerably less than this, by 27% and 57%, respectively.  A&S spending on “instruction” (that is, faculty & departmental offices) rose almost exactly in tandem with student numbers.  Spending on A&S salaries in central admin and in ICT rose about twice as fast as that, ending the 35-year period at three-and-a-half times their original rate.  But the really huge increases occurred in student services, where expenditures on A&S salaries are now six times as high as they were in 1979.

Over the next couple of weeks, I’ll be able to supplement this picture with institutional data, but the key take-aways for now are as follows: i) “central administration” salaries are growing substantially faster than enrolment and academic salary mass, but they represent less than a quarter of total A&S spending; ii) the largest component of A&S spending – that is, those reporting to academic Deans – is actually growing exactly on pace with enrolment; and, iii) the fastest-growing component of A&S spending is student services.  So, there has been a shift in A&S spending, but it’s not entirely to the bad, unless you’ve got a thing against student services.

More next week.

September 02

Some Basically Awful Graduate Outcomes Data

Yesterday, the Council of Ontario Universities released the results of the Ontario Graduates’ Survey for the class of 2012.  This document is a major source of information regarding employment and income for the province’s university graduates.  And despite the chipperness of the news release (“the best path to a job is still a university degree”), it actually tells a pretty awful story when you do things like, you know, place it in historical context, and adjust the results to account for inflation.

On the employment side, there’s very little to tell here.  Graduates got hit with a baseball bat at the start of the recession, and despite modest improvements in the overall economy, their employment rates have yet to resume anything like their former heights.

Figure 1: Employment Rates at 6-Months and 2-Years After Graduation, by Year of Graduating Class, Ontario














Now those numbers aren’t good, but they basically still say that the overwhelming majority of graduates get some kind of job after graduation.  The numbers vary by program, of course: in health professions, employment rates at both 6-months and 2-years out are close to 100%; in most other fields (Engineering, Humanities, Computer Science), it’s in the high 80s after six months – it’s lowest in the Physical Sciences (85%) and Agriculture/Biological Sciences (82%).

But changes in employment rates are mild compared to what’s been happening with income.  Six months after graduation, the graduating class of 2012 had average income 7% below the class of 2005 (the last class to have been entirely surveyed before the 2008 recession).  Two years after graduation, it had incomes 14% below the 2005 class.

Figure 2: Average Income of Graduates at 6-Months and 2-Years Out, by Graduating Class, in Real 2013/4* Dollars, Ontario














*For comparability, the 6-month figures are converted into real Jan 2013 dollars in order to match the timing of the survey; similarly, the 2-year figures are converted into June 2014 dollars.

This is not simply the case of incomes stagnating after the recession: incomes have continued to deteriorate long after a return to economic growth.  And it’s not restricted to just a few fields of study, either.  Of the 25 fields of study this survey tracks, only one (Computer Science) has seen recent graduates’ incomes rise in real terms since 2005.  Elsewhere, it’s absolute carnage: education graduates’ incomes are down 20%; Humanities and Physical Sciences down 19%; Agriculture/Biology down 18% (proving once again that, in Canada, the “S” in “STEM” doesn’t really belong, labour market-wise).  Even Engineers have seen a real pay cut (albeit by only a modest 3%).

Figure 3: Change in Real Income of Graduates, Class of 2012 vs. Class of 2005, by Time Graduation for Selected Fields of Study














Now, we need to be careful about interpreting this.  Certainly, part of this is about the recession having hit Ontario particularly harshly – other provinces may not see the same pattern.  And in some fields of study – Education for instance – there are demographic factors at work, too (fewer kids, less need of teachers, etc.).  And it’s worth remembering that there has been a huge increase in the number of graduates since 2005, as the double cohort – and later, larger cohorts – moved through the system.  This, as I noted back here, was always likely to affect graduate incomes, because it increased competition for graduate jobs (conceivably, it’s also a product of the new, wider intake, which resulted in a small drop in average academic ability).

But whatever the explanation, this is the story universities need to care about.  Forget tuition or student debt, neither of which is rising in any significant way.  Worry about employment rates.  Worry about income.  The number one reason students go to university, and the number one reason governments fund universities to the extent they do, is because, traditionally, universities have been the best path to career success.  Staying silent about long-term trends, as COU did in yesterday’s release, isn’t helpful, especially if it contributes to a persistent head-in-the-sand unwillingness to proactively tackle the problem.  If the positive career narrative disappears, the whole sector is in deep, deep trouble.

September 01

The Tennessee Promise

So, yesterday I talked about a big increase in access in the UK, which seems to have little to do with tuition fees.  Today, let’s talk about a developing story in the United States, where a lowering of net prices seems to have had a big impact on access.

You may recall that in the US over the last couple of years, there has been a growing movement for free community college, something that President Obama picked up on earlier this year.  But before Obama picked up this baton, free community college had already been introduced in Republican Tennessee, where governor Bill Haslam had turned something called “the Tennessee Promise into law in 2014.

Technically, the Tennessee Promise is not “free tuition”.  It’s only available to students entering straight from high school (which is a bit weird in terms of design, but whatever).  Students have to be full-time, maintain a 2.0 average, meet regularly with a mentor, and perform eight hours of community service per term.  And technically, what it does is reduce your tuition to zero after all other forms of aid and scholarship are taken care of (this is what is known in the business as a “last dollar” scholarship).  If you apply for the award and meet the terms, government will cover your tuition to the point where your net price is zero.  For a good number of people, this means free tuition with minimal strings attached, so let’s just call it free tuition.

Now, you might expect that with this kind of incentive, enrolment might rise a bit.  And you’d be right.  According to very early results, the number of freshmen is up 29.6% over last year.  Obviously this is a pretty impressive result, but before we get too excited, we should probably find out a little more about where these new students are coming from.  Are they “new” students, or are they mostly students who would have gone to a 4-year college, but have chosen 2-year instead?  And what about students’ financial background?  If you’re poor enough to be anywhere near maximum Pell grant ($5,775), the Tennessee Promise provides no additional aid, because tuition at Tennessee Community Colleges is about $4,000.  So it may well be that what the Tennessee Promise is doing is providing aid to people higher up the income ladder.  This is a little inefficient, but since (as I noted back here) community college students tend to come from poorer backgrounds anyway, this is not as regressive as it would be if it were implemented at 4-year colleges.

We should be able to answer these questions in a few weeks (yes, Canadians, in some places data is available in weeks, rather than years).  Even though Tennessee does not track applicants by income the way the UK does, the state’s excellent annual Higher Education Fact Book does contain two pieces of data that will help us track this.  The first is college-going rates by county, which will help us understand whether the jump in participation is concentrated in higher- or lower-income counties, and the second is the percentage of students who are Pell-eligible.  I’ll keep you up-to-date on this when the data is out.

The most intriguing possibility here is that rates of attendance for Pell-eligible students might be rising, even though the Tennessee Promise provides no actual added benefit for many of them.  It may well be that simply re-packing the way we frame higher education costs (“it’s free!”) matters more than the way we actually fund it (“your tuition is $4,000, and you also have a grant for $4,500”).

This would have significant policy ramifications for us in Canada.  As we noted last year in our publication, The Many Prices of Knowledge, many students at Canadian community colleges face an all-inclusive net price that is negative, or very close to it.  Similarly, poor first-year university students in both Ontario and Quebec have negative net prices.   No one knows it, because we package aid in such a ludicrously opaque fashion, but it’s true.  And if the Tennessee data provides evidence that the packaging of aid matters as much as the content, then it will be time for Canadian governments to re-evaluate that packaging, tout de suite.

Page 30 of 111« First...1020...2829303132...405060...Last »