HESA

Higher Education Strategy Associates

Category Archives: Data

October 08

Higher Education Data Glasnost

Many people complain that there is a lack of post-secondary data in Canada.  But this is actually not true.  There are tons of data about; it’s just that institutions won’t share or publish much of it.

Let me tell you a little story.  Once upon a time, there was a small, public-minded higher education research company that wanted to create the equivalent of Statistics Canada’s university tuition fee index for colleges.  The company had completed a project like this before, but had done so in a somewhat imprecise way because of the time and effort involved in getting enrollment data necessary to weight the program-level tuition data.  And so, politely, it began asking colleges for their enrolments by program.

Now, program-level enrolments are not a state secret.  We are talking about publicly-funded institutions here, and given the number of people who use these services, this is very much the definition of public information.  Nor are these data difficult to collect or display.  Institutions know exactly how many students are in each program, because it’s the basis on which they collect fees.

And yet, across most of the country, many institutions have simply refused to provide the data.  The reasons are a mix of the understandable and the indefensible.  Some probably don’t want to do it because it’s a disruptive task outside their workplan.  Others are cautious because they don’t quite know how the data will be used (or they disagree with how it will be used) and are afraid of internal repercussions if it turns out that the shared data ends up making their institution look bad (note: we’re using the data to publish provincial averages, not institutional ones; however, in single-institution provinces like Saskatchewan or Newfoundland and Labrador, this can’t be helped).  A few simply just don’t want to release the data because it’s theirs.

Regardless, it is unacceptable for public institutions to conceal basic operational data for reasons of convenience.  That’s not the way publicly-funded bodies are supposed to operate in a democracy.  And so, regretfully, we’ve had to resort to filing Access to Information (ATI) requests to find out how many students attend public college programs across Canada.  Sad, but true.

It then occurred to me how many of our national higher education data problems could be solved through Access to Information legislation.  Take Simona Chiose’s very good piece in the Globe and Mail last week in which she tried to piece together what Canadian universities are doing with sessional professors, and where many institutions simply refused to give her data.  If someone simply started hitting the universities with annual ATI requests on sessional lectures, and publishing the results, we’d have good data pretty quickly. Ditto for data on teaching loads.  All that excellent comparable data the U-15 collects every year?  You can’t ATI the U-15 because it’s a private entity, but it’s easy-peasy lemon squeezy to ATI all of the U-15 members for their correspondence with the Ottawa office, and get the data that way (or, conversely, ATI the U-15’s correspondence to each university, and get the collected data that way).

Oh, I could go on here.  Want better data on staff and students?  ATI the universities that have factbooks, but refuse to put them online (hello, McGill and UNB!).  Want better data on PhD graduate outcomes?  ATI every university’s commencement programs from last year’s graduation ceremonies, and presto, you’ve got a register of 3,000 or so PhDs, most of whom can be tracked on social media to create a statistical portrait of career paths (this would take a little bit of volunteer effort, but I can think of quite a few people who would provide it, so while not easy-peasy lemon squeezy, it wouldn’t be difficult-difficult lemon difficult, either).

It’s not a cure-all of course.  Even with all that ATI data, it would take time to process the data and put it into usable formats. Plus there’s an agency problem: who’s going to put all these requests together? Personally, I think student unions are the best place to do so, if not necessarily the best-equipped to subsequently handle the data.  And of course institutional data is only part of the equation.  Statistics Canada data has to improve significantly, too, in order to better look at student outcomes (a usable retention measure would be good, as would an annual PSIS-LAD-student aid database link to replace the now-terminally screwed National Graduates Survey).

To be clear, I’m not suggesting going hair-trigger on ATIs.  It’s important to always ask nicely for the data first; sometimes, institutions and governments can be very open and helpful.  But the basic issue is that data practices of post-secondary institutions in Canada have to change.  Secrecy in the name of protecting privacy is laudable; secrecy in the name of self-interested ass-covering is not.  We need some glasnost in higher education data in this country.  If it takes a wave of ATI requests to do it, so be it.   Eventually, once enough the results of these ATI requests filter into the public realm, institutions themselves will try to get ahead of the curve and become more transparent as a matter of course.

I’d like to think there was a simpler and less confrontational way of achieving data transparency, but I am starting to lose hope.

September 24

Youth Unemployment: Some Perspective, Please

Every once in awhile, you’ll hear folks talking about the scourge of youth unemployment.  If you’re really lucky, you’ll hear them describe it as a “crisis”.  But how bad is youth unemployment, really?

Well, the quick answer is that you can’t really separate youth unemployment from general unemployment.  As Figure 1  shows, one is a function of the other.

Figure 1: Youth Unemployment Rates, 15 and Over vs. 15-24 Age Groups, Canada, 1976-2015 (Source: CANSIM 282-001.  Seasonlly-Adjusted)

1

 

 

 

 

 

 

 

 

 

 

 

 

As Figure 1 also shows, compared to most of the last 40 years, youth unemployment is currently fairly low.  In the 476 months since the Labour Force Survey began, it has been lower than it is today only 29% of the time.  If this is a crisis, it is of exceedingly long duration.

Now, what some people get upset about is the fact that youth unemployment is “twice the overall rate”.  But is that really historically unique?  Figure 2 shows the answer.

Figure 2: Ratio of 15-24 Unemployment Rate to 15 and Over Unemployment Rate (Source: CANSIM 282-001)

2

 

 

 

 

 

 

 

 

 

 

 

 

So, there are two things here on which to remark.  The first is that 2:1 isn’t an immutable ratio: it has changed over time, most notably in the mid-90s when it increased significantly.  The second thing is that the ratio is a lot more seasonal than it used to be.  It’s not entirely clear why this happened.  I had thought initially that it might have something to do with increasing PSE participation rates, but that doesn’t seem to be the case.  A mystery worth pursuing, at any rate.

In any case, we should also ask: how does Canada look in comparison to other countries?  In Figure 3, I show the ratio of youth unemployment to overall unemployment in various countries.  Canada’s current ratio – about 1.96 – is not world-beating, but significantly better than the OECD average (2.2). It suggests that the question of youth employment ratios is actually something all economies – with the exception of the Netherlands and Germany, perhaps – deal with.

Figure 3: Ratio of Youth Unemployment Rate to Overall Unemployment Rate, Selected Countries (Source: OECD)

3

 

 

 

 

 

 

 

 

 

 

 

 

To get right down to brass tacks: workers gain value with experience.  By definition then, young workers are, on average, less valuable than older workers.  This is the reason why they have trouble getting hired.  And this is why, in the end, the only way to bring down youth unemployment is to give them more value to employers; which is to say, they need more job-ready skills.

Could we do better than we are doing?  Yes, of course.  But even the best countries in the world aren’t doing much better than we are.  So, let’s work on this problem, but maybe tone down the rhetoric about the its extent.

September 21

The China “Crisis”

It’s no secret that China dominates the world market when it comes to sending students abroad.  About 20% of all globally-mobile students are from China; in countries like the US, Canada, and the UK, they are far and away the number one source of foreign students.  (In all three countries, Chinese students account for as many foreign students as the next four source countries combined.)

Now every once in awhile – more and more frequently these days – you get some bad economic data from China, and everybody wants to be the first person to predict the coming “China Crisis”: oh Dear Lord, Chinese students are going to disappear, how will everyone cope?

To which I say: chill.  The Chinese market isn’t going anywhere, at least not for economic reasons.

If the argument is that China’s financial turmoil might lower Chinese incomes, and therefore reduce the affordability of foreign education, you need to keep in mind that Chinese families don’t fund education the way we do.  They save.  A lot.  For years.  Unlike North American families, Chinese families don’t try to make things work using their current incomes.  And so unless Zhounior’s savings were fully invested in the Shanghai stock-exchange just before the crash, some short-term economic instability isn’t going to matter that much.

And if things get worse?  What if financial instability leads to political instability?  I’d say that’s more likely to lead to an increase in study abroad rather than a decrease.  For wealthy Chinese families, sending students abroad for their education is at least as much about giving kids a foot in the door for emigration as it is a tool with which to advance their careers in China.  Having your kid in a foreign university is a hedge against precisely this kind of political uncertainty.

Now, this doesn’t mean the Chinese market is impervious to decline.  The fall in the size of the Chinese university-age cohort still matters, but that’s a long-term phenomenon, not a short-term one.  The troubles that graduates have in the labour market is real, and is affecting the composition of demand for higher education.  But remember: the proportion of Chinese undergraduates who choose to study abroad every year is 1-2% of the total.  What happens in that 1-2% market is only barely related to what goes on in the mass market.  It’s like trying to guess what’s going on with Mercedes-Benz sales from the sale of Toyota Corollas. The “mass market” looks nothing like the “elite market”.

The single thing that would most disrupt the flow of students out of China would be a sudden and noticeable increase in the availability of enrolment places at prestigious domestic institutions.  That is, either the big prestigious institutions could expand, or new institutions could join the ranks of the elite; either would reduce the demand for foreign education.  But the former flat-out isn’t happening; and the latter, while not impossible, seems unlikely under present circumstances.

In short, there are solid reasons to prepare for an eventual cresting of demand from China.  But the prospect in the short-term of a bursting of the Chinese student “bubble” is less convincing.  Plan accordingly.

September 15

Visible Minority Numbers Rise Sharply

I was poking around some data from the Canadian Undergraduate Survey Consortium the other day and I found some utterly mind-blowing data.  Take a look at these statistics on the percentage of first-year students self-identifying as a “visible minority” on the Consortium’s triennial Survey of First Year Students:

Figure 1: Self-Identified Visible Minority Students as a Percentage of Entering Class, 2001-2013

1

 

 

 

 

 

 

 

 

 

 

 

 

Crazy, right?  Must be all those international students flooding in.

Er, no.  Well, there are more students with permanent residences outside Canada, but they aren’t necessarily affecting these numbers, because they represent only about 7% of survey respondents.  If we assume that 80% of these students are themselves visible minorities, and we pull them out of the data, the visible minority numbers look like this:

Figure 2: Visible Minority Students, International* vs. Domestic, 2001-2013

2

 

 

 

 

 

 

 

 

 

 

 

 

*assumes 80% of students with permanent residences outside Canada are “visible minorities”

That’s still a heck of a jump.  Maybe it has something to do with the changing demographics of Canadian youth?

Well, we can sort of track this by looking at census data on visible minorities, aged 15-24, from 2001 and 2006, and (yes, yes, I know) the 2011 National Household Survey, and then marry these up with the 2001, 2007, and 2013 CUSC data.  Not perfect, but it gives you a sense of contrasting trends.  Here’s what we find.

Figure 3: Domestic Visible Minority Students as a Percentage of Total vs. Visible Minorities as a Percentage of all 15-24 Year-Olds, 2001, 2007, 2013

3

 

 

 

 

 

 

 

 

 

 

 

 

So, yes, a greater proportion of domestic youth self-identify as visible minorities, but that doesn’t come close to explaining what seems to be going on here.

What about changes in the survey population?  Well, it’s true that the consortium metric isn’t stable, and that there is some movement in institutions over time.  If we just look at 2007 and 2014 – a period during which the number of visible minority students almost doubled – we can see how a change in participating schools might have shifted things.

Table 1: Schools Participating in CUSC First-Year Survey, 2007 and 2013

4

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Here’s what stands out to me on that list.  York and Waterloo are in the 2013 survey, but were not there in 2007, which you’d think would skew the 2013 data a bit higher on visible minorities (although not that much – together, these two schools were only 7% of total sample).  On the other hand, UBC Vancouver was there in the 2007 survey, but not 2013, which you’d think would skew things the other way.  On the basis of this, I’d say a school participation probably contributed somewhat to the change, but was not decisive.

I could end this post with a call for better data (always a good thing).  But if a trend is big enough, even bad data can pick it up.  I think that might be what we’re seeing here with the increase in visible minority students.  It’s a big, intriguing story.

September 14

Better Post-Secondary Data: Game On

On Saturday morning, the US Department of Education released the College Scorecard.  What the heck is the College Scorecard, you ask?  And why did they release it on a Saturday morning?  Well, I have no earthly idea about the latter, but as for the former: it’s a bit of a long story.

You might remember that a little over a year ago, President Obama came up with the idea for the US Government to “rate” colleges on things like affordability, graduation rates, graduate earnings and the like.  The thinking was that this kind of transparency would punish institutions that provided genuinely bad value for money by exposing said poor value to the market, while at the same encouraging all institutions to become more attentive to costs and outcomes.

The problem with the original idea was three-fold.  First, no one was certain that the quality of available data was good enough.  Second, the idea of using the same set ratings for both quality improvement and to enforce minimum standards was always a bit dicey.  And third, the politics of the whole thing were atrocious – the idea that a government might declare that institution X is better than institution Y was a recipe for angry alumni pretty much everywhere.

So back in July, the Administration gave up on the idea of rating institutions (though it had been quietly backing away from it for months); however, it didn’t give up on the idea of collecting and disseminating the data.  Thus, on Saturday, what it released instead was a “scorecard”; a way to look up data on every institution without actually rating those institutions.  But also – and this is what had nerds in datagasm over the weekend – it released all of the data (click “download all data” here).  Several hundred different fields worth.  For 20 years. It’s totally unbelievable.

Some of the data, being contextual, is pretty picayune: want to know which institution has the most students who die within four years of starting school?  It’s there (three separate branches of a private mechanics school called Universal Technical Institute).  But other bits of the data are pretty revealing.  School with the highest average family income? (Trinity College, Connecticut.)  With the lowest percentage of former students earning over $25,000 eight years after graduation? (Emma’s Beauty Academy in Mayaguez, PR.)  With the highest default rates? (Seven different institutions – six private, one public – have 100% default rates.)

Now, two big caveats about this data.  The first is that institutional-level data isn’t, in most cases, all that helpful (graduate incomes are more a function of field of study than institution, for instance). The second caveat is that information around former students and earnings relate only to student aid recipients (it’ a political/legal thing – basically, the government could look up the post-graduation earnings for students who received aid, but not for students who funded themselves).  The government plans to rectify that first caveat ahead of next year’s release; but you better believe that institutions will fight to their dying breath over that second caveat, because nothing scares them more than transparency.  As a result, while lots of the data is fun to look at, it’s not exactly the kind of stuff with which students should necessarily make decisions (a point made with great aplomb by the University of Wisconsin’s Sara Goldrick-Rab.

Caveats aside, this data release is an enormous deal.  It completely raises the bar for institutional transparency, not just in the United States but everywhere in the world.  Canadian governments should take a good long look at what America just did, and ask themselves why they can’t do the same thing.

No… scratch that.  We ALL need to ask governments why they can’t do this.  And we shouldn’t accept any answers about technical difficulties.  The simple fact is that it’s a lack of political will, an unwillingness to confront the obscurantist self-interest of institutions.

But as of Saturday, that’s not good enough anymore.  We all deserve better.

September 03

One Lens for Viewing “Administrative Bloat”

The Globe’s Gary Mason wrote an interesting article yesterday about the Gupta resignation.  Actually, let me qualify: he wrote a very odd article, which ignored basically everything his Globe colleagues Simona Chiose and Frances Bula had reported the previous week, in order to peddle a tale in which the UBC Board fired Gupta for wanting to reduce administrative costs. This, frankly, sounds insane.  But Mason’s article did include some very eye-opening statistics on the increase of administrative staff at UBC over the past few years – such as the fact that, between 2009-10 and 2014-15, professional administrative staff numbers increased by 737, while academic staff numbers increased by only 28.  Eye-opening stuff.

And so, this seems as good a time as any to start sharing some of the institution-by-institution statistics on administrative & support (A&S) staff I’ve been putting together, which I think you will find kind of interesting.  But before I do that, I want to show you some national-level data that is of interest.  Not on actual staff numbers, mind you – that data doesn’t exist nationally.  However, through the annual CAUBO/Statscan Financial Information of Universities and Colleges (FIUC) survey, we can track how much we pay staff in various university functions.  And that gives us a way to look at where, within the university, administrative growth is occurring.

FIUC tracks both “academic” salaries and “other” (i.e. A&S) salaries across seven categories: “Instruction & Non-Sponsored Research” (i.e. at the faculty level); “Non-Credit Instruction” (i.e. cont. ed); “Library, Computing, and Communications”; “Physical Plant”; “Student Services”; “External Relations” (i.e. Government Relations plus Advancement); and, “Administration” (i.e. central administration).  Figure 1 shows the distribution of A&S salary expenditures across these different categories for 2013-14.  A little over 32% of total money is spent on faculty, while another 23% is spent in central administration.  Physical plant and student services account for about 11% apiece, while the remaining three areas account for 18% combined.

Figure 1: Distribution of A&S Salaries by Function, in 000s of Dollars, Canada, 2013-14

1

 

 

 

 

 

 

 

 

 

 

 

 

A zoom-in on the figures for central administration is warranted, as there has been some definitional change over time, which makes time-series analyses a bit tricky.  Back in 1998, the reporting rules were changed in a way that increased reported costs by about 30%.  Then, in 2003, about 15% of this category was hacked-off to create a new category: “external relations” – presumably because institutions wanted to draw a distinction between bits of central administration that increased revenues, and those that consumed them.  Figure 2 shows how that looks, over time.

Figure 2: Expenditure on Administrative & Support Salaries in Central Administration, in 000s of 2014 Real Dollars, Canada

2

 

 

 

 

 

 

 

 

 

 

 

 

Long story short: from the 80s through to the mid-90s, administrative & support salaries in central administration rose by a little over 3% per year in real terms.  Then, briefly, they fell for a couple of years, before resuming an upward trend.  Ignoring the one-time upward re-adjustment, aggregate A&S salaries in these two areas combined have been rising at 5.3%, after inflation, since 1999.  Which is, you know, a lot.

Now, let’s look at what’s been going on across the university as a whole.  Figure 3 shows changes in total A&S salary paid over time, relative to a 1979 base.  For this graph, I dropped the “non-credit” category (because it’s trivial); for central admin, I’ve both combined it with “external relations”, and corrected for the 1998 definitional change.  Also, for reference, I’ve included two dotted lines, which represent change in student numbers (in red), and change in total academic salary mass (in yellow).

Figure 3: Change in Real Total Academic & Support Salary Spending (1979-80 = 100) by Function, Canada

3

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Since 1979, student FTEs rose 120%, while academic salary mass doubled, after inflation.  A&S spending in libraries and physical plant rose by considerably less than this, by 27% and 57%, respectively.  A&S spending on “instruction” (that is, faculty & departmental offices) rose almost exactly in tandem with student numbers.  Spending on A&S salaries in central admin and in ICT rose about twice as fast as that, ending the 35-year period at three-and-a-half times their original rate.  But the really huge increases occurred in student services, where expenditures on A&S salaries are now six times as high as they were in 1979.

Over the next couple of weeks, I’ll be able to supplement this picture with institutional data, but the key take-aways for now are as follows: i) “central administration” salaries are growing substantially faster than enrolment and academic salary mass, but they represent less than a quarter of total A&S spending; ii) the largest component of A&S spending – that is, those reporting to academic Deans – is actually growing exactly on pace with enrolment; and, iii) the fastest-growing component of A&S spending is student services.  So, there has been a shift in A&S spending, but it’s not entirely to the bad, unless you’ve got a thing against student services.

More next week.

September 02

Some Basically Awful Graduate Outcomes Data

Yesterday, the Council of Ontario Universities released the results of the Ontario Graduates’ Survey for the class of 2012.  This document is a major source of information regarding employment and income for the province’s university graduates.  And despite the chipperness of the news release (“the best path to a job is still a university degree”), it actually tells a pretty awful story when you do things like, you know, place it in historical context, and adjust the results to account for inflation.

On the employment side, there’s very little to tell here.  Graduates got hit with a baseball bat at the start of the recession, and despite modest improvements in the overall economy, their employment rates have yet to resume anything like their former heights.

Figure 1: Employment Rates at 6-Months and 2-Years After Graduation, by Year of Graduating Class, Ontario

1

 

 

 

 

 

 

 

 

 

 

 

 

Now those numbers aren’t good, but they basically still say that the overwhelming majority of graduates get some kind of job after graduation.  The numbers vary by program, of course: in health professions, employment rates at both 6-months and 2-years out are close to 100%; in most other fields (Engineering, Humanities, Computer Science), it’s in the high 80s after six months – it’s lowest in the Physical Sciences (85%) and Agriculture/Biological Sciences (82%).

But changes in employment rates are mild compared to what’s been happening with income.  Six months after graduation, the graduating class of 2012 had average income 7% below the class of 2005 (the last class to have been entirely surveyed before the 2008 recession).  Two years after graduation, it had incomes 14% below the 2005 class.

Figure 2: Average Income of Graduates at 6-Months and 2-Years Out, by Graduating Class, in Real 2013/4* Dollars, Ontario

2

 

 

 

 

 

 

 

 

 

 

 

 

*For comparability, the 6-month figures are converted into real Jan 2013 dollars in order to match the timing of the survey; similarly, the 2-year figures are converted into June 2014 dollars.

This is not simply the case of incomes stagnating after the recession: incomes have continued to deteriorate long after a return to economic growth.  And it’s not restricted to just a few fields of study, either.  Of the 25 fields of study this survey tracks, only one (Computer Science) has seen recent graduates’ incomes rise in real terms since 2005.  Elsewhere, it’s absolute carnage: education graduates’ incomes are down 20%; Humanities and Physical Sciences down 19%; Agriculture/Biology down 18% (proving once again that, in Canada, the “S” in “STEM” doesn’t really belong, labour market-wise).  Even Engineers have seen a real pay cut (albeit by only a modest 3%).

Figure 3: Change in Real Income of Graduates, Class of 2012 vs. Class of 2005, by Time Graduation for Selected Fields of Study

3

 

 

 

 

 

 

 

 

 

 

 

 

Now, we need to be careful about interpreting this.  Certainly, part of this is about the recession having hit Ontario particularly harshly – other provinces may not see the same pattern.  And in some fields of study – Education for instance – there are demographic factors at work, too (fewer kids, less need of teachers, etc.).  And it’s worth remembering that there has been a huge increase in the number of graduates since 2005, as the double cohort – and later, larger cohorts – moved through the system.  This, as I noted back here, was always likely to affect graduate incomes, because it increased competition for graduate jobs (conceivably, it’s also a product of the new, wider intake, which resulted in a small drop in average academic ability).

But whatever the explanation, this is the story universities need to care about.  Forget tuition or student debt, neither of which is rising in any significant way.  Worry about employment rates.  Worry about income.  The number one reason students go to university, and the number one reason governments fund universities to the extent they do, is because, traditionally, universities have been the best path to career success.  Staying silent about long-term trends, as COU did in yesterday’s release, isn’t helpful, especially if it contributes to a persistent head-in-the-sand unwillingness to proactively tackle the problem.  If the positive career narrative disappears, the whole sector is in deep, deep trouble.

July 16

Student Debt in Canada: Sorry, Still no Crisis

If you’re in the looking-at-student-debt business in Canada, your data sources are limited.  Provinces could publish their debt figures annually, but they don’t.  Canada Student Loans does publish its debt numbers annually, but it includes nothing on provincial debt, so it’s not very useful.  Statistics Canada surveys graduating students every five years, but only three years out from graduation, so the most recent data we have from that source is now five years old.  Kinda sucks.

But there is one other source of data, at least for university graduates.  That’s the triennial survey of graduating students from the Canadian University Survey Consortium, which just released its report on their 2015 data (Hey, Statscan!  17,000 responses, and a turnaround time of under four months!).  This gives us a chance to see what’s been going on the last few years by comparing the 2102 and 2015 results.

Before I get into the results, a small caveat about the data.  As its name implies, the consortium doesn’t have a fixed membership, and so comparability of results between surveys isn’t perfect.  In 2012, 37 institutions participated (n= 15,111 students), and 34 in 2015 (n=18,114).  Twenty-nine institutions did both surveys, but there was some churn.  In terms of student numbers, the 2015 survey is biased slightly more heavily towards the Atlantic (17% vs. 13%), and less heavily towards Ontario (39% vs. 42%).  Since the latter has been seeing lower average student debt of late because of its 30% tuition rebate program, one would expect a slight bias towards higher debt numbers in the 2015 survey.  In both periods, the survey sample as a whole is overweight in the Atlantic, Saskatchewan, and Manitoba, and underweight in Quebec and Ontario.

Onwards.  Here’s what happened to student debt incidence:

Figure 1: Percentage of Graduating Students With Debt, By Type of Debt

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

Not to beat around the bush: incidence is down.  By four percentage points for family debt, three each for private bank debt and government debt, and a whopping nine percent for “any debt” (and, recall, this is with a population shift that is slightly more likely to have debt). For a three-year period, that’s a simply massive change, and one heading in the right direction.

Now, how about average debt levels?

Figure 2: Average Debt Levels (Among Those with Debt), by Source of Debt, in $2015

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

Here, we have trends going in different directions.  Students are borrowing substantially less from family (for what that’s worth: in all likelihood, a substantial portion of these get forgiven), and marginally less from banks.  But government borrowing is up 6% in real dollars, which more than offsets those changes.  That’s a change for the worse, but it’s at least partially a product of a shifting survey base (my guess is that this accounts for about a quarter of this change).  CUSC does not release data by region, but I think it’s pretty safe to say that the big increases will be found in Alberta, BC, and the Maritimes.

In other words, we’re mostly seeing a continuation of trends that NGS has been showing for a decade now: average debt is rising slightly, but debt incidence is falling (while enrolments are rising, which is counter-intuitive).

Takeaway: As inconvenient as this may be for the hell-in-a-handbasket crowd, there is still no student debt crisis.

May 27

How High Can Pay Go?

A few months ago, in the midst of a very exciting battle of words at Windsor, I got into an internet discussion with a professor who was absolutely outraged by one of the administration’s proposals: namely, to put a ceiling on professors’ salary, including his, after 30 years of service.

To step back for a moment: collective bargaining agreements generally outline a grid: a series of salary scales (or ladders, or steps – pick your term), generally one for each rank, to determine compensation. Each rank’s scale has a floor and increments, usually corresponding to years of seniority. Occasionally, at places like Alberta, UBC, and Waterloo, the increments are conditioned on an annual merit review, in which case it’s possible for a faculty member to see no increase in a year, or jump more than one step in a single year, but basically the principle is the same. Compensation increases as faculty move up the scale, and the whole scale gains value every year to compensate for cost-of-living.  (For more on the Progression Through the Ranks system, see an earlier post here.)

Anyways, this professor was peeved at the thought that his salary (apparently he had 30 plus years as a full professor) could never grow by more than a cost-of-living increment.  “What’s my incentive to even show up to work?” he asked seriously, while making north of $150,000 per year. I suggested that salary ceilings were pretty normal, but he claimed this was nonsense. So I asked one of our policy analysts, Jonathan Williams, to figure out who was right.

Jonathan reviewed collective agreements across 54 Canadian universities to identify the prevalence of maxima, or ceilings, on scale compensation, and the maximum number of steps for Professors, Associate Professors and Assistant Professors. He then compared results across institution types, using the Maclean’s classifications of medical/doctoral, comprehensive, and undergraduate. Those institutions in our sample, but not in Maclean’s (e.g. Athabasca, Vancouver Island), we have left as “unclassified”. Since this is meant to be a short and convenient morning email, we’ll spare you the more detailed methodology report, but feel free to email us if you’re really curious.

Anyways, turns out the answer is slightly complicated, because while most collective agreements do have ceilings, they don’t always have them for all ranks.  As a result, in Figure 1 we display results not only by institution type, but also by rank. Across all institutions, over two-thirds have ceilings for Full Professors, and four-out-of-five have them for Assistant Professors.

Figure 1 – Percentage of Institutions With Pay Ceilings, by Academic Rank and Type of Institution

1

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Now, Figure 1 simply measures the number of collective agreements where there is some kind of pay maxima.  In practice, however, some of these scales have so many steps that almost no one will ever hit the top of the scale. For example, there are some collective agreements where there are more than 35 increments on the pay scale for Full Professors. Given that most people don’t make Full Professor until at least their mid-40s, only 80 year-olds would ever hit such a maxima (which, even with the elimination of mandatory retirement, seems a bit extreme). We therefore did a second analysis in which we counted scales containing 30 increments, or more, (i.e. incorporating 30 years of service, or more), as being equivalent to not having a ceiling at all.

Figure 2 – Percentage of Institutions With Pay Ceilings, by Academic Rank and Type of Institution (Assuming 30+ Years Per Rank Equivalent to no Ceiling)

2

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Across all institutions, this brings down the percentage with maxima by about ten percentage points. However, the effect is concentrated at comprehensive universities (like Windsor, as it happens). Of the 15 institutions where 30 or more steps existed, ten were in place at comprehensives.

However, these are, to some extent, outliers. If we look at the mean number of increments per rank, the numbers are considerably lower – most agreements have between 14 and 20 increments per rank – which is probably absurdly high for Assistant Profs, but are otherwise about right.

Figure 3 – Mean Number of Pay Increments, by Academic Rank and Type of Institution

3

 

 

 

 

 

 

 

 

 

 

 

 

 

We can also examine these patterns by region.  Ontario turns out to have the fewest institutions with maxima, especially when it comes to Assistant and Associate Professors, as shown in Figure 4. Institutions in Ontario also, on average, have about five more pay increments per scale than institutions in the rest of the country.

Figure 4 – Percentage of Institutions With Pay Ceilings, by Academic Rank and Region

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

It should be noted here that shorter times to maxima are not necessarily positive or negative, either for institutions or faculty associations. Agreements with fewer increments tend to have larger increases per increment, meaning professors may earn higher salaries more quickly. Conversely, more years may reflect more modest annual increments.

So there you have it.  Most institutions do in fact have pay grids with ceilings, although in some cases these are more abstract than real. This was a lot of effort to settle an 8-month old internet argument, but perhaps some of you will find it useful.

 

April 27

McGill vs. UBC

In eastern parts of the country, if you use the words “the three best universities in Canada”, they look at you slightly oddly.  They know you mean U of T and McGill, but they’re not 100% sure who the third one is.  “UBC?” they ask, uncertainly. This is pure eastern myopia.  Today, I will advance the proposition that by most measures, UBC is substantially ahead of McGill, and is in fact the country’s #2 university.

Let’s start with some statistics on size, just to orient ourselves. UBC is the slightly bigger institution, and at both institutions graduate students account for about 26% of all FTEs.

Enrolment and Academic Staff Complement, UBC vs. McGill

Screen Shot 2015-04-26 at 10.46.38 PM

 

Now let’s look at money.  The two institutions have similar-sized endowments, a shade over $1.3 Billion, which is a point in McGill’s favour when adjusted for student body size.  When it comes to operating budget, however, there is simply no comparison: UBC’s has a total budget of $2.2 billion, and an operating budget of $1.1 billion; the equivalent for McGill is  $1.4 billion and $620 million.  On an unadjusted basis, UBC takes in $58,500 per FTE student, to McGill’s $40,493 – a 44% gap in UBC’s favour.  If we adjust for student body composition – that is, convert all FTE’s into Weighted FTEs based on field of study, and use the weights used by the Quebec government (see here for more details) – then the gap actually increases somewhat to 48%.  Point UBC.

Figure 1: Total Income per Student and per Weighted Student Unit, 2011-12, UBC vs. McGill

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

Now let’s look at some measures of research output, like bibliometrics.  This data is taken from the 2013-14 Leiden rankings, which is the most comprehensive publicly available list of bibliometric indicators.  On sheer volume of publications UBC wins, which probably isn’t surprising given its size.  But on measures of publication impact – normalized citation scores, and the percentage of papers among the 10% most-cited in its field in the past five years – UBC is ahead in both, as it is in the percentage of papers that involved collaboration with an industry.  Only in the category of papers with international collaborators does McGill come out on top.  Point UBC.

World Position in Leiden Rankings on Selected Bibliometric Indicators, UBC vs. McGill (Leiden Rankings, 2014-15)

Screen Shot 2015-04-26 at 10.47.58 PM

 

 

 

 

 

 

While we’re at the research output game, we might as well see what the “Big Three” (Shanghai ARWU, Times Higher, and QS) international rankings say, all of which are mostly based either on research or prestige (as measured by surveys of academics).  Two of the three say: point UBC

Positions in Major International Rankings, 2014/15, UBC vs. McGill

Screen Shot 2015-04-26 at 10.48.35 PM

 

 

 

 

Does faculty pay matter?  Here’s the most recent average pay data from the three institutions.  UBC wins again, by about 20% at the level of assistant profs, and 15% above that.  Still: point UBC.

Figure 2: Average academic staff pay by rank, UBC vs. McGill

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

Now this is from the Statistics Canada, Full-time University and College Academic Staff Survey, 2009-10.  And yes, that’s old, but it’s the last year for which we have data from both institutions because Statscan discontinued the survey, and as far as I know, the COU-led replacement survey hasn’t reported anything publicly yet.  And given both institutions’ limits on salary increases the last few years, I doubt the gap has changed much.

And of course, there’s student experience.  Here are the two universities compared on the main aspects of student satisfaction, using data from the final Canadian University Report.  These are scored on a 9-point scale.  Point: McGill.

Select Measures of Student Satisfaction, Canada

Screen Shot 2015-04-26 at 10.34.48 PM

 

In other words, UBC comes ahead on most measures.  And when you think about it, this isn’t all that surprising.  It has far more money than McGill, it has huge endowment lands, which represent a huge future income, and it is far better positioned to take advantage of the rise of Asia.  Arguably, given the imbalance in resources, the question is: why isn’t UBC even further ahead of McGill than it actually is? (Or, to reverse that: well done to McGill for being so efficient!)

To conclude: UBC is fairly clearly ahead of McGill – the question now is when will it overtake U of T?

Page 5 of 16« First...34567...10...Last »