HESA

Higher Education Strategy Associates

Category Archives: Data

September 15

Why our Science Minister is Going to be Disappointed in Statscan

Last week Statscan sent me a consultation form asking my opinions about their ideas on how to change UCASS (the University and College Academic Staff Survey, which like most Statscan products containing the word “college” does not actually include the institutions most of us call “colleges” i.e. community colleges).  I’ve already said something about this effort back here to the effect that focussing so much effort on data collection re: part-time staff is a waste of time, but the consultation guide makes me think Statscan is heading into serious trouble with this survey reboot for a completely different set of reasons.

Remember that when the money for all this was announced, the announcement was made by our Minister of Science, Kristy Duncan.  One of her priorities as Minister has been to improve equity outcomes in scientific hiring, particularly when it comes to things like Canada Research Chairs (see here for instance).  The focus of her efforts has usually been gender, but she’s also interested in other equity populations – in particular, visible minorities, Indigenous peoples, and persons with disabilities.  So one of the things she charged Statscan with doing in this revived UCASS (recall that Statscan cut the program for five years as a result of Harper-era cuts) is to help shine a light on equity issues in terms of salaries, full-time/part-time status, and career progression.

This is all fine except for one tiny thing.  UCASS is an not a questionnaire-based instrument.  It’s an administrative survey.  That means institutions fill in a complicated set of sheets to provide Statscan with hundreds of different aggregated data cuts about their institution (what is the average salary of professors in Classics?  How many professors in chemical engineering are female?  Etc).  In order to use UCASS to address the demographic questions Duncan wants answered, institutions would first need to know the answer themselves.  That is, they would need to know precisely which instructors have disabilities, or which are “visible minorities”, just as they currently know everyone’s gender.  Which means they would need to find a way to make such disclosures mandatory, otherwise they would not be able to report to Statistics Canada.

I tried this idea out on my twitter audience over the weekend.  Let’s just say it did not go over very well.  A significant number of responses were, essentially: “over my dead body do I give this information to my employer.  If Statscan wants to know this, they can ask me directly.”

Well, yes, they could I suppose, but then the resulting data couldn’t be linked to administrative information on rank and salary without getting each faculty member’s permission, which I can see not always being forthcoming.  In addition, you’d have all sorts of non-response bias issues to deal with, especially if they tried to do this survey every year – my guess is most profs would simply ignore the survey after year 2.  And yes, you’d have to do it frequently because not all disabilities are permanent.

Here’s my suggestion.  Statscan should actually do two surveys.  Keep UCASS more or less the way it is, extend it to colleges (some of whom will take a decade to report properly but that’s life) and part-timers (if they must – frankly, I think more people would be interested in data on non-academic staff than in data on part-time staff) but don’t mess around with the basic structure or try to force professors into reporting on their demographic characteristics – other than gender, which is already in there – to their employers because that’s just more trouble than it’s worth.  Then, every five years or so, do a second survey so in which you take a demographic snapshot of the professoriate as a whole.  It will have mediocre completion rates, but it’s better than nothing.

(In fact, universities and colleges could do this themselves if they wanted to at a cost much lower than whatever Statscan will end up paying, but since they almost never collaborate on creating public data without a gun to their heads it seems like some federal intervention is inevitable if anyone wants this done).

This is not what Minister Duncan asked for, I know.  But it’s the only way to get her the data she wants without causing mayhem on campuses.  Hopefully, pragmatism will prevail here.

September 14

Notes on the Finances of China’s Top Universities

One of my distractions over the past summer has been to learn more about Chinese universities.  And, fortunately, this is becoming a lot easier as Chinese universities are starting to put more of their data online.  Today, I just want to take you through a bit of a tour of China’s top universities (roughly the equivalent of the US Ivy League), which are known as the “C9”, most of which now put their financial data online.

So let’s start just by looking at raw annual expenditures (I prefer using expenditures to income as a guide to a university size because it tends to be more stable year-to-year) at these top universities.  Figure 1 shows this by institutions for the 2015 calendar year.  Tsinghua leads the pack by a wide margin, at a little over RMB 13 billion.  Peking, Zhejiang and Shanghai JiaoTong are next at between RMB 8-9 billion Yuan, Fudan followed by Fudan Xi’an Jiao Tong at between RMB 5-6 billion.  The bottom positions are held by the two C9 universities which do not report to the higher education ministry: the University of Science and Technology of China (Chinese Academy of Science) and the Harbin Institute of Technology (Ministry of Industry and Information Technology) at RMB 3.4 billion and RMB 2.2 billion, respectively.

Figure 1: Expenditures, in Billions of RMBTop Chinese Universities, 2015

One interesting piece of information about these institutions is how little of their annual budget actually comes from government.  Figure 2 shows government appropriations as a percentage of annual expenditures (Harbin Institute of Technology is excluded because its financials do not distinguish between public and private sources of revenue).  As it turns out, top Chinese universities actually look a lot like Ontario ones in that they tend to get less than half their money from government.  That said, at most institutions student fees only account for about 15% of total revenue.

Figure 2: Government income as a % of total expenditures, Top Chinese Universities, 2015

Now at this point you may be wondering: RMB 13billion….is that a lot?  A little?  What’s the frame of reference here?  Well, fair enough.  Let’s put all this into US dollars, just so we’re clear.  And for reference, let’s throw in data for Harvard, Berkeley, U of T and UBC for 2015-16 for comparison.  To do this, I’m converting to USD at the mid 2015 exchange rate of RMB 6.21 = CDN $1.29 = USD $1.  The results are shown in Figure 3: By this measure, only Tsinghua is really up in the North American big leagues.

Figure 3: Total Expenditures, in USD, Top Chinese Universities plus US/Canada Comparators, 2015

But hang on a second.  What if we use purchasing power-parity instead of exchange rates?  Well, actually, this changes things more than you’d think.  If you convert the data at the mid-2015 Big Mac Index rate of RMB 3.55 = CDN $1.22 = USD $1.

Figure 4: Total Expenditures, in billions of USD at PPP, Top Chinese Universities plus US/Canada Comparators, 2015

Once adjusted for PPP, Tsinghua moves closer to Harvard, and the next three are more obviously in the big leagues, having all passed UBC.  Now in fact, PPP probably overstates universities’ buying power somewhat, because for many of the goods what universities purchase (top professors, scientific equipment, etc), the price is global rather than local.  So if you want to think about relative purchase power, a fair comparison between the institutions is probably somewhere between figure 3 and figure 4.

(If we were to do this from the perspective of “how big is each institution relative to the size and development of the economy” – that is, adjusting for GDP per capita, all the Chinese institutions would rise by a factor of four relative to American ones, i.e. Tsinghua would be three times as large as Harvard.

Now, what about dollars per student?  For this, I take the student numbers the institutions report to Quacquarelli Simons (QS) for use on its “top universities” website.  You can take these with a grain of salt: I can’t get QS’ numbers to line up with the data I have directly from any of these institutions, but it’s the most consistent thing we’ve got, so we’ll just have to live with it.

Figure 5: Expenditures per Student, in USD at PPP, Top Chinese Universities plus US/Canada Comparators, 2015

Now Tsinghua is much more clearly in an Ivy-League-approaching kind of position, with expenditures of over $100,000 per student.  That’s not near Harvard, which spends about twice that, but it is a full 25% higher than Berkeley and 150% higher than UBC and Toronto.  Even the Chinese second-tier trio of Shanghai Jiao Tong, Peking and Zhejiang are spending 50% more per student than the top Canadian universities.

In short, the top Chinese universities aren’t, as it is sometimes said, “rising”.  Financially, they’re already comfortably part of the world elite.

September 13

Some Curious Data From OECD Education at a Glance 2017

The OECD put out its annual Education at a Glance  publication yesterday.  No huge surprises except for the fact that they appear to have killed one of the most-used tables in the whole book (A.1.2, which compared tertiary attainment rates for 25-34 year olds by type of tertiary program – i.e. college v. university) which is an enormous bummer.  The finance data says what it pretty much always says: Canada is the #2 spender overall on higher education at 2.6% of GDP (just behind the US at 2.7%).  If you read my analysis last year, the story is still pretty much the same this year.

But there are some interesting nuggets buried away in the report nevertheless – stuff that other media won’t pick up.  I thought I would highlight two of them in particular which pose some thorny questions about Canadian statistical data and what we think we know about higher education.

Let’s start with the data on expenditures per pupil at the tertiary level.  Figure 1 looks at costs in Short-cycle Tertiary Education (meaning career-oriented, which in Canada’s case means community colleges)

Figure 1: Total Expenditures per Student, Colleges (or equivalent), Select OECD countries

Among major countries, Canada spends the most (from both public and private sources) on college or college-equivalent student.  A couple of countries actually do outspend us (the Austrians and – totally out left field – the Czechs), but the important point here is that our expenditures are nearly 40% above the OECD average.  And if you’re wondering why the UK and the US aren’t there, it’s because the former has no college equivalent and the latter chooses to not to report on colleges on the batshit crazy spurious grounds that even if you’re studying for a (college-equivalent) associate’s degree, the fact that this can be laddered up into a full bachelor’s means everything is really degree-level.  Nonsense, I know, but there we are.

Now, let’s do the same with universities:

Figure 2: Total Expenditures per Student, Universities, Select OECD countries

There’s not much in figure 2 we didn’t already know: US and Canada in terms of total expenditure per university student at the top with us over 50% above the OECD average and Korea way down at the bottom because the Koreans do everything in higher ed on a shoestring.

Now, one new little detail that OECD has added to Education at a Glance this year is that it splits out the portion of total expenditures (that is combine short-cycle and degree-levels)  which are devoted to R&D.  And this data is a little odd.

Figure 3: Total R&D Expenditures per Tertiary Student, Selected OECD Countries

There’s nothing obviously egregiously wrong with figure 3 – except for the data on the USA, which is bananas.  Read literally, it suggests that Canadian universities on average spend twice as much on R&D as American ones do and that’s simply false.

(The explanation, I think, is that Canada and possibly some other countries claim that all professors’ time spent on research – notionally 40% of time or thereabout – counts as “R&D”.  Whereas Americans claim that their universities – which only pay staff for 9 months a year with the rest of the time notionally off for research – do not count time that way, preferring to claim that the government is buying profs’ time with research grants.  Basically, they view universities as mailboxes for cheques to pay for staff time and so all that time money gets claimed as government expenditure on R&D, not university expenditure on R&D.  GERD, not HERD, in the innovation policy lingo.  I think, anyway).

What’s actually a little crazy about figure 3 is that the denominator is all tertiary students, not just degree-level students.  And yet we know that R&D money is pretty heavily concentrated (98%+) in universities.  In a country like Germany where over 99% of tertiary students are in degree-level institutions, that’s not a big deal.  But in Canada, about a third of our students are in short-cycle programs.  Which means, if you do the math, that in fact the R&D expenditures per university student are a little ways north of $9750.  Now here’s figure 3 again, with just degree-level students in the denominator.

Figure 4: Total R&D Expenditures per University Student, Selected OECD Countries

And of course, subtracting these numbers means we can revisit figure 2 and work out total non-R&D expenditures per student in universities.  Canada still remains 40% or so ahead of the OECD average, but is now similarly that far behind the US in per-student expenditure.

Figure 5: Total non-R&D Expenditures per University Student, Selected OECD Countries

Now, to be clear: I’m not saying OECD is wrong, or Statscan is wrong or anything else like that.  What I’m saying is that there appear to be major inconsistencies in the way institutions report data for international comparative purposes on key concepts like R&D.  And that this particular inconsistency means that Canada at least (possibly others) look a lot better vis-à-vis the United States than it probably should.

Just something to keep in mind when making comparisons in future – particularly around research expenditures and performance.

September 11

The Growing Importance of Fee Income

I made a little remark last week to the effect that on present trends, student fees would pass provincial funding as a source of revenue for universities by 2020-2021 and combined fed-prov government funding by 2025.  Based on my twitter feed, that seems to have got people quite excited.  But I should have been a little clearer about what I was saying.

First of all, by “on present trends”, I literally meant do the simple/stupid thing and take the annual change from 2014-15 to 2015-16 and stretch it out indefinitely.  One could use longer-term trends but for provincial government funds, the difference is minuscule because the 1-year and 5-year trends are pretty similar.  It’s harder to do that with the federal money because it jumps around a lot on an annual basis (is there a federal infrastructure program in a given year?  Have they given a one-time bump to granting council dollars?  etc.) and so medium term trends are harder to discern.   Second, when I said it would pass government funding, I meant for the entire budget, not just the operating budget (feds don’t really contribute to operating budgets).  And third, I was speaking in terms of national averages: regional averages vary considerably and in some provinces, fees passed government grants as a source of income some time ago.

Anyways, I thought it would be fun to do some inter-provincial comparisons on this.  To make things simple, I’m going to exclude federal funds from the exercise, and just look at provincial grants and student fees.  As previously, the data source is the Statcan/CAUBO Financial Information of Universities and Colleges Survey.

Let’s start by looking at how grants and fees compare to the size of the operating budget of universities in each province.

Figure 1: Provincial grant and fee income as a percentage of operating income, by province, 2015-16

Now, remember: some provincial and fee income goes to areas other than the operating budget and operating income is not restricted to just student fees and government grants.  Thus, you shouldn’t expect the two sets of lines to add up to 100%.  In some cases they add to more than 100%, in some cases less.  But no matter, the point is here that already in 2015-16 fees represent a greater portion of the operating budget than government grants in Ontario and an equal proportion in Nova Scotia.  In BC and PEI, fee and grant income are close-ish, but in the other six provinces government grants predominate.

Now let’s look at the five-year percentage change in income, in real dollars, from fees and grants.  This one is kind of complicated, so bear with me.

Figure 2: Change in income from provincial grants and student fees, by province, 2010-11 to 2015-16

There are seven provinces which share a pattern: increasing real fee income and decreasing real provincial grant income, though the extent varies.  The biggest shifts here are in Ontario and BC.  Quebec is the only province which has seen an increase in income from both sources.  In all eight of these provinces, we can do straight-line projections of the future pretty easily.

But then there are two provinces – Newfoundland and New Brunswick – which have seen net decreases in both sources of income.  Basically, this is what happens when a demographic collapse happens at the same time as a fiscal collapse.  In per-student terms this doesn’t look quite so bad because enrolments are declining, but since staff don’t get paid on a per-student basis that doesn’t help much when it comes to paying the bills.  It’s hard to do straight-line projections with these two because it’s quite clear the fee income declines aren’t going to continue indefinitely (the demographic collapse stabilizes, eventually).  So we’re going to say good-bye to these two for the rest of this analysis, while wishing them the very best in dealing with their rather significant challenges.

Ok, for the remaining eight provinces, let’s combine the info in those last few graphs.  Let’s take the income by source data in figure one, and then apply the trend changes in figure 2 to each province.  The easiest way to show this in a graph is to show fee income as a percentage of provincial grant income.  We can show this out to 2024-25, as seen below in figure 3.

Figure 3: Projected ratio of student fee income to government grant income to 2025, by province

What figure 3 really shows is that Canada is heading towards a much more financially heterogeneous higher education system.  For the country as a whole, fee income for universities should surpass provincial government grants in 2020-21.  But this masks huge variation at the provincial level.  Ontario and Nova Scotia (by now) already exceed that level.  BC will get there in three or four years, PEI will get there by 2024-25.  But the other provinces aren’t on track to hit that level until 2030 at the earliest (and in Quebec’s case it’s about 2055).

Another way to think of this is that in about a decade’s time, the funding landscape in places like Quebec, Manitoba and Saskatchewan is going to look the way it did in Ontario ten to fifteen years ago.  At the same time, Ontario’s funding landscape is going to look a lot like big American public schools, with less than 30% of the operating budget (and probably something like 15% of total funding) coming from provincial governments.  Differing incentives facing different universities means they are probably going to be run quite differently too: expect a greater variety of institutional cultures as a result.

Now, as with any straight-line projection, you should take the foregoing with a healthy amount of salt.  Politics matter, and funding trajectories can change.  This is one possible scenario, not necessarily the most likely but simply the one most in line with current trends.

But keep in mind that the above is the probably good news scenario for Ontario.  The bad news scenario would see the percentage of funds coming from fees restricted not by increasing the government grant, but by restricting student intake, or the intake of international students (which is where the big gains in fees are really coming from).  So even if you find this scenario disturbing: be careful what you wish for.

September 08

Data on Sexual Harassment & Sexual Assault in Higher Ed-an Australian Experiment

Earlier this year, I raged a bit at a project that the Ontario government had launched: namely, an attempt to survey every single student in Ontario about sexual assault in a way that – it seemed to me – likely to be (mis)used for constructing a league table on which institutions had the highest rates of sexual assault.  While getting more information about sexual assault seemed like a good idea, the possibility of a league table – based as it would be on a voluntary survey with pretty tiny likely response rates – was a terrible idea which I suggested needed to be re-thought.

Well, surprise!  Turns out Australian universities actually did this on their own initiative last year.  They asked the Australian Human Rights Commission (AHRC) to conduct a survey almost exactly along the lines I said was a terrible idea. And the results are…interesting.

To be precise: the AHRC took a fairly large sample (a shade over 300,000) of university students – not a complete census the way Ontario is considering – and sent them a well-thought-out survey (the report is here).  The response rate was 9.7%, and the report authors quite diligently and prominently noted the issues with data of this kind, which is the same as bedevils nearly all student survey research, including things like the National Survey of Student Engagement, the annual Canadian Undergraduate Research Consortium studies etc etc.

The report went on to outline a large number of extremely interesting and valuable findings.  Even if you take the view that these kinds of surveys are likely to overstate the prevalence of sexual assault and harassment because of response bias, the data about things like the perpetrators of assault/harassment, the settings in which it occurs, report of such events and the support sought afterwards are still likely to be accurate, and the report makes an incredible contribution by reporting these in detail (see synopses of the reports  from CNN, and Nature).  And, correctly, the report does not reveal data by institution.

So everything’s good?  Well, not quite.  Though the AHRC did not publish the data, the fact that it possessed data which could be analysed by institution set up a dynamic where if the data wasn’t released, there would be accusations of cover-up, suppression, etc.  So, the universities themselves – separate from the AHRC report – decided to voluntarily release their own data on sexual assaults.

Now I don’t think I’ve ever heard of institutions voluntarily releasing data on themselves which a) allowed direct comparisons between institutions b) on such a sensitive subject and c) where the data quality was so suspect.  But they did it.  And sure enough, news agencies such as ABC (the Australian one) and News Corp immediately turned this crap data into a ranking, which means that for years to come, the University of New England (it’s in small-town New South Wales) will be known as the sexual assault capital of Australian higher education.  Is that label justified?  Who knows?  The data quality makes it impossible to tell.   But UNE will have to live with it until the next time universities do a survey.

To be fair, on the whole the media reaction to the survey was not overly sensationalist.  For the most part, it focussed on the major cross-campus findings and not on institutional comparisons.  Which is good, and suggests that some of my concerns from last year may have been overblown (though I’m not entirely convinced our media will be as responsible as Australia’s).  That said, for data accuracy, use of a much smaller sample with incentives to produce a much higher response rate would still produce a much result with much better data quality than what the ARHC did, let alone the nonsensical census idea Ontario is considering.  The subject is too important to let bad data quality cloud the issue.

 

Erratum: There was a data transcription error in yesterday’s piece on tuition.  Average tuition in Alberta is $5749 not $5479, meaning it is slightly more expensive than neighbouring British Columbia, not slightly less.

September 07

Tuition Fees in Canada, 2017-18

So, yesterday was the annual tuition fee data dump from Statscan.  Probably worth it to go over the data just a bit to see what the story is.

The data everyone likes to focus on is the “average undergraduate tuition fee by province”.  This year, it looks like this (note that “fees” here do not include ancillary fees, only tuition proper):

Figure 1: Average Domestic Undergraduate Tuition Fees by Province, 2017-18

The other number that people always look out for is the one that shows increases over time.  For reasons that defy easy comprehension, Statscan always publishes these in nominal rather than real dollars which always leads to inflated estimates of tuition increases.  So I’ve put all the figures in 2017 real dollars in Figure 2:

Figure 2: Average Domestic Undergraduate Tuition Fees, Canada, 2006-07 to 2017-18

 

So, accounting for inflation, the increase in tuition fees is 25% over 11 years, or an average of 1.8% per annum.

Now keep in mind that what is being averaged here is tuition fees across all domestic undergraduate students, not fees across all undergraduate programs.  So where a program has one set of fees for in-province students and another for out-of-province students (e.g. Quebec, Nova Scotia) the two get averaged.  Also, even if there is no change in posted tuition, if more students enrol in more expensive programs (e.g. engineering) and fewer in cheaper ones (e.g. Arts) then that will still mean an increase in average tuition.

And tuition does vary a heck of a lot by field of study even at the undergraduate level.  This is actually a nuance of Canadian tuition fee data which is not well understood outside the country, where variable tuition by field tends to be quite rare. Here’s the national average by field:

Figure 3: Undergraduate Average Tuition by Selected Field of Study, 2017-18

Note here that the presence of a few very expensive professional disciplines drags up the average substantially.  In humanities and social sciences, average tuition fees are 12-15% lower than the national average, and in education it is 29% lower.  This is one of those cases where the average price is somewhat higher than the median price – something to keep in mind when thinking about affordability.

There was some other interesting data in yesterday’s release with respect to domestic graduate student fees (up 1.8% in nominal terms, vs. 3.1% for undergraduate fees) and for international students fees (surprise surprise – up 6.1% in nominal dollars), but the above covers the main points of interest.  Nothing terribly exciting, but worth re-capping and putting into context nonetheless. 

September 06

Canadian University Finance Statistics (2015-16 Edition)

The 2015-16 version of Financial Information of Universities and Colleges Survey (which, confusingly, doesn’t include community colleges) was released over the summer.  As in previous years I’m going to do a little summary of what it tells us about how income and expenditure has change over one year and five years.  Just so we’re all clear, all figures here are in real (i.e. inflation-adjusted) dollars.  And – caveat – comparisons with 2010-11 are a little weird because Quebec universities changed their fiscal year-end that year and only reported 11 months of data, meaning that nationally, reported expenditures for that year are probably about 1.5% lower than they normally would be.  This means that 5-year averages are probably inflated slightly compared to reality.

For starters, let’s look at total income by source, which was $34,385 million.  That’s down nearly 5% from the previous year, though the a little over 75% of the drop is due to a fall in endowment income (apparently everyone got hammered in 2015).  Income from governments fell by a little under 2%, nearly all due to reductions at the provincial level.  Over the past five years, revenue from government is down by a stonking 12.6%.  However, rising fee income mostly compensates for this: it rose by nearly 5% over 2014-15 and 27% over 2010-11.  For the most part, these increases are not coming from domestic student fees, they are coming from increases in international student enrolment.

Figure 1: Change in Total Income by Source, Canadian Universities, 2015-16

What’s really interesting about the total income numbers is how small the government numbers are becoming.  Already as of 2014, the university sector as a whole took in more money from non-governmental sources (fees, donations, sale of goods and services, etc) than it did combined from the federal and provincial governments.  On current trends, income from student fees will surpass provincial government grants to universities in 2020-21, and will pass combined federal and provincial contributions in 2024-2025.  At which point it would be fair to say we will have moved from a public university system to a publicly-assisted university system.

Now, on to the changes in income by Fund, which I show below in Figure 2.  This tells a slightly different story.  Operating income actually kept pace with inflation in 2015-16 and over a five year period actually increased by 8.8%.  Endowment income fell from about $1.5 billion to about 27 million, or a fall of roughly 98%, but this is an erratic income source and like I said last year was a bad year.  Capital expenditures are down substantially, but recall that in the base year of 2010-11 the feds were sacrificing billions to the Construction Industry Gods to keep the recession at bay; in fact, current capital expenditure is close to the 30-year norm.  The interesting piece is that sponsored research income is down 6% over the past five years.

Figure 2: Change in Income by Fund, Canadian Universities, 2015-16

On to expenditures by type.  Total expenditures are roughly unchanged either over one year or five years. If you’re wondering how this is possible when income is down, recall that most of the income drop is in endowment, which has very little impact on year-to-year spending since it’s supposed to all be salted away anyway.   But while total expenditures are unchanged, some fairly big line items continue to rise, over the medium if not the short term.  Academic salaries by 7.5%, salaries to non-academic staff 8.3%, total compensation (including benefits) 8.3% and scholarships – three-quarters of which go to grad students – up by a whopping 16.4% since 2010-11.  Total scholarship expenditures are now just shy of $2 billion, which means institutions are giving back to students over 20 cents of every dollar they collect from students; from domestic students the figure is closer to 30 cents.

Figure 3: Change in Selected Types of Expenditure, Canadian Universities 2015-16

 

Now you may well ask yourself: wait a minute.  Total expenditures are flat, but salaries and scholarships are rising.  So how does this balance? Well, simple enough: non-salary, non-scholarship expenditures have fallen by 14% in the last five years in constant dollars.  Some of that is just buildings not getting built (no loss, in the eyes of some), but other things are getting squeezed, too; notably, renovations, travel and printing.

Finally, let’s look specifically at what’s going on inside the operating budget (that is, excluding ancillary, capital, research and the like) which accounts for about 60% of the total.  Figure 4 shows that overall, operating expenditures rose by 14.3% over five years.  How is this possible when operating income only rose 8.8% you ask?  Mainly, because universities have been trimming margins: universities were running surpluses five years ago and mostly aren’t any more.

 

Figure 4: Changes in Operating Budget Expenditures, Canadian Universities, 2015-16

The big expenditure increases are in ICT and student services.  In the case of student services, an awful lot of that increase is scholarships.  In ICT, interestingly, the cost of equipment purchased has actually gone down: the increases are in staff costs, consulting contracts, professional fees and equipment rentals.  Make of that what you will.  The biggest piece of the pie – Instruction and non-sponsored research (meaning basically what it costs to run core academic functions), which takes up about half the operating budget – is up 11.7 % over five years.

So there you go.  Don’t say financial reports aren’t fun.

 

June 05

Student Health (Part 3)

You know how it is when someone tries to make a point about Canadian higher education using data from American universities? It’s annoying.  Makes you want to (verbally) smack them upside the head. Canada and the US are different, you want to yell. Don’t assume the data are the same! But of course the problem is there usually isn’t any Canadian data, which is part of why these generalizations get started in the first place.

Well, one of the neat things about the AHCA-NCHA campus health survey I was talking about last week is that it is one of the few data collection instruments that is in use on both sides of the border. Same questions, administered at the same time to tens of thousands of students on both sides of the border. And, as I started to look at the data for 2016, I realized my “Canada is different” rant is – with respect to students and health at least – almost entirely wrong. Turns out Canadian and American students are about as alike as two peas in a pod. It’s kind of amazing, actually.

Let’s start with basic some basic demographic indicators, like height and weight. I think I would have assumed automatically that American students would be both taller and heavier than Canadian ones, but figure 1 shows you what I know.

Figure 1: Median Height (Inches) and Weight (Pounds), Canadian vs. US students.

OTTSYD-1

Now, let’s move over to issues of mental health, one of the key topics of the survey. Again, we see essentially no difference between results on either side of the 49th parallel.

Figure 2: Within the last 12 months have you been diagnosed with/treated for…

OTTSYD-2

What about that major student complaint, stress? The AHCA-NCHA survey asks students to rate the stress they’ve been under over the past 12 months. Again, the patterns in the two countries are more or less the same.

Figure 3: Within the last 12 months, rate the stress you have been under.

OTTSYD-3

One interesting side-note here: students in both countries were asked about issues causing trauma or being “difficult to handle”. Financial matters were apparently more of an issue in Canada (40.4% saying yes) than in the US (33.7%). I will leave it to the reader to ponder how that result lines up with various claims about the effects of tuition fees.

At the extreme end of mental health issues, we have students who self-harm or attempt suicide. There was a bit of a difference on this one, but not much, with Canadian students slightly more likely to indicate that they had self-harmed or attempted suicide.

Figure 4: Attempts at Self-harm/suicide.

OTTSYD-4

What about use of tobacco, alcohol and illicit substances? Canadian students are marginally more likely to drink and smoke, but apart from that the numbers look pretty much the same. The survey, amazingly, does not ask about use of opioids/painkillers, which if books like Sam Quinones’ Dreamland are to be believed have made major inroads among America’s young – I’d have been interested to see the data on that. It does have a bunch of other minor drugs – heroin, MDMA, etc, and none of them really register in either country.

Figure 5: Use of Cigarettes, Alcohol, Marijuana, Cocaine.

OTTSYD-5

This post is getting a little graph-heavy, so let me just run through a bunch of topics where there’s essentially no difference between Canadians and Americans: frequency of sexual intercourse, number of sexual partners, use of most illegal drugs, use of seat belts, likelihood of being physically or sexually assaulted, rates of volunteering….in fact among the few places where you see significant differences between Canadian and American students is with respect to the kinds of physical ailments they report. Canadian students are significantly more likely to report having back pain, Americans more likely to report allergies and sinus problems.

Actually, the really big differences between the two countries were around housing and social life. In Canada, less than 2% of students reported being in a fraternity/sorority, compared to almost 10% in the United States. And as for housing, as you can see Americans are vastly more likely to live on-campus and vastly less-likely to live at home. On balance, that means they are incurring significantly higher costs to attend post-secondary education. Also, it probably means campus services are under a lot more pressure in the US than up here.

Figure 6: Student Living Arrangements.

OTTSYD-6

A final point here is with respect to perceptions of campus safety. We all know the differences in rates of violent crimes in the two countries, so you’d expect a difference in perceptions of safety, right? Well, only a little bit, only at night and mostly- off-campus. Figure 7 shows perceptions of safety during the day and at night, on campus and in the community surrounding campus.

Figure 7: Perceptions of safety on campus and in surrounding community.

OTTSYD-7

In conclusion: when it comes to students health and lifestyle, apart from housing there do not appear to many cross-border differences. We seem to be living in a genuinely continental student culture.

June 01

Student Health (part 1)

I have been perusing a quite astonishingly detailed survey that was recently released regarding student health.  Run by the American College Health Association-National College Health Assessment, this multi-campus exercise has been run twice now in Canada – once in 2013 and once in 2016.  Today, I’m going to look at what the 2016 results say, which are interesting in and of themselves.  Tomorrow, I’m going to look at how the data has changed since 2013 and why I think some claims about worsening student health outcomes (particularly mental health) need to be viewed with some caution.  If I get really nerdy over the weekend, I might do some Canada-US comparisons, too.

Anyways.

The 2016 study was conducted at 41 public institutions across Canada.  Because it’s an American based survey, it keeps referring to all institutions as “colleges”, which is annoying.  27 of the institutions are described as “4-year” institutions (which I think we can safely say are universities), 4 are described as “2-year” institutions (community colleges) and 10 described as “other” (not sure what to make of this, but my guess would be community colleges/polytechnics that offer mostly three-year programs).  In total, 43,780 surveys were filled out (19% response rate), with a roughly 70-30 female/male split.  That’s pretty common for campus surveys, but there’s no indication that responses have been re-weighted to match actual gender splits, which is a little odd but whatever.

 

There’s a lot of data here, so I’m mostly going to let the graphs do the talking.  First, the frequency of students with various disabilities.  I was a little bit surprised that psychiatric conditions and chronic illnesses were as high as they were.

Figure 1: Prevalence of Disabilities

Figure 1 Prevalence of Disabilities

Next, issues of physical safety.  Just over 87% of respondents reported feeling safe on campus during the daytime; however, only 37% (61% of women, 27% of men, and right away you can see how the gender re-weighting issue matters) say that they feel safe on campus at night.  To be fair, this is not a specific worry about campuses: when asked about their feelings of personal safety in the surrounding community, the corresponding figures were 62% and 22%.  Students were also asked about their experiences with specific forms of violence over the past 12 months.  As one might imagine, most of the results were fairly highly gendered.

 

Figure 2: Experience of Specific Forms of Violence Over Past 12 Months, by Gender

Figure 2 Experience of Specific Forms of Violence

Next, alcohol, tobacco, and marijuana.  This was an interesting question as the survey not only asked students about their own use of these substances, but also about their perception of other students’ use of them.  It turns out students vastly over-estimate the number of other students who engage with these substances.  For instance, only 11% of students smoked cigarettes in the past 30 days (plus another 4% using e-cigarettes and 3% using hookahs), but students believed that nearly 80% of students had smoked in the past month.

 

Figure 3: Real and Perceived Incidence of Smoking, Drinking and Marijuana Use over Past 30 Days

Figure 3 Real and Perceived Incidence of Smoking

Figure 4 shows the most common conditions for which students had been diagnosed with and/or received treatment for in the last twelve months.  Three of the top ten and two of the top three were mental health conditions.

Figure 4: Most Common Conditions Diagnosed/Treated in last 12 Months

Figure 4 Most Common Conditions Diagnosed

Students were also asked separately about the kinds of things that had negatively affected their academics over the previous year (defined as something which had resulted in a lower mark than they would have otherwise received).  Mental health complaints are very high on this list; much higher in fact than actual diagnoses of such conditions.  Also of note here: internet gaming was sixth among factors causing poorer marks; finances only barely snuck into the top 10 reasons, with 10.3% citing it (though elsewhere in the study over 40% said they had experienced stress or anxiety as a result of finances).

Figure 5: Most Common Conditions Cited as Having a Negative Impact on Academics

Figure 5 Most Common Conditions Cited as Having

A final, disturbing point here: 8.7% of respondents said they had intentionally self-harmed over the past twelve months, 13% had seriously contemplated suicide and 2.1% said they had actually attempted suicide.  Sobering stuff.

April 05

Student/Graduate Survey Data

This is my last thought on data for awhile, I promise.  But I want to talk a little bit today about what we’re doing wrong with the increasing misuse of student and graduate surveys.

Back about 15 years ago, the relevant technology for email surveys became sufficiently cheap and ubiquitous that everyone started using them.  I mean, everyone.  So what has happened over the last decade and a half has been a proliferation of surveys and with it – surprise, surprise – a steady decline in survey response rates.  We know that these low-participation surveys (nearly all are below 50%, and most are below 35%) are reliable, in the sense that they give us similar results year after year.  But we have no idea whether they are accurate, because we have no way of dealing with response bias.

Now, every once in awhile you get someone with the cockamamie idea that the way to deal with low response rates is to expand the sample.  Remember how we all laughed at Tony Clement when he claimed  the (voluntary) National Household Survey would be better than the (mandatory) Long-Form Census because the sample size would be larger?  Fun times.  But this is effectively what governments do when they decide – as the Ontario government did in the case of its sexual assault survey  – to carry out what amounts to a (voluntary) student census.

So we have a problem: even as we want to make policy on a more data-informed basis, we face the problem that the quality of student data is decreasing (this also goes for graduate surveys, but I’ll come back to those in a second).  Fortunately, there is an answer to this problem: interview fewer students, but pay them.

What every institution should do – and frankly what every government should do as well – is create a balanced, stratified panel of about 1000 students.   And it should pay them maybe $10/survey to complete surveys throughout the year.  That way, you’d have good response rates from a panel that actually represented the student body well, as opposed to the crapshoot which currently reigns.  Want accurate data on student satisfaction, library/IT usage, incidence of sexual assault/harassment?  This is the way to do it.  And you’d also be doing the rest of your student body a favour by not spamming them with questionnaires they don’t want.

(Costly?  Yes.  Good data ain’t free.  Institutions that care about good data will suck it up).

It’s a slightly different story for graduate surveys.  Here, you also have a problem of response rates, but with the caveat that at least as far as employment and income data is concerned, we aren’t going to have that problem for much longer.  You may be aware of Ross Finnie’s work  linking student data to tax data to work out long-term income paths.  An increasing number of institutions are now doing this, as indeed is Statistic Canada for future versions of its National Graduate Survey (I give Statscan hell, deservedly, but for this they deserve kudos).

So now that we’re going to have excellent, up-to-date data about employment and income data we can re-orient our whole approach to graduate surveys.  We can move away from attempted censuses with a couple of not totally convincing questions about employment and re-shape them into what they should be: much more qualitative explorations of graduate pathways.  Give me a stratified sample of 2000 graduates explaining in detail how they went from being a student to having a career (or not) three years later rather than asking 50,000 students a closed-ended question about whether their job is “related” to their education every day of the week.  The latter is a boring box-checking exercise: the former offers the potential for real understanding and improvement.

(And yeah, again: pay your survey respondents for their time.  The American Department of Education does it on their surveys and they get great data.)

Bottom line: We need to get serious about ending the Tony Clement-icization of student/graduate data. That means getting serious about constructing better samples, incentivizing participation, and asking better questions (particularly of graduates).  And there’s no time like the present. If anyone wants to get serious about this discussion, let me know: I’d be overjoyed to help.

Page 1 of 1612345...10...Last »