HESA

Higher Education Strategy Associates

Category Archives: Data

January 21

Marginal Costs, Marginal Revenue

Businesses have a pretty good way of knowing when to offer more or less of a good.  It’s encapsulated in the equation MC = MR, and shown in the graphic below.

profit-maximisation

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Briefly, in the production of any good, unit-costs fall to start with as the benefits of economies of scale start to rise.  Eventually, however, if production is expanded far enough you get diseconomies of scale, and the marginal cost begins to rise.  Where the marginal cost of producing one more unit of a good rises above the marginal revenue one receives from selling it (in the above diagram, Q1), that’s the point where you start losing money, and hence where you stop producing the good.

(This gets more complicated for products like software or apps where the marginal cost of production is pretty close to zero, but we’ll leave that aside for the moment.)

Anyway, when it comes to delivering educational programs, you’d ideally like to think you’re not doing so at a loss (otherwise, you eventually have a bit of a problem paying employees).  You want each program to more or less, over time, come close to paying for itself.  It’s not the end of the world if they don’t, cross-subsidization of programs is a kind of core function of a university after all; but it would be nice if they did.  In other words, you really want each program to have a production function where the condition MC=MR is fulfilled.

But here’s the problem.  Marginal revenue’s relatively easy to understand: it’s pretty close to average revenue, after all, though it gets a bit more complicated in places where government grants are not provided on a formula basis, and there’s some trickiness when you start calculating domestic fees vs. international fees, etc.  But the number of universities that genuinely understand marginal cost at a program level is pretty small.

Marginal costs in universities are a bit lumpy.  Let’s say you have a class of twenty-five students and a professor already paid to teach it.  The marginal cost of the twenty-sixth student is essentially zero – so grab that student!  Free money!  Maybe the twenty-seventh student, too.  But after awhile, costs do start to build.  Maybe on the 30th student there’s a collective bargaining provision that says the professor gets a TA, or assistance in marking.  Whoops!  Big spike in marginal costs.  Then where you get to forty, the class overfills and you need to split the course into two, get a new classroom, and a new instructor, too.  The marginal cost of that forty-first student is astronomical.  But the forty-second is once again almost costless. And so on, and so on.

Now obviously, no one should measure marginal costs quite this way; in practice, it would make more sense to work out averages across a large numbers of classes, and work to a rule of thumb at the level of a department or a faculty.  The problem is very few universities even do that (my impression is that some colleges have a somewhat better record here, but the situation varies widely).  Partly, it’s because of a legitimate difficulty in understanding direct and indirect costs: how should things like light, heat, and the costs of student services, admissions, etc., be apportioned – and then there is the incredible annoyance of working out how to deal with things like cross-listed courses.  But mostly, I would argue, it’s because no one wants to know these numbers.  No one wants to make decisions based on the truth.  Easier to make decisions in the dark, and when something goes wrong, blame it on the Dean (or the Provost, or whoever).

Institutions that do not understand their own production functions are unlikely to be making optimal decisions about either admissions or hiring.  In an age of slow revenue growth, more institutions need to get a grip on these numbers, and use them in their planning.

December 10

Reports, Books, and CUDO

It’s getting close to that time of year when I need to sign off for the holidays (tomorrow will be the last blog until January 4th).  So before then, I thought it would be worth quickly catching up on a few things.

Some reports you may have missed.  A number of reports have come out recently that I have been meaning to review.  Two, I think, are of passing note:

i) The Alberta Auditor-general devoted part of his annual report (see pages 21-28) to the subject of risk-management of cost-recovery and for-profit enterprises in the province’s post-secondary institutions, and concluded that the government really has no idea how much risk the provinces’ universities and colleges have taken on in the form of investments, partnerships, joint ventures, etc.  And that’s partly because the institutions themselves frequently don’t do a great job of quantifying this risk.  This issue’s a sleeper – my guess is it will increase in importance as time goes on.

ii) The Ontario auditor general reviewed the issue of University Intellectual Property (unaccountably, this story was overlooked by the media in favour of reporting on the trifling fact that Ontarians have overpaid for energy by $37 billion over the last – wait, what?  How much?).  It was fairly scathing about the province’s current activities in terms of ensuring the public gets value for money for its investments. A lot of the recommendations to universities consisted of fairly nitpicky stuff about documentation of commercialization, but there were solid recommendations on the need to track the impact of technology transfer, and in particular the socio-economic impact.  Again, I suspect similar issues will crop up with increasing frequency for both governments and institutions across the country.

Higher Ed Books of the Year.  For best book, I’m going to go with Lauren Rivera’s Pedigree: How Elite Students Get Elite Jobs, which I reviewed back here.   I’ll give a runner-up to Kevin Carey’s The End of College, about which I wrote a three-part review in March (here, here, and here).  I think the thesis is wrong, and as others have pointed out there are some perspectives missing here, but it put a lot of valuable issues about the future of higher education on the table in a clear and accessible way.

Worst book?  I’m reluctantly having to nominate Mark Ferrara’s Palace of Ashes: China and the Decline of American Higher Education.  I say reluctantly because the two chapters on the development of Chinese higher education is pretty good.  But the thesis as a whole is an utter train wreck.  Basically it amounts to: China is amazing because it is spending more money on higher education, and the US is terrible because it is spending less money on higher education (though he never bothers to actually check how much each is spending, say, as a proportion of GDP, which is a shame, as he would quickly see that US expenditure remains way above China’s even after adjusting for the difference in GDP).  The most hilarious bits are the ones where he talks about the erosion of academic freedom due to budget cuts, whereas in China… (you see the problem?  The author unfortunately doesn’t).  Dreck.

CUDO: You may recall I had some harsh things to say about the stuff that Common University Dataset Ontario was releasing on class sizes.  I offered a right of reply, and COU has kindly provided one, which I reproduce below, unedited:

We have looked into the anomalies that you reported in your blog concerning data in CUDO on class size.  Almost all data elements in CUDO derive from third party sources (for example, audited enrolment data reported to MTCU, NSSE survey responses) or from well-established processes that include data verification (for example, faculty data from the National Faculty Data Pool), and provide accurate and comparable data across universities. The class size data element in CUDO is an exception, however, where data is reported by universities and not validated across universities. We have determined that, over time, COU members have developed inconsistent approaches to reporting of the class size data in CUDO.

 COU will be working with universities towards more consistent reporting of class size for the next release of CUDO.

With respect to data concerning faculty workload:  COU published results of a study of faculty work in August 2014,  based on data collected concerning work performed by full-time tenured faculty, using data from 2010 to 2012. We recognize the need for further data concerning teaching done by contract teaching staff. As promised in the 2014 report, COU is in the process of updating the analysis based on 2014-15 data, and is expanding the data collection to include all teaching done in universities by both full-time tenured/tenure track faculty and contract teaching staff. We expect to release results of this study in 2016.

Buonissimo.  ‘Til tomorrow.

December 04

Defending Liberal Arts: Try Using Data

A few weeks back, I wrote about the Liberals Arts/humanities, and some really bad arguments both for and against them.  As usual when I write these, I got a lot of feedback to the effect of: “well, how would you defend the Liberal Arts, smart guy”?  Which, you know, fair enough.  So, here’s my answer.

The humanities, at root, are about pattern recognition in the same way that the sciences and the social sciences are: they just seek patterns in different areas of human affairs – in music, in literature, and in the narrative of history.  And though humanities cannot test hypotheses about patterns using the same kinds of experimental methods as elsewhere, they can nevertheless promote greater understanding of thorough synthesis.  Or, to paraphrase William Cronon’s famous essay, the humanities are about making connections, only connections.  In a networked world, that’s a valuable skill.

None of this, to me, is in doubt.  What is in doubt is whether this promise made by the humanities and Liberal Arts is actually delivered upon.  Other disciplines synthesize and make connections, too.  They promote critical thinking (the idea that other disciplines, disciplines founded on the scientific method, don’t promote critical thinking is the most arrogant and stupid canard promoted by people in the humanities).  What the humanities desperately need is some proof that what they claim is true is, in fact, true.  They need some data.

In this context, it’s worth taking a look at the Wabash National Study on Liberal Arts education.  This was an elaborate, longitudinal, multi-institutional study to look at how students in liberal arts programs develop over time.  Students took a battery of tests – on critical reasoning, intercultural effectiveness, moral character, leadership, etc. – at various points in their academic career to see the effects of Liberal Arts teaching, holding constant the effects of things like gender, age, race, prior GPA, etc.  You can read about the results here – and do read them, because it is an interesting study.

At one level, the results are pretty much what we always thought: students do better if they are in classes where the teaching is clear and well-organized, and they learn more where they are challenged to do things, like applying theories to practical problems in new contexts, or integrating ideas from different courses in a project, or engaging in reflective learning.  And as can be seen here in the summary of results, the biggest positive effects of liberal arts education are on moral reasoning, critical thinking, and leadership skills (academic motivation, unfortunately, actually seems to go down over time).

So: mostly good for Liberal Arts/humanities, right?  Not quite.  Let me quote the most interesting bit: the research found that “even with controls for student pre-college characteristics and academic major, students attending liberal arts colleges (as compared to their peers at research universities and regional institutions) reported significantly higher levels of clarity and organization in the instruction they received, as well as a significantly higher frequency of experiences on all three of the deep-learning scales.”  In other words, the effects of Liberal Arts on students in Liberal Arts colleges are significantly greater than the effects on students studying similar programs in other, larger institutions.  That is to say, it’s the teaching environment and teaching practices, not the subject matter itself, which seems to make more of a difference.

Now, this does not suggest that Liberal Arts/humanities can’t deliver those kinds of benefits at larger universities; it’s just to say that for it to deliver those benefits, the focus needs to be on providing the subject matter using quite specific teaching practices and – not to beat around the bush – keeping class sizes down (which may in turn have implications for teaching loads and research activity, but that’s another story).

There are some good stories for the Liberal Arts in the Wabash data, and some not so good stories.  But the point is, there is data.  There are some actual facts and insights that can be used to improve programs, to make them better at producing well-rounded critical thinkers.  And at the end of the day, the inquiry itself is what’s important.  Humanities’ biggest problem isn’t that it’s got nothing to sell; it’s that too frequently they act like they have nothing to learn.  If more institutions adopted Wabash-like approaches, and acted upon them, my guess is the Liberal Arts would get a lot more respect than they currently do.

November 25

The 2015 OECD Education at a Glance

So the OECD’s Education at a Glance was published yesterday.  It’s taken a couple of months longer than usual because of the need to convert  into the new International Standard Classification of Education (ISCED) system.  No, don’t ask; it’s better not to know.

I won’t say there’s a whole lot new in this issue that will be of interest to PSE-types.  One point of note is that Statscan has – for no obvious or stated reason – substantially restated Canadian expenditure on tertiary educational institutions, downwards.  In last year’s edition, you may recall that they claimed 2011 spending was 2.8% of GDP, which I thought was a tad high (I couldn’t get it to go over 2.43%).  They are now saying that last year was in fact 2.6% of GDP, and this year is 2.5%.  That still puts Canada well ahead of most countries, and more than 50% ahead of the OECD average.

Figure 1: Selected OECD Countries’ Spending on Tertiary Education as a Percentage of Gross Domestic Product

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

Next, the shift to the ISCED system has produced a slight change to the way attainment data is presented.  Basically, they make it easier to tease-out different levels of attainment above the bachelor’s level; but this makes no difference for Canada, because we can’t actually measure these things.  The problem is our Labor Force Survey, which has a very vague and sketchy set of responses on educational attainment (basically, you can only answer “college” or “university” on attainment, so our college numbers include all sorts of weird private short course programs, and our university numbers make no distinction between types of degrees).  Still, for what it’s worth, here’s how attainment rates for young Canadians (age 25-34) stack up against other countries.

Figure 2: Selected OECD Countries’ Tertiary Attainment Rates, 2012

2

 

 

 

 

 

 

 

 

 

 

 

 

Those of you familiar with the “Canada’s number 1” rhetoric that accompanied previous EAG releases may do a double-take at this graph.  Yes, certainly, Canada is still close to the top if you include all of post-secondary education.  But it used to be that we were also at – or close to – the top on university education, as well; now, we’re actually below the OECD average.  What the heck is going on?

Well, it helps to look back a decade or so to see what the picture looked like then.

Figure 3: Selected OECD Countries’ Tertiary Attainment Rates, 2003

3

 

 

 

 

 

 

 

 

 

 

 

 

Much of what has changed is the way this data is presented.  First, the old 5A/5B excluded attainment at the doctoral level, which the new system does not.  Since European countries tend to have slightly higher doctoral degree award rates than we do, this cuts the difference a bit.  A bigger issue is that fact that post-Bologna, a lot of European countries simply did away with short-cycle degrees from polytechnics and fachochschule, and re-classified them as university degrees.  Finland thus went from a system with 23% attainment at 5A (university) level, and 17% at 5B (college or polytechnic) level, to a system that is now simply 40% degree level, or above.  In other words, tertiary attainment rates are exactly the same in Finland as they were a decade ago, but credentials have simply been re-labelled.  Something similar also happened in Germany.

While reclassification explains part of the change, it doesn’t explain it all.  Some countries are genuinely seeing much bigger increases in university attainment than we are.  There is South Korea, where attainment rates ballooned from 47% of all 25-34 year olds in 2003, to 68% in just a decade (30% to 45% at the university level alone), as well as Australia, where university attainment has gone from 25% to 38%.

Those are some quite amazing numbers.  Makes you wonder why we can’t do that, as well.

November 24

Class Size, Teaching Loads, and that Curious CUDO Data Redux

You may recall that last week I posted some curious data from CUDO, which suggested that the ratio of undergraduate “classes” (we’re not entirely sure what this means) to full-time professors in Ontario was an amazingly-low 2.4 to 1.  Three quick follow-ups to that piece.

1.  In the previous post, I offered space on the blog to anyone involved with CUDO who could clear up the mystery of why undergraduate teaching loads appeared to be so low.  No one has taken me up this offer.  Poor show, but it’s not too late; I hereby repeat the offer in the hope that someone will step forward with something convincing.

2.  I had a couple of people – both in Arts faculties at different medium-sized non-U15 Ontario universities – try to explain the 2.4 number as follows: teaching loads *are* in fact 4 courses per year (2/2), they said.  It’s just that once you count sabbaticals, maternity leaves, high enrolment (profs sometimes get a reduced load if one of their classes is particularly large), leaves for administrative duty, and “buyouts” (i.e. a prof pays to have a sessional teach the class so he/she can do research), you come down to around 2.5.

This is sort of fascinating.  I mean, if this were generally true, it essentially means that universities are managing their staff on the assumption that 35-40% of staff resources are theoretically available for teaching.  Now, obviously all industries overstaff to some extent: sickleaves and maternity happen everywhere.  But 40%?  That sounds extremely high.  It does not speak particularly well of an institution that gets its money primarily for the purpose of teaching.  Again, it would be useful if someone in an institution could confirm/deny, but it’s a heck of a stat.

3.  Turns out there’s actually a way to check this, because at least one university – give it up for Carleton, everyone – actually makes statistics about sessional professors public!  Like, on their website, for everyone to seeMirabile dictu.

Anyways, what Carleton says is that in 2014-15, 1,397 “course sections” were taught by contract or retired faculty, which translates into 756.3 “credits”.  At the same time, the university says it has 850 academic staff (actually, 878, but I’m excluding the librarians here).  Assuming they are all meant to teach 2/2, this would be 3,400 “classes” per year.  Now, it’s not entirely clear to me whether the definition of “classes” is closer to “credits” or “course sections”; I kind of think it is somewhere in between.  If it’s the former, then contract/retired faculty are teaching 22.2% of all undergraduate classes; if it’s the latter, then it’s 41.1%.  That’s a wide range, but probably about right.  And since Carleton is a pretty typical Canadian university, my guess is these numbers roughly hold throughout the system.

However, what this doesn’t tell you is what percentage of credit hours are taught by sessionals – if the undergraduate classes taught by these academics are larger, on average, than those taught by full-timers, then the proportion will be even higher than this.  I’ve had numerous conversations with people in a position to know who indicate that in many Ontario Arts faculties, the percentage of undergraduate credit hours taught by sessional faculty is roughly 50%. Elsewhere, of course, mileage may vary, but my guess is that with the possible exception of the Atlantic, this is the case pretty much everywhere.

I could be wrong, of course.  As with my CUDO offer, anyone who wants to step forward with actual data to show how I am wrong is welcome to take over the blog for a couple of days to present the evidence.

November 17

Curious Data on Teaching Loads in Ontario

Back in 2006, university Presidents got so mad at Maclean’s that they stopped providing data to the publication.  Recognizing that this might create the impression that they had something to hide, they developed something called “Common University Dataset Ontario” (CUDO) to provide the public with a number of important quantitative descriptors of each university.  In theory, this data is of better quality and more reliable than the stuff they used to give Maclean’s.

One of the data elements in CUDO has to do with teaching and class size.  There’s a table for each university, which shows the distribution of class sizes in each “year” (1st, 2nd, 3rd, 4th): below 30, 31-60, 61-90, 91-150, 151-250, and over 250.  The table is done twice, once including just “classes”, and another with slightly different cut-points that include “subsections”, as well (things like laboratories and course sections).  I was picking through this data when I realised it could be used to take a crude look at teaching loads because the same CUDO data also provides a handy number of full-time professors at each institution.  Basically, instead of looking at the distribution of classes, all you have to do is add up the actual number of undergraduate classes offered, divide it by the number of professors, and you get the number of courses per professor.  That’s not a teaching load per se, because many courses are taught by sessionals, and hell will freeze over before institutions release data on that subject. Thus, any “courses per professor” data that can be derived from this exercise is going to overstate the amount of undergradaute teaching being done by full-time profs.

Below is a list of Ontario universities, arranged in ascending order of the number of undergraduate courses per full-time professor.  It also shows the number of courses per professor if all subsections are also included.  Of course, in most cases, at most institutions, subsections are not handled by full-time professors but some are; and so assuming the underlying numbers are real, a “true” measure of courses per professors would be somewhere in between the two.  And remember, these are classes per year, not per term.

Classes Per Professor, Ontario, 2013

1

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Yes, you’re reading that right.  According to universities’ own data, on average, professors are teaching just under two and a half classes per year, or a little over one course per semester.  At Toronto, McMaster, and Windsor, the average is less than one course per semester.  If you include subsections, the figure rises to three courses per semester, but of course as we know subsections aren’t usually led by professors.   And, let me just say this again, because we are not accounting for classes taught by sessionals, these are all overstatements of course loads.

Now these would be pretty scandalous numbers if they were measuring something real.  But I think it’s pretty clear that they are not.  Teaching loads at Nipissing are not five times higher than they are at Windsor; they are not three and a half times higher at Guelph than at Toronto.  They’re just not.  And nor is the use of sessional faculty quite so different from one institution to another as to produce these anomalies.  The only other explanation is that there is something wrong with the data.

The problem is: this is a pretty simple ratio; it’s just professors and classes.  The numbers of professors reported by each institution look about right to me, so there must be something odd about the way that most institutions – Trent, Lakehead, Guelph, and Nipissing perhaps excepted – are counting classes.  To put that another way, although it’s labelled “common data”, it probably isn’t.  Certainly, I know of at least one university where the class-size data used within the institution explicitly rejects the CUDO definitions (that is, they produce one set of figures for CUDO and another for internal use because senior management thinks the CUDO definitions are nonsense).

Basically, you have to pick an interpretation here: either teaching loads are much, much lower than we thought, or there is something seriously wrong with the CUDO data used to show class sizes.  For what it’s worth, my money is on it being more column B than column A.  But that’s scarcely better: if there is a problem with this data, what other CUDO data might be similarly problematic?  What’s the point of CUDO if the data is not in fact common?

It would be good if someone associated with the CUDO project could clear this up.  If anyone wants to try, I can give them this space for a day to offer a response.  But it had better be good, because this data is deeply, deeply weird.

November 16

An Interesting but Irritating Report on Graduate Overqualification

On Thursday, the Office of Parliamentary Budget Officer (PBO) released a report on the state of the Canadian labour market.  It’s one of those things the PBO does because the state of the labour market drives the federal budget, to some extent.  But in this report, the PBO decided to do something different: it decided to look at the state of the labour market from the point of view of recent graduates, and specifically whether graduates are “overqualified” for their jobs.

The methodology was relatively simple: using the Labour Force Survey, determine the National Occupation Code (NOC) for every employed person between the ages of 25 and 34.  Since NOCs are classified according to the level of education they are deemed to require, it’s simple to compare each person’s level of education to the NOC of the job they are in, and on that basis decide whether someone is “overqualified”, “underqualified” or “rightly qualified”.

So here’s what the PBO found: over the past decade or so, among university graduates, the rate of overqualification is rising, and the rate of “rightly qualified” graduates is falling.  Among college graduates, the reverse is true.  Interesting, but as it turns out not quite the whole story.

Now, before I get into a read of the data, a small aside: take a look at the way the PBO chose to portray the data on university graduates.

Figure 1: Weaselly PBO Way of Presenting Data on Overqualification Among 25-34 Year Old University Graduates

1

 

 

 

 

 

 

 

 

 

 

 

 

 

Wow!  Startling reversal, right?  Wrong.  Take a look at the weaselly double Y-axis.  Here’s what the same data looks like if you plot it on a single axis:

Figure 2: Same Data on University Graduate Overqualification, Presented in Non-Weaselly Fashion

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

See?  A slightly less sensational story.  Clearly, someone in PBO wanted to spice up the narrative a bit, and did so by making a pretty unforgivable graph, one that was clearly meant to overstate the fundamental situation.  Very poor form from the PBO.

Anyways, what should we make of this change in university graduates’ fortunes?  Well, remember that there was a massive upswing in university access starting at the tail end of the 1990s.  This meant a huge change in attainment rates over the decade.

Figure 3: Attainment Rates Among 25-34 Year-Olds, Canada

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

What this upswing in the university attainment rate meant was that there were a heck of a lot more university graduates in the market in 2013 than there was, say, a decade earlier.  In fact, 540,000 more, on a base of just over a million – a 53% increase between 1999 and 2013.  Though the PBO doesn’t mention it in the report, it’s nevertheless an important background fact. Indeed, it likely explains a lot of the pattern change we are seeing.

To see how important that is, let’s look at this in terms of numbers rather than percentages.

Figure 4: Numbers of Rightly-Qualified and Overqualified 25-34 Year Old University Graduates, 1999-2013

1

 

 

 

 

 

 

 

 

 

 

 

 

In fact, the number of rightly-qualified graduates is up substantially over the last decade, and they’ve been increasing at almost (though not quite) as fast a rate as the number of “overqualified” graduates.  For comparison, here’s the situation in colleges:

Figure 5: Numbers of Rightly-Qualified and Overqualified 25-34 Year Old College Graduates, 1999-2013

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

As advertised, there’s no question that the trend in college outcomes looks better than the one for universities.  Partly that’s because of improvements in colleges’ offerings, and partly it has to do with the run-up in commodity prices, which made college credentials more valuable (remember the Green-Foley papers? Good times).

What should you take from all of this?  If nothing else, don’t forget that comparing university outcomes over time is hard because of the changing size and composition of the student body.  Remember: the median student today wouldn’t have made it into university 25 years ago.  Average outcomes were always likely to fall somewhat, both because more graduates means more competition for the same jobs, and also because the average academic ability of new entrants is somewhat lower.

It would be interesting, for instance, to see these PBO results while holding high school grades constant – then you’d be able to tell whether falling rates of “rightly-qualified” graduates were due to changing economy/less relevant education, or a changing student body.  But since we can’t, all one can really say about the PBO report is: don’t jump to conclusions.

Especially on the basis of those godawful graphs.

November 04

How Canadian Universities Got Both Big and Rich

Earlier this week, I gave a speech in Shanghai on whether countries are choosing to focus higher education spending on top institutions as a response to the scarcity of funds since the start of the global financial crisis.  I thought some of you might be interested in this, so over the next two days I’ll be sharing some of the data from that presentation.  The story I want to tell today is about how exceptional the Canadian story has been among the top countries in higher education.

(A brief aside before I get started on this: there is nothing like a quick attempt to find financial information on universities in other countries to put our own gripes – Ok, my gripes – about institutional transparency into some perspective.  Seriously, you could fill the Louvre with what French universities don’t publish about their own activities.)

For the purpose of this exercise, I compare what is happening to universities generally in a country, to what is happening at its “top” universities.  To keep things simple, I define as a “top” university any university that makes the Top 100 of the Shanghai Academic Ranking of World-class Universities (ARWU).  In Canada, that means UBC, Toronto, McGill, and McMaster (yes, it’s an arbitrary criteria, but it happens to work internationally).  I use expenditures rather than income because fluctuations in endowment income make income numbers too noisy.  Figure 1 shows the evolution of funding at Canadian universities in real (i.e. inflation-adjusted) dollars.

Figure 1: Real Change in Expenditures, Canadian Universities 2000-01 to 2012-13, Indexed to 2000-01 (Source: Statistics Canada/CAUBO Financial Information of Universities and Colleges Survey)

1

 

 

 

 

 

 

 

 

 

 

 

 

 

So this is actually a big deal.  On aggregate, Canadian universities saw their expenditures grow by nearly 70% in real dollars between 2000 and 2010.  For “top” universities, the figure was a little over 80%  (the gap, for the most part, is explained by more research dollars).  Very few countries in the developed world saw this kind of growth.  It’s really quite extraordinary.

But a lot of that money went not to “improvement”, per se, but rather to expanding access.  Here are the same figures, adjusted for growth in student numbers.

Figure 2: Real Change in Per-Student Expenditures, Canadian Universities 2000-01 to 2012-13, Indexed to 2000-01

2

 

 

 

 

 

 

 

 

 

 

 

 

 

Once you account for the big increase in student numbers, the picture looks a little bit different.  At the “top” universities, real per-student income is up 20% since 2000, but about even since the start of the financial crisis; universities as a whole are up about 8% since 2000, but down by nearly 10% since the start of the financial crisis.

This tells us a couple of things.  First, Canadians have put a ton of money, both collectively and as individuals, into higher education over the past 15 years.  Anyone who says we under-invest in higher education deserves hours of ridicule.  But second, it’s also indicative of just how much Canadian universities – including the big prestigious ones – have grown over the past decade.  Figure 3 provides a quick look at changes in total enrolment at those top universities.

Figure 3: Changes in enrolments at highly-ranked Canadian universities, 2000-2001 to 2012-13, indexed to 2000-2001

3

 

 

 

 

 

 

 

 

 

 

 

 

 

In China, the top 40 or so universities were told not to grow during the country’s massive expansion of access, because they thought it would affect quality.  US private universities have mostly kept enrolment growth quite minimal.  But chez nous, McGill’s increase – the most modest of the bunch – is 30%.  Toronto’s increase is 65%, and McMaster’s is a mind-boggling 80%.

Michael Crow, the iconoclastic President of Arizona State University, often says that where American research universities get it wrong is in not growing more, and offering more spaces to more students – especially disadvantaged students.  Well, Canadian universities, even our research universities, have been doing exactly that.  What we’ve bought with our money is not just access, and not just excellence, but accessible excellence.

That’s pretty impressive. We might consider tooting our own horn a bit for things like that.

October 19

Canadian University Finances: An Update

Back in July, the Canadian Association of University Business Officers released the results of its survey of university finances for 2013-14.  The results underline the fact that institutions in Canada are facing some highly heterogeneous financial circumstances.

Let’s start with operating budgets.  Though universities are allegedly facing some kind of unprecedented austerity, total operating income rose by 4.17% in real dollars from the previous year  (inflation from September 2012 to September 2013 was a shade over 1%).  Income from government rose 0.9% in real dollars, from $11.1 billion to $11.2 billion.  But the big new source of money came from student fees, which rose 5.8% (again, after inflation) to $8.6 billion.  Remember, that’s not because tuition fees rose by 5.8%, but rather because both tuition fees and enrolment (notably, international enrolment) increased.

The surprise in 2013-14 was that although government grants and fees make up 91% of operating income, it was the remaining 9%, mostly endowment income, which actually accounted for 35% of all income growth, as shown below in Figure 1.  That’s probably not sustainable.

Figure 1: Source of Operating Income Growth

1

 

 

 

 

 

 

 

 

 

 

 

 

So, if operating income went up 4.17% after inflation, universities must be swimming in cash, right?  Well, no.  Because, in fact, spending went up to match income growth exactly at 4.17%.  Figure 2 shows the year-on-year increase in real spending.

Figure 2: Increases in Real Spending

2

 

 

 

 

 

 

 

 

 

 

 

 

Briefly: Compensation, which forms around 75% of the operating budget, is up by 4.6% after inflation (yes, after).  It’s not from hiring temps or sessionals, which gets classified as “other instructional wages” – that line item has actually shrunk slightly.  Total academic wages are up 3.6%; total non-academic wages are up 4.2%.  (Lest anyone get too excited about wage growth among non-academics, be aware that it’s primarily in student services and IT.  The smallest registered growth among all functional sectors was central administration, at 3.2%.)  But the real killer is what’s happening to benefits: up 9.2% in real terms.  Regardless of why this is happening (my guess: topping up barely solvent pension plans), the technical term for this situation is “bananas”.  But of course, it’s unfair to blame everything on the wage bill because universities aren’t excelling at restraining growth on their non-wage spending, either: that’s up 4.1%.  All of which is to suggest that the revenue theory of expenditure is alive and well in our universities; they raise all they can, and then spend all they raise.

Now of course, these trends aren’t spread equally across universities.  At some institutions, increases in operating budget came in at over 10% (UQTR, Trinity Western, and also UBC, but I think some of that is a reclassification issue).  Queen’s had an 8% real rise in income, and Toronto saw a 6% increase.  But at the other end of the table it’s pretty ugly.  UNB and PEI saw a fall of 3% in real terms, Mount Royal 4%, and NSCAD University a whopping 12%.

Similarly, outside  operating budgets, universities across the country have taken a hit.  Research income fell in nominal terms for the first time since 1995-1996; capital income fell by 8%, and now sits below $1 billion in real terms for the first time since 1998-99.

So, in short: operating costs are rising much faster than government funding, leaving institutions to fill the whole with larger student numbers and lots more international students. Research and capital funding are down in some serious ways.  But these trends are playing out in different ways in different parts of the country.  Kind of like last year only more so.

Happy Election Day, all; don’t forget to vote!

October 13

Statistics Canada is in the Wrong Century

If what you are looking for is agricultural statistics, Statistics Canada is a wondrous place.  See, Statscan even made a fabulous (if oddly truncated) little video about agricultural statistics.

Statscan can tell you *anything* about agriculture.  Monthly oilseed crushing statistics?  No problem (59,387 tonnes in August, in case you were wondering).  It can tell you on a weekly basis the weight of all eggs laid and processed in Canada (week of August 1st = 2.3 million kilograms); it can even break it down by “frozen” and “liquid”.  Want to know the annual value of ranch-raised pelts in Canada?  Statscan’s got you covered.

But let’s not stop here.  Wondering about barley, flaxseed, and canola deliveries for August, by province?  Check.  National stocks of sweetened concentrated whole milk, going back to 1970? Check (for comparison, GDP data only goes back to 1997).  Average farm prices for potatoes, per hundredweight, back to 1908?  Check.

There is even – and this one is my favourite – an annual Mushroom Growers’ Survey.  (Technically, it’s a census of mushroom growers, – and yes, this means Statscan expends resources to maintain a register of Canadian mushroom growers; let that sink in for a moment.)  From this survey – the instrument is here – one can learn what percentage of mushrooms grown in Canada are of the Shiitake variety, whether said Shiitake mushrooms are grown on logs, in sawdust, or pulp mill waste fibers, and then compare whether the value per employee of mushroom operations is greater or lesser for Shiitake mushrooms than for Agaricus or Oyster mushrooms.

According to Statistics Canada, this is actually worth spending money on.  This stuff matters.

Also according to Statistics Canada: the combined value of agriculture, forestry, fishing, and hunting is $25 billion.  Or about $10 billion a year less than the country spends on universities alone.  Total value of educational services is $86 billion a year.

And yet, here are a few things Statscan doesn’t know about education in Canada: the number of first-year students in Canada, the number of part-time instructors at Canadian universities, the number of part-time professors at universities, anything at all about college instructors, access rates to post-secondary education by ethnic background or family income, actual drop-out and completion rates in secondary or post-secondary education, the number of new entrants each year to post-secondary education, the rate at which students transfer between universities and colleges, or within universities and colleges, time-to-completion, rates of student loan default, scientific outputs of PSE institutions, average college tuition, absolutely anything at all about private for-profit trainers… do I need to go on?  You can all list your pet peeves here.

Even on topics they do know, they often know them badly, or slowly.  We know about egg hatchings from two months ago, but have no idea about college and university enrolment from fall 2013.  We have statistics on international students, but they do not line up cleanly with statistics from Immigration & Citizenship.  We get totals on student debt at graduation from the National Graduates Survey, but they are self-reports and are invariably published four years after the student graduates.

What does it say about Canada’s relationship to the knowledge economy, when it is official policy to survey Mushroom growers annually, but PSE graduates only every five years?  Who in their right mind thinks this is appropriate in this day and age?

Now, look, I get it: human capital statstics are more complicated than education statistics, and it takes more work, and you have to negotiate with provinces and institutions, and yadda yadda yadda.  Yes.  All true.  But it’s a matter of priorities.  If you actually thought human capital mattered, it would be measured, just as agriculture is.

The fact that this data gap exists is a governmental problem rather than one resulting from Stastcan, specifically.  The agency is hamstrung by its legislation (which mandates a substantial focus on agriculture) and its funding.  Nevertheless, the result is that we have a national statistical system that is perfectly geared to the Edwardian era, but one that is not fit for purpose when it comes to the modern knowledge economy.  Not even close.

Page 4 of 16« First...23456...10...Last »