HESA

Higher Education Strategy Associates

September 21

Unit of Analysis

The Globe carried an op-ed last week from Ken Coates and Douglas Auld, who are writing a paper for the MacDonald Laurier institute on the evaluation of Canadian post-secondary institutions. At one level, it’s pretty innocuous (“we need better/clearer data”) but at another level I worry this approach is going to take us all down a rabbit hole. Or rather, two of them.

The first rabbit hole is the whole “national approach” thing. Coates and Auld don’t make the argument directly, but they manage to slip a federal role in there. “Canada lacks a commitment to truly high-level educational accomplishment”, needs a “national strategy for higher education improvement” and so “the Government of Canada and its provincial and territorial partners should identify some useful outcomes”. To be blunt: no, they shouldn’t. I know there is a species of anglo-Canadian that genuinely believes the feds have a role in education because reasons, but Section 93 of the constitution is clear about this for a reason. Banging on about national strategies and federal involvement just gets in the way of actual work getting done.

Coates & Auld’s point about the need for better data applies to provinces individually as well as collectively. They all need to get in the habit of using more and better data to improve higher education outcomes. I also think Coates and Auld are on the right track about the kinds of indicators most people would care about: scholarly output, graduation rates, career outcomes, that sort of thing. But here’s where they fall into the second rabbit hole: they assume that the institution is the right unit of analysis for these indicators. On this, they are almost certainly mistaken.

It’s an understandable mistake to make. Institutions are a unit of higher education management. Data comes from institutions. And they certainly sell themselves as a unified institutions carrying out a concerted mission (as opposed to the collections of feuding academic baronetcies united by grievances about parking and teaching loads they really are). But when you look at things like scholarly output, graduation rates, and career outcomes the institution is simply the wrong unit of analysis.

Think about it: the more professional programs a school has, the lower the drop-out rate and the higher the eventual incomes. If a school has medical programs, and large graduate programs in hard sciences, it will have greater scholarly output. It’s the palette of program offerings rather than their quality which makes the difference when making inter-institutional comparisons. A bad university in with lots of professional programs will always beat a good small liberal arts school on these measures.

Geography play a role, too. If we were comparing short-term graduate employment rates across Canada for most of the last ten years, we’d find Calgary and Alberta at the top – and most Maritime schools (plus  some of the Northern Ontario schools) at the bottom. If we were comparing them today, we might find them looking rather similar. Does that mean there’s been a massive fall-off in the quality of Albertan universities? Of course not. It just means that (in Canada at least) location matters a lot more than educational quality when you’re dealing with career outcomes.

You also need to understand something about the populations entering each institution. Lots of people got very excited when Ross Finnie and his EPRI showed big inter-institutional gaps in graduates incomes (I will get round to covering Ross’ excellent work on the blog soon, I promise). “Ah, interesting!” people said. “Look At The Inter-Institutional Differences Now We Can Talk Quality”. Well, no. Institutional selectivity kind of matters here. Looking at outputs alone, without taking into account inputs, tells you squat about quality. And Ross would be the first to agree with me on this (and I know this because he and I co-authored a damn good paper on quality measurement a decade ago which made exactly this point).

Now, maybe Coates and Auld have thought all this through and I’m getting nervous for no reason, but their article’s focus on institutional performance when most relevant outcomes are driven by geography, program and selectivity suggests to me that there’s a desire here to impose some simple rough justice over some pretty complicated cause-effect issues. I think you can use some of these simple outcome metrics to classify institutions – as HEQCO has been doing with some success over the past couple of years – but  “grading” institutions that way is too simplistic.

A focus on better data is great. But good data needs good analytical frameworks, too.

September 20

Sessionals: Equal Pay for Equal Work?

Following up on yesterday’s piece about counting sessionals, I thought it would be a useful time to address how sessionals get paid.  Every so often, the Ontario Confederation of University Faculty Associations (OCUFA) issues a press release asking that contract faculty get “equal pay for work of equal value”.  And who could be against that?  But what they don’t say, because no one wants to say this out loud is that, in Canada , adjuncts and sessionals are far from being underpaid: for the most part they actually are compensated fairly.  At least according to the standards of the academy itself.

I know that’s an unpopular opinion, but hear me out.  Think about what the correct comparator to a sessional academic is: it is a junior-rank academic, one who has been given assistant professor status but is not yet tenured.  These days in Canada, average pay for such folks is in the $80,000 range (your mileage may vary based on an institution’s location and prestige).

How much of that $80,000 is specifically for teaching?  Well, within the Canadian academy, there is a rule of thumb that a professor’s time should be spent 40% on teaching, 40% on research and 20% on some hazily-defined notion of service.  So, multiply that out and what you find is that only $32,000 of a new tenure-track prof’s salary is devoted to teaching.

Now break that down per class.   Depending on the institution, a professor is (in theory at least) teaching either four or five semester-long classes per academic year (2/2 or 3/2, in the academic vernacular).  Divide that $32,000 payment for teaching by four and you get $8,000 per one-semester class; divide it by five and you get $6,400.  An “equal work for equal pay” number therefore needs to be somewhere in that range.

Here’s what we know about adjuncts’ salaries: in 2014, the Higher Education Quality Council of Ontario published a study on salaries of “non-full-time instructors” in the province.  It showed that sessional instructors’ salaries in 2012/13 ranged from about a little $6,000 per course to a little over $8,000 per course (with inflation, it is likely slightly higher now), with most of the big universities clustered in the low-mid $7,000 range.  At a majority of institutions, sessionals also get health benefits and may participate in a pension plan.  In 2013, University Affairs, the in-house publication of Universities Canada published results on a nine-institution survey of sessional lecturer compensation (see here).  This showed a slightly wider range of compensation rates: at Quebec schools they were comparable to or slightly higher than Ontario rates, while elsewhere they were somewhat below.

To re-cap: if you buy the 40-40-20 logic of professorial pay, most universities in Canada – at least in central Canada – are in fact paying sessionals roughly the same as they are paying early-career tenure-track academics.  In some cases the benefits are not the same, and there may be a case for boosting pay a bit to compensate for that.  But the complaint that sessionals are vastly underpaid for the work they are contracted for?  Hard to sustain.

Sessionals themselves would naturally argue that they do far more than what they are contracted for: they too are staying academically active, doing research, etc.  To which the university response is: fine, but that’s not what we’re paying you for – you’re doing that on your own time.  The fight thus isn’t really about “equal pay”, it’s a fight about the right to be paid for doing research.

And of course OCUFA knows all this.  The math involved is pretty elementary.  It can’t really think these staff are underpaid unless it believes a) that the 40-40-20 thing is wrong and teaching should be a higher % of time and salary (good luck getting that one past the membership) or that sessionals need to be paid not on the same scale as assistant profs but on the scale of associates or full profs (again, I would question the likelihood of OCUFA’s membership thinking this is a good idea).

But if neither of those things is true, why does OCUFA use such language?  It’s a mystery worth pondering.

September 19

Counting Sessionals

Much rejoicing last Thursday when Science Minister Kirsty Duncan announced that the federal government was re-instating the funding for the Universities and Colleges Academic Staff System (UCASS), which was last run in 2011.     But what caught most people’s attention was the coda to the announcement, which said that Statistics Canada was going to “test the feasibility” of expanding the survey to include “part-time and public college staff” (the “C” in UCASS stands for colleges in the Trinity College sense, not the community college sense, so despite the name public colleges have never been in the survey).

What to make of this?  It seems that by “part-time” Statscan meant sessionals/adjuncts/ contract faculty.  That’s a bit off because every university I know of makes a very sharp distinction between “part-time” (many of whom are tenured) and “sessionals”.  It make one worry that Statistics Canada doesn’t understand universities well enough to use the correct terminology, which in turn bodes ill for their future negotiations with universities around definitions.

Because let’s be clear about this: universities will do almost anything they can to throw sand in the gears on this.  They do not want data on sessionals in the public eye, period.  Oh sure, in public the Presidents will welcome transparency, evidence-based decision-making, etc.  But institutional research shops – the ones who will actually be dealing with Statscan on this file – are Olympic champions in shutting down government attempts to liberate information.  In fact, that’s arguably their main purpose.  They won’t actually say no to anything – they’ll just argue relentlessly about definitions until Statscan agrees to a reduced program of data collection.  Statscan knows this is coming – they have apparently allocated four years (!!!) for negotiations with institutions, but the safest guess is that this simply isn’t going to happen.

And to be fair to universities, the kind of data UCASS would provide about sessionals would be pretty useless – a lot of work for almost nothing.  UCASS can count individuals, and track their average salaries.  But average salary data would be useless: it would conflate people teaching one course with people teaching five.  And since UCASS had no way to track workload (you’d actually need to blow up the survey and start again if you wanted to get at workload, and as interesting as that might be, good luck getting universities to green-light it), the data is meaningless.  Knowing the number of sessionals tells you nothing about what proportion of undergraduates are being taught by sessionals.  Are 200 sessionals teaching one course each worse than 100 teaching two courses apiece?  Of course not.  But if raw numbers are the only thing on offer then we’ll ascribe meanings to them where they arguably shouldn’t exist.

You see, “sessionals” are not really a single phenomenon.  Many are professionals who have full-time jobs and like teaching a class on a side, and they’re usually a huge boon to a department (especially in professional faculties like law, nursing an business) because they help expose students to a life beyond academia.  Others are PhD students teaching a class while another professor is away – and thus learning valuable skills.  The “bad” sessionals – the one people claim to want to stamp out – are the ones who have a PhD, are teaching multiple classes the way professors do.  I suspect this is a pretty small percentage of total sessionals, but we don’t know for sure.  And adding sessionals to UCASS won’t get us any closer to finding out because even if they wanted to, universities couldn’t collect data on which of their employees have other full-time jobs outside the institution.

Despite all the kumbayahs on Tuesday about how this UCASS expansion is going to promote “evidence-based decision-making”, I’m genuinely having trouble imagining a single policy problem where data from UCASS would make a difference.  Universities already know how many sessionals they employ and whether numbers are going up or down; UCASS might let them know how many sessionals other universities employ but frankly who cares?  It’s not going to make a difference to policy at an institutional level.

If you really wanted to know something about sessionals, you’d probably start with requiring institutions simply to provide contact information for every individual with teaching responsibilities who is not tenure-track, along with the amount paid to them in the previous academic year (note: Statscan couldn’t do this because it would never use the Stats Act to compel data this way.  Provincial governments could do so, however).  Then you’d do a survey of the instructors themselves – number of classes taught, other jobs they have, income from other jobs, etc.  Now I know some of you are going to say; didn’t some folks at OISE do that just recently?  Well, almost.  Yes, they administered almost this kind of survey, but because they weren’t drawing their sample from an administrative database, there’s no way to tell how representative their sample is and hence how accurate their results are.  Which is kind of important.

So, anyways, two cheers for the return of UCASS.  More data is better than less data.  But the effort to get data on sessionals seems like a lot of work for very little practical return even if universities can be brought round to co-operate.

September 16

OECD data says still no underfunding

The OECD’s annual datapalooza-tastic publication Education at a Glance was released yesterday.  The pdf is available for free here.  Let me take you through a couple of the highlights around Higher Education.

For the following comparisons, I show Canada against the rest of the G7 (minus Italy because honestly, economically, who cares?), plus Australia because it’s practically our twin, Korea because it’s cool, Sweden because someone always asks about Scandinavia and the OECD average because hey that just makes sense.  First off, let’s look at attainment rates among inhabitants 25-34.  This is a standard measure to compare how countries have performed in the recent past in terms of providing access to education.

Figure 1: Attainment Rates, 25-34 years olds, selected OECD countries

ottsyd20160915-01

*Data for Master’s & above not provided separately for Korea and Japan, and is included in Bachelor’s

Education-fevered Korea is light-years ahead of everyone else on this measure, with 69% of its 25-34 yr old population attaining some kind of credential, but Canada is still close to the top at 59%.  In fact we’re right at the top if you look just at short-cycle (i.e. sub-baccalaureate) PSE (see previous comments here about Canada’s world-leading strengths in College education); in terms of university attainment alone, our 34% is slightly below the OECD average of 36%.

Now let’s turn to finances.  Figure 2 shows total public and private expenditure on Tertiary educational institutions.

Figure 2: Public and Private Expenditures on Tertiary Institutions, as a Percentage of GDP, Selected OECD Countries

ottsyd20160915-02

Canada spends 2.5% of GDP on institutions, just below the US but ahead of pretty much everybody else, more than 50% higher than the OECD average.  For those of you who have spent the last couple of years arguing how great Germany because of free tuition is and why can’t Canadian governments spend money like Germany, the answer is clearly they can.  All they would need to do is cut spending by about 30%.

(If you’re wondering how UK claims 58% of all money in higher ed comes from government when the latest data from Universities UK shows it to be 25%, the answer I think is that this is 2013 data, when only 1/3 of the shift from a mainly state-based university funding system to mainly student-based funding system had been completed)

Turning now to the issue of how that money is split between different parts of the tertiary sector, here we see Canada’s college sector standing out again: by some distance, it receives more funding than any other comparable sector in the OECD (with 0.9% of GDP in funding).  The university sector, by contrast,  gets only 1.6% of GDP, which is closer to the OECD average of 1.4%.

Figure 3: Expenditure on Tertiary Institutions, by sector, as a Percentage of GDP, selected OECD countries

ottsyd20160915-03

*US data not available for short-course, 2.6% is combined total

Now this is the point where some of you will jump up and say “see, Usher?  We’re only barely above the OECD average! Canadian universities aren’t as well-funded as you usually make out.”  But hold on.  We’re talking % of GDP here.  And Canada, within the OECD is a relatively rich country.  And, recall from figure 1 that out university attainment rate is below the OECD average, which means those dollars are being spread over fewer students.  So when you look just at expenditures per student in degree-level programs, you get the following:

Figure 4: Annual Expenditures per Student in $US at PPP, Degree-level Programs only, Selected OECD Countries

ottsyd20160915-04

Again, Canada is very close to the top of the OECD charts here: at just over $25,000 US per student we spend over 50% more per student than the OECD average (and Germany, incidentally – just sayin’).

So, yeah, I’m going to give you my little sermon again: Canada’s is not an underfunded university system by any metric that makes the remotest bit of sense.  If we’re underfunded, everyone’s underfunded, which kind of robs the term of meaning.

That doesn’t mean cuts are easy: our system is rigid and brittle and even slowing down the rate of increase of funds causes problems.  But Perhaps if we directed even a fraction of the attention we pay to “underfunding” to the problem of our universities’ brittleness we might be on our way to a better system.

I won’t hold my breath.

September 15

Innovation Policy: Are Universities Part of the Problem?

We’re talking a lot about Innovation in Canada these days. Especially in universities, where innovation policy is seen as a new cash funnel. I would like to suggest that this attitude on the part of universities is precisely part of Canada’s problem when it comes to Innovation.

Here’s the basic issue: innovation – the kind that expands the economy – is something that firms do. They take ideas from here and there and put them together to create new processes or services that fill a market need in a way that creates value (there’s public sector innovation too but the “creating value” thing is a bit trickier, so we’ll leave that aside for now while acknowledging it exists and matters a lot).

Among the many places the ideas come from are higher education institutions (HEIs). Not necessarily local HEIs: ideas travel, so Toronto firms can grab ideas from universities in Texas, Tromso or Tianjin as well as U of T. The extent to which they will focus on ideas generated locally has to do not only with the quality of the local ideas, but also with the way the ideas get propagated locally. Institutions whose faculty are active and involved in local innovation networks will tend to see their ideas picked up more often that those who do not, partly because contact with local firms generates “better” scientific questions and partly because they will have more people paying attention to their work.

But ideas are really only a part of what matters in innovation. Does the business climate encourage firms to innovate? What’s the structure of business taxation? What kind of management and worker skill base exists? What regulations impede or encourage innovation? What barriers to competition and new entrants exist? What kind of venture capital is available? Does government procurement work in favour of or against new products or services? All of this matters in terms of helping to set firms’ priorities and set it on a more-innovative or less-innovative path.

The problem is, all this stuff is boring to politicians and in some cases, requires directly attacking entrenched interests (in Canada, this specifically has to do with protectionism in agriculture telecoms and banking). It requires years of discipline and trade-offs and politicians hate discipline and trade-offs. If only there were some other way of talking about innovation that didn’t require such sacrifice.

And here’s where universities step in to enable bad policies. They write about how innovation is “really” about the scientific process. How it’s “really” about high tech industries of the future and hey, look at all these shiny labs we have in Canada, wouldn’t it be great if we had more? And then all of a sudden “innovation” isn’t about “innovation” anymore, it’s about spending money on STEM research at universities and writing cheques to tech companies (or possibly to real estate companies to mediate a lot of co-working spaces for startups). Which as far as I can tell seems to be how Innovation Minister Navdeep Bains genuinely approaches his file.

Think I’m exaggerating? Check out this article from Universities Canada’s Paul Davidson about innovation in which the role of firms is not mentioned at all except insofar as they are not handing enough money to universities. Now, I get it: Paul’s a lobbyist and he’s arguing his members’ case for public support, which is what he is paid to do. But what comes across from that article is a sense that for Universities , l’Innovation c’est nous. Which, as statements of innovation policy go, is almost Nickelbackian in its levels of wrongness.

I don’t think this is a universal view among universities, by the way. I note SFU President Andrew Petter’s recent article in the same issue of Policy magazine which I think is much clearer in noting that universities are only part of the solution and even then, universities have to get better at integrating with local innovation networks. And of courses colleges, by putting themselves at the more applied end of the spectrum, are inherently aware that their role is as an adjunct to firms.

Universities are a part – a crucial part, even – of innovation systems. But they are a small crucial part. Innovation Policy is not (or should not be, anyway) code for “industrial policy in sci/tech things universities are good at”. It is (or should be) about firms, not universities. And we all need to remember that.

September 14

The Canadian Way of Study Abroad

A few years ago, I think around the time that HESA Towers ran a conference on internationalization, I realized there was something weird about the way Canadian higher education institutions talked about study abroad.  They talked about it as helping students “bridge the gap between theory and practice”, “increasing engagement”, and “hands-on learning”.

That’s odd, I thought.  That sounds like experiential learning, not study abroad.  Which is when it hit me: in Canada, unlike virtually everywhere else in the world, study abroad to a large degree is experiential learning.

In Europe, when they say study abroad, they mostly mean study at a foreign institution in the same field through the Erasmus program.  In the US, they may mean this, or they may means studying in facilities owned by their home universities but located in different countries.  For instance, Wake Forest owns a campus in Venice, Webster University has a campus in Leiden, University of Dallas has one in Rome (have a browse through this list). Basically, if your students are paying megabucks to be at a US campus, the ideas can’t just give them exchange semesters at some foreign public school because who knows about the quality, the safety, etc.

But look at how Canadian institutions showcase their study abroad: McGill talks up its science station in Barbados.  University of Alberta showcases its international internships.  University of Saskatchewan has a fabulous little Nursing programs which ties together practicums in East Saskatoon and Mozambique.  The stuff we like to talk about doesn’t seem to actually involve study in the sense of being in a classroom, per se.  That’s not to say our universities don’t have typical study-abroad programs: we’ve got thousands of those.  They’re just not where the sizzle is.  It’s a distinctly Canadian take on the subject.

This brings me to a point about measuring the benefits of study abroad.  Let’s take it for granted that being abroad  for a while makes students more independent, outward-looking, able to problem-solve, etc.  What is it, exactly, about being abroad that actually makes you that way?  Is it sitting in classes in a foreign country?  Is it meeting foreign people in a foreign country?  Is it meeting people from your own culture in a foreign country (too often the main outcome of study abroad programs)?  What about if you actually get to work in a foreign country? And – crucially for the design of some programs – how long does it take for the benefits to kick in?  A week?  A month?  A year?  When do diminishing returns set in?

Despite study-abroad being a multi-billion dollar niche within higher education, we actually don’t know the answer to many of these questions.  There isn’t a lot of work done which picks apart the various elements of “study abroad” to look at relative impact.  There is some evidence from Elspeth Jones in the UK that many of the benefits actually kick in after as little as 2-4 weeks, which suggests there may be cheaper ways of achieving all these purported benefits.

Of course, one of the reasons we have no answers to this is that it’s pretty hard to unpack the “treatment” involved in study abroad.  You can’t, for instance, randomly assign people to a program that just sits in class, or force people to make friends among locals rather than among the study-abroad group.  But, for instance, it would be possible to look at impacts (using some of the techniques we talked about yesterday) based on length of study abroad period.  It would be possible to compare results of programs that have students mostly sit in class to ones where they do internships.  It would be possible to examine internships based on whether or not they actually made friends among local students or not, a question not asked enough in study evaluation work.  It would also be possible to examine this based on destination country: are the benefits higher or lower depending on proficiency in the destination country’s language?

These questions aren’t easily answerable at the level of an individual institution – the sample size on these would simply be too small.  But one could easily imagine a consortium of institutions agreeing to a common impact assessment strategy, each collecting and sharing data about students and also collectively collecting data on non-mobile students for comparative purposes (again, see yesterday’s blog), perhaps through the Canadian Undergraduate Survey Consortium.  It would make a heck of a project.

If anyone’s interested in starting a consortium on this, let me know.  Not only would it be fun, but it might help us actually design study abroad experiences in a more thoughtful, conscious and impactful way.  And we’d find out if the “Canadian Way” is more effective than more traditional approaches.

Worth a try, I think.

September 13

Measuring the Effects of Study Abroad

In the higher education advocacy business, an unhappily large proportion of the research used is of the correlation = causation type.  For instance, many claim that higher education has lots of social benefits like lower crime rates and higher rates of community volunteering on the grounds that outcomes of graduates are better than outcomes of non-graduates in these areas.  But this is shaky.  There are very few studies which look at this carefully enough to eliminate selection bias – that is, that the people who go to higher education were less disposed to crime/more disposed to volunteering to begin with.  The independent “treatment” effect of higher education is much more difficult to discern.

This applies in spades to studying the question of the effects of study abroad.  For instance, one widely quoted study  of the Erasmus program showed that five years after graduation, unemployment rates for graduates who had been in a study-abroad program were 23% lower than for those who did not.  But this is suspect.  First of all “23% lower” actually isn’t all that much for a population where unemployment is about 5% (it means one group has unemployment of 4% and the other 5%, more or less).  Second of all, there is a selection bias here.  The study-abroad and non-study abroad populations are not perfectly identical populations who differ only in that they have been given different “treatments”: they are different populations, one of which has enough drive and courage to pick up sticks to move to another country and (often) study in another language.  It’s quite possible they would have had better employment outcomes anyways.  You can try to limit bias by selecting a control group that is similar to the study abroad population by selecting a group that mimics them in terms of field of study, GPA, etc, but it’s not perfect and very few studies do so anyway (a very honourable mention here to the GLOSSARI project from Georgia headed by Don Rubin)

(Before we go any further: no, I don’t think employability skills are the only reason to encourage study abroad.  I do however think that if universities and colleges are going to frame their claim for more study abroad in economic terms – either by suggesting students will be more employable or making more general claims of increasing economic competitiveness – then it is incumbent on them to actually demonstrate some impact.  Claiming money on an economic imperative and them turning around and saying “on that doesn’t matter because well-rounded citizen” doesn’t really wash.

There are other ways of trying to prove this point about employability, of course.  One is to ask employers if they think study abroad matters.  They’ll usually say yes, but it’s a leap of faith to go from that to saying that study abroad actually is much a help in actually landing a job.  Some studies have asked students themselves if they think their study abroad experience was helpful in getting a job.  The answer is usually yes, but it’s hard to interpret what that means, exactly.

Since it’s difficult to work out directly how well internationalization is helping students get jobs, some people try to look at whether or not students get the skills that employers want (self-discipline, creativity, working in teams, etc).  The problem with this approach of course, is that the only real way to do this is through self-assessment which not everybody accepts as a way to go (but in the absence of actual testing of specific skills, there aren’t a whole lot of other options).  Alternatively, if you use a pre-post evaluation mechanism, you can at least check on the difference in self-assessment of skills over time, which might then be attributed to time spent in study abroad.  If that’s still not enough to convince you (if, for instance, you suspect that all students self-assessments would go up over the space of a few months, because all students are to some degree improving skills all the time), try a pre-post method with a control group, too: if both groups’ self-assessments go up, you can still measure the difference in the rate at which the self-reported skills increase across the two groups.  If they go up more for study-abroad students than for stay-at-homes, then the difference in the rates of growth can, cautiously, be attributed to the study abroad period.

Basically: measuring impacts takes time, and is complicated.  And despite lots of people in Canada avowing how important outbound mobility is, we never seem to take them time, care and expense to do the measurement.  Easier, I suppose, to rely on correlations and hope no one notices.

It’s a shame really because I think there are some interesting and specifically Canadian stories to tell about study abroad.  More on that tomorrow.

September 12

How Many Canadian Students Study Abroad? How Many Should?

If you look at the current issue of Policy Options, there is a startling claim made in the sub-headline of an article by Universities Australia CEO Belinda Robinson; namely, that “five times as many Australian undergraduates are studying abroad as their Canadian counterparts”.  It’s not a claim Robinson herself makes – it seems likely that it’s been added by the editorial staff at Policy Options.  The problem is it’s not correct.

The Canadian numbers come from a periodic survey Universities Canada does of its members on internationalization (the last example of this is here).  The last time they did the survey they found that 2.6% of students did a “for-credit” international experience, and another 0.5% did a non-credit course: total, 3.1%.  That’s if you believe universities can actually keep track of this stuff; my guess is there’s a substantial number who can’t or don’t capture this data well – particularly in those cases where students are arranging foreign experiences on their own, so this number is likely at least a bit of an undercount.

Now, in the aforementioned article Robinson noted that over 16% of Australian students had an overseas experience of some kind.  Someone, clearly, took that 16%, divided it by the Canadian 3% and voila!  Over 5 times more!  Except this is apples and oranges: the Canadian figure refers to students who go abroad in any given year while the Australian figure is the percentage who go abroad at some point in their career.   We don’t actually know what percentage of Canadian undergraduates go abroad over the course of their degree.  Back in the days when we HESA Towers used to run a national student panel, we found that 8% of current students (in the panel, at any rate, which was skewed to upper-year students and hence should come closer to the Australian picture) had had some kind of exchange experience.

(This is one of those things we could answer pretty easily if Statscan put a question about it in the NGS, or if CUSC put it in their tri-annual surveys of graduating students.  Hint hint.)

Wonky figures aside, Australia does seem to have been doing something right over the past few years, having quadrupled its out-bound student flow since 2000, and perhaps Canada should be emulating it.  But there is a genuine question here about what the “right” number of students going abroad.  What‘s should our target be?  I’ve seen serious commentators say we shouldn’t be looking at Australia, but rather Germany, where something like 30% of students go abroad at some point (actually, it’s 30% of “upper-year” students have gone abroad, based on a survey of students – on which measure Canada is, as noted earlier, about 8%).

This strikes me as a bit pie-in-the-sky.  For a German student, the marginal cost of studying elsewhere in the EU is fairly low; over two-thirds of undergraduates in Germany already live alone (source – the excellent Eurostudent website), and they have a host of potentially awesome international destinations within a couple of hundred dollars transportation fare by budget airline.  In Canada, a greater percentage of students live with their parents, so the average marginal cost is going to be higher, not to mention the fact that most of the international destinations we care about (we’re inconsistent about whether or not to call the US an “international experience”, mostly we mean Europe and Asia) are a couple of thousand dollars away, and unless you live in Toronto or Vancouver, the costs of living abroad on one’s own are likely to be somewhat higher than it is back here.   So it’s overall a much more expensive proposition for Canadians than Germans.

And what are the benefits of study abroad?  Anything that might justify the extra expense?  Well, I’ll get into this in some length over the next couple of days, but ask yourself: when’s the last time you heard about a recent graduate getting or losing a job because of having/not having an international experience?  Exactly.  Whatever you might be able to get a corporate exec to say re: the need for global competencies blah blah blah, Canadian employers, be they in the private or public sector, simply don’t seem to care that much about international experiences (lest you think I am being harsh on Canada and its complacency in international affairs, I urge everyone to read Andrea Mandell-Campbell’s book Why Mexicans Don’t Drink Molson.  Tl:dr: too often Canadians believe the hokey “The World Needs More Canada” line when in fact the reverse is usually true).

I think it will be hard to make the financial case for raising our rate of outbound mobility, simply because neither students nor governments will put money into this kind of project if there aren’t clear signals from the labour market that the return will balance the expense.  Instead, study abroad will remain what it too often is now: a holiday somewhere nice.  For all the talk of study abroad as an inter-cultural experience, it is striking how many students take their study abroad in the US, the UK, Australia or France: in cultures not dissimilar from their own.

So how can we measure and sell the benefits of study abroad?  Tune in tomorrow.

September 09

Some Intriguing New UK Access Data

The UK’s Higher Education Statistics Agency (also known in these parts as “the other HESA”) put out an interesting report recently on participation in higher education in England (available here).  England is of course of great interest to access researchers everywhere because its massive tuition hike in 2012 is a major natural policy experiment: if there is no clear evidence of changes in access after a tuition hike of that magnitude then we can be more confident that tuition hikes elsewhere won’t have much of an effect either (assuming students are all given loans to cover the fees as they are in England).  I’ve written about previously about some of the evidence that has come out to date back here, here, here and here: mostly the evidence has shown little to no effect on low-income students making a direct transition to university, but some effects on older students.

The new (other) HESA report is interesting.  You may have seen the Guardian headline on this, which was that since the change in fees, the percentage of state school students who proceeded to higher education by the age of 19 fell from 66% to 62% in the years either side of the policy change (note: regular state-school students make up a little over 83% of those enrolled in A-level or equivalent courses, with the rest split about equally between selective state schools and independent schools).  On the face of it, that’s a pretty bad result for those concerned about access.

But there are three other little nuggets in the report which the Guardian chose to ignore.  The first was that if you looked simply at those who took A-levels, the drop was much smaller (from 74% to 72%).  Thus the biggest drop was from those taking what are known as “A-level equivalents” (basically, applied A-levels).  The second is that among the very poorest students – that is, those who receive free school meals, essentially all of whom are in the main state sector – enrolment rates essentially didn’t move at all.  They were 21% in 2011/12, 23% in 2012/13 and 22% in 2013/14. All of this is a long way up from 13% observed in 2005, the year before students from families with incomes below £20,000 had to start paying tuition.  Third and last, the progression rate of state school students to the most selective institutions didn’t change at all, either.

So what this means is that the decline was most concentrated not on the poor in state schools but in the middle-class, and landed more on students with “alternative” credentials.  That doesn’t make a loss of access any more acceptable, but it does put a crimp in the theory that the drop was *caused* by higher tuition fees.  If “affordability” (or perceived affordability) were the issue, why would it hit middle-income students more than lower-income students?  If affordability were the issue, why would it be differentially affecting those taking alternative credentials?  There some deeper questions to answer here.

 

September 08

Trends in Canadian University Finance

New income and expenditure data on Canadian universities came out over the summer courtesy of StatsCan and our friends over at the Canadian Association of University Business Officers (CAUBO), so today it’s time to check in on what the latest financial trends.

In 2014-15, income at Canadian Universities was, overall, a record 35.5 Billion dollars (just above 2% of GDP, if you’re counting).  That’s up 1% in real terms over the previous year and up 5% on five years ago (2009-10).  But the composition of that income is changing.  Total government income is down 2% in real terms from last year and down 8% from 2009-10 (the latter being somewhat exaggerated because the base year included a lot of money from the 2009 budget stimulus via the Knowledge Infrastructure Program (KIP).  Income from student fees, on the other hand, was up 5% on the previous year and 32% up from 2009-10, again taking inflation into account.  That doesn’t mean that fee levels increased that much; this is aggregate income so part (maybe even most) of this change comes from changes in domestic and (more pertinently) international enrollment.

Figure 1: Change in Real Income by Source, Canadian Universities, 2014-15 vs 2013-14 and 2009-10

ottsyd20160907-01

Let’s turn to a look at expenditures by type.  Salary mass for academic staff actually fell slightly last year after inflation, but over five years the overall salary budget for academics is up by 10%, after inflation. Again, this isn’t what’s happening to average salaries, it’s what’s happening to aggregate salaries, so it’s partially a function of average and partially a function of staff numbers.  For non-academic salaries, it’s an 11% increase over five years.  And yes, you’re reading that right: labour costs have risen 10% while income has risen only 5%.  Again, that’s exaggerated a bit by fluctuations in incoming funds for capital expenditures, but it’s probably not sustainable in the long term.  Because other elements of the budget are increasing quickly too: for instance, scholarship expenditures rose by 21% over that period to stand at over $1.87 billion.

 Figure 2: Change in Real Expenditures by Type, Canadian Universities, 2014-15 vs 2013-14 and 2009-10

 ottsyd20160907-02

Finally, let’s take a look at expenditures by function within the operating budget.  Operating budgets as a whole are actually up quite a bit – 14% (this is partially offset by falls in the capital and research budgets).  Here’s how the money gets used:

 Figure 3 – Division of Canadian Universities’ Operating Budgets by Expenditure Function, 2014-15

ottsyd20160907-03

As you’d expect and hope, the lion’s share (57%) of the operating budget goes to instruction and non-sponsored research.  Most of the rest goes on three categories: administration, student services, and physical plant.  Figure 4 shows how growth in each of these areas has differed.

Figure 4: Change in Real Expenditures by Function, Canadian Universities, 2014-15 vs 2013-14 and 2009-10

ottsyd20160907-04

If you look at the “big four” spending areas, instructional and admin costs rose at roughly the same rate over fiver years (14% vs. 15%), while student services rose more quickly (21%) and physical plant less so (7%, with a 4% drop in the last year).  Non-credit instruction is up very strongly for reasons I cannot quite fathom.  But look at computing costs (up 31%) and “External Relations” (which includes Government Relations, alumni relations/fundraising and other marketing costs – up 27%).

In sum: i) government funding is down in real dollars but student income has replaced that income and more besides, so that institutional budgets are still increasing at inflation +1% per year; ii) compensation budgets (academic and non-academic alike) are rising faster than income, which is a problem for the medium-term and iii) there are a lot of small-ish budget items that are growing much more quickly than salaries (scholarships, computing, student services etc.) but given that compensation is 60% of the total budget, that’s still where the majority of the restraint needs to happen.

Page 16 of 116« First...10...1415161718...304050...Last »