HESA

Higher Education Strategy Associates

Category Archives: Colleges and Polytechnics

February 01

Loving It

Back in the summer you may have heard a bit of a brouhaha about a deal signed between Colleges Ontario and McDonald’s, allowing McDonald’s management trainees to receive advanced standing in business programs at Ontario colleges.  If you read the papers, what you probably saw was a he-said/she-said story in which someone from Colleges Ontario said something like “Ontario colleges are providing advanced credit for people who have been through a MacDonald’s management training program and that’s a good thing for access” and someone from the Ontario Public Service Employees Union (OPSEU) saying something like “Corporate education McDonald’s bad!”

This should have been an unequivocally good news story.  It is a travesty that it was not.  Here’s the real story.

McDonald’s, a company which employees around 400,000 employees directly and whose franchisees employ another 1.5 million, runs one of the largest internal corporate training programs in the world.  That’s not just the famous training center known as Hamburger University in Illinois, which is mainly for mid-management and executive development: they also have training centers in various locations around the world providing training programs for restaurant managers and crews.  While not many young employees stay at McDonald’s very long (turnover is something like 150% per year), a small fraction do stick with it to become managers.  And those that do receive a substantial education through the company in how to run a business.

Now, if you believe in the principles of prior learning recognition, you’ll recognize that this situation is a slam-dunk to create a standardized system of assessment to award credit.  Assessing prior knowledge can be a right mess; assessing knowledge gained through work experience (paid or unpaid) or in other forms of informal or non-formal learning in a way that maps on to some kind of credit or credential system is time-consuming and inexact.  But this situation is different.  With McDonald’s, there’s an actual written-down curriculum that can be used to do the curriculum mapping.  This is – comparatively – easy-peasy.

So what happened prior to last summer was that McDonald’s approached Colleges Ontario to try to work out such an arrangement.  Both sides had previous experience in doing something similar: McDonald’s had worked out a similar agreement in British Columbia with BCIT and Fanshawe College had led a national process to do an analogous type of curriculum mapping with the Canadian Military to allow its soldiers/veterans to count various parts of its training programs towards college credentials.  Faculty and admin representatives from all 24 colleges agreed on the parameters of the deal, then allowed a smaller technical group to work on mapping all the elements of McDonald’s coursework up to the Second Assistant Manager level of training onto the common (Ontario) college standard outcomes for the Business Administration diploma.  At the end of it, it was decided that one level was more or less equivalent to another, and so individuals who had reached Second Assistant Manager could automatically get a year’s worth of credit (there’s no partial credit for having complete some McDonald’s training: this is an all-or-nothing deal).

So what are the criticisms?  Basically, they amount to:

  1. College-level courses need to be taught by college teachers in a college atmosphere
  2. McDonald’s is a big evil corporation. Why with McDonald’s?  Why not others?
  3. Why isn’t mapping available publicly?

The first argument, taken to its logical conclusion, essentially says that PLAR is illegitimate because no knowledge derived from outside the classroom can possibly count. Presumably people who believe this also believe mapping arrangements for Armed Forces training is also a complete scandal.

The second…well, if that’s your belief, I suppose there is no shaking it.  As for why McDonald’s – it’s because they asked.  And they had a hell of a well-documented curriculum to present to Colleges Ontario.  Presumably similar deals are open to other businesses, but no one (to my knowledge) has asked.  As for the third, it’s clear why it’s not public: McDonald’s treats the curriculum of its courses as corporate intelligence – as they have every right to do – and don’t want it published for the world to see.  One could make the argument that a decision involving credits at public institutions needs to be to be fully in the public domain.  But, one, that would mean that virtually every program at Ontario university is suspect (just try finding curriculum maps or un-redacted program evaluations online and see how many are publicly available) and two, faculty co-ordinators responsible for Business Administration from all 24 institutions (all of whom are OPSEU members, incidentally) all saw the detailed curriculum in confidence and signed off on the deal, which seems like a reasonable saw-off.

In short, this is a good deal.  If we want to promote life-long learning and increase prior learning recognition, we need more of these, not less.  Bravo to everyone involved.

January 18

More Bleak Data, But This Time on Colleges

Everyone seems to be enjoying data on graduate outcomes, so I thought I’d keep the party going by looking at similar data from Ontario colleges. But first, some of you have written to me suggesting I should throw some caveats on what’s been covered so far. So let me get a few things out of the way.

First, I goofed when saying that there was no data on response rates from these surveys. Apparently there is and I just missed it. The rate this year was 40.1%, a figure which will make all the economists roll their eyes and start muttering about response bias, but which anyone with field experience in surveys will tell you is a pretty good response for a mail survey these days (and since the NGS response rate is now down around the 50% mark, it’s not that far off the national “gold standard”).

Second: all this data on incomes I’ve been giving you is a little less precise than it sounds. Technically, the Ontario surveys do not ask income, they ask income ranges (e.g. $0-20K, $20-40K, etc). When data is published either by universities or the colleges, this is turned into more precise-looking figures by assigning the mid-point value of each and then averaging those points. Yes, yes, kinda dreadful. Why can’t we just link this stuff to tax data like EPRI does? Anyways, that means you should probably take the point values with a pinch of salt: but the trend lines are likely still meaningful.

Ok, with all that out of the way, let’ turn to the issue of colleges. Unfortunately, Ontario does not collect or display data on college graduates’ outcomes the way they do for universities. There is no data around income, for instance. And no data on employment 2 years after graduation, either. The only real point of comparison is employment 6 months after graduation, and even this is kind of painful: for universities the data is available only by field of study; for colleges, it is only available by institution. (I know, right?) And even then it is not even calculated on quite the same basis: universities include graduates with job offers while the college one does not. So you can’t even quite do an apples-to-apples comparison, even at the level of the sector as a whole. But if you ignore that last small difference in calculation and focus not on the point-estimates but on the trends, you can still see something interesting. Here we go:

Figure 1: Employment Rates 6 months after Graduation, Ontario Universities vs. Ontario Colleges, by Graduating Cohort, 1999-2015

ottsyd-20170117

So, like I said, ignore the actual values in Figure 1 because they’re calculated in two slightly different ways; instead, focus on the trends. And if you do that, what you see is (a blip in 2015 apart), the relationship between employment rates in the college and university sector looks pretty much the same throughout the period. Both had a wobble in the early 2000s, and then both took a big hit in the 2008 recession. Indeed, on the basis of this data, it’s hard to make a case that one sector has done better than another through the latest recession: both got creamed, neither has yet to recover.

(side point: why does the university line stop at 2013 while the college one goes out to 2015? Because Ontario doesn’t interview university grads until 2 years after grad and then asks them retroactively what they were doing 18 months earlier. So the 2014 cohort was just interviewed last fall and it’ll be a few months until their data is released. College grads *only* get interviewed at 6 months, so data is out much more quickly)

What this actually goes is put a big dent in the argument that the problem for youth employment is out-of-touch educators, changing skill profiles, sociologists v. welders and all that other tosh people were talking a few years ago. We’re just having more trouble than we used to integrating graduates into the labour market. And I’ll be taking a broader look at that using Labour Force Survey data tomorrow.

November 24

Who’s More International?

We sometimes think about international higher education as being “a market”. This is not quite true: it’s actually several markets.

Back in the day, international education was mostly about graduate students; specifically, at the doctoral level. Students did their “basic” education at home and then went abroad to get research experience or simply emigrate and become part of the host country’s scientific structure. Nobody sought these students for their money; to the contrary these students were usually getting paid in some way by their host institution. They were not cash cows they did (and still do) contribute significantly to their institutions in other ways, primarily as laboratory workhorses.

In this market, the United States was long the champion since its institutions were the world’s best and could attract top students from all over the world. In absolute terms, it is still the largest importer of doctoral students. But in percentage terms, many other countries have surpassed it. Most of them, like Switzerland, are pretty small and small absolute numbers of international students nevertheless make up a huge proportion of the student body (in this case, 55%). The UK and France, however, are both relatively large markets, and despite their size they now lead the US in terms of percentage of doctoral students who are international (42 and 40% vs 35%). Canada, at 27%, is at right about the OECD average.

Figure 1: International Students at Doctoral Level as Percentage of Total
ottsyd-20161123-1

Let’s turn now to Master’s students, who most definitely *are* cash-cows. Master’s programs are short degrees, mainly acquired for professional purposes and thus people are prepared to pay a premium for good ones. The biggest market here are for fields like business, engineering and some social sciences. Education could be a very big market for international Master’s but tends not to be  because few countries (or institutions, for that matter) seem to have worked out the secret for international programs in what is, after all a highly regulated profession. In any case, this market segment is where Australia and the UK absolutely dominate, with 40 and 37% of their students being international. Again, Canada is a little bit better than the OECD average (14% vs. 12%).

Figure 2: International Students at Master’s Level as Percentage of Total
ottsyd-20161123-2

Figure 3 turns to the market which is largest in absolute terms: undergraduate students. Percentages here tend to be smaller because domestic undergraduate numbers are so large, but we’re still talking about international student numbers in the millions here. The leader here is – no, that’s not a misprint – Austria at 19% (roughly half of them come from Germany – for a brief explainer see here). Other countries at the top will look familiar (Great Britain, New Zealand, Australia) and Canada doesn’t look to bad, at 8% (which strikes me as a little low) compared to an OECD average of 5%. What’s most interesting to me is the US number: just 3%. That’s a country which – in better days anyway – has an enormous amount of room to grow its international enrollment and if it hadn’t just committed an act of immense self-harm would have be a formidable competitor for Canada for years to come.

Figure 3: International Students at Bachelor’s Level as Percentage of Total

ottsyd-20161123-3

Finally, let’s look at sub-baccalaureate credentials, or as OECD calls them, “short-cycle” programs. These are always a little bit complicated to compare because countries’ non-university higher education institutions and credentials are so different. Many countries (e.g. Germany) do not even have short-cycle higher education (they have non-university institutions, but they still give out Bachelor’s degrees). In Canada, obviously, the term refers to diplomas and certificates given out by community colleges. And Canada does reasonably well here: 9% of students are international, compared to 5% across OECD as a whole. But look at New Zealand: 24% of their college-equivalent enrollments are made up of international. Some of those will be going to their Institutes of Technology (which in general are really quite excellent), but some of this will also be students from various Polynesian nations coming to attend one of the Maori Wānanga.
Figure 4: International Students in Short-Cycle Programs as Percentage of Total

ottsyd-20161123-4

Now if you look across all these categories, two countries stand out as doing really well without being either of the “usual suspects” like Australia or the UK. One is Switzerland, which is quite understandable. It’s a small nation with a few really highly-ranked universities (especially ETH Zurich), is bordered by three of the biggest countries in the EU (Germany, France, Italy), and it provides higher education in each of their national languages. The more surprising one is New Zealand, which is small, has good higher education but no world-leading institutions, and is located in the middle of nowhere (or, at least, 5000 miles from the nearest country which is a net exporter of students). Yet they seem to be able to attract very significant (for them, anyway) numbers of international students in all the main higher education niches. That’s impressive. Canadians have traditionally focused on what countries like Australia and the UK are doing in international higher education because of their past track record. But on present evidence, it’s the Kiwis we should all be watching, and in particular their very savvy export promotion agency Education New Zealand.

Wellington, anyone?

June 15

A Canadian Accomplishment

Often, I think, I am seen as a bit of a downer on Canada.  It goes with the territory: my role in Canadian higher education is i) “the guy who knows what’s going on in other countries and ii) “the guy who pokes the bear”.  So frequently I ending up writing blogs saying why isn’t Canada doing X or wouldn’t it be great if we were more like Y, and people get the impression I’m down on the North.

Not true.  I think we have a pretty good system, one most of the world would envy if we could ever stop admiring our minute inter-provincial differences and explain our system properly.  Among OECD countries, we’re always in the top third of pretty much any higher education metric you want to use.  Never at the very top, but reasonably close.  It’s just that it’s not cheap, is all.  We’re never going to win any prizes for efficiency; countries like Israel, the Netherlands and Australia perform far better on those metrics.

But there is one area in which Canada does a fantastic job and doesn’t even realise it.  And that is the extent to which it has a strong culture of work-oriented higher education which is matched by few other countries.

Let’s start with our colleges and polytechnics, which for the most part deliver labour market-oriented professional education at a level known by UNESCO and OECD as “Type 5B” (bachelor’s degree programs are called “Type A”).  Among OECD countries only Japan and Korea do a greater proportion of young people have this kind of education.

Figure 1: Level 5 (post-secondary education) Attainment Rates of 25-34 year olds, Select OECD Countries

fig1june15

 

We sometimes hear complaints from colleges and polytechnics about not getting enough respect, but the fact is, Canada has arguably the best-funded and most successful non-university post-secondary education system in the world.  We should say it, and celebrate it.

What about the university system, you say?  Well, the University of Cincinnati may have invented co-op education, but I don’t think there’s much doubt that the University of Waterloo perfected it.  Last time I checked, they were arranging over 17,000 co-op experiences for students every year.  And institutions across the country have adopted the idea as well.  Personally, I think that’s a result of competition from our excellent college sector: it keeps universities on their toes.

 

And OK, it’s easy to scoff at university claims that 40% of students get some kind of work-integrated learning experience because so many of them are so short-term and of not-particularly high quality, and because at least a few universities seem to care more about classifying as things as possible as “experiential” than actually creating more such experiences: but so what?  The fact that we’re having the debate at all suggests we are on the right track.  And that’s a sight better than most other countries I could name.

Now, I know some of you are going to say “but Germany! Switzerland! Apprentices!”.  And there are some admirable things about those systems (though, as I have said before), Canadians deeply misunderstand what it is apprenticeships in Germany actually do).  Namely, they aren’t post-secondary in nature (note how low Germany’s Type B score is in the figure above); rather, they’re part of the secondary system and in many ways are designed to keep people out of the post-secondary system.  It’s hard to compare out system to theirs.

So, in sum: could we do more on experiential and work-integrated learning?  Of course we could (and should).  But stop and smell the roses: compared to most places, we do a pretty good job on this stuff.  And we should acknowledge that to ourselves even if, in true Canadian fashion, we’re a little reluctant to say so to anyone else.

June 01

Early Results from the Tennessee “Free Tuition” Experiment

You may remember a blog I wrote last year concerning something called the Tennessee Promise.  Described by some as a “free tuition” program, essentially what it did was ensure that every Tennessee student enrolled in a Tennessee community college received student aid at least equal to tuition.  In the fall, the state touted that first year, direct-from high-school enrollments in Tennessee colleges had increased by fourteen percent.  But now, however, some more complete data is available in the form of the State’s annual higher education factbook, which allows us to look a little bit more deeply at what happened.

What the numbers show is something a little bit weird.  If we look just at direct from-public-high-school -to-community-college/college of technology, the numbers are actually much better than initially advertised.  In 2014, this number was 13,527; in 2015 it was 17,550, and increase of nearly 30%.  That’s quite astonishing.

However, not all of this jump in enrollment at colleges came from “new” students.  To a considerable degree, the jump in the number of community-college bound students came from cannibalizing students who would otherwise have attended 4-year colleges, as shown in the figure below.

Figure 1: Public In-State Public High School Graduate Enrolment by System, Fall 2011-Fall 2015

2016-05-31-1

So, Community College and College of Applied Technology enrollment rose by about 4,000, but enrollments in 4-year colleges fell by 2,000, meaning effectively that half the growth came from people switching from other types of higher education. Still, net growth in enrollments at all levels was about 2,000 , or 6.8%, which is pretty impressive given that growth in the three previous years combined was only about 4%.  It sure seems like there is something positive going on here.  But what?

Well, free tuition promoters would have you believe that what’s happening here is a rush of previously-excluded poor students suddenly attending because education is more affordable.  Unfortunately, we can’t directly check students’ socio-economic backgrounds, so we can’t know for sure who’s responding to these lower net prices.  However, because the factbook shows transition rate by county, we can look at different enrollment responses by county median household income. Figure 2 plots the percentage increase in enrollments in each of Tennessee’s 95 counties against their median household incomes

Figure 2: Percentage increase in college-going rate, Tennessee 2015 over 2014 by County, vs County Median Household Income

2016-05-31-2

Pretty clearly, there’s no relationship here, which at face value suggests that participation rates of students from poor counties did not increase any faster than the rates of students from richer counties.  But that’s not quite right.  Remember we are looking at percentage increases, and poorer counties tend to have lower participation rates.  Therefore, in order for the percentage increase to be the same in richer and poorer counties, the percentage point increase actually has to be larger in richer counties.  (think about:  a 10% rise for a county with 30% participation rates is 3 percentage point; for a county with a 60% part rate, a 10% rise requires a jump of 6 percentage points).

So, a pure, unsophisticated simple-stupid pre-post analysis of the Tennessee data, suggests that the Tennessee Promise appears to have i) caused a 30% increase in 2-year college-going rates among high school graduates, half of which was diverted from other types of higher education and ii) caused a 6.8% overall increase in transitions to all forms of college, but that this increase did not primarily take place due to increases of the college-going rate of students from poorer counties.

Make no mistake, this is still a very good outcome for a program that only costs $14 million per cohort per academic year; it works out to $7,000 per new student added to the post-secondary system, which is pretty cheap.  Nevertheless, it’s worth noting that those benefits don’t seem to necessarily accrue to youth from poorer backgrounds.

 

May 27

Three Unconnected Thoughts on PSE and Aboriginal Peoples

1)      Changing Disciplines

In the last five years or so, I’ve seen a real change in the way Aboriginal students are moving through the country’s PSE system.  For a whole number of reasons, aboriginal students were traditionally concentrated either in humanities disciplines like history and sociology, or they were in disciplines which led to careers in social services or direct band employment (child care, police foundations, education, nursing).  STEM and Business fields simply weren’t in the picture.  That’s changed substantially over the past few years.  Aboriginally-focussed business programs are popping up all over the place.  Increasingly, we are seeing enrolments in STEM (though there is still a long way to go).  So what’s changed?

A couple of things, I think.  First, the demographics of First Nations students are changing.   Time was, a very high proportion of aboriginal students came in after quite a period out of school, typically in their mid-20s.  Nowadays, we are seeing a lot more students transition at an earlier age, direct from high school (and more often than not from urban, mainstream high schools).  On average, this background prepares them better for PSE than graduation from on-reserve schools. Hence, they tend to be applying for and getting access to more selective courses.

But this begs the question: what’s behind this shift at the secondary level?  A lot of it is demographics.  A greater proportion of First Nations youth are living in urban areas, and so on average they have better access to better schooling.  Drop-out rates are still high and there is much to be done to improve inner-city schools, but conditional on completing high school First Nations graduates seem about as prepared as mainstream students to deal with the rigors of PSE.

Another important factor here is the aging of the last generation to have experienced residential schools.  Parents pass on their views of education to their children; unsurprisingly, those who had been through residential schools weren’t always inclined to encourage their children to invest a lot of their identity in schooling. On top of poverty, racism, etc., this probably had a lot to do with low aboriginal participation rates until fairly recently.  But most residential schools closed in the 1970s; so most of the kids now coming through the system are the grandchildren of the last residential schools generation.  Soon it will be the great-grandchildren.  The bad memories of residential schools are by no means gone, but they are of less relevance in terms of pre-disposition to invest in schooling, and that matters.

Finally, there’s the money issue.  Institutions are finding it a whole lot easier now to raise funds for aboriginal scholarships or other focussed initiatives than they used to.  And that certainly improves the quality of the aboriginal student experience, which probably contributes to improved completion rates.

2)  Money

People are rightly getting peeved at the federal Government for having not come through on its promise to add $50 million in funding for First Nations education through the Post-Secondary Student Support Program (PSSSP).  I expect that’s a promise the Liberals will try to fulfill next year or the year after (there may be a delay as the Feds ponder the implications of the Daniels decision which puts Metis Canadians on the same legal footing as First Nations vis-à-vis the federal government.

But what people haven’t remarked on is the huge boost in funding that First Nations students could receive should they sign up for federal and provincial students aid.  In Ontario, virtually all on-reserve students will be eligible for $9,000 in grants through the new Ontario Student Grant: elsewhere, they will be eligible for at least $3,000 through the improved Canada Student Grant.  If First Nations make their students apply for this aid before applying to their bands for PSSSP, then all their students will have at least some base amount of funding.  That would mean bands wouldn’t need to give as much to each individual student, and could use freed-up funds to provide aid to more students, thus alleviating the well-known waiting list problem.  But that would take a bit of organization to make sure band educational counselors know how to help their students navigate the federal/provincial aid system.  Something our friends at Aboriginal Affairs might want to think about.

3) Truth & Reconciliation

Since the release of Justice (now Senator) Murray Sinclair’s report last year, some Canadian post-secondary institutions have made some extremely useful gestures towards reconciliation, like requiring all students to take  at least one class in aboriginal culture or history.  Which is great, except it’s not actually what Sinclair asked for.  Rather, he asked that students in specific professional programs (i.e. health and law) be required to take courses in Aboriginal health and law, respectively.  As I said at the time I thought this was a stretch and that prestigious law programs would resist this (quietly and passive aggressively, of course).

It’s been a year now – and to my knowledge (everybody please correct me if I am wrong) – no university law or medical school has adopted this proposal.  I wonder how long before this becomes an issue?

May 02

What’s Going On With College Graduates in Ontario?

I see that Ken Coates and Bill Morrison have just written a new book  called Dream Factories: Why Universities Won’t Solve The Youth Jobs Crisis.  I haven’t read it yet, but judging by the title I’d assume that it makes pretty much the same argument Coates made back in this 2015 paper  for the Canadian Council of Chief Executives, which in effect was “fewer university students, more tradespeople!” (my critique of this paper is here)

With the fall in commodity prices, it’s an odd time to be making claims like this (remember when we had a Skills Gap?  When’s the last time you heard that phrase?).  There’s no evidence based on wages data that trades-related occupations are experiencing greater growth that those in the rest of the economy – since 2007, wages in these occupations have grown at exactly the same rate as the overall economy.  True, occupations in the natural resource sector did experience higher-than-average growth between 2010 and 2014, but unsurprisingly they underperformed the rest of the economy in 2015.  (see figure 1).  More to the point, perhaps, these jobs aren’t a particularly large sector of the economy – if you exclude the mostly seasonal agricultural harvesting category, Canada only has about 265,000 workers in this field.  That’s less than 1.5% of total employment.

Figure 1: Real Wage Increases by Occupation, Canada, 2007-2015, 2007=100

2016-05-01-1

Source: CANSIM

More generally, though, the assumption of Coates and those like him is that in the “new” post-crisis  economy college graduates have qualitatively different (and better) outcomes than university graduates, too.  But a quick look at the actual data suggests this isn’t the case.  Figure 2 shows employment rates 6-months out of college graduates in Ontario over the past decade.  Turns out college graduates have experience more or less the same labour market as university students: an almighty fall post-Lehmann brothers and no improvement thereafter.

Figure 2: Employment Rates of College Graduates, Ontario, 2005-2015

2016-05-01-2

Source: Colleges Ontario Key Performance Indicators

The decline in employment rates can’t really be described as a regional phenomenon, either.  There is not a single college which can boast better employment rates today than it had in 2008: most have seen their rates fall by between 4 and 7 percentage points.  The worst performer is Centennial College, where employment rates have fallen by 13 percentage points; one wonders whether Centennial’s performance has something to do with the very rapid growth in the number of international students it has started accepting in the last decade.

Figure 3: Change in Employment Rates 2008-2015

2016-05-01-3

Source: Colleges Ontario Key Performance Indicators

So what’s going on here?  Is there something that’s changed in college teaching?  Is it falling behind the times?  Well, not according to employers.  Satisfaction rates among employers stayed rock-solid over the period where employment rates fell; and although there has been a slight decline  in the last couple of years, the percentage saying they are satisfied or very satisfied remains over 90%.  Graduate satisfaction fell a bit during the late 00s when employment rates fell, but they too remain very close to where they were pre-crisis.

Figure 4: Employer & Student Satisfaction Rates for College Graduates, Ontario, 2005-2015

2016-05-01-4

Source: Colleges Ontario Key Performance Indicators

My point here is not that colleges are “bad” or universities are “better”.  Rather, my point is that if you measure the success of any part of the post-secondary system exclusively by employment rates, then you’re basically hostage to economic cycles.  Some parts of the cycle might make you look good and others might look bad; regardless, it’s largely out of your hands. So, maybe we should stop focusing so much on this.  And we should definitely stop pretending colleges and universities are different in this respect.

April 27

Comparing Per-Student University Expenditures by Category (2)

This is part 2 of a two-parter on how Canadian universities spend their money.  All the stuff about what data I’m using, caveats thereto, etc., are available in yesterday’s post.  If you missed yesterday, go catch up here.

First, two small mea culpas from yesterday.  First, due to a cut/paste error, part of the data on student services that went out yesterday was slightly off, but has now been corrected on the website.  Second, I neglected to mention that the student services figures included money from operating budgets for grants and bursaries, which accounts for some of the wide differences between institutions.  Sorry.

OK, onwards.  Let’s focus first on the two spending categories we didn’t take a look at yesterday; namely, “Administration” (meaning, mostly, central administration) and “External Relations” (meaning mostly government relations and fund raising).  This is shown below in table 1.

Table 1: Per-Student Expenditure, Selected Categories of Non-Academic Activity

2016-04-27-1

A couple of obvious points here:

  • Compared to the spending categories we looked at yesterday, the gaps between 75th and 25th percentile are smaller (in other areas, the gap was usually 2:1; in these categories it is closer to 3:2).  This suggests that on the whole, institutional spending patterns vary less in these central admin functions that they do in areas like libraries and ICT.
  • On the other hand, the institutions at the top and bottom of the range seem to be much more outliers.  At the high-cost end, there are probably two things going on.   First, some tasks are pretty common and have to be done no matter what the size of the university, so small institutions  tend to look expensive on a per-student basis (for example: a $400,000 p.a President at a school with 40,000 students is $10/student; a $200,000 p.a President at a university with 2,000 students is $100/student).  Second, recall that the “central administration” category does vary a bit from school-to-school, and so some of this may be about oddities in reporting.
  • Most of the schools that spend small on “external relations” are part of the UQ system.   Basically, when you’re so close to being 100% government-funded and controlled, you don’t lobby or look for external money, hence your costs go down.

Figure 2 puts together all the data from the different expenditure categories.

Table 2: Per-Student Expenditure, all non-academic categories

2016-04-27-2

Three major points here:

  • The per-student costs at very small universities is really stratospheric.  Universities clearly have some fixed base costs that require large student numbers in order to make them bearable.  From a public policy perspective, that either makes it important to ensure institutions are a minimum size, or that funding formulas provide a base amount for fixed costs in addition to per-student funding.
  • Keeping a rein on non-academic costs matters.  The difference in costs between an institution at the 75th percentile of overall non-academic costs and a 25th percentile institution is $2,950 or pretty close to half a year’s worth of average tuition at a Canadian university.  That’s a lot of money which could be used for other purposes (or cut in order to provide cheaper education, though that wouldn’t be my choice.
  • Actually, it’s even more than that.  If an institution could emulate the spending of the 25th percentile institution in each individual category – that is, a library cost like UQAM’s ($509/student), an ICT cost like Carleton’s ($508/student), physical plant costs like Laval’s ($1,331/student), Student Services costs like Winnipeg’s ($958/student), administration costs like St. Thomas’ ($1,604/student) and external relations’ costs like Manitoba’s ($285/student), you’d have total non-academic costs of just $5,195 – that is, $3,800 less than the 75th percentile institution and $2,200 less than the median one.

But of course, one might protest: does anyone really want to be in the 25th percentile of spending on this stuff?  Don’t great universities spend a lot of money on this stuff?  Isn’t spending more money on things like Libraries and ICTs a sign of quality?

Well, maybe.  To some extent, you get what you pay for.  But welcome to the central paradox of university management: you can’t simultaneously demand prudence and excellence if the only indicator of greatness is how much money you spend.  It’s why outcome metrics matter; and why those who oppose them, in the end, simply promote waste.

April 26

Comparing Per-Student University Expenditures by Category (1)

Just for giggles the other day, I took a look at Canadian university expenditures in 2013-14 using (as usual) the CAUBO/Statscan Financial Information of Universities and Colleges Survey.  I looked at operating expenditures by category.  Then I normalized them per FTE student.  And I got some very weird results which I thought I would share with y’all.

What I am going to do in this series is show you the results for the main categories of expenditure which are “non-academic”.  I am not going to look at the categories known as “instruction and non-sponsored research” or “non-credit instruction”, because those vary significantly according to the mix of disciplines offered at an institution.  Instead, today I am going to restrict myself just to the categories “Library”, “Computing”, “Physical plant”, and “Student Services”; tomorrow I will  look at the more complicated cases of “Administration” (meaning central administration), and “External Relations” (meaning both government/public affairs and fundraising/alumni relations).

(btw – the data is from 2011-12 because we haven’t updated our PSIS file lately.  The numbers presented here are a bit dated, but the basic picture hasn’t changed.)

The following table shows the key elements of the comparison.  The intriguing thing here is that institutions actually seem to have very different patterns of spending.  In all four categories, the difference in per-student spending between an institution at the 75th percentile is twice what it is at the 25th percentile.  I’m not sure I would go so far as to say that institutions are using different strategies of non-academic spending to meet their mission – it’s not clear that these spending variations are occurring in a conscious manner – but it is certainly true that institutions are exhibiting quite different patterns of spending.

ottsyd 20160425-W

So, a variety of thoughts here:

  • The universities with the lowest-spend in Libraries are all small-ish, new-ish (post-1992) institutions; those with the highest spending are more of a mix.
  • Athabasca and RRU near the top of the ICT spending charts is not a surprise; what is a little weird is seeing Université du Québec en Outaouais in top spot.  Also, why is U of T near the bottom?  ICT is one of those fields in the FIUC survey which is prone to bad comparisons (some institutions stick a lot of the salary costs related to ICT in their central admin numbers or occasionally in their faculty expenditures, if staff are based in the faculty – a quirk of the way the data is compiled), so it might be that.  On the other hand, expenditures on ICT might just scale a lot better.
  • The student services numbers are fascinating: 4 of the 5 top-spending institutions are in Nova Scotia; 4 of the 5 lowest-spending institutions (and 7 of the top 10) are in Quebec.
  • The physical plant numbers are the hardest to interpret because in a sense these are in some ways legacies.  NSCAD owns some historic properties with high upkeep and doesn’t have a lot of students and so has high per-student costs.  Kwantlen is a relatively new institution and therefore doesn’t need to spend a lot on upkeep (or heat – institutions in the lower mainland have a big cost advantage because of the climate)

There are a couple of ways to look at these numbers overall.  There’s the competitive-bidding aspect: some will look at these numbers and say “why isn’t our institution spending that much?  Gotta keep up with the Joneses!”  But there’s an efficiency angle, too.  Those institutions spending at the 75th quartile and above – what are they getting for their money that other institutions are not?

Maybe the most interesting case is Libraries.  A lot of big Ontario universities have very low library costs: Guelph $473/student, Waterloo $591, McMaster $688, Ottawa $723, Western $749, all of which are below the national average.  You might think the big difference is in the collections budget – and it’s true those are lower, in part because there is a lot more collections-sharing between institutions in Ontario than is possible in places like Saskatoon or St. John’s, which don’t have nearby neighbours.  But the biggest single cost in Libraries is salaries, which makes up 45-65% of any university library’s budget (higher in Quebec).  The real difference between these institutions is therefore staffing.  So do users notice the difference?  If so, which users and how is the difference felt?

More tomorrow.

 

January 11

Why Class Size Matters (Up to a Point)

At the outset of the MOOC debate about four years ago, there was a line of argument that went something like this:

MOOC Enthusiast:  These MOOCs are great.  Now the classroom is not a barrier.  Now we can teach hundreds of thousands of students at a time!  Quel efficiency!

Not MOOC Enthusiast:  They’re just videos.  They can’t give you the same human touch as an in-class experience with a professor.

MOOC Enthusiast: How’s that human touch going for you in the 1,000-person intro class?

To which there was never really a particularly good reply, just a lot of sputtering about underfunding, etc. The fact is, from a student’s point of view, there probably isn’t a lot of difference between a 1,000 person classroom and an online course, at least as far as personal touch from a professor is concerned.  There are some other differences, of course, mainly in terms of the kinds of study supports available, but if your argument is that direct exposure to tenured faculty is what matters, then this is kind of beside the point.

There was a period of time during which it was fashionable to say that class size didn’t matter, and that it was what happened in the class, not how big it was, etc., etc.  I am ever less convinced by some of these arguments.  Small classes matter for two reasons.  One is the ability – in science, health ,and engineering disciplines in any case – to be in contact with advanced equipment.  If classes are too large, students don’t get enough time with the top equipment and hence aren’t as prepared for careers in their fields as they might be.  Obviously this matters more in places like Africa than in North America, but you’d be surprised at how often this issue pops up here.  I know of at least one “world-class” university in Canada that, faced with budget cuts in the late 1990s, instituted a policy of not offering lab courses to science majors until third year (yes, really).

The second reason is perhaps more universal: the larger the class, the less interaction there is, not just between professors and students but also between students.  And this interaction matters because it is the key to developing many of the soft skills required for employability.  Work that is presented in class and argued among colleagues – whether assigned to teams or individuals – is pretty much the only place where students actually come to understand in real time how arguments are made and broken, how to interact with colleagues and experts, how to deal with (hopefully constructive) criticism, among other skills. When I go to developing countries (where I am currently doing a lot of work) and I hear about how students don’t have labour force skills, this is exactly what employers are talking about, and there’s simply no way to provide them those skills at the scale of classes currently being offered.  So, small classes are good, but not primarily for disciplinary reasons (though those may benefit as well).  It’s mostly about employability.

Canadian polytechnics actually worked this out awhile ago.  One of the most notable differences between degree programs at polytechnics and universities is that class sizes are relatively constant over four years in polytechnics, whereas universities (apart from the smallest of liberal arts colleges) employ a pyramid model, with huge classes in first year and many more smaller ones in upper years (CUDO data – flawed as it is – suggests that there are more classes with 30 students or less for 4th year students than there are classes of all sizes for first year students).  Students at polytechnics are getting the benefits of smaller classes all the way through, while for most university students, these benefits aren’t seen until third year at the earliest.

By this, I don’t mean to suggest that class size is destiny.  The point that what happens in a class is a function of more than its size is a relevant one (although a slightly trickier one to make today than in pre-MOOC times).  But interaction matters.  If institutions are going to increase class sizes (as they have done repeatedly over the past two decades, both through admitting more students and reducing professors’ undergraduate course loads), there needs to be a strategy to work out how interaction can be maintained or improved.  Otherwise, it’s very hard to say that quality isn’t being impaired.

Page 1 of 512345