HESA

Higher Education Strategy Associates

Category Archives: Uncategorized

October 13

Pedagogical Change: Why Waterloo and not McMaster?

In the field of higher education, Canada has two genuine claims to having been (at least at one-time) at the forefront of innovation: co-op education, which primarily stems from Waterloo’s Faculty of Engineering, and Problem-based Learning as practiced at McMaster’s School of Medicine.   This is a big deal: most universities never pioneer innovative pedagogical techniques, and here Canada has two of them.  Yet only one of these universities really gets credit for it.  Waterloo is known nationally (and to some degree internationally) for its’ pedagogy, and McMaster…isn’t.  Not really.  And understanding why is key to understanding how innovation spreads (or doesn’t) in higher education.

So, let’s start with McMaster.  Shortly after the School of Medicine was founded in the mid-1960s, the staff there decide to adopt a pedagogy that had been experimented with at Case Western in the 1950s. Namely, switching from a system of more or less rote system of learning information to a system with a much greater emphasis on problem-solving skills.  What McMaster added to the Case Western system was a focus on tutorials and small-group learning.  Within the world of medical education this method was a smash success, spreading over the space of three decades to fifty-odd medical schools in North America, Europe and Australia.  In a couple of instances, it even jumped the disciplinary boundary into fields like business education and architecture. Significantly, it never made the jump to any other part of the institution at McMaster itself.  And though PBL still exists, McMaster is no longer really thought of as the leader.

Now, compare that to Waterloo, an institution that began life as a satellite campus of the (then) University of Western Ontario, teaching engineering to serve tire-making factories in the region.  The professors at Waterloo were intrigued by the model of co-op education that had been developed at the University of Cincinnati in the United States and wanted to introduce it at Waterloo College.  Western’s Engineering faculty thought this was simply too déclassé an idea for a real university and said no.  Since the Government of Ontario was in the business of setting up new universities at that time, Waterloo College essentially flipped Western off and started their own university, with co-op as a kind of founding mission.   Within Waterloo co-op spread to all its faculties, including Arts and last I heard was placing over 17,000 student per year in co-op programs.  Co-op is now  the norm in Canadian Engineering schools, and people all over the world recognize Waterloo as the pre-eminent institution in co-op education.

So what’s the difference?  Why did co-op at Waterloo turn out one way and PBL at McMaster another?

I think the simplest take on it is this: Waterloo had co-op embedded in its DNA.  The school’s trimester schedule, which was a necessary complement to co-op, was adopted across the institution.  Professors are hired based on their willingness to work in the trimester system and their willingness to update classes frequently based on feedback from students on how in-touch curricula really are with industry practices.  It’s not isolated to one faculty, or to the senior administration: the whole institution really is invested in it.  Compare that to McMaster, where no other faculty (so far as I can tell) ever really took the idea of PBL seriously. Even the School of Medicine itself wasn’t founded on PBL principles.   It was a success, for a time.  But it wasn’t in the DNA.

There is an important lesson here.  Universities, even when presented with fabulous ideas for reform, are very reluctant to change on a systematic basis.  It’s not that individual professors never do anything new; it’s that systemic change requires everyone going in a more or less similar direction at the same time, that that is very difficult for institutions to achieve.  It’s why real university reform often happens not by getting universities to change but by setting up new institutions.  Napoleon knew this: it was why he shut down universities and created the Grandes Écoles.  It’s why in the United States it took a new institution like Johns Hopkins to pioneer the PhD, or why it took the arrival of ANU in Australia to really make universities take research and graduate work seriously.

Left to themselves, universities will always tend to be conservative, fearful and change-averse.  History shows that new institutions pursuing new missions with all their might and leading by example can, eventually, drive real change.

October 12

The Fractured Chinese Higher Education Market

We often casually refer to China as being a single higher education market, but that’s really not true.  It’s probably more accurate to say that it is 32 different markets (34 if you want to include Macau and Hong Kong), one for each of the 23 provinces, 5 autonomous regions, and 4 major municipalities (Beijing, Shanghai, Chongqing and Tianjin).  That’s not just because most higher education funding is local rather than national; it’s also because student mobility is significantly restricted, especially among top universities.

Let’s start with economics.  Broadly speaking, the coastal provinces  Inner Mongolia are fairly rich (on par with Greece and the Baltics by GDP per capita), the middle provinces such as Hubei, Heinan and Shanxi have GDP/capita roughly half that of coastal ones, and then further west GDP per capita drops by another 50% when you get to Yunnan, Sichuan, and Gansu.  It’s not quite as simple as that – Anhui and Jangxi are close to the coast but relatively poor, Xianjing is as far west as you can go and yet is part of the middle-band, Shanghai and Beijing are relatively wealthy, etc.  But the rule of thumb is: coastal provinces are rich, inner provinces are poor.

This matters to higher education for a couple of reasons, the main one being that for the most part, higher education is funded provincially.  There are, however, a few dozen universities which are primarily  administered and  funded from Beijing, most of which report to the Ministry of Education but some to other ministries (e.g. the Telecoms Ministry or the Army).  The 38 research-intensive “985” universities (so named because the policy which governs them dates from May 1998) receive massive amounts of central government funding. The 110-odd “211” universities (apparently a reference to having 100 21st Century universities…21-1(00)…no, I don’t get it either) also get some central funding despite being largely dependent on local funding.

The second reason this matters is that these “top-tier” universities (especially the 985s) are unevenly distributed around the country.  Beijing has seven of them, Shanghai 4, and Tianjin and Chongqing another 3 between them – meaning that nearly half of the top universities are in just four cities.  Indeed, fourteen provinces and regions have no 985 universities at all, and Tibet has neither a 211 nor a 985 university.  That wouldn’t be a major issue if it weren’t for a second important factor: student mobility in china is strictly limited.

Because universities are mostly funded locally, the local government gets to determine the number of spots at each university.  Unsurprisingly, poorer provinces have fewer spots than richer ones.  Which means cut-offs have to be higher; and since every state has control of its own gaokao exam, it’s become the case that different provinces have different levels of gaokao difficulty (there is a helpful service which compares them all and ranks them on difficulty – for the last couple of years it has been Jiangsu).

But despite regional differences in the difficulty  of the gaokao, universities treat all the provincial scores as equal.  This matters enormously because each province reserves spaces for local students – and limits spaces for out-of-province students.  Pretty much everyone in the country wants to get into one of the top Beijing universities, and yet these policies keep these institutions largely the preserve of locals.  Which is absolutely fine for the privileged few who live in Beijing (mainly public servants and party apparatchiks) but not so good for anyone else.  A student from Jiangsu not only takes a tougher gaokao than one from Beijing, but s/he has to obtain a much higher score in order to get into Tsinghua or Peking.  It’s considered a truism in those schools that the students from outside Beijing are of a much higher calibre than the locals.

This problem isn’t going away any time soon.  As Damian Ma and William Adams say in their excellent book In Line Behind a Billion People, this policy of reserving little educational plums for Beijing parents is one of those things that keeps the elite population behind the regime: in a democratic system there is simply no way that benefits would be concentrated this tightly (their chapter on education is called “Give me Equality: But Not Until After My Son Gets Into Tsinghua”).  So even as the central government tries to open spaces at top-tier (i.e. 985 or 211) universities for people from provinces where top-tier universities are scarce or non-existent, they are doing their level best not to put that burden anywhere other than Beijing or Shanghai.  This year, most of the growth in spaces for students from the western provinces fell on Hubei and Jiangsu, much to the anger of local residents who feared their own children would lose out as a result.

There is a lesson here for people interested in recruiting students in China, and it is this: ignore the coastal provinces.  Find the provinces with the hardest gaokaos and the fewest 985/211 institutions (Jiangxi is not a bad place to start).  There are a lot of frustrated families there.  Go talk to them.  They will be more price-conscious than the students on the coasts, but they will also probably be of higher quality.

October 06

Does the Canada Student Loans Program Make Money?

You’ll remember a couple of weeks ago I took the Ontario NDP to task for an absurd meme about the provincial government “profiting” from student loans. But it occurred to me later than though there is no way the charge sticks against the provincial government, it arguably might about the federal government’s Canada Student Loans Program (CSLP), which both borrows more cheaply and lends more dearly than the provincial government. So I decided to find out.

The data I am using in this blog comes from the latest CSLP Actuarial Report, which was published in 2012 (and hence presumably written in 2011). This is done periodically by the Chief Actuary of Canada (the same guy who makes sure the Canada Pension Plan is solvent). I suspect a lot of his data after 2011-12 is off because of the large jump in loan program usage after Ontario introduced the 30% tuition rebate midway through that year. The Actuary also assumed interest rates were going to rise throughout the decade (they haven’t), and more controversially, assumed enrollments would fall substantially over the same period (which they have in certain regions but not nationally). So to avoid these and other issues, I am simply going to use the 2011-12 projections, which have the least doubt about them as they are the least contaminated by dubious projections.

Here’s a quick summary of the estimated cost of the program: In-school (Class A) interest – that is, the interest government pays on student loans while students are in school and hence paying no interest – is $128 Million (which is *tiny* considering that there are 400,000 borrowers per year – credit here to prolonged slow growth and the lowest interest rates in living memory). The Repayment Assistance Program, which subsidizes repayments for low-income borrowers in repayment, is another $169 Million. Then on top of that is the provision for bad debt. Based on long-term trends, the government puts aside 12.4% of every dollar lent on the assumption some people will default. That, plus the interest on the loans left outstanding comes to $376.2 million. Grand total: $673.2 million.

(There are also $650-odd million in grants plus $280 or so million in alternative payments to Quebec, Nunavut and NWT and $140 M in administration fees, which brings the total cost to a little over $1.7 billion or so, but put that aside for the moment.)

So to go back to our example from last week, the question is whether or not CSLP meets the Elizabeth Warren test for “profiting from students”: that is, does net income from the interest paid by students more than cover the cost of interest subsidies and defaults? Income from loans comes from the spread between the rate at which the government of Canada borrows (currently hovering around 1% on ten-year bonds) and the rate at which it lends to students (prime +2.5%, or currently 5.2%). The rates were slightly different in in 2011-12 but the 420 basis point spread has stayed pretty consistent. Which is a whole lotta basis points – it’s over three times the spread Ontario gets on its loans – and quite a lot of room in which to “make money”.

A lot, but not quite enough. The projection for revenue on interest paid for 2011-12 was $521.4 million. The cost of borrowing was $166 million, meaning that “net” revenue – that is, earning on the spread between loan costs and loan revenues – was $355 million. So the huge spread the federal government has on student loans more or less covers the cost of defaults, but still leaves the government’s Consolidated Revenue Fund to pay nearly $300 million for loan costs such as Class A interest and RAP, not to speak of another billion or so for the Canada Student Grants, the alternative payments and administration.

The lesson to be learned from all this is that student loan programs are expensive. Even if you charge stonkingly high rates of interest with huge spreads, loan losses from defaults and interest subsidies will eat those up and more. There are no profits to be seen here.

September 30

Athletics Scholarships in Canada

Time was, about twenty years ago, Canadian universities didn’t spend money on university athletic scholarships.  Then things changed and universities turned on the taps.  Today we ask the question: “how’s that going for everyone”?

Well, it’s not going too badly, if you’re an athlete.  Just under 5,830 students received athletic scholarships totalling $15,981,189 in 2013-14 – that’s a little over $3,000 a pop.  CIS officially recognizes twenty-one sports, nine of which have teams for both genders (eighteen total), plus football which is male-only and rugby and field-hockey which are female-only.  However, roughly 85% of the scholarship dollars are concentrated in just nine sports, as shown below in Figure 1.  Some have almost no scholarships at all: inter-collegiate curling, for instance, has only 16 scholarships nationally for both sexes.

Figure 1: Top Sports by Scholarship Expenditure, 2013-14

ottsyd20160929-01

What’s interesting here is that over time, the amount of money spent on Athletics scholarship has been rising quickly and steadily.  Even after accounting for inflation, Canadian universities spent nearly three times as much on athletics scholarships ($16 million vs. $5.8 million) in 2013-14 as they did ten years earlier.  It’s an interesting choice of expenditure by allegedly cash-strapped institutions.

Figure 2: Total Athletics Scholarships by Gender, 2003-4 vs 2013-4, in constant $2014

ottsyd20160929-02

I suspect most institutions would probably defend it as a kind of strategic enrolment investment, much the way they defend other kinds of tuition discounting.  I mean, does it really matter if you give someone a $5,000 academic entrance scholarship or a $5,000 athletic scholarship?  They’re both forms of tuition discounting.  And of course, the absolute amounts are trivial.  $16 million is only 1% of the total amount of funding given by universities to students (if you include funding to graduate students).  And if you want into get into truly ludicrous comparisons, it’s less than what the University of Michigan spends on salaries for its football coaching staff.

A final point to make here is around gender equity.  Male and female athletes receive awards at roughly the same rate (45% of athletes of each gender receive an award), which is good.  However, imbalances remain in terms of the number of athletics spots for men than women (53% of all athletic team spots are male, compared to about 41% of undergraduates as a whole), and in terms of the size of the average award ($3,286 vs. $2,737).  Those results are better than they were a decade ago, and they appear to be slightly better than they are in the US, where actual legislation exists in the form of Title IX to enforce equity in sports, but they are still some ways from equal.

September 28

International Rankings Round-Up

So, the international rankings season is now more or less at an end.  What should everyone take away from it?  Well, here’s how Canadian Universities did in the three main rankings (the Shanghai Academic Ranking of World Universities, the QS Rankings and the Times Higher Rankings).

ottsyd20160928

Basically, you can paint any picture you want out of that.  Two rankings say UBC is better than last year and one says it is worse.  At McGill and Toronto, its 2-1 the other way.  Universities in the top 200?  One says we dropped from 8 to 7, another says we grew from 8 to 9 and a third says we stayed stable at 6.  All three agree we have fewer universities in the top 500, but they disagree as to which ones are out (ARWU figures it’s Carleton, QS says its UQ and Guelph, and for the Times Higher it’s Concordia).

Do any of these changes mean anything?  No.  Not a damn thing.  Most year-to-year changes in these rankings are statistical noise: but this year, with all three rankings making small methodological changes to their bibliometric measures, the year-to-year comparisons are especially fraught.

I know rankings sometimes get accused of tinkering with methodology in order to get new results and hence generate new headlines, but in all cases, this year’s changes made the rankings better, either making them more difficult to game, more reflective of the breadth of academia, or better at handling outlier publications and genuine challenges in bibliometrics.  Yes, the THE rankings threw up some pretty big year-to-year changes and the odd goofy result (do read my colleague Richard Holmes’ comments the subject here) but I think on the whole the enterprise is moving in the right direction.

The basic picture is the same across all of them.  Canada has three serious world-class universities (Toronto, UBC, McGill), and another handful which are pretty good (McMaster, Alberta, Montreal and then possibly Waterloo and Calgary).  16 institutions make everyone’s top 500 (the U-15 plus Victoria and Simon Fraser but minus Manitoba, which doesn’t quite make the grade on QS), and then there’s another half-dozen on the bubble, making it into some rankings’ top 500 but not others (York, Concordia, Quebec, Guelph, Manitoba, Concordia).  In other words, pretty much exactly what you’d expect in a global rankings.  It’s also almost exactly what we here at HESA Towers found when doing our domestic research rankings four years ago. So: no surprises, no blown calls.

Which is as it should be: universities are gargantuan, slow-moving, predictable organizations.  Relative levels of research output and prestige change very slowly; the most obvious sign of a bad university ranking is rapid changing of positions from year to year.   Paradoxically, of course, this makes better rankings less newsworthy.

More globally, most of the rankings are showing rises for Chinese universities, which is not surprising given the extent to which their research budgets have expanded in the past decade.  The Times threw up two big surprises; first by declaring Oxford the top university in the world when no other ranker, international or domestic, has them in first place in the UK, and second by excluding Trinity College Dublin from the rankings altogether because it had submitted some dodgy data.

The next big date on the rankings calendar is the Times Higher Education’s attempt to break into the US market.  It’s partnering with the Wall Street Journal to create an alternative to the US News and World Report rankings.  The secret sauce of these rankings appears to be a national student survey, which has never been used in the US before.  However, in order to get a statistically significant sample (say, the 210-students per institution minimum we used to use in the annual Globe and Mail Canadian University Report) at every institution currently covered by USNWR would imply an astronomically large sample size – likely north of a million students.  I can pretty much guarantee THE does not have this kind of sample.  So I doubt that we’re going to see students reviewing their own institution; rather, I suspect the survey is simply going to ask students which institutions they think are “the best”, which amounts to an enormous pooling of ignorance.  But I’ll be back with a more detailed review once this one is released.

September 27

Lying With Statistics, BC Edition

A couple of weeks ago I came across a story in the Vancouver Sun quoting a Federation of Post-Secondary Educators of BC (FPSE) “report” (actually more of a backgrounder) which contained two eye-catching claims:

  1.  “per-student operating grants have declined by 20 per cent since 2001 when adjusted for inflation.”
  2.  “government revenues from tuition fees have increased by almost 400 per cent since 2001”

The subtext here is clear.  20% down vs. 400% up?  How heinous!  How awful can the Government of British Columbia be?

Well now.  What to make of this dog’s breakfast?

Let’s start with the second point.  First of all, it’s institutional income, and not government income.  But leaving that aside, there was indeed a very big rise in tuition fees back in 2001-2 and 2002-3 (presumably why the authors chose 2001 as a base…if one used 2003 as a base, it would be a very different and much less dramatic story).   But if you simply look at average university tuition (college tuition is untracked) the increase since 2001 is only 110% (in nominal dollars).  Assume the increase for colleges was a bit higher because they were working from a lower base and perhaps we can nudge that up to 125%.  Still: how does one get from there to 400%?

First, remember that the authors (whoever they may be) are talking about aggregate tuition, not average tuition.  So some of this simply reflects an increase in enrollments.  In 2001-2, there were 196,000 students in BC.  In 2013-14, the last year for which we currently have data, there were 277,515 – an increase of 41%.  Back of the envelope, multiply that by the 110% nominal tuition increase and that gets you to a 176%.  Still a ways to go to 400% though.

Second, a lot of students are moving from lower-cost programs to higher cost-programs.  Some of that is happening within universities (e.g., from Arts to Engineering), but in BC it’s mostly a function of colleges turning themselves into universities and charging more tuition.  University enrollment rose from 80,388 to 179,917 while college enrolments went from 116,007 to 197,698.  That’s a lot of extra fees.

Third, BC has a lot more international students than it used to, and they pay more in fees on an individual basis than domestic students do.  Add those two factors together and you get another 19% or so increase in aggregate fees, which brings us to a 210% total increase.

That’s still nowhere near 400%.  So, I went and checked the source data – Statistics Canada’s Financial Information of Universities and Colleges (FIUC) for Universities (cansim 477-0058 if you’re a nerd) and the Financial Information of Community Colleges and Vocational Schools (cansim 477-0060) to try to find an answer.  Here’s what I found:

ottsyd2016093001

Yeah, so actually not 400%, more like 207% – reasonably close to the 210% from our back-of-the-envelope exercise.  The best excuse I can come up with for the Federation of BC Post-Secondary Educators’ number is that if you extend the universities number out another year (to 2014-15), you get to $1.258B, which is almost four times (actually 3.74x) of the 2001-02 figure (which is still only a 274% increase).  But you have to a) torque the living daylights out of the numbers and b) actively confuse percentage increases and multiples to get there.

But now let’s move over to the other side of the ledger, where the Federation notes a 20% drop in government support per student, adjusted for inflation.  Let’s note straight off the first inconsistency: they’re adjusting the government grants for inflation and not doing the same for tuition.  Second inconsistency: they’re adjusting the government grants for the size of the student population and not doing the same for tuition.

It’s easy to see why FPSE does this.  As we’ve already noted, student numbers were up by 41% between 2001-2 and 2013-14.  Just do the math: a 20% per student cut while student numbers are rising by 41% actually means that government support has risen by 13%.  In real dollars. (I went back to the source data myself and came up with 14% –  Close enough).  Chew on that for a second: FPSE is ragging on a government which has increased funding for post-secondary education by – on average – 1% over and above inflation every year since 2001-2.

So quite apart from any problems with the 400% number, FPSE is making a deliberate apples-to-oranges comparison by adjusting only one set of figures for student growth and inflation.  Here’s how those numbers compare on a number of different apples-to-apples basis (and I’m being nice to FPSE here and allowing different end dates for fees and grants based on different data availability):

ottsyd2016093001-2

Now, it seems to me there’s enough in the Clark government’s record to mount a decent attack without resorting this kind of nonsense.  It certainly under-invests relative to what it could be doing given the province’s growing population.  It carries a faux-populist pro-extraction industry line to the detriment of investing in expanding knowledge industries.  It has stayed out of step with most of the rest of the country in the last ten years by not improving student assistance.  And a fair, non-torqued comparison between student fees and government grants still shows students are bearing an increasing share of the cost.

So why stoop to using transparently false figures?  One might expect that kind of chicanery from the Canadian Federation of Students, which has form in this area.  But this is from an organization which represents professors: people who actually use statistics in the search for the truth.  So why is the organization which represents them using statistics in a way that wouldn’t pass muster in an undergraduate course?

I’m quite sure most profs wouldn’t be OK with this.  So why do FPSE’s member locals tolerate it?

September 22

MOOCs at Five

It was five years ago last month that Stanford set up the first MOOC.  MOOCs were supposed to change the world: Udacity, Coursera and EdX were going to utterly transform education, putting many universities out of business.  Time to see how that’s going.

(Ok, ok: the actual use of the term MOOC was applied to a 2008 University of Manitoba course led by George Siemens and Stephen Downes.  Technically, using Downes’ taxonomy, the 2008 MOOC was a “cMOOC” – the “c” standing for connectivist, if I am not mistaken – while the versions that became popular through Coursera, Udacity and EdX, etc. are “xMOOCs”, the difference being essentially that learning in the former is more participative and collaborative while the latter has more in common with textbooks, only with video.  But the Stanford MOOC is what usually gets the attention, so I’m going to date it from there).

In the interests of schadenfreude if nothing else, allow me to take you back to 2012/3 to look at some of the ludicrous things people said about the likely effects of MOOCs.

  • “In 50 years there will only be 10 institutions in the world delivering higher education” (Sebastian Thrun, former CEO of Udacity)
  • “Higher Education is now being disrupted; our MP3 is the massive open online course (or mooc), and our Napster is Udacity” – Clay Shirky.
  • “Higher education is just on the edge of a crevisse (sic)…five years from now these enterprises (i.e. universities) are going to be in real trouble” Clayton Christensen

And of course who can forget that breathless cliché-ridden gem of an op-ed from Don Tapscott, about the week that higher education changed forever in January 2013 (i.e. the week he sat in on a couple of seminars on the subject in Davos).  Classic.

So, five years on, where are the MOOC pioneers now?  Well, Sebastian Thrun of Udacity got out of the disrupting-higher-education business early after coming to the realization that his company “didn’t have a good product”; the company pivoted to providing corporate training.  Over at Coursera, the most hyped of the early pioneers, founders Andrew Ng and Daphne Koller have both left the company (Ng left two years ago for Baidu, Koller left last month for one of Alphabet’s biotech enterprises).  Shortly after Koller’s departure, Coursera released this announcement  which was widely interpreted as the company throwing in the towel on the higher education market and following Udacity down the corporate training route.

EdX, the platform owned jointly by MIT and Harvard thus seems to be the last MOOC provider standing.  Perhaps not coincidentally, it is also the one which has (arguably) been most successful in helping students translate MOOC work into actual credits.  It has partnered with Arizona State University in its “Global Freshman Academy” and even allows conversion of some credits towards a specific MIT MBA (conditional on actually spending a semester on campus and paying normal fees to finish the program).   These “micro-MBAs” seem to catching on, but precisely because they are “micro”, they haven’t made a big impact on EdX’s overall numbers: their user base is still less than half Coursera’s.

So what’s gone wrong?  It isn’t a lack of sign-ups.  The numbers of people taking MOOCs continues to grow at a healthy clip, with Global enrolments to date now over 35 million.  The problem is there’s no revenue model here.  Depending on whose numbers you’re using, the number of users paying for some kind of certification (a fee which is usually priced in double digits) is at best around 3%.  So, work that out: 35 million users, with a 3% conversion rate, at $50 per user, and you’ve got a grand total of $52.5 million in total revenue.  Over five years.  Using content existing institutions are giving them essentially for free at a cost of anywhere between $50,000 and $250,000 per course.

This is not sustainable and never was.  Whatever valid points MOOC boosters had about the current state of education (and I think they had more than a few), the proposed solution wasn’t one that met the market test.  The basic problem is (and always has been) that higher education is fundamentally a prestige market.  Distance education is low prestige; distance education which doesn’t give out actual course credit doubly so.  You can disguise this by making delivery the domain of top universities, as EdX and to a lesser extent Coursera did – but top institutions don’t want to diminish their prestige by handing out actual credit to the hoi polloi over the internet.   So what you get is this unsatisfying compromise which in the end not enough people want to pay for.

Some of us said this five years ago (here, here and here) when MOOC-mania was in full flow and critical faculties were widely suspended.  Which just goes to show: higher education is the world’s most conservative industry and the rate of successful innovation is tiny.  Your best bet for imagining what higher education looks like in the future is what it looks like today, only more expensive.

 

September 21

Unit of Analysis

The Globe carried an op-ed last week from Ken Coates and Douglas Auld, who are writing a paper for the MacDonald Laurier institute on the evaluation of Canadian post-secondary institutions. At one level, it’s pretty innocuous (“we need better/clearer data”) but at another level I worry this approach is going to take us all down a rabbit hole. Or rather, two of them.

The first rabbit hole is the whole “national approach” thing. Coates and Auld don’t make the argument directly, but they manage to slip a federal role in there. “Canada lacks a commitment to truly high-level educational accomplishment”, needs a “national strategy for higher education improvement” and so “the Government of Canada and its provincial and territorial partners should identify some useful outcomes”. To be blunt: no, they shouldn’t. I know there is a species of anglo-Canadian that genuinely believes the feds have a role in education because reasons, but Section 93 of the constitution is clear about this for a reason. Banging on about national strategies and federal involvement just gets in the way of actual work getting done.

Coates & Auld’s point about the need for better data applies to provinces individually as well as collectively. They all need to get in the habit of using more and better data to improve higher education outcomes. I also think Coates and Auld are on the right track about the kinds of indicators most people would care about: scholarly output, graduation rates, career outcomes, that sort of thing. But here’s where they fall into the second rabbit hole: they assume that the institution is the right unit of analysis for these indicators. On this, they are almost certainly mistaken.

It’s an understandable mistake to make. Institutions are a unit of higher education management. Data comes from institutions. And they certainly sell themselves as a unified institutions carrying out a concerted mission (as opposed to the collections of feuding academic baronetcies united by grievances about parking and teaching loads they really are). But when you look at things like scholarly output, graduation rates, and career outcomes the institution is simply the wrong unit of analysis.

Think about it: the more professional programs a school has, the lower the drop-out rate and the higher the eventual incomes. If a school has medical programs, and large graduate programs in hard sciences, it will have greater scholarly output. It’s the palette of program offerings rather than their quality which makes the difference when making inter-institutional comparisons. A bad university in with lots of professional programs will always beat a good small liberal arts school on these measures.

Geography play a role, too. If we were comparing short-term graduate employment rates across Canada for most of the last ten years, we’d find Calgary and Alberta at the top – and most Maritime schools (plus  some of the Northern Ontario schools) at the bottom. If we were comparing them today, we might find them looking rather similar. Does that mean there’s been a massive fall-off in the quality of Albertan universities? Of course not. It just means that (in Canada at least) location matters a lot more than educational quality when you’re dealing with career outcomes.

You also need to understand something about the populations entering each institution. Lots of people got very excited when Ross Finnie and his EPRI showed big inter-institutional gaps in graduates incomes (I will get round to covering Ross’ excellent work on the blog soon, I promise). “Ah, interesting!” people said. “Look At The Inter-Institutional Differences Now We Can Talk Quality”. Well, no. Institutional selectivity kind of matters here. Looking at outputs alone, without taking into account inputs, tells you squat about quality. And Ross would be the first to agree with me on this (and I know this because he and I co-authored a damn good paper on quality measurement a decade ago which made exactly this point).

Now, maybe Coates and Auld have thought all this through and I’m getting nervous for no reason, but their article’s focus on institutional performance when most relevant outcomes are driven by geography, program and selectivity suggests to me that there’s a desire here to impose some simple rough justice over some pretty complicated cause-effect issues. I think you can use some of these simple outcome metrics to classify institutions – as HEQCO has been doing with some success over the past couple of years – but  “grading” institutions that way is too simplistic.

A focus on better data is great. But good data needs good analytical frameworks, too.

September 19

Counting Sessionals

Much rejoicing last Thursday when Science Minister Kirsty Duncan announced that the federal government was re-instating the funding for the Universities and Colleges Academic Staff System (UCASS), which was last run in 2011.     But what caught most people’s attention was the coda to the announcement, which said that Statistics Canada was going to “test the feasibility” of expanding the survey to include “part-time and public college staff” (the “C” in UCASS stands for colleges in the Trinity College sense, not the community college sense, so despite the name public colleges have never been in the survey).

What to make of this?  It seems that by “part-time” Statscan meant sessionals/adjuncts/ contract faculty.  That’s a bit off because every university I know of makes a very sharp distinction between “part-time” (many of whom are tenured) and “sessionals”.  It make one worry that Statistics Canada doesn’t understand universities well enough to use the correct terminology, which in turn bodes ill for their future negotiations with universities around definitions.

Because let’s be clear about this: universities will do almost anything they can to throw sand in the gears on this.  They do not want data on sessionals in the public eye, period.  Oh sure, in public the Presidents will welcome transparency, evidence-based decision-making, etc.  But institutional research shops – the ones who will actually be dealing with Statscan on this file – are Olympic champions in shutting down government attempts to liberate information.  In fact, that’s arguably their main purpose.  They won’t actually say no to anything – they’ll just argue relentlessly about definitions until Statscan agrees to a reduced program of data collection.  Statscan knows this is coming – they have apparently allocated four years (!!!) for negotiations with institutions, but the safest guess is that this simply isn’t going to happen.

And to be fair to universities, the kind of data UCASS would provide about sessionals would be pretty useless – a lot of work for almost nothing.  UCASS can count individuals, and track their average salaries.  But average salary data would be useless: it would conflate people teaching one course with people teaching five.  And since UCASS had no way to track workload (you’d actually need to blow up the survey and start again if you wanted to get at workload, and as interesting as that might be, good luck getting universities to green-light it), the data is meaningless.  Knowing the number of sessionals tells you nothing about what proportion of undergraduates are being taught by sessionals.  Are 200 sessionals teaching one course each worse than 100 teaching two courses apiece?  Of course not.  But if raw numbers are the only thing on offer then we’ll ascribe meanings to them where they arguably shouldn’t exist.

You see, “sessionals” are not really a single phenomenon.  Many are professionals who have full-time jobs and like teaching a class on a side, and they’re usually a huge boon to a department (especially in professional faculties like law, nursing an business) because they help expose students to a life beyond academia.  Others are PhD students teaching a class while another professor is away – and thus learning valuable skills.  The “bad” sessionals – the one people claim to want to stamp out – are the ones who have a PhD, are teaching multiple classes the way professors do.  I suspect this is a pretty small percentage of total sessionals, but we don’t know for sure.  And adding sessionals to UCASS won’t get us any closer to finding out because even if they wanted to, universities couldn’t collect data on which of their employees have other full-time jobs outside the institution.

Despite all the kumbayahs on Tuesday about how this UCASS expansion is going to promote “evidence-based decision-making”, I’m genuinely having trouble imagining a single policy problem where data from UCASS would make a difference.  Universities already know how many sessionals they employ and whether numbers are going up or down; UCASS might let them know how many sessionals other universities employ but frankly who cares?  It’s not going to make a difference to policy at an institutional level.

If you really wanted to know something about sessionals, you’d probably start with requiring institutions simply to provide contact information for every individual with teaching responsibilities who is not tenure-track, along with the amount paid to them in the previous academic year (note: Statscan couldn’t do this because it would never use the Stats Act to compel data this way.  Provincial governments could do so, however).  Then you’d do a survey of the instructors themselves – number of classes taught, other jobs they have, income from other jobs, etc.  Now I know some of you are going to say; didn’t some folks at OISE do that just recently?  Well, almost.  Yes, they administered almost this kind of survey, but because they weren’t drawing their sample from an administrative database, there’s no way to tell how representative their sample is and hence how accurate their results are.  Which is kind of important.

So, anyways, two cheers for the return of UCASS.  More data is better than less data.  But the effort to get data on sessionals seems like a lot of work for very little practical return even if universities can be brought round to co-operate.

September 07

Unpleasantness at Brock

So, everybody is talking about the kerfuffle at Brock: yet another presidential hire gone wrong, though this time the slamming-on-the-brakes happened before the hire actually started working, which I suppose is progress.

What actually happened?  At the moment, here’s what we know for sure:  Wendy Cukier, a former VP at Ryerson was offered the President’s job at Brock in December 2015 with a start date of September 1.  She was undergoing what seemed to be a normal transition, starting to meet with faculty, up until a few weeks ago when meetings suddenly ceased.  On Monday August 29th, news emerged that Cukier and the Board had mutually agreed to suspend the appointment, and to look for a new President.  Cukier returned to her professorial position at the Ted Rogers business school at Ryerson.

Now, no one has yet actually asserted in print that the reason for the “mutual” change of heart is a report about Cukier’s alleged bullying of staff while at Ryerson, but many news outlets have reported that an inquiry into such allegations took place and by putting two facts side by side the journalists clearly expect the reading public to make that leap.  The inquiry into those allegations is said to have occurred in late 2015 (i.e. around the time Cukier’s appointment at Brock occurred),  an investigative report into the allegations is reported to have been received by Ryerson in January 2016 (i.e. after the appointment).  We don’t know what the inquiry’s report said, and news outlets have been careful to avoid directly stating that there was any connection between the two.

Brock, obviously, is a bit screwed now.  Their interim President is the VP Finance & Administration (not an academic and a former CFL player to boot, which has made the faculty union extremely sniffy in an oh-my-God-what-will-other-universities-think-of-us kind of way, which is frankly juvenile).  The some-say acting, some-say interim Provost is an outsider: Martin Singer, the Arts Dean from York who is best remembered for deciding to allow Saudi males to not study with girls in the name of religious accommodation. The VP Research is also interim.  It’s going to be a tough two years working to sort this out.

To the extent anyone is talking about the general implications, there same to be three.  First, some people have posited that gender is an issue in the affair.  On the facts of this particular case that seems a stretch.  It is however undeniable that recent university President “departures” (let’s call them that) have been disproportionately female (Leavitt at King’s, Ghazzali at UQTR, Lovett-Doust at Nipissing, Scherf at Thompson Rivers, Woodsworth at Concordia, Busch-Vishniac at Saskatchewan, Hitchcock at Queen’s and now Cukier), at least compared to the mostly male population of university presidents.  I’d argue that – contra Jennifer Berdahl and the view that only alpha male behaviour is rewarded in universities – there’s a disproportionate number of individuals in that group who were let go precisely because they were too alpha.  If there’s a gender case to be made here, it might be about what kinds of leadership styles get women promoted to decanal and vice-Presidential positions in the first place.

Second is the role of non-disclosure agreements (NDAs), which again are getting in the way of Everyone’s Right to Know Every Last Detail (though to be honest, Brock’s Board of Governors has 27 members and I’m willing to bet that that’s too many to keep a secret for long).  NDAs don’t get a lot of favourable press and some say they should be done away with, but it’s hard to see how that’s possible. If someone is being let go for some reason that reflects badly on them but which is short of being “with cause”, you can either pay them a small amount of money now and have them leave quietly (I’m actually a bit surprised no one has yet commented publicly on whether there was a payout and if so how big it was), or you can trash them publicly and pay a lot of money after the inevitable lawsuit.  As public institutions, I don’t think universities and colleges have a lot of flexibility on that point.

The third implication people are drawing form this is that here again we have Another Failed Board Search, Why Can’t Boards Get Things Right, Need for Immediate Governance Overhaul, etc.  But I think this is overdone. The Brock University Board Chair has gone on record saying his university “did not know” about the Ryerson report (there is no word about when Brock became aware of it).  But unlike one or two Presidential searches I’ve heard of, Brock actually *did* its homework and interviewed quite a few people about Cukier.  It’s just that, as far as we know, no one at Ryerson told them about the results of the inquiry, presumably because it was a “personnel matter” and hence confidential.  If staff at Ryerson knew about the issue and withheld information from the Brock search committee, that’s hardly something the Brock board can be blamed for.  Sometimes bad things happen even if you do everything by the book.

Finally, let me stress that we don’t yet know the full story.  Maybe we never will.  The staff allegations at Ryerson might only be a small part of the issues involved.  Keep an open mind.  There’s probably more to come.

Page 4 of 10« First...23456...10...Last »