HESA

Higher Education Strategy Associates

November 19

Stories Arts Faculties Tell Themselves

Here at HESA towers, we’ve been doing some work on how students make decisions about choosing a university (if you’re interested: the Student Decisions Project was a multi-wave, qualitative, year-long longitudinal study that tracked several hundred Grade 12 students as they went through the PSE research, application, and enrolment process.  We also took a more targeted qualitative look, specifically at Arts, with the national Prospective Arts Students Survey).  We’ve been trying to do the same for colleges, but it’s a much trickier demographic to survey.

In both studies, one of the questions we asked is what students really want from their education.

Now at one level, this question is kind of trite.  We know from 15 years of surveys from the Canadian Undergraduate Survey Consortium that students go to university: i) to get better jobs; ii) because they like learning about a particular field; and also, iii) to make friends, and enjoy the “university experience”.

Where it gets a little trickier, however, is when you break this down by particular fields of study.  With most faculties, there tends to be a positive reason to attend.  However, when it comes to Arts, enrolment is often seen as a fall-back option – it’s something you do if you don’t have concrete goals, or if you can’t do anything else.  Now, Arts faculties tend to take the positive here, and spin this as students wanting to “find themselves”. But in deploying this bit of spin, Arts faculties often end up heading in the wrong direction.

One of the problems here is that the notion of students “finding themselves” (not a term students themselves use) is not as straightforward as many think. Broadly, there are three possible definitions.  The first situates “finding yourself” in academic terms: by exploring a lot of different academic options, a student finds something that interests her/him, and becomes academically engaged.  This is one of the reasons that Arts faculties are built around a smorgasbord model, which lets students “taste” as many things as possible, and hence “discover” themselves.

But that’s not the only possible definition of “finding oneself”.  There is another option, in which students essentially view PSE as a cooling out period where they can “find” what they want to do, in a vocational sense.  Yes, they are taking courses, but since they recognize that Arts courses don’t lead directly to employment, they are more or less marking time while they discover how to make their way in the employment world, and think about how and where they want to live.  Then there is a third, slightly different take, in which students view “finding themselves” as the process by which they acquire transversal skills, and the skills of personal effectiveness needed to be successful adults.  School is something they do while they are learning these skills, often for little reason other than that going to school is something they have always done, and in many cases are expected to do.

Though all of these interpretations of “finding yourself” have some currency among students, it probably shouldn’t come as a surprise to learn that the one about “finding yourself” being a voyage of academic discovery is, in fact, the least frequently mentioned by incoming students.  Now, maybe they come around to this view later on, but it is not high on the list of reasons they attend in the first place.  To the extent that they have specific academic interests as a reason for enrolling in Arts, they tend to be just that: specific – they want to study Drama, or History, or whatever.

Which raises two questions: if this is true, what’s the benefit of Arts faculties maintaining such a wide breadth of requirements?  And second, why aren’t Arts faculties explicitly building-in more transversal skills elements into their programs?  Presumably, there would be a significant advantage in terms of recruitment for doing so.  Someone should give it a whirl.

November 18

The Radical Implications of David Turpin’s Installation Speech

David Turpin was installed as President at the University of Alberta earlier this week.  His inaugural speech was good.  Very good.  Read a shortened version of it here.

(Full disclosure: I spoke at a leadership function at the University of Alberta in August, for which I received a fee.  The University has also recently purchased two of our syndicated research products.  Make of that what you wish.)

The speech starts out with what I would call some standard defences of the university, which any president would give: we seek truth and knowledge, we innovate, and we create jobs, yadda yadda.  Where it gets interesting is where he starts his appeal to the provincial government.  Let me quote what I think are the key bits:

“Our task continues to be to ask unexpected questions, seek truth and knowledge, and help society define, understand and frame its challenges. Our goal for the future is to find new and innovative ways to mobilize our excellence in research and teaching to help municipal, provincial, national and international communities address these challenges.”

Note: the truth/knowledge tasks “continue”, but now we’re adding a “goal” of mobilizing the university’s talents to address “challenges”.  And these are not just abstract challenges.  Turpin gets very, very specific here:

To our municipal partners: We will work with you to address your major goals on poverty reduction, homelessness, downtown revitalization, infrastructure renewal and transportation.

To our provincial partners: We will work with you to strengthen a post-secondary education system that serves the needs of all Alberta’s learners. We will provide our students the educational experience they need to seed, fuel and drive social, cultural and economic diversification. We will advance social justice, leading reconciliation with our First Nations and protection for minorities. We will conduct research to sustainably develop Alberta’s wealth of natural resources and improve Albertans’ health and wellness.

These are really specific promises.  If I’m a municipal or provincial official, what I hear from this is “Cool! U of A is going to be my think tank!  It’s going to put expertise at my disposal in areas like poverty reduction and economic diversification”.  That may or may not be Turpin’s intent, but it’s what they will hear.  And that’s well beyond the traditional role of a university in Canada, and in some ways beyond even some of the “state service” commitments that exist in US Land Grant institutions.  Sure, ever since von Humboldt, universities have been there to serve and strengthen the state, but I think the way Turpin is articulating this is genuinely new.

Now, no doubt the University has enormous resources to help achieve all of these things.  But those resources are mostly faculty members and grad students.  And while the university can ask them nicely to help folks at city hall/the legislature when they come calling, the question is: what’s in it for the profs and grad students to drop what they’re doing and go help the city/province (especially if they feel they have better things to do)?  Is the expectation that staff will do this out of a collective desire to contribute to their communities, or will incentives be put in place?

This goes deep to the heart of a university’s research mission.  At research universities like U of A, tenure and promotion is based mostly on publication records, and time is supposed to be spent 40-40-20 on teaching, research, and service.  But if your provost walks down the hall and says “hey, I just met with a couple of MLAs, and they’re hoping they can borrow your expertise for a couple of weeks”, do those expectations now change?  Will tenure/promotion committees actually take into account work done for government as equivalent to work done for an academic publication?

(For those of you not native to academe, it may seem amazing that research done for public policy, something that changes the way government makes decisions in a certain area, is not rated as highly for tenure/promotion as publishing things in journals that on average are read by a handful of people.  It is amazing, yes.  But true more often than not.)

If the answer to those questions is no, then I don’t think this initiative will go far.  But if the answer is yes, then Turpin is literally talking about a new kind of university, one that is prepared to sacrifice at least some of the prestige associated with being a “world-class university” with a laser-like focus on publication outputs, in order to contribute to its community in very concrete ways.  It’s not a reduction in research intensity, but it is a different type of research intensity.

The risk, of course, is that this new type of intensity won’t come with as many dollars attached.  I hope that’s not the case.  But in any event, this could be quite an exciting experiment.  One definitely worth keeping an eye on.

November 17

Curious Data on Teaching Loads in Ontario

Back in 2006, university Presidents got so mad at Maclean’s that they stopped providing data to the publication.  Recognizing that this might create the impression that they had something to hide, they developed something called “Common University Dataset Ontario” (CUDO) to provide the public with a number of important quantitative descriptors of each university.  In theory, this data is of better quality and more reliable than the stuff they used to give Maclean’s.

One of the data elements in CUDO has to do with teaching and class size.  There’s a table for each university, which shows the distribution of class sizes in each “year” (1st, 2nd, 3rd, 4th): below 30, 31-60, 61-90, 91-150, 151-250, and over 250.  The table is done twice, once including just “classes”, and another with slightly different cut-points that include “subsections”, as well (things like laboratories and course sections).  I was picking through this data when I realised it could be used to take a crude look at teaching loads because the same CUDO data also provides a handy number of full-time professors at each institution.  Basically, instead of looking at the distribution of classes, all you have to do is add up the actual number of undergraduate classes offered, divide it by the number of professors, and you get the number of courses per professor.  That’s not a teaching load per se, because many courses are taught by sessionals, and hell will freeze over before institutions release data on that subject. Thus, any “courses per professor” data that can be derived from this exercise is going to overstate the amount of undergradaute teaching being done by full-time profs.

Below is a list of Ontario universities, arranged in ascending order of the number of undergraduate courses per full-time professor.  It also shows the number of courses per professor if all subsections are also included.  Of course, in most cases, at most institutions, subsections are not handled by full-time professors but some are; and so assuming the underlying numbers are real, a “true” measure of courses per professors would be somewhere in between the two.  And remember, these are classes per year, not per term.

Classes Per Professor, Ontario, 2013

1

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Yes, you’re reading that right.  According to universities’ own data, on average, professors are teaching just under two and a half classes per year, or a little over one course per semester.  At Toronto, McMaster, and Windsor, the average is less than one course per semester.  If you include subsections, the figure rises to three courses per semester, but of course as we know subsections aren’t usually led by professors.   And, let me just say this again, because we are not accounting for classes taught by sessionals, these are all overstatements of course loads.

Now these would be pretty scandalous numbers if they were measuring something real.  But I think it’s pretty clear that they are not.  Teaching loads at Nipissing are not five times higher than they are at Windsor; they are not three and a half times higher at Guelph than at Toronto.  They’re just not.  And nor is the use of sessional faculty quite so different from one institution to another as to produce these anomalies.  The only other explanation is that there is something wrong with the data.

The problem is: this is a pretty simple ratio; it’s just professors and classes.  The numbers of professors reported by each institution look about right to me, so there must be something odd about the way that most institutions – Trent, Lakehead, Guelph, and Nipissing perhaps excepted – are counting classes.  To put that another way, although it’s labelled “common data”, it probably isn’t.  Certainly, I know of at least one university where the class-size data used within the institution explicitly rejects the CUDO definitions (that is, they produce one set of figures for CUDO and another for internal use because senior management thinks the CUDO definitions are nonsense).

Basically, you have to pick an interpretation here: either teaching loads are much, much lower than we thought, or there is something seriously wrong with the CUDO data used to show class sizes.  For what it’s worth, my money is on it being more column B than column A.  But that’s scarcely better: if there is a problem with this data, what other CUDO data might be similarly problematic?  What’s the point of CUDO if the data is not in fact common?

It would be good if someone associated with the CUDO project could clear this up.  If anyone wants to try, I can give them this space for a day to offer a response.  But it had better be good, because this data is deeply, deeply weird.

November 16

An Interesting but Irritating Report on Graduate Overqualification

On Thursday, the Office of Parliamentary Budget Officer (PBO) released a report on the state of the Canadian labour market.  It’s one of those things the PBO does because the state of the labour market drives the federal budget, to some extent.  But in this report, the PBO decided to do something different: it decided to look at the state of the labour market from the point of view of recent graduates, and specifically whether graduates are “overqualified” for their jobs.

The methodology was relatively simple: using the Labour Force Survey, determine the National Occupation Code (NOC) for every employed person between the ages of 25 and 34.  Since NOCs are classified according to the level of education they are deemed to require, it’s simple to compare each person’s level of education to the NOC of the job they are in, and on that basis decide whether someone is “overqualified”, “underqualified” or “rightly qualified”.

So here’s what the PBO found: over the past decade or so, among university graduates, the rate of overqualification is rising, and the rate of “rightly qualified” graduates is falling.  Among college graduates, the reverse is true.  Interesting, but as it turns out not quite the whole story.

Now, before I get into a read of the data, a small aside: take a look at the way the PBO chose to portray the data on university graduates.

Figure 1: Weaselly PBO Way of Presenting Data on Overqualification Among 25-34 Year Old University Graduates

1

 

 

 

 

 

 

 

 

 

 

 

 

 

Wow!  Startling reversal, right?  Wrong.  Take a look at the weaselly double Y-axis.  Here’s what the same data looks like if you plot it on a single axis:

Figure 2: Same Data on University Graduate Overqualification, Presented in Non-Weaselly Fashion

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

See?  A slightly less sensational story.  Clearly, someone in PBO wanted to spice up the narrative a bit, and did so by making a pretty unforgivable graph, one that was clearly meant to overstate the fundamental situation.  Very poor form from the PBO.

Anyways, what should we make of this change in university graduates’ fortunes?  Well, remember that there was a massive upswing in university access starting at the tail end of the 1990s.  This meant a huge change in attainment rates over the decade.

Figure 3: Attainment Rates Among 25-34 Year-Olds, Canada

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

What this upswing in the university attainment rate meant was that there were a heck of a lot more university graduates in the market in 2013 than there was, say, a decade earlier.  In fact, 540,000 more, on a base of just over a million – a 53% increase between 1999 and 2013.  Though the PBO doesn’t mention it in the report, it’s nevertheless an important background fact. Indeed, it likely explains a lot of the pattern change we are seeing.

To see how important that is, let’s look at this in terms of numbers rather than percentages.

Figure 4: Numbers of Rightly-Qualified and Overqualified 25-34 Year Old University Graduates, 1999-2013

1

 

 

 

 

 

 

 

 

 

 

 

 

In fact, the number of rightly-qualified graduates is up substantially over the last decade, and they’ve been increasing at almost (though not quite) as fast a rate as the number of “overqualified” graduates.  For comparison, here’s the situation in colleges:

Figure 5: Numbers of Rightly-Qualified and Overqualified 25-34 Year Old College Graduates, 1999-2013

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

As advertised, there’s no question that the trend in college outcomes looks better than the one for universities.  Partly that’s because of improvements in colleges’ offerings, and partly it has to do with the run-up in commodity prices, which made college credentials more valuable (remember the Green-Foley papers? Good times).

What should you take from all of this?  If nothing else, don’t forget that comparing university outcomes over time is hard because of the changing size and composition of the student body.  Remember: the median student today wouldn’t have made it into university 25 years ago.  Average outcomes were always likely to fall somewhat, both because more graduates means more competition for the same jobs, and also because the average academic ability of new entrants is somewhat lower.

It would be interesting, for instance, to see these PBO results while holding high school grades constant – then you’d be able to tell whether falling rates of “rightly-qualified” graduates were due to changing economy/less relevant education, or a changing student body.  But since we can’t, all one can really say about the PBO report is: don’t jump to conclusions.

Especially on the basis of those godawful graphs.

November 13

A Ministry of Talent

My friend and colleague, Jamie Merisotis of the Lumina Foundation in the United States, recently wrote a book called America Needs Talent.  It’s a short, popularly-oriented account of how human capital drives the economy, and what countries and cities can do to acquire it.  One of the suggestions he makes is for a federal “Department of Talent”, which is intriguing as a thought experiment, if nothing else.  So let’s explore that idea for a moment.

To begin, let’s be clear about what Merisotis means by “talent”.  He’s painting with a very broad brush here.  Yes, he means “top talent” in the sense of highly qualified scientists, entrepreneurs, etc.,  but he’s also talking about the talents one acquires through post-secondary education: everything from car repair to microbiology.  So when he says “Department of Talent”, what he’s really talking about is a “Department of Skills”, just with a much less vocational sheen.

Most of Merisotis’ argument could be transported to Canada (albeit in somewhat muted form, given the differing nature of the two countries’ federal responsibilities).  The primary goal of any government is (or should be) to raise levels of productivity.  No increase in productivity, no increase in standard of living (or standard of public services).  So there’s always a need to come up with better ways to think through how a country can raise its overall skills profile.

Yet at the federal level, we split up that responsibility among four ministries (warning: I have not yet bothered to learn the snappy new names the Trudeau government has bestowed upon departments).  In Canada, we have an Industry/Science Ministry, which plays a huge role in developing scientific careers, and fostering skills/knowledge through science-business relationships.  We have a Human Resources Ministry (now on its sixth name in twelve years), which supports training and learning in a variety of guises.  We have an Immigration Ministry in charge of bringing people into the country, which sort of co-ordinates with Human Resources in working out labour market needs, but not necessarily in ways that maximizes skill acquisition.  And finally, we have a Foreign Ministry, which is increasingly interested in attracting foreign students, and that has knock-on effects for immigration down the road.

And so, genuine question: why not hive-off the bits of all of those ministries that focus on talent development and acquisition, and stick them into a single ministry?  Why not create an organization with a singular focus of making our talent pool across all industries better?  It’s not impossible; in fact, Saskatchewan already tried it for several years with a ministry of Advanced Education, Employment and Immigration (an experiment recently unwound for reasons that remain obscure).

I think the quick and simple answer here is that changing bureaucratic chairs is not the same as changing actual policy orientation, and still less about changing results.  In reality, you might create a lot of churn, with very few actual results.  Still, the thought experiment is a useful one: who is in charge of ensuring that Canada always has the best talent at its disposal, in every field of endeavour?  Who makes sure that our education and immigration policies are complimentary?  Given the nature of our federation, this is always going to be a distributed function, but it seems to me that we don’t actually pose questions this way.  We talk about programs (how can we improve student loans?) or institutions (how can we increase the number of community college seats?), but only rarely do we talk about what the final talent pool will look like.

So, while a Ministry of Talent might make not make a lot of sense, there’s still much to be gained by talking about ends rather than means.  A “Talent Agenda”, maybe?  Something to think about, anyway.

November 12

Explaining the #FeesMustFall Movement

One of the more interesting policy debacles in higher education this year has been the fracas over tuition fees in South Africa, which has led to what some are calling the biggest set of anti-government protests since the end of apartheid.  Here’s what you need to know:

The protests began when universities announced fee hikes for the coming year.  On average, the fee hikes were in the 6% range, which was relatively modest given a persistent inflation rate of just under 5%, and additional cost pressures due to a falling rand (the rand is 14 = 1 USD at the moment, up from 8 = 1 USD three years ago).   This kind of increase is not unusual in South Africa, but for a variety of reasons, this year the increases brought students out into the streets in very large numbers.

There were, near as I can tell, three factors at work.  The first is generalized discontent with the ANC government (animosity that is by no means restricted to students).  Though the party can still win over 50% of the vote in elections, a lot of that support is residual loyalty for its fight against apartheid rather than approval of current policies; and since today’s students were mostly born after Mandela was released from prison, they feel less loyalty to the party than do older South Africans.  Economic growth is fading (partly due to falling commodity prices, partly due to government incompetence, particularly on energy and power generation), which means no progress on persistently high unemployment among blacks.  And if there is one file where the government has underperformed the most over the past twenty years, it’s education.  The problem is worse in K-12 than  in universities (though colleges are a right mess), but the repeated failure to sufficiently increase expenditure in higher education is a persistent failure.

The second issue is with respect to student aid.  Though the government has massively increased outlays, it has also massively increased loan losses.  Up until about seven years ago, the National Student Financial Aid System (NSFAS) had the continent’s best record of loan repayment (about 60%).  Then, the government decided – on what many regard as quite spurious grounds – to make it harder for NSFAS to collect the loans, and repayment plummeted to about 20%.  This was good news for graduates of course: more money for them; but it effectively raised the price of increasing access.  One of the casualties of that was an inability to expand  middle-class families’ access to loans, a group who subsequently feel very squeezed.

The third factor was an uptick in student militancy this past March with the #RhodesMustFall campaign.  This started at the University of Cape Town where students wanted to remove a statue of the arch-colonialist Cecil Rhodes (they succeeded).  This morphed into a wider set of protests about the progress universities have made in transforming themselves since 1994, in particular with respect to the progress of black academics.

So with all this kindling, the relatively small sparks of what vice-chancellors thought was a run-of-the-mill tuition increase turned into a major conflagration, which went under the heading #FeesMustFall (a play on the earlier Rhodes campaign).  At first the government tried to straight-arm the students, with the Higher Education minster (and Communist party chief) Blade Nzimandize claiming maladroitly that he would start his own #StudentsMustFall campaign.  When that didn’t work, the ANC began trying to co-opt the protest, claiming students’ views as their own.  Eventually the protests grew so large that President Zuma eventually froze all fees for a year, and compensated institutions to the tune of 80% of the cost of the freeze.  But the ANC has also taken steps to give itself unprecedented authority to massively intrude on universities’ autonomy, so that it can more directly control costs and remove inconvenient administrators.

The fee freeze took some of the sting out of the protests, but it also emboldened some protestors who want to see South Africa move to a free fee system.  Given that participation rates for whites are between three and four times higher for blacks, this is a curiously regressive idea (and may explain why whites were seemingly so much more prominent in the #feesmustfall protests than in those for #rhodesmustfall).  The head of South Africa’s Centre for Higher Education Trust, Nico Cloete, skewered the idea in a University World News column this weekend (read it here; it’s long but very good), saying rightly that in a society as unequal as South Africa, “affordable higher education for all” is a necessary goal, but “free higher education for all” is morally wrong.

Which is dead on, frankly.  Fix student aid so the poor get more grant aid and the middle-class get more loan aid, sure.  More money for universities to maintain quality?  Sure (South Africa has an amazing set of universities for a middle-income country, but that’s at risk over the long-term).  But spending more money to make it free for the already highly privileged?  South Africa can and should do better than that.

November 11

Times You Wish There Was a Word Other Than Research

There is something about research in modern languages (or English, as we used to call it) that sets many people’s teeth on edge, but usually for the wrong reasons.

Let’s go back a few months to Congress, specifically to an article Margaret Wente wrote where she teed-off on a paper called “Sexed-up Paratext: The Moral Function of Breasts in 1940s Canadian Pulp Science Fiction”.   Her point mostly was about “whatever happened to the great texts?”  Which, you know: who cares?  The canon is overrated, and the transversal skills that matter can be taught through many different types of materials.

But she hit a nerve by articulating a point about research in the humanities, and why the public feels uneasy about funding them.  Part of it is optics, and what looks to outsiders like childish delight in mildly titillating or “transgressive” titles.  But mostly, it just doesn’t “look like” what most people think of as research.  It’s not advancing our understanding of the universe, and it’s not making people healthier, so what’s it doing other than helping fuel career progression within academia?  And that’s not a judgement at all on what’s in the paper itself (I haven’t read the paper, I can’t imagine Wente did either); even if it were the best paper at Congress, people who defend the humanities wouldn’t likely point to a paper whose title contains the words “the Moral Functions of Breasts” as a way to showcase the value of humanities research.  The title just screams self-indulgence.

And yet – as a twitter colleague pointed out at the time – whoever wrote this piece probably is a great teacher.  With this kind of work, they can show the historical roots of things like sexuality in comics, which is highly relevant to modern issues like Gamergate.  If we want teachers to focus on material that is relevant and can engage students, and if you really want scholarly activity to inform teaching, surely this is exactly the kind of thing that should be encouraged.  As scholarly activity, this is in a completely different – and much, much better – category than, say, the colonoscopic post-modernist theorizing that was so memorably skewered during the Sokal Affair because you can clearly see the benefits for teaching and learning.

But is it “research”?

The academe doesn’t like to talk about this much because, you know, you stick to your discipline and I’ll stick to mine.   You can push the point if you want and claim that all research is similar because, regardless of discipline, research is an exercise in pattern recognition.  There are, however, some fundamental differences between what sciences call research and what humanities call research.   In the sciences, people work to uncover laws of nature; in the social sciences, people (on a good day) are working on laws (or at least patterns) of human behaviour and interaction.  In humanities, especially English/Modern Languages, what’s essentially going on is narrative-building.  That’s not to say that narratives are unimportant, nor that the construction of good narrative is easier than other forms of scholarly work.  But it is not “discovery” in the way that research is in other disciplines. 

And here’s the thing: when the public pays for research, it thinks it’s paying for discovery, not narrative-building.  In this sense, Wente taps into something genuine in the zeitgeist; namely, the public claim that: “we’re being duped into paying for something to which we didn’t agree”.  And as a result, all research comes under suspicion.  This is unfortunate: we’re judging two separate concepts of scholarly work by a single standard, and both end up being found suspect because one of them is mislabeled.

To be clear: I am not at all suggesting that one of these activities is superior to the other.  I am suggesting that they are different in nature and impact.  For one thing, the most advanced scientific research is mostly unintelligible to lower-year undergraduates, whereas some of the best narrative work is actually – much like Sexed-up Paratext – intended precisely to render some key academic concepts more accessible to a broader audience.

It is precisely for this reason that we really ought to have two separate words to describe the two sets of activities.  The problem is finding one that doesn’t create an implicit hierarchy between the two.  I think we might be stuck with the status quo.  But I wish we weren’t.

November 10

An Update on England’s Teaching Excellence Framework

Last week, the UK Minister for Business Innovation and Skills (which is responsible for higher education) released a green paper on higher ed.  It covered a lot of ground, most of which need not detain us here; I think I have a reasonable grasp of my readers’ interests, and my guess is that the number of you who have serious views about whether the Office For Fair Access should be merged into a new Office for Students, along with the Higher Education Funding Council for England, is vanishingly small (hi, Andrew!).  But it’s worth a quick peek into this document because it puts a bit more meat on the bones of that intriguing notion of a Teaching Excellence Framework.

You may remember that back in the summer I reviewed the announcement of a “Teaching Excellence Framework” wherein institutions that did well on a series of teaching metrics would be rewarded with the ability to charge higher tuition fees.  The question at the time was: what metrics would be used?  Well, the green paper is meant to be a basis for consultation, so we shouldn’t take this as final, but for the moment the leading candidates for metrics seem to be: i) post-graduation employment; ii) retention rates; and, iii) student satisfaction indicators.

Ludicrous?  Well, maybe.  At the undergraduate level, satisfaction tends to correlate with engagement, which at some vague level correlates with retention, so there’s sort of a case here – or would be if they weren’t already measuring retention.  Retention is not a silly outcome measure either, provided you can: a) control for entering grades (else retention be simply a function of selectivity), and b) figure out how to handle transfer students.  Unfortunately, it’s not clear from the document that either of these things has been thought through in any detail.

And as for using post-graduate employment? Again, it’s not necessarily a terrible idea. However, first: the regional distribution of graduate destinations matters a lot in a country where the capital city is so much richer than the rest of the country.  Second: the mantra that “what you study matters more than where you study” works in the UK, too – measuring success by graduate incomes only makes sense if you control for the types of degrees offered by each institution.  Third: the UK only looks at graduate incomes six months after graduation.  Presumably, a longer survey period is possible (Canada does it at three years, for instance), but the only thing on the table at the moment is the current laughably-short period.

So, there’s clearly a host of problems with the measures.  But perhaps even more troubling is what is on offer to institutions who do “well” on these measures.  The idea was that institutions would pay attention to “teaching” (or whatever the aforementioned load of indicators actually measures) if doing so allowed them to raise tuition above the current cap of £9,000.  However, according to the green paper, the maximum an institution will be allowed to increase fees every year is inflation.  Yet at the moment CPI is negative, which suggests this might not be much of an incentive.  Even if inflation returns to 1% or so, one has a hard time imagining this being enough of a carrot for all institutions to play along.

In sum, this is not a genuine attempt to find ways to encourage better teaching; rather, it is using a grab-bag of indicators to try to differentiate the sector into “better” and “worse” actors, and in so doing try to create more “signals of quality” to influence student decision-making.  Why does it want to do this?  Because it desperately wants higher education to work like a “normal” market, the government is trying to rationalize some of its weirder ideas about how the system should be run (the green paper also devotes quite a bit of space to market entry, which is code for letting private providers become universities with less oversight, as well as market exit, which is code for letting universities fail).

Though the idea of putting carrots in place to encourage better teaching has value, an effective policy would require a lot more hard thinking about metrics than the UK government appears willing to do.  As it stands, this policy is a dud.

November 09

Yukon College’s Difficult Path to University Status

Last week, Yukon Education Minister Doug Graham announced that the territory was going to change the name of Yukon College to Yukon University.  The College then proceeded to state that it would launch new degree programs and seek membership in Universities Canada in 2017.

Well, now.  How is that going to work exactly?

Universities Canada has some pretty clear guidelines about membership.  Point 4 says that a prospective member must have “… as its core teaching mission the provision of education of university standard with the majority of its programs at that level”.  At the moment, only five of Yukon College’s fifty-odd programs are at degree level (Social Work, Public Administration, Environmental and Conservation Sciences, Education – Yukon Native Teacher, and Circumpolar Studies). Now, presumably it could upgrade some of its career & tech programming into full degree programs to change the balance a bit, but it’s not clear that would be to students’ benefit: there are good reasons – cost among them – to keep programs like business administration, early childhood education, and various technologist programs at two years rather than four.

One possible solution would be to split the administration of the college and the university, so that the two could share physical space, infrastructure, and back-office functions, while at the same time having separate management and programming.  This might get Yukon College off one hook, but it would quickly get snagged by another.  Universities Canada also requires prospective members to have 500 FTEs for at least two years before joining.

In terms of student numbers, Yukon college looks like this:

Yukon College FTE Students by Stream, 2012-13 (Yukon College Annual Report 2013-14, FTE = FT + [PT/3.5])

1

 

 

 

 

 

 

 

 

 

 

 

 

Degree and transfer programs together only make up 253 FTEs, or about a third of the college’s 748 FTE students in credit programming.  One would need some really big increases in student numbers to change this.  But Yukon College would likely have trouble moving the needle much.  Here’s the history of Yukon College enrolments:

Total FTE Enrolments at Yukon College, 2007-08 to 2014-15

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

Numbers at the college have never gone over 850, and the highest two-year average is about 820.  Even assuming you could nudge university numbers from a third to a half of total enrolments, that would still leave a newly-separated university nearly 100 students shy of the required 500.

Now, there’s nothing stopping a Yukon institution from using the name “university”, even if Universities Canada doesn’t agree.  Quest University seems to be doing fine without Universities Canada membership, for instance.  But Universities Canada is still the closest thing Canada has to an accreditation system, and so being on the outside would hurt.

But here’s a wild-and-crazy suggestion:  Yukon wants a university; Nunavut wants a university; presumably, at some point, the Northwest Territories will glom onto this idea and want a university, too.  Any one of them, individually, would have a hard go of making a serious university work.  But together, they might have a shot.

Of course, universities are to some extent local vanity projects.  I’m sure each territory would prefer its own university rather than share one.  But these things cost money, especially at low volume.   We could have three weak northern universities, or we could have one serious University of the North.  Territories need to choose carefully, here.

November 06

What Canadians Think About Universities, and Where Canadian Universities Want To Go

A couple of quick notes about two interesting things from Universities Canada this week.

The first is the release of some public opinion polling, which they commissioned in the spring, regarding universities and other forms of higher education.  You can see the whole thing here, but I want to highlight a couple of slides, in particular.

The first is this one:

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

It seems Canadians are overwhelmingly positive about most post-secondary institutions (though Quebecers clearly have a few doubts about CEGEPs).  Somewhat perplexingly, UnivCan also felt the need to test Canadians’ opinions about universities in Europe (do Canadians really have deep feelings about French grands écoles, German fachhochschulen, and Romanian politehnici?).  Mostly, though, this is all to the good.

But the more interesting set of answers is this one:

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Turns out Canadians think their universities are world-class, practical, and produce valuable research… but they also really need to change.  Which seems about right to me.  However, one wishes there might have been a follow-up: what kind of change is needed, exactly?

Often times, these kind of dissonant results (you’re great/please change) give the poll-reader a lot of room to cherry-pick.  Is UnivCan doing this?  Well, maybe.  Take a look at the new “Commitments to Canadians” the Presidents collectively issued this week.  They commit themselves to:

  • Equip all students with the skills and knowledge they need to flourish in work and life, empowering them to contribute to Canada’s economic, social, and intellectual success.
  • Pursue excellence in all aspects of learning, discovery, and community engagement.
  • Deliver a broad range of enriched learning experiences.
  • Put our best minds to the most pressing problems – whether global, national, regional, or local.
  • Help build a stronger Canada through collaboration and partnerships with the private sector, communities, government, and other educational institutions in Canada and around the world.

OK, so some of this is yadda yadda, whatever kind-of-stuff. (“pursue excellence in everything we do” is utterly void of meaning). But an emphasis on partnerships is good, as is the commitment to preparing students for work & life – in that order.  Something stronger on internships and co-ops would have been better: both UC Chair Elizabeth Cannon and UC President Paul Davidson have spoken a lot about co-ops in recent speeches, but a specific commitment to them is lacking in the actual statement.  That’s too bad: co-ops and internships have the potential to be a genuine and unique value proposition for Canadian higher education; our universities do a lot more of it than those in other developed countries.  And pretty much everyone loves them, bar the sniffy types who disdain them as “mere training”.

The issue is follow-through, of course, and Lord knows shifting institutional cultures ain’t easy.  But one gets the sense that Canadian universities are absorbing the change message, and acting upon it.  That’s good news.

Have a good weekend.

Page 30 of 115« First...1020...2829303132...405060...Last »