HESA

Higher Education Strategy Associates

Category Archives: Universities

March 13

The Alternative to the End of College (Part 3)

So, if Kevin Carey is pretty much dead on about the weaknesses of current universities, and mostly wrong about where things go from here, how else might universities change over the next couple of decades?

Let’s start with the key points:

  • Money pressures aren’t going to ease up.  The cost disease will always be with us;
  • Professors want to research, and they don’t want to do it in soviet-style academies, divorced from teaching.  They’ll fight hard for present system;
  • Higher education is, to a significant extent, a Veblen Good.  It is thus, to a considerable degree, impervious to disruption;
  • Students don’t go to school just for the teaching.  They go for the experience.  And the networks.  And the personal contact.  And occasional piece of praise.  Some of this can be had online; but it tends to be more meaningful and lasting if accompanied by something face-to-face;
  • The value of an established credential is that it relieves employers of the need to think too hard about the worth of an applicant.  For this reason, it’s really hard for a new credential to displace an established credential;
  • Employers are looking for universities to produce graduates who have more soft skills – mainly relating to teamwork and customer-facing skills.  Students know this – and they want an education that will help provide this.

Any future one can imagine will need to meet these parameters.  So, let’s extrapolate a little bit from here.

  • Students will pay more for university if asked.  They may not like it, but they will do it.  This will eventually ease some of the cost pressure.  As a result, the status quo re: day-to-day practices will be easier to maintain.  A blow-out event;
  • That said, absent a frontal assault by government (which I think unlikely), tenured research track faculty are likely to hang around and get more expensive.  So there will still be cost-pressure for change;
  • Professional pressures around research output means professors by and large will abandon lower-year courses (to the extent they already haven’t).  Something has to replace them;
  • MOOCs – or something like them – are an obvious way to cut costs here.  Carey notes that although there are hundreds of thousands of different courses offered across the United States, the majority of credits actually awarded come from just 5,000 or so courses, which are pretty standard across institutions (e.g. American History 100, Accounting 200, etc.).  To some significant degree, these can be standardized.  That’s not to say there need only be a single course in each of these 5,000 areas: monocultures are bad.  But in the words of one Harvard professor Carey interviewed, there probably doesn’t need to be more than half a dozen, either.  Delivered at sufficient volume, these future-MOOCs will not just feature top lecturers, but also will have massively better support packages and learning design.  Institutions could still localize and personalize them by offering their own tutorial support and testing of the material covered in these future-MOOCs, and then award their own credit for them.  It’s not obvious the outcomes of this kind of arrangements would be worse than they are now: the lectures will likely be better, the scope for improvements for inter-institutional mobility and credit transfer are enormous, and the more nightmarish scenarios around MOOCS could be avoided;
  • Pressure from students and employers is going to lead to significant re-designs of programs around learning outcomes – and specifically around issues of teamwork and problem-solving.  The key change is going to come around how to integrate credible assessments of these qualities into existing structures of courses and degrees.  There will likely be a lot of experimentation; certainly, I think we’re on the verge of the most serious re-think of the structure of credits and degrees since the 1960s;
  • In tandem, various forms of work-based learning are going to keep expanding.  Co-ops and internships will grow.  Practical upper-year courses where students get to tackle real-world problems will become much more common.  Some new types of validation – maybe not badges but something different from a simple diploma – will arise to help document achievement in these areas.

In other words, there will likely some big changes in undergraduate programming, some due to technology, some due to cost pressures, and some due to demands from students and employers.   These changes will weaken the importance of the credit hour and reduce the centrality of academic discipline in academic life.  It will make university-based learning less reliant on classroom teaching as we currently know it.

But it will not be the End of College.

*Note: I’ll be in South Africa next week, and to keep myself sane, I’ll be taking a one-week hiatus from the blog.  See you all again on March 23rd.

March 12

The End of College? (Part 2)

As discussed yesterday, Kevin Carey’s The End of College pinpoints higher education’s key ills in its inability (or unwillingness) to provide students with any real signal about the quality of their work.  This serves students badly in a number of ways.  First, it makes finding job matches harder, and second, it means institutions can mis-sell themselves by investing in the accoutrements of excellence (ivy, quads, expensive residences) without its substance.

Essentially, Carey believes that technology will solve these problems.  He’s not a blind MOOC-hypester; in fact, his chapter on Coursera is reasonably astute as to the reasons the current generation of MOOCs have yet to set the world alight.  But he is utterly certain that the forces of technology will eventually provide high-quality, low-price solutions, which will overwhelm the current model.  The ability to learn without the need for physical classrooms or libraries, the ability to get tutorial and peer assistance online, and the ability to test and certify at a distance will largely do away with the need for current (expensive) physical universities, and usher in the age of “The University of Everywhere”.  Cue the usual stuff about “disruption”.

Carey provides readers with a useful overview of some of the ed tech companies whose products are trying to provide the basis of this revolution, with a particular emphasis on technologies that can capture and measure learning progress, and use that information both to immediately improve student performance, and to provide feedback to instructors and institutions to improve courses.  He also spends a chapter looking at the issue of credentials.  He correctly recognizes that the main reason universities have been able to maintain their position for so long is the strength of the Bachelor’s degree, a credential over which they maintain a near-monopoly.  And yet, he notes, credentials don’t actually tell much about what a graduate’s capabilities are.  And so he spends an entire chapter talking about alternatives to Bachelor’s degrees, such as Digital “badges” – open-sourced, machine-readable competency-based credentials which, in theory at least, are better at communicating actual skills to potential employers.

The problem is that this argument misses the mark, somewhat.  To measure learning in the way techno-optimists wish, the “learning” has to be machine-readable.  That is to say, student capabilities at a point in time have to be captured via clicks or keystrokes, and those keystrokes have to be interpretable as capabilities.  The first is trivially easy (although implementing into a classroom setting in a disciplined way may end up being a form of torture); the second will vary from easy to unimaginably difficult depending on the discipline.

A lot of the promise people see in machine learning is based on things like Sebastian Thrun’s early MOOCs, which were in some ways quite intriguing.  But these were in computer science, where answering a question rightly or wrongly is a pretty good indication of a mastery of underlying concepts, which in turn is probably a reasonable measure of “competence” in a field.  But extrapolating from computer science is less helpful; most disciplines – and indeed, all of business and the social sciences – are not susceptible to capture this way.  The fact that a history student might not know a “correct” answer to a question (e.g. “in what year was the Magna Carta signed”?) doesn’t tell you how well that student has mastered skills like how to interpret sources.  In the humanities and social sciences (here including Law, Education, and Business), you can capture information, but it tells you very little about underlying skills.

With badges, the problem is roughly the same.  Provided you are in a field of study where discrete skills are what matters, badges make sense.  But by and large, those fields of study aren’t where the problem is in higher education.  What problems do badges solve in humanities and social sciences?  If the skills you want to signal to employers are integrative thinking or teamwork (i.e. skills the majority of employers say they most desperately need), how do badges solve any of the problems associated with the current Bachelor’s degree?

Two final points.  First, I think Carey is too optimistic about learners, and insufficiently mindful that universities have roles beyond teaching.  One justified criticism of much of the “disruption” crowd is that their alternative vision implies a high degree of autodidacticism among learners: if you put all these resources online for people, they will take advantage of them on their own.  But in fact, that’s likely the case only for a minority of learners: a University of Everywhere will – in the early years at least, and quite possibly much longer – likely impose significant penalties on learners who need a bit more assistance.  They need a level of human contact and interaction higher than that which can be provided over the internet.

Finally, one of the main reasons people go to universities is the social aspect.  They meet people who will remain friends, and with whom they’ll associate for the rest of their lives.  They learn many skills from each other via extra-curricular activities.  Basically, they learn to become adults – and that’s a hugely important function.  And sure, most universities do a half-assed job (at best) of communicating and executing this function, but Carey’s alternative is not an improvement on this.  It is why I’m fairly sure that even if most students could go to the University of Everywhere, they would still choose not to.  Even if it were practical, I’m not sure it passes the market test.

So if Carey’s diagnosis about universities’ weaknesses are accurate but his predictions incorrect, what are the real alternatives?  I’ll tackle that tomorrow.

March 11

The End of College? (Part 1)

Over the next couple of days, I want to talk a bit about a new book called The End of College, written by the New America Foundation’s Kevin Carey.  It’s an important book not just because it’s been excerpted repeatedly in some major publications, or because the conclusions are correct (in my view: they’re not), but because it has an unerringly precise diagnosis of how higher education came to its present malaise, and the nature of the economic and institutional reasons that impede change in higher education.

Carey’s narrative starts by tracing the origins of universities’ current problems back to the 19the century, when America had three competing types of universities.  First were the small liberal arts colleges devoted either to Cardinal Newman’s ideals, or training clergy, or both; second were the Land Grant institutions, created by the Morrill Act and devoted to the “practical arts”; and a third was a group that wanted to emulate German universities and become what we now call “research universities”.  Faced with three different types of institutions from which to choose, America chose not choose at all – in effect, it asked universities to embody all three ideals at once.

On top of that, American universities made another fateful decision, which was to adopt what is known as the Elective model (I prefer the term “Smorgasbord model”, and wrote about it back here).  Starting at Harvard under President Charles Eliot, this move did away with programs consisting of a standardized set of courses in a standard curriculum, and replaced it with professors teaching more or less what they felt like, and students getting to choose the courses they liked.  This mix of specialization and scholarly freedom was one of the things that allowed institutions to accommodate both liberal and practical arts within the same faculties.  In Carey’s words: “the American university emerged as an institution that was designed like a research university, charged with practical training and immersed in the spirit of liberal education”.

The problem is that this hybrid university simply didn’t work very well as far as teaching was concerned.  The research end of the university began demanding PhDs – research degrees – as minimum criteria for hiring.  So hiring came to center on research expertise even though this was no guarantee of either teaching quality or ability in practical arts. And over time, universities largely abandoned responsibility for teaching to those people who were experts in research but amateurs at teaching.  No one checked up on teaching effectiveness or learning outcomes.  Degrees came to be a function of time spent in seats rather than actual measures of competence, proficiency, or mastery of a subject.

Because no one could check up on actual outputs or outcomes – not only are our research-crazy institutions remarkably incurious about applying their talents to the actual process of learning, they actively resist outsiders attempts to measure, too (see: AHELO) – competition between universities was fought solely on prestige.  Older universities had a head start on prestige; unless lavishly funded by the public (as the University of California was, for a time), the only way to complete with age was with money – often students’ money.  Hence, George Washington University, New York University, the University of Southern California, and (to a lesser extent) Washington U St. Louis all rose in the rankings by charging students exorbitant fees and ploughing that money into the areas that bring prestige: research, ivy, nicer quads, etc.  (Similarly, Canadian institutions devoted an unholy percentage of all the extra billions they got in tuition and government grants since the late 90s into becoming more research-intensive; in Australia, G-8 universities are shameless in saying that the proceeds of deregulated tuition are going to be ploughed into research.)  The idea that all those student dollars might actually be used to – you know – improve instruction rarely gets much of a look-in.

Maybe if we were cruising along at full employment, no one would care much about all this.  But the last six years have seen slow growth and (in the US at least) unprecedented declines in disposable middle-class incomes, as well as graduates’ post-school incomes.  So now you’ve got a system that is increasingly expensive (again, more so in the US than Canada), doesn’t attempt to set outcomes standards or impose standards on its professors, or do much in terms of working out “what works”.

Carey – rightly, I think – sees this as unsustainable: something has to give.  The question is, what? Tomorrow, I’ll discuss Carey’s views on the subject, and on Friday I’ll provide some thoughts of my own.

March 10

Maritime Problems

A couple of weeks ago, Leo Charbonneau over at University Affairs wrote a nice little piece on Maritime universities and the trouble they’re having.  The basic message is that universities out there aren’t doomed – part of the “Don’t Panic” line that AUCC seems to be putting out these days.  The argument was essentially: hey, just nudge the participation rate a point or two, and improve retention a little bit, and those plucky little eastern universities will do just fine.

Allow me to demur a bit.  Once you break down the numbers to the provincial or institutional level, you realize that the picture out east is, in fact, by no means uniform; while the system as a whole is mostly holding steady, there are a few institutions that are in real trouble.

Let’s start by looking at the numbers by province.  In Prince Edward Island, UPEI has done a good job growing its enrolments.  Numbers are off slightly in the last couple of years, but overall they remain about 10% (or about 350 students) higher than they were a decade ago.  Nova Scotia and New Brunswick, on the other hand, both saw falling enrolments from roughly 2004 to 2008.  What’s interesting since that time is the divergence in fortunes between those two provinces.

Figure 1: Total Enrolments by Province, 2004/05-2013/14, Indexed to 2004

In Nova Scotia, the fall during the 04-08 period was concentrated at four universities: Acadia, Cape Breton, Mount Saint Vincent, and Saint Mary’s, all of which lost about 12% of their student body in those four years.  However, during the subsequent rebound, only Acadia actually recovered to any significant extent: most of the growth happened at Dalhousie, which was never hurting for students in the first place.  Nova Scotia as a whole has stayed constant, but what’s actually happened over the last decade is that Dal has grown from being 34% of the provincial system to being over 40%. Meanwhile, Cape Breton, the Mount, and SMU are all a lot more precarious than they used to be.

In New Brunswick, the biggest absolute loser has been the University of New Brunswick’s Fredericton campus, which now has roughly 16% fewer students than it did a decade ago (to all those folks who wonder along with AUNBT why UNB keeps getting rid of tenure lines: that’s why).  But in percentage terms, the real disasters are Moncton’s satellite campuses in Shippagan and Edmundston, where enrolments are down 37 and 49%, respectively, over a decade.

Figure 2: Total Enrolments at Universite de Moncton Satellite Campuses, 2004/05 to 2013/14

Some of you may have noticed last week that Sweet Briar College, a small women’s college in Virginia with an endowment of $100 million, and annual fees of $34,000 US, announced it would be closing because its enrolments and finances were unsustainable.  How many students did it have?  550 – about the same as Edmundston, and 100 more than Shippagan. Somehow, neither campus is losing too much money yet – both are losing about $150K on budgets in the $12 million range – but we’re getting close to the point where the viability of both has to come into question.  That will be hugely traumatic for both communities: having a post-secondary institution in town is a major part of both their survival plans.  But it’s hard to see how the provincial government and the Acadian community as a whole can avoid this discussion.

So, is post-secondary education as a whole in trouble in the Maritimes?  No.   But I count four institutions whose enrolments are already down over 10% from where they were a decade ago, plus the two catastrophic cases of Shippagan and Edmundston.  And there are further youth population declines to come.  Yeah, some of this can be offset by international students (though in the case of Saint Mary’s, they’re already at 30% international students, and *still* their overall numbers are down 10%), but I wouldn’t bet they all can.

In other words, don’t be distracted by the aggregate numbers.  There are some very tough decisions to be made at some of these schools.  My guess is one or two of them won’t be here a decade from now.

March 09

Sessionals, Nursing Degrees, and the Meaning of University

Be forewarned: I am going to be very mean about universities today. 

One thing the labour disputes in Ontario highlight is the amount of undergraduate teaching done by non-tenure track professors.  Numbers on this are hard to come by, and poorly defined when they are.  York sessionals claim to be teaching 42% of all undergraduate classes – but how do you define a class?  But from what I’ve gathered from talking to people across the province who are in a position to know, it is not uncommon at larger universities to at least see between 40 and 50% of all undergraduate credit hours (which is the correct unit of analysis) taught by sessionals.

Think about that for a minute: half of all credit hours in major Ontario universities are taught by staff  who are off the tenure track.  People with no research profile to speak of.  Yet aren’t we always told that the combination of research and teaching is essential in universities?  Aren’t we told that without research, universities would be nothing more than – God forbid – community colleges?  So what does it mean when half of all undergraduate credit hours are taught by these sessionals?  Are students are only getting the essential university experience half the time?  And the other half of the time they are at community colleges?  If so, why are student and taxpayers paying so much more per credit hour?

These are important questions at any time, but I think their importance is underlined by the stramash currently going on between Ontario universities and colleges over the possibility of colleges offering stand-alone nursing programs.  You see, Ontario has none of these.  Universities can have stand-alone nursing programs; colleges can have nursing programs, but require a university partner to oversee the curriculum.  This partnership has nothing to do with sharing of physical resources or anything – Humber College’s partner is the University of New Brunswick (which is how UNB became Ontario’s third/fourth-largest supplier of nurses a few years ago).  No, it’s just a purely protectionist measure, which Ontario universities justify on the grounds that “patient care [has] become so complex that nurses needed research, theory, critical thinking, and practice in order to be prepared [for work]”.  Subtext being: obviously you can’t get that just from a community college.

But why is this obvious?  Clearly, universities themselves don’t believe that theory and critical thinking are related to research, because they’re allowing non-research staff to provide half the instruction.  Indeed, maybe – horror upon horrors – nearly all undergraduate instruction in nursing can be delivered by halfway competent practitioners who are reasonably familiar with developments in nursing research, and that actually having one’s own research practice is neither here nor there.   In which case, the argument for stand-alone nursing schools – with appropriate quality oversight from professional bodies – is pretty much unanswerable.

Too much of universities’ power and authority rests on their near-monopoly on degree-granting.  And too much of that monopoly on degree-granting rests on hand-waving about “but research and teaching!”  Yet, as sessionals’ strikes always remind us, Ontario universities are nowhere close to living up to this in practice.  I wonder how long it will be before some government decides to impose some costs on them for this failure.

March 06

Some Thoughts on TA Strikes

At the time of writing (Thursday PM), Teaching Assistant Unions at both the University of Toronto and York University are on strike, as is the union representing sessionals at York.  Since Toronto is indeed “The Centre of the Universe”, I’m sure everyone across the country is just riveted by this news.  At the risk of irritating those readers still further, I thought I’d jot down a few thoughts on the matter.

1)      A lot of people seem to be wondering “why are we relying so much on adjunct labour these days?”  The quick answer is “because profs are spending more time researching and less time teaching than they used to”; sessionals are an emergent property of a system that gets paid to teach, but prefers to spend money on research.  See also this recent piece on the economics of sessionals.

2)      It’s for this reason that I’m finding the OCUFA campaign on sessionals – “WeTeachOntario” – mindbogglingly un-self-aware.  It’s great to support sessionals, of course, but the utter lack of any kind of recognition that full-time faculty’s well-above-inflation pay settlements, and their perennial push to research more and teach less are significant contributing factors to the problem is simply amazing.

3)      The University of Waterloo’s Emmett Macfarlane wrote a very good piece on the TA strike on the Policy Options blog, which summed up a lot of my feelings about the strikes.  The issue pretty clearly isn’t about what students get paid for their labour as TAs (which at over $40/hr is pretty good), but what they receive overall (i.e. labour plus scholarship), which they feel is inadequate.  And yet it’s the labour tool they are using to address the problem, which is… problematic.

4)      On the issue of whether U of T grad students are, as they frequently claim, “living below the poverty level”:  The union keeps using a figure of $23,000 as the Toronto poverty level, which is in fact the pre-tax low-income cut off for large cities.  The post-tax figure – which is the more accurate comparison, since TA labour income is below the level at which income gets taxed and scholarships are tax-free up to $10K – is $19,000.  Or $1,583/month.  The base TA/grad package is $15K for 8 months or $1,875/month.  So the veracity of the claim seems to rest on the assumption that grad students get no outside income in those other 4 months.  My guess is that’s not for the most part true – they’ll either take on extra work or have an outside scholarship.

5)      What doctoral students are really asking for is that they be treated as employees, not just for their teaching duties but also for the entirety of their academic labour.  And that’s not crazy: in much of Europe, doctoral students are in fact university employees, and reasonably well paid.  There’s nothing to stop a university doing that here: in fact, some might argue that it would substantially improve a university’s ability to recruit graduate students.

The problem – as always – is money: universities don’t want to make the sacrifices to other aspects of the university budget (including, obviously, academic and staff pay) to make this work.  One possible compromise would be to turn PhD students into employees, but accept far fewer of them; but here you’d run into the problem of Arts professors having to backfill by doing more teaching themselves, and Science professors going bananas because now who’s going to run the labs?

To which, with some justification, doctoral students might simply say: Exactly. We’re worth more than you think.   And I’d have a fair bit of sympathy with that.

Have a good weekend.

February 25

Rankings in the Middle East

If you follow rankings at all, you’ll have noticed that there is a fair bit of activity going on in the Middle East these days.  US News & World Report and Quacquarelli Symonds (QS) both published “Best Arab Universities” rankings last year; this week, the Times Higher Education (THE) produced a MENA (Middle East and North Africa) ranking at a glitzy conference in Doha.

The reason for this sudden flurry of Middle East-oriented rankings is pretty clear: Gulf universities have a lot of money they’d like to use on advertising to bolster their global status, and this is one way to do it.  Both THE and QS tried to tap this market by making up “developing world” or “BRICs” rankings, but frankly most Arab universities didn’t do too well on those metrics, so there was a niche market for something more focused.

The problem is that rankings make considerably less sense in MENA than they do elsewhere. In order to come up with useful indicators, you need accurate and comparable data, and there simply isn’t very much of this in the region.  Let’s take some of the obvious candidates for indicators:

Research:  This is an easy metric, and one which doesn’t rely on local universities’ ability to provide data.  And, no surprise, both US News and the Times Higher Ed have based 100% of their rankings on this measure.  But that’s ludicrous for a couple of reasons.  First is that most MENA universities have literally no interest in research.  Outside the Gulf (i.e. Oman, Kuwait, Qatar, Bahrain, UAE, and Saudi Arabia) there’s no available money for it.  Within the Gulf, most universities are staffed by expats teaching 4 or even 5 classes per term, with no time or mandate for research.  The only places where serious research is happening are at one or two of the foreign universities that are part of Education City in Doha, and in some of the larger Saudi Universities.  Of course the problem with Saudi universities, as we know, is that at least some of the big ones are furiously gaming publication metrics precisely in order to climb the rankings, without actually changing university cultures very much (see for example this eye-raising piece).

Expenditures:  This is a classic input variable used in many rankings.  However, an awful lot of Gulf universities are private and won’t want to talk about their expenditures for commercial reasons.  Additionally, some are personal creations of local rulers who spend lavishly on them (for example, Sharjah and Khalifa Universities in UAE); they’d be mortified if the data showed them to spending less than the Sheikh next door.  Even in public universities, the issue isn’t straightforward.  Transparency in government spending isn’t universal in the area, either; I suspect that getting financial data out of an Egyptian university would be a pretty unrewarding task.  Finally, for many Gulf universities, cost data will be massively wonky from one year to the next because of the way compensation works.  Expat teaching staff (in the majority at most Gulf unis) are paid partly in cash and partly through free housing, the cost of which swings enormously from one year to the next based on changes in the rental market.

Student Quality: In Canada, the US, and Japan, rankings often focus on how smart the students are based on average entering grades, SAT scores, etc.  But those simply don’t work in a multi-national ranking, so those are out.

Student Surveys: In Europe and North America, student surveys are one way to gauge quality.  However, if you are under the impression that there is a lot of appetite among Arab elites to allow public institutions to be rated by public opinion then I have some lakeside property in the Sahara I’d like to sell you.

Graduate Outcomes:  This is a tough one.  Some MENA universities do have graduate surveys, but what do you measure?  Employment?  How do you account for the fact that female labour market participation varies so much from country to country, and that many female graduates are either discouraged or forbidden by their families from working? 

What’s left?  Not much.  You could try class size data, but my guess is most universities outside the Gulf wouldn’t have an easy way of working this out.  Percent of professors with PhDs might be a possibility, as would the size of the institution’s graduate programs.  But after that it gets pretty thin.

To sum up: it’s easy to understand commercial rankers chasing money in the Gulf.  But given the lack of usable metrics, it’s unlikely their efforts will amount to anything useful, even by the relatively low standards of the rankings industry.

February 13

Meetings vs. Management

It’s always difficult to make accurate observations about differences in national higher education cultures.  But one thing I can tell you that is absolutely not true is the perception that Canadian universities are suffering under some kind of unprecedented managerialist regime.  If anything, Canadian academics are among the least managed employees in the entire world;

When academics complain of over-management, they aren’t using that term in a way that workers in other fields would recognize.  They are not, for instance, required to be in any one place other than the six to nine hours per week they are teaching: it is simply understood that they are working and being efficient at a place of their choosing.  The content of their work largely escapes scrutiny: no one checks-in on their classes to see what is being taught (though Queen’s university may be wishing they had a bit more hands-on management after revelations of anti-vaxxing material in a health class last week).  Research topics are largely left to the individual researchers’ interests.  In other words, subject to contractual obligations around teaching, they mostly do what they want.  In most respects, they resemble a loosely connected set of independent contractors rather than actual employees.

Rather, what academics in Canada are actually complaining about when they talk about managerialism is three things:

1)      The existence (and growth) at universities of a class of managers that are almost as well paid as senior academics.  The fact that these people rarely impact the working life of academics is irrelevant; their mere presence is evidence of “managerialism”.

2)      The existence of apparently pointless bureaucracy around purchasing, reimbursement, and travel.  This annoyance is easy to understand, but it’s not clear to me that this problem is any worse at universities than it is at other organization of similar size.

3)      Meetings.  Lots and lots of meetings.  Yet the thing about meetings in universities is that they are rarely decision-making affairs.  More often than not, in fact, they are decision-retarding events (or possibly even decision-preventing events), whose purpose is more about consultation than administration.

In a real managerial university, courses would be ruthlessly overseen, if for no other reason than to ensure that classes met minimum enrolment counts.  In a real managerial university, individual professors’ research programs would be reviewed continuously in order to ensure that it was attracting maximum funding.  In a real managerial university, the managers would know where employees were from 9-5 every day. But almost none of that exists in Canada.  To really see that stuff you need to go to the UK or – to a lesser extent – Australia.

Professors are, of course, right to worry about managerialism, because UK universities sound pretty horrid.  But a dose of actual managerialism (as opposed to just having more meetings) probably wouldn’t hurt in Canadian universities – particularly when it comes to ensuring curriculum coherence and enforcing class-size minima.

February 09

Funding Universities’ Research Role

A couple of weeks ago, I wrote a series of pieces looking at the economics of teaching loads; specifically, I was focussed on the relationship between per-student funding and the teaching loads required to make universities self-sustaining.  I had a number of people write to me saying, in effect, “what about research?”

Good question.

The quick answer is that in provinces with explicit enrolment-driven funding formula (e.g. Ontario, Quebec, Nova Scotia), governments are not in fact paying universities to do any research, and neither are students.  They are paying simply for teaching.  There is nothing in these funding formulae or the tuition agreements with students that says, “this portion of the money is for research”.

Now that doesn’t mean governments don’t want faculty to conduct research.  It could mean that government just wants any research to occur after a certain number of credits are completed.  But I’m not sure this is, in fact, the correct interpretation.  In Ontario, for instance, universities sign multi-year agreements with governments.  Not a word can be found in these agreements about research – they are entirely about enrolments and teaching.  Admittedly that’s just Ontario, but I don’t think it’s substantially different elsewhere. British Columbia’s institutional mandate letters, for instance, do not mention research, and while Alberta’s do, they really only suggest that the institution’s priorities align at least somewhat with that of the Alberta Research and Innovation Plan, a commitment so loose that any half-way competent government relations person could make it appear to be true without ever bothering any actual academics to alter their programs of research.

So I might go further and say it’s not that provincial governments want research to occur after a certain number of credits have been offered; rather, I would suggest that provincial governments do not actually care what institutions do with their operating grants, provided they teach a certain number of credits.  Certainly, to my knowledge, there is not a single provincial government in Canada that has ever endorsed the formula by which professors spend their time 40-40-20 teaching/research/service.  That’s an internal convention of universities, not a public policy objective.

There’s a case to be made that the research component of provincial funding needs to be made more transparent – a case, for instance, made by George Fallis in a recent book.  But universities will resist this; if research subsidies are made transparent, there will inevitably be a push to make institutions accountable for the research they produce.  That way lies assessment systems, such as the UK’s Research Excellence Framework (formerly known as the Research Assessment Exercise), or the Excellence in Research for Australia.  Both of these have driven differentiation among universities, in that institutions have tended to narrow research foci in response to external evaluations.  This, of course, is something universities hate: no one wants to have to tell the chemistry department (or wherever) that their research output is sufficiently weak that from now on they’re a teaching-only unit.

To put this another way: sometimes, ambiguity benefits universities.  Where research is concerned, it’s probably not in universities’ interest to make things too transparent.  Whether this opacity is actually in students’ and taxpayers’ interests is a different question.

February 05

It’s Not Just Demographics

The Council of Ontario Universities (COU) released an amusingly defensive press release last month, just after the high school applications deadline.  After a glancing acknowledgment that applications to university are down in the province for the second year in a row, we are earnestly told: DEMOGRAPHICS!  APPLICATIONS WAY UP IF YOU USE 2000 AS A BASE YEAR!  JOBS!  DEMOGRAPHICS!  MORE JOBS!  DID WE MENTION DEMOGRAPHICS?

I guess COU views itself as a prophylactic against negative press coverage that secondary school applicants to university are now down more than 5% over the past two years (actual applications are down only 2%, but that’s because students, on average, are applying to more schools than before).  And maybe that’s fair enough, because most of the decrease is due to demographics rather than a fall in the actual application rate.  But this attitude is only semi-productive, because while the overall decline in applications may not reflect a change in the public’s view of universities, the changing application numbers are going to produce some pretty dramatic alterations in the province’s higher ed landscape.

Let’s start first at the institutional level.  The only institution where first-choice applications are definitively above where they were two years ago is Nipissing.  Which, you know, thank God because after the way the province screwed them on funding for Education, they could use a break.  On the other side,  seven institutions in Ontario are looking at declines in first-choice applications from Ontario secondary schools of 10% or more: Brock, Guelph, Laurentian, Western, Ottawa, Lakehead, and Windsor.  Ottawa can perhaps afford this since it’s recently had an offsetting surge in Quebec applications, but elsewhere those declines are going to directly impact the bottom line, and result in cuts.  At Lakehead and Windsor, where application drops are 18% and 19%, respectively, the scope of impending cuts looks positively savage.

The numbers are perhaps even more portentous if you look  on a faculty basis.  In most fields of study, numbers are relatively flat.  Science, Business, and Nursing are all fluctuating within a 2% band.  Fine and Applied Arts are up a little bit over 4%, and Engineering is up by over 13%.  But Arts.  Oh my Lord, Arts: down nearly 16% in two years.

No, that’s not a typo.  Sixteen.  One-six.  In two years.

The implications of this are huge.  In the very short-term it’s good news because class sizes will decrease.  But in the medium-term, institutions simply will not be putting money into units where revenue is falling.  So Arts faculties should expect hiring freezes, loss of positions through attrition, reductions in budgets for sessionals, etc.

And remember, this won’t be because Visigothic neo-liberal governments don’t respect Social Sciences and Humanities; it will be because young people simply aren’t interested in studying in these fields.  And the reason they aren’t interested is because starting wages are down 20% or more over the past six years.  Decry their utilitarian approach to education if you must, but the simple fact is that unless Arts faculties get serious about changing program offerings to respond to students’ shifting interests, there are going to be deep program cuts ahead.

That’s something worth talking about, and soon.  The longer we tell ourselves this is just about demographics, the worse things are going to get.

Page 10 of 29« First...89101112...20...Last »