HESA

Higher Education Strategy Associates

February 01

Questions and Answers about UBC

So, what happened last week?  On Monday, pursuant to a freedom-of-information request submitted last fall, UBC finally released documents – mainly emails – related to the events surrounding the departure of Arvind Gupta.  Much of it was redacted, including a flurry of fairly long exchanges that happened in May and June.  On Wednesday, somebody figured out how to un-redact the document in adobe, and all of a sudden everyone could see the crucial exchanges.  Then on Thursday, in view of the fact that the UBC leak effectively violated the privacy clause of the non-disclosure agreement with the former President, Gupta himself decided to give a couple of interviews to the press.

What did we actually learn from the documents? Apart from the fact that folks at UBC are really bad at electronically redacting documents?  Less than you’d think. 

We do have a better understanding of the timeline of where things went wrong.  A discussion about a proposed strategic plan stemming from the February Board meeting seems to have been the start of the deteriorating relationship between Gupta and at least a portion of the Board.  Clear-the-air talks about weaknesses in Gupta’s performance were held following the April board meeting.  And then downhill from there.  The documents make clear there were a lot of complaints within the Board about Gupta’s leadership: in particular, his relationship with his own leadership team and his handling of relationships with the Board.  Read the May 18th letter from Montalbano to Gupta: it’s rough.

Some of the specifics were new, but frankly there isn’t much surprising in there.  You didn’t need to know the details to realize that the heart of the whole affair was that Gupta lost the backing of the Board, and that this was something that probably happened gradually over time.

What has Gupta said in his interviews?  He has said, first: the released documents provided a one-sided representation of the events of the spring, which is true enough.  Second, that despite having resigned because he had lost the confidence of the full Board, he now regrets not having pushed back hard and wishes he could have fought back, which is puzzling (if you’ve lost the confidence of a body, how would kicking back have aided anything?).  Third, he doesn’t understand why the Board didn’t support him because he had lots of support from professors, which seems to be a major instance of point-missing.  Fourth, that the whole push against him on the Board came from an ad-hoc, possibly self-selected sub-committee of the executive committee.

Wait, what?  There’s a lot of quivering about the fact that much of the Board were bystanders to the interplay between Montalbano and a few other key Board members, and Gupta – look, it’s a cabal, they had it in for him, hid it from the Board, etc.  But some of this is overwrought.  Generally speaking, a CEOs performance review is handled by the Chair of the Board and a few others, rather than by full Board.  The unanswered process question here is: what was the relationship of this group to the executive?  Was it duly constituted, or was it just a few people the Board Chair thought were “sound”?  In the grand scheme of things, this is kind of beside the point.  The fact that not a single other person on the Board has stepped forward and said “yeah, we were wrong about Gupta” suggests substantial unanimity on the key point: that even if something was amiss procedurally, any other procedure would have led to the same result. 

(Similarly for the argument that there wasn’t “due process” for Gupta because he didn’t get the job performance evaluation that was in his contract: once the person/people responsible for evaluating a CEO decide the CEO needs to be replaced, what’s the point of a formal job evaluation?  If you were the CEO in question, wouldn’t you resign rather than go through a formal review where a negative outcome is certain?)

Is any of this going to change anyone’s mind about what happened?  I doubt it.  Gupta’s backers will say “it shows the Board had it in for him for the start”; any evidence that could be read as saying “gosh, maybe relations weren’t going so well” is simply regarded as “a pretext” so the mean old Board could stitch Gupta up.  A new set of rhetorical battle-lines seem to be forming: Gupta as champion of faculty (a point he himself seems keen to make) and the Board as the enemy of faculty.  There is little-to-no evidence this was actually the reason for Gupta’s dismissal, but it’s nevertheless the hill upon which a lot of other people want to believe he died.

That’s unfortunate, because it entirely misses the point about this affair.  Whether Gupta was popular with faculty, or whether he was a good listener and communicator with them, is irrelevant.  Presidents have to run a university to the satisfaction of a Board of Governors – some directly elected, some appointed by an elected government – who are there to maintain and ensure that the public interest is being served.  They have to do a large number of other things as well, but this is the really basic bit.  Whatever other beneficial things Gupta did or might have accomplished – and I think he might have done quite a lot – this wasn’t something he managed to achieve.  However nice or progressive a guy he may have seemed in the other aspect of his job doesn’t change this fact.  And so he and the board parted company.  End of story.

January 29

Asleep at the Switch…

… is the name of a new(ish) book by Bruce Smardon of York University, which looks at the history of federal research & development policies over the last half-century.  It is a book in equal measures fascinating and infuriating, but given that our recent change of government seems to be a time for re-thinking innovation policies, it’s a timely read if nothing else.

Let’s start with the irritating.  It’s fairly clear that Smardon is an unreconstructed Marxist (I suppose structuralist is the preferred term nowadays, but this is York, so anything’s possible), which means he has an annoying habit of dropping words like “Taylorism” and “Fordism” like crazy, until you frankly want to hurl the book through a window.  And it also means that there are certain aspects of Canadian history that don’t get questioned.  In Smardon’s telling, Canada is a branch-plant economy, always was a branch-plant economy, and ever shall be one until the moment where the state (and I’m paraphrasing a bit here) has the cojones to stand up to international capital and throw its weight around, after which it can intervene to decisively and effectively restructure the economy, making it more amenable to being knowledge-intensive and export-oriented.

To put it mildly, this thesis suffers from the lack of a serious counterfactual.  How exactly could the state decisively rearrange the economy so as to make us all more high-tech?  The best examples he gives are the United States (which achieved this feat through massive defense spending) and Korea (which achieved it by handing over effective control of the economy to a half-dozen chaebol).  Since Canada is not going to become a military superpower and is extremely unlikely to warm to the notion of chaebol, even if something like that could be transplanted here (it can’t), it’s not entirely clear to me how Smardon expects something like this to happen, in practice.  Occasionally, you get a glimpse of other solutions (why didn’t we subsidize the bejesus out of the A.V. Roe corporation back in the 1960s?  Surely we’d be an avionics superpower by now if we had!), but most of these seem to rely on some deeply unrealistic notions about the efficiency of government funding and procurement as a way to stimulate growth.  Anyone remember Fast Ferries?  Or Bricklin?

Also – just from the perspective of a higher education guy – Smardon’s near-exclusive focus on industrial research and development is puzzling.  In a 50-year discussion of R&D, Smardon essentially ignores universities until the mid-1990s, which seems to miss quite a bit of relevant policy.  Minor point.  I digress.

But now on to the fascinating bit: whatever you think of Smardon’s views about economic restructuring, his re-counting of what successive Canadian governments have done over the past 50 years to make the Canadian economy more innovative and knowledge-intensive is really quite astounding.  Starting with the Glassco commission in the early 1960s, literally every government drive to make the country more “knowledge-intensive” or “innovative” (the buzzwords change every decade or two) has taken the same view: if only publicly-funded researchers (originally this meant NRC, now it means researchers in university) could get their acts together and talk to industry and see what their problems are, we’d be in high-tech heaven in no time.  But the fact of the matter is, apart from a few years in the 1990s when Nortel was rampant, Canadian industry has never seemed particularly interested in becoming more innovative, and hence why we perennially lag the entire G7 with respect to our record on business investment in R&D.

You don’t need to buy Smardon’s views about the potentially transformative role of the state to recognize that he’s on to something pretty big here.  One is reminded of the dictum about how the definition of insanity is doing the same thing over and over, and expecting a different result.  Clearly, even if better co-ordination of public and private research efforts is a necessary condition for swifter economic growth, it’s not a sufficient one.  Maybe there are other things we need to be doing that don’t fit into the Glassco framework.

At the very least, seems to me that if we’re going to re-cast our R&D policies any time soon, this is a point worth examining quite thoroughly, and Smardon has done us all a favour by pointing this out.

Bon weekend.

January 28

The Future of Work (and What it Means for Higher Education), Part 2

Yesterday we looked at a few of the hypotheses out there about how IT is destroying jobs (particularly: good jobs).  Today we look at how institutions should react to these changes.

If I were running an institution, here’s what I’d do:

First, I’d ask every faculty to come up with a “jobs of the future report”.  This isn’t the kind of analysis that makes sense to do at an institutional level: trends are going to differ from one part of the economy (and hence, one set of fields of study) to another.  More to the point, curriculum gets managed at the faculty level, so it’s best to align the analysis there.

In their reports, all faculties would need to spell out: i) who currently employs their grads, and in what kinds of occupations (an answer of “we don’t know” is unacceptable – go find out); ii) what is the long-term economic outlook for those industries and occupations? iii) what is the outlook for those occupations with respect to tasks being susceptible to computerization (there are various places to look for this information, but this from two scholars at the University of Oxford is a pretty useful guide)? And, iv) talk to senior people in these industries and occupations to get a sense of how they see technology affecting employment in their industry.

This last point is important: although universities and colleges keep in touch with labour market trends through various types of advisory boards, the question that tends to get asked is “how are our grads doing now?  What improvements could we make so that out next set of grads is better than the current one?”  The emphasis is clearly on the very short-term; rarely if ever are questions posed about medium-range changes in the economy and what those might bring.  (Not that this is always front and centre in employers’ minds either – you might be doing them a favour by asking the question.)

The point of this exercise is not to “predict” jobs of the future.  If you could do that you probably wouldn’t be working in a university or college.  The point, rather, is to try to highlight certain trends with respect to how information technology is re-aligning work in different fields over the long-term.  It would be useful for each faculty to present their findings to others in the institution for critical feedback – what has been left out?  What other trends might be considered? Etc.

Then the real work begins: how should curriculum change in order to help graduates prepare for these shifts?  The answer in most fields of study would likely be “not much” in terms of mastery of content – a history program is going to be a history program, no matter what.  But what probably should change are the kinds of knowledge gathering and knowledge presentation activities that occur, and perhaps also the methods of assessment.

For instance, if you believe (as economist Tyler Cowen suggests in his book Average is Over that employment advantage is going to come to those who can most effectively mix human creativity with IT, then in a statistics course (for instance), maybe put more emphasis on imaginative presentation of data, rather than on the data itself.  If health records are going to be electronic, shouldn’t your nursing faculty be developing a lot of new coursework involving the manipulation of information on databases?  If more and more work is being done in teams, shouldn’t every course have at least one group-based component?  If more work is going to happen across multi-national teams, wouldn’t it be advantageous to increase language requirements in many different majors?

There are no “right” answers here.  In fact, some of the conclusions people will come to will almost certainly be dead wrong.  That’s fine.  Don’t sweat it.  Because if we don’t look forward at all, if we don’t change, then we’ll definitely be wrong.  And that won’t serve students at all.

January 27

The Future of Work (and What it Means for Higher Education), Part 1

Back in the 1990s when we were in a recession, Jeremy Rifkin wrote a book called The End of Work, which argued that unemployment would remain high forever because of robots, information technology, yadda yadda, whatever.  Cue the longest peacetime economic expansion of the century.

Now, we have a seemingly endless parade of books prattling on about how work is going to disappear: Brynjolfsson and McAfee’s The Second Machine Age, Martin Ford’s Rise of the RobotsJerry Kaplan’s Humans Need not Apply, Susskind and Susskind’s The Future of the Professions: How Technology will Transform the Work of Human Experts (which deals specifically with how info tech and robotics will affect occupations such as law, medicine, architecture, etc.), and from the Davos Foundation,  Klaus Schwab’s The Fourth Industrial Revolution. Some of these are insightful (such as the Susskinds’ effort, though their style leaves a bit to be desired); others are hysterical (Ford); while others are simply dreadful (Schwab: seriously, if this is what rich people find insightful we are all in deep trouble).

So how should we evaluate claims about the imminent implosion of the labour market?  Well first, as Martin Wolf says in this quite sober little piece in Foreign Affairs, we shouldn’t buy into the hype that “everything is different this time”.  Technology has been changing the shape of the labour market for centuries, sometimes quite rapidly.  We will go on changing.  The pace may accelerate a bit, but the idea that things are suddenly going to “go exponential” are simply wrong.  Just because we can imagine technology creating loads of radical disruption doesn’t mean it’s going to happen.  Remember the MOOC revolution, which was going to wipe out universities?  Exactly.

But just because the wilder versions of these stories are wrong doesn’t mean important things aren’t happening.  The key is to be able to lose the hype.  And to my mind, the surest way to get past the hype is to clear your mind of the idea that advances in robotics or information technology “replace jobs”.  This is simply wrong; what they replace are tasks.

We get a bit confused by this because we remember all the jobs that were lost to technology in manufacturing.  But what we forget is that the century-old technology of the assembly line had long turned jobs into tasks, with each individual performing a single task, repetitively.  So in manufacturing, replacing tasks looked like replacing jobs.  But the same is not true of the service sector (which covers everything from shop assistants to lawyers).  This work is not, for the most part, systematic and routinized, and so while IT can replace tasks, it cannot replace “jobs”  per se.  Jobs will change as certain tasks get automated, but they don’t necessarily get wiped out.  Recall, for instance, the story I told about ATMs a few months ago: that although ATMs had become ubiquitous over the previous forty years, the number of bank tellers not only hadn’t decreased, but had actually increased slightly.  It’s just that, mainly, they were now doing a different set of tasks.

Where I think there are some real reasons for concern is that a lot of the tasks that are being routinized are precisely the ones we used to give to new employees.  Take law, for instance, where automation is really taking over document analysis – that is, precisely the stuff they used to get articling students to do.  So now what do we do for an apprenticeship path?

Working conditions always change over time in every industry, of course, but it seems reasonable to argue that job change in white-collar industries – that is, the ones for which university education is effectively an entry-requirement – are going to change substantially over the next couple of decades.  Again, it’s not job losses; rather, it is job change.  And the question is: how are universities thinking through what this will mean for the way students are taught?  Too often, the answer is some variation on “well, we’ll muddle through the way we always do”.  Which is a pretty crap answer, if you ask me.  A lot more thought needs to go into this.  Tomorrow, I’ll talk about how to do that.

January 26

Tenure and Aboriginal Culture

You may or may not have noticed a story in the National Post over the weekend relating to a scholar at the University of British Columbia named Lorna June McCue, who has brought a human rights tribunal case against UBC for denying her tenure.  The basics of the story are that UBC didn’t think she’d produced enough – or indeed, any – peer-reviewed research to be awarded tenure in the Faculty of Law; Ms. McCue argues that since she adheres to an indigenous oral tradition (she is also a hereditary chief of the Ned’u’ten at Lake Babine, a few hundred kilometres northeast of Vancouver), she needs to be judged by a different standard.

Actually, Ms. Mcue brought the case in the fall of 2012; UBC moved to have it dismissed; the hearing last week was on the motion to dismiss, which failed.  So now, 39 months later, the hearing can proceed (justice in Canada, Ladies and Gentlemen!  A big round of applause!).  Anyways, I have a feeling this story is going to run and run (and not just because of the glacial pace of the legal system), so I thought I would get some thoughts in early on this.

A couple of obvious points:

The spread of the university around the world, mainly in the 19th century, eliminated a lot of different types of knowledge preservation/communication traditions.  They basically wiped out the academy tradition in East Asia, and did a serious number on the madrassas of the Indian subcontinent and the middle-east (though as we have seen, these are making a comeback in recent years in some fairly unfortunate ways).  And though universities do exhibit a lot of differences around the world in terms of finance and management, and to some extent around mission, there is no question that due to the strengths of the disciplines it houses, it has had some extraordinarily isomorphic effects on the way we think and talk about knowledge.  So it’s not crazy for non-western cultures to once in awhile say: look, there are other ways to construct and transmit knowledge, and we’d like a bit of space for them.  Maoris have done this successfully with their Wānanga, or Maori Polytechnics as they’re sometimes called.  Why not in Canada?

And there’s nothing immutable about the need for research as a professor.  Hell, 40 years ago in the humanities, research certainly wasn’t a hard pre-requisite for tenure; even today in the newer professional schools (I’m thinking journalism, specifically), people often get a pass on publication if they are sufficiently distinguished prior to arriving at the university.  Different strokes, etc.

But of course, all that said, the fact is that accommodation for different knowledge paradigms is the kind of thing you work out with your employer before you start the tenure process, not afterwards.  It’s not as though McCue’s views render her incapable of writing; the university hired her on the basis of her 1998 L.L.M. dissertation, which was a good 250 pages long, and presumably expected they’d get more work of similar quality.  And yes, it’s probably a good idea to have and fund institutions that more fully value Aboriginal ways of knowing, and are prepared to take a broader view of what scholarship means (the relevant tenure criteria at First Nations University, for instance, is “consistently high achievement in research and scholarship useful to First Nations’ communities”).  But even if it is located on unceded Musqueam land, UBC ain’t that institution.

I have a hard time imagining this will go anywhere, but Human Rights cases are funny things.  Keep an eye on this case, anyway.

January 25

One In, One Out

I had a discussion a few months ago with a government official who was convinced she knew what was wrong with universities.  “They have no discipline,” she said.  “They just go out and create new programs all the time with no thought as to what the cost implications are or what the labour market implications are, and so costs just keep going up and up.”

I told her she was only half right.  It’s absolutely true that universities have no discipline when it comes to academic programs, but the problem really isn’t on the creation side.  When universities start a new program, it has to go through a process where enrolment is projected, labour market uptake estimated, and all that jazz.  And yes, there is a certain amount of creativity and outright bullshit in these numbers since no one really knows how to estimate this stuff in a cost-effective manner.  But basically, these things have a decent track record: they hit their enrolment targets often enough that they haven’t fallen into disrepute.

The problem is that these enrolment targets aren’t hit exclusively by attracting new students to the institution; there is always some cannibalization of students from existing programs involved.  Therefore, while each new program might be successful in its own terms, these programs were succeeding only by making every other program in the faculty slightly less effective.

And here’s where the lack of discipline comes in.  At some point, institutions need to sit back and take a look at existing programs, and be able to prune them judiciously.  When resources – particularly staffing resources – are static, if you keep trying to pile on new programs without getting rid of the old ones, all you get are a lot of weak programs (not to mention more courses staffed by sessionals).

And here’s one of the biggest, dirtiest secrets of academics: they suck at letting things go.  They are hoarders; nothing, once approved by Senate, must ever be taken away.  Prioritization exercises?  Never!  After all, something might be found not to be a priority.

Getting rid of academic programs is one of the purest examples of Mancur Olson’s Collective Action problem. Getting rid of any given program will hurt a few people a lot, while the majority will barely feel the benefits.  The advantage in terms of political mobilization always goes to the side who perceive themselves to have the most at stake, and so they are very often able to mobilize support and stop the cuts (this point is made very well in Peter Eckel’s excellent book Changing Course: Making the Hard Decisions to Eliminate Academic Programs).  But over time, if you can never cut any programs, then the collective does start to hurt, because of the cumulative effect of wasted resources.

Of course, Olson’s theory also gives us a clue as to how to solve this problem: there need to be stronger incentives within institutions for people to support program closures.  One way to do that would be to introduce a one in, one out rule.  That is, every time Senate endorses a new program, it has to cut one somewhere else.  Such a rule would mean that pretty much anyone in the university who has an ambition to open a program at some point would have an incentive, if not to support specific program closures, then at least to support an effective process for identifying weak programs.

Might be worth a try, anyway.  Because this hoarding habit really needs to stop.

January 22

Higher Education in Developing Countries is Getting Harder

Here’s the thing about universities in developing countries: they were designed for a past age.  In Latin America, the dominant model was that of Napoleon’s Universite de France – a single university for an entire country, which was all the rage among progressives for the first half of the nineteenth century.  In Africa (and parts of Asia), it was a colonial model – whatever the University of London was doing in the late 1950s, that’s basically what universities (the bigger ones, anyway) in Anglophone Africa are set up to do now.  We think of universities as being about teaching and research; by and large, in the global south, universities were about training future governing elite and transmitting ideology.

Of course, for a long time now, governments and foreign donors have been trying to nudge institutions in the direction of modernization.  By and large, the preference seems to be something like a 1990s Anglo/American model: market-focused for undergraduate studies, more of an emphasis of knowledge creation, etc.   This has been a tough shift, and not just because of the usual academic foot-dragging.

The problems are manifold.  If you want research, you need PhDs.  In much of Africa and Latin America, less than half of full-time academics have them.  And because only PhDs can give PhDs that’s a pretty serious bottleneck.  A few years ago, South Africa announced that it wanted to triple the number of PhDs in the country.  Great, said the universities.  Who’s going to train them?

And of course you need money, but that’s in exceedingly short supply.  Money for equipment, for instance (quick, how many electron microscopes are there in sub-Saharan African universities?  Take out South Africa, and I’m pretty sure the answer is zero).  But also money for materials, dissemination, conferences, etc.  In some African flagship universities, close to 80% of money for research comes from foreign donors.  That money is welcome, of course, but it means your research programs are totally at the whim of changing fads in international aid programs.

As for being market-focused: how does that work in countries where 80% of the formal economy is dominated by government and parastatals?  What’s even the point of building up a good reputation for graduating employable students when public sector HR managers aren’t allowed to discriminate between universities when hiring?

Now, making things worse are some fairly worrying macro-economic trends.  Not the commodities collapse, thought that doesn’t help.  No, it’s the secular change in the way development is actually happening; specifically, that countries are starting to de-industrialize at ever lower levels of manufacturing intensity (a phenomenon that economist Dani Rodrik explains very well here).  To put it bluntly, countries are no longer going to be able to get rich through export-driven manufacturing.  There aren’t going to be any more Taiwans or Koreas.  In future, if countries are going to get rich, it’s going to be through some kind of services and knowledge-intensive products.

This, to put it mildly, places enormous pressure on countries to have institutions that are knowledge-intensive and market-oriented.  When human capital trained for services industries become the only route for development, universities become vital to national success in a way they simply are not in a society that already has a major manufacturing base.  Simply put, no good universities, no development.  And that’s a world first because the developed world – including China – got rich before it got good universities.  It’s simply an unprecedented position for higher education anywhere.

But it’s a job for which these universities are simply not ready.  In Africa at least, even when the nature of the challenge is fully understood, universities are neither funded nor staffed adequately for the task; not only are their own internal cultures insufficiently entrepreneurial, but also they simply lack entrepreneurial partners with whom to work on knowledge and commercialization projects.

Getting a whole new set of challenges when you’ve barely got to grips with the old ones is a tall order. It’s a structural issue that international development and co-operation agencies need to think about, and invest in more than they currently do.

January 21

Marginal Costs, Marginal Revenue

Businesses have a pretty good way of knowing when to offer more or less of a good.  It’s encapsulated in the equation MC = MR, and shown in the graphic below.

profit-maximisation

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Briefly, in the production of any good, unit-costs fall to start with as the benefits of economies of scale start to rise.  Eventually, however, if production is expanded far enough you get diseconomies of scale, and the marginal cost begins to rise.  Where the marginal cost of producing one more unit of a good rises above the marginal revenue one receives from selling it (in the above diagram, Q1), that’s the point where you start losing money, and hence where you stop producing the good.

(This gets more complicated for products like software or apps where the marginal cost of production is pretty close to zero, but we’ll leave that aside for the moment.)

Anyway, when it comes to delivering educational programs, you’d ideally like to think you’re not doing so at a loss (otherwise, you eventually have a bit of a problem paying employees).  You want each program to more or less, over time, come close to paying for itself.  It’s not the end of the world if they don’t, cross-subsidization of programs is a kind of core function of a university after all; but it would be nice if they did.  In other words, you really want each program to have a production function where the condition MC=MR is fulfilled.

But here’s the problem.  Marginal revenue’s relatively easy to understand: it’s pretty close to average revenue, after all, though it gets a bit more complicated in places where government grants are not provided on a formula basis, and there’s some trickiness when you start calculating domestic fees vs. international fees, etc.  But the number of universities that genuinely understand marginal cost at a program level is pretty small.

Marginal costs in universities are a bit lumpy.  Let’s say you have a class of twenty-five students and a professor already paid to teach it.  The marginal cost of the twenty-sixth student is essentially zero – so grab that student!  Free money!  Maybe the twenty-seventh student, too.  But after awhile, costs do start to build.  Maybe on the 30th student there’s a collective bargaining provision that says the professor gets a TA, or assistance in marking.  Whoops!  Big spike in marginal costs.  Then where you get to forty, the class overfills and you need to split the course into two, get a new classroom, and a new instructor, too.  The marginal cost of that forty-first student is astronomical.  But the forty-second is once again almost costless. And so on, and so on.

Now obviously, no one should measure marginal costs quite this way; in practice, it would make more sense to work out averages across a large numbers of classes, and work to a rule of thumb at the level of a department or a faculty.  The problem is very few universities even do that (my impression is that some colleges have a somewhat better record here, but the situation varies widely).  Partly, it’s because of a legitimate difficulty in understanding direct and indirect costs: how should things like light, heat, and the costs of student services, admissions, etc., be apportioned – and then there is the incredible annoyance of working out how to deal with things like cross-listed courses.  But mostly, I would argue, it’s because no one wants to know these numbers.  No one wants to make decisions based on the truth.  Easier to make decisions in the dark, and when something goes wrong, blame it on the Dean (or the Provost, or whoever).

Institutions that do not understand their own production functions are unlikely to be making optimal decisions about either admissions or hiring.  In an age of slow revenue growth, more institutions need to get a grip on these numbers, and use them in their planning.

January 20

The Inter-Generational Equity Thing

I see that one of my favourite student groups, the Ontario Undergraduate Student Association (OUSA), has come out in favour of a tuition freeze.  Fair enough; not many students endorse fee increases, after all.  But the stated rationale for wanting one is a bit disappointing – mixing, as it does, poor historical analysis with poor generational politics.

Here’s their thinking:

In 1980, student contributions to university operating budgets in Ontario, which include tuition and fees, were only 18 per cent. In 2014, accounting for inflation, that number reached 51 per cent. I’m no financial planner, but I do believe that if I invest 33 per cent more into something—I should probably receive a comparable amount in return, or at the very least, expect to.

So let me ask: are there more jobs available for university graduates? More co-op and paid internship opportunities? Are students being taught to articulate their soft skills to employers? Has the ratio of students to faculty in the classroom improved? Most importantly, are university degrees more valuable now than they were in 1980? If the answer to these questions, particularly the last one, is no, then why are students paying more than ever for a university education?

(You can read the complete document here.)

There are a number of errors here.  Are there more jobs for graduates?  Yes, of course there are.  Maybe not relative to the number of graduates, but even so, graduate unemployment rates are a lot lower than they were in the early 80s and early 90s (though of course that has more to do with the state of the economy than anything else).  More co-ops and paid internships?  Incomparably more.  In 1980, Waterloo was still about the only place doing co-op; today, the practice is widely spread (and at Waterloo itself, the number of co-op students per year is at least three times what it was back then). The only piece that’s unambiguously true here is the bit about student-teacher ratios.

If we really want to understand why students are paying more for their education we need look no further than the facts that: a) enrolments tripled, and b) the cost per-student for education got more expensive (not always for good reasons, but true nonetheless).  Governments paid for part of this – admittedly less so in Ontario than elsewhere in the country – and students paid for the rest.

And this is why we have to be careful when making comparisons over time.  Of course, we could bring student contributions back down to 18% of total costs: but remember, part of what that increased contribution bought was vastly increased access.  Anyone want to make that trade and return to a time of cheaper education for a luckier few?  No, thought not.

So that’s the analytical error.  The political error – and it’s a seductive one, I’ll admit – is to claim that every time a new generation doesn’t get something that the old generation got, it’s “unfair” and a basis to lay a claim on state resources.  But this way madness lies.  Where PSE is concerned, it’s tantamount to saying “our parents were oversubsidized and we demand the same treatment”.  Or maybe, “we’re getting a pretty good deal on PSE, but we demand that our deals be ludicrously good like they were in the 70s”.

For a whole bunch of very long-term demographic and economic reasons, today’s students are going to find it harder than the boomers, and even the Gen Xers did (also harder than the generation that passed through university between 2000 and 2005, who did pretty well).  There’s not a whole lot anyone can do about that: some cohorts just have it easier than others, and progress isn’t always linear.  Policy shouldn’t be totally insensitive to these shifts, but neither should our goal be to preserve certain benefits in amber just because “that’s what our parents got”.

None of this is to say there aren’t decent arguments in favour of tuition freezes, or even that the “universities need to show value for money” argument is wrong.  (If it were me arguing the case, I’d push for limiting increases in student fees to whatever the increase in public funding is.)  But arguing on the basis of changes that have occurred over 35 years is a mistake; too much of the money spent over that period did too much good to be criticized.  Inter-generational arguments are trickier than they look, both analytically and politically.

January 19

The Allure of the (G)Olden Days

Among the many things that drive me completely crazy about discourse in higher education is the mythologizing about “the olden days”.  You know, before “neoliberalism” came along, and research was non-instrumental, people “valued knowledge for its own sake”, classes were tiny, and managers were things that happened to other people.

Whenever I hear this kind of thinking, part of me wants to say “and when was this again?” But that’s a bit flip: there is some truth to each of these claims of former idyll.  However, each needs to come with a caveat because it either wasn’t quite as good as it seems in retrospect, or it was abandoned for some pretty good reasons.

Start with research.  Yes, there was a time when research came with a lot fewer forms (major paperwork really began in the 1990s, so far as I can tell) and demands to demonstrate short-term relevance were not quite so prominent.  But back then there was also a *lot* less money for research, and tenure standards didn’t demand quite so much of it.  Less money, but less research needed for career requirements.  And, I might add, significantly higher expectations with respect to teaching loads.

As for the days when people “valued knowledge for its own sake”?  This is a favourite of people who disdain – not entirely without reason – the continuing drift towards professional fields of study, and prefer classic (usually liberal) education.  And yes, there were olden days – say, up until the early 1970s – where this was true.

But the professionalization of curriculum was, for the most part, a reaction to massification.  It became pretty clear in the early 1970s that there weren’t unlimited jobs for arts graduates.  And since the whole point of massification was to provide more routes into the upper middle class, a lot of people – including students themselves – began demanding programs that were more applied.  In a very real way the “valuing knowledge for its own sake” thing was only possible because rates of access were about a quarter of what they are now.  If we had the chance to make that decision again, does anyone really think we should make it differently?

Same thing – to a degree – about managerialism.  If you read books like Peter Kent’s Inventing Academic Freedom, which provides an interesting picture of a Canadian campus in the late 1960s (and which I reviewed back here), you’ll see that in the late ’60s there was virtually none of the managerial infrastructure that now exists.  Of particular note is the degree to which academics themselves took on pastoral roles on campus.  But precisely because of: a) massification, and b) increased research load, profs simply opted out of these types of roles in the 1970s and 1980s, and loaded the work onto a new, largely professional student services system.

A final point to make about the halcyon days: professors really didn’t get paid anywhere near as well as they do today.  If you go back to the 1950s or 1960s, academics’ pay was much closer to the national median.  Compare that to now, where making associate prof pretty much automatically puts someone in the top 5% of individual income distribution.  Ironically, part of the reason for this is that the arrival of all those widely- derided “administrators” relieved professors of their “non-academic” duties, which made the professoriate itself more of a “profession”, which was a key in achieving higher pay.

So yes, there were some halcyon days – for some, at least.  But they existed partially because access was restricted, pay was low, and teaching loads were high.  Now, hands up: who would trade yesterday for today?

Page 7 of 96« First...56789...203040...Last »