HESA

Higher Education Strategy Associates

Author Archives: Alex Usher

July 21

University of Saskatchewan Detritus

We all remember this spring’s controversy at the University of Saskatchewan over the firing of Robert Buckingham, which resulted in the resignation of the University’s Provost, Brett Fairbairn, and the firing of the President, Ilene Busch-Vishniac.  Despite all the coverage, a number of key questions were never answered, like “how could anyone possibly think firing a tenured professor was a good idea?”  And, “who’s idea was it to fire him anyway – the Provost’s or the President’s?”

We now have more insight, as Fairbairn recently released a five-page letter providing his perspective on events.  Two key points from his account:

  • The decision to fire Buckingham as Dean was a group decision.  The Provost, “leaders responsible for Faculty Relations, HR internal legal expertise and Communications”, and the President (by phone) were all present.  But the key question of whether to dismiss him from the university altogether was referred to HR for further study.  At this point Busch-Vishniac told Fairbarin: “I will stand behind any actions you deem necessary and will not second-guess”.
  • The decision to fire him from both jobs was the HR department’s recommendation.

How HR came to this conclusion isn’t clear; Fairbairn notes that it had happened before at U of S in a case where there had been an irreparable breakdown in relations between employer and employee. Without knowing the case to which he’s referring, it’s hard to know what to make of this.  Certainly, the employer-employee relationship with Buckingham as a dean was irreparably damaged (which is why they were correct to fire him); it’s not at all clear that he couldn’t have remained as a faculty member since he wouldn’t have had any real contact with any of the superiors whose trust he had abused as Dean.  For whatever reason, Fairbairn decided to take the “expert advice” from HR, and did so without looping back to the communications people to get their input (which might have been valuable) or checking with Bush-Vishniac.

Far from backing up Fairbairn as promised, Busch-Vishniac threw him under the bus and asked for his resignation three days later.  That was emphatically the wrong call.  From the moment she gave the go-ahead for Buckingham’s dismissal, it was clear that either both of them would stay, or neither would.  Fairbairn decided to go, astutely noting that “the only thing worse than blame and recrimination among senior leaders is mutual recrimination among senior leaders”.

Fairbairn’s letter is a valuable peek into how crises get managed at universities.  I think it shows him as a manager with mostly the right instincts, but who erred in accepting some terrible advice from professionals who should have known better.  Others – mostly people who genuinely have no insight into how major organizations function – will probably see this distinction as irrelevant since the real crime was firing Buckingham as a Dean in the first place.  Former CAUT director James Turk, in particular, has made the “managers should have a right to criticise each other publicly” case – to which the correct response is: “and how much freedom did Turk allow his staff and executive to criticise his management as CAUT director?”.

If I were at the University of Saskatchewan, though, my main question after reading Fairbairn’s letter would be: “how is it that the HR department got off comparatively lightly?”  Food for thought.

July 14

Paul Cappon, Again

You may have noticed that Paul Cappon – former President of the Canadian Council of Learning – had a paper out last week about how what the country really needs is more federal leadership in education.  It is desperately thin.

Cappon starts by dubiously claiming that Canada is in some sort of education crisis.  Canada’s position as a world-leader in PSE attainment is waved away thusly: “this assertion holds little practical value when the competencies of those participants are at the low end compared with other countries”.  In fact, PIAAC data shows that graduates of Canadian universities, on average, have literacy skills above the OECD average.  Cappon backs up his claim with a footnote referencing this Statscan piece, but said document does not reference post-secondary graduates.  Hmm.

And on it goes.  He cites a Conference Board paper putting a $24 billion price tag on skills shortages as evidence of “decline”, even though there’s no evidence that this figure is any worse than it has ever been (Bank of Canada data suggests that skills shortages are always with us, and were worse than at present for most of the aughts), or that a similar methodology would not lead to even bigger figures in other countries.  He cites a low proportion of STEM graduates in engineering as cause for concern, despite a total lack of evidence that this percentage has anything at all to do with economic growth.

Ridiculously, he proclaims that the Canadian education system (not higher education – all education from kindergarden onwards) is less effective than the Swiss because they beat us on some per capita measures of research output (which are of course largely funded through pockets entirely separate from the education budget).  Yes, really.  For someone so in favour of evidence-based policy, Cappon’s grasp on what actually constitutes evidence is shockingly tenuous.

Having cobbled together a pseudo-crisis, Cappon concludes that “the principal cause of our relative regression is that other countries are coordinating national efforts and establishing national goals… Canada, with no national office or ministry for education, is mired in inertia, each province and territory doing its best in relative isolation.” Now there is no evidence at all that the existence of national systems or national goals makes a damn bit of difference to PIAAC outcomes.  Germany comes in for all sorts of praise because of its co-operation between national and state governments, but their PIAAC and PISA outcomes are actually worse than Canada’s.  Yet Cappon believes – without a single shred of evidence – that if only there were some entity based in Ottawa that could exercise leadership in education that everything would be better.

Two points: first, our nation rests on a very simple compromise.  Back in 1864, the *only* way in which Catholic, Francophone Lower Canada could be tempted into agreeing to a federal government with representation by population was if that new level of government never, ever, ever got its hands on education.  That was, and is, the deal.  It’s not going to change.  Period.

Second, everyone needs to remember that there was a time not long ago that the Director-General of CMEC suggested exactly such a body to some deputy ministers in Ottawa, and Ottawa created a body not dissimilar to what Cappon is describing.  But said CMEC DG didn’t tell the provinces he was negotiating this deal with the feds.  When he was offered the leadership of the new organization, he jumped, taking several CMEC senior staff with him.  The result was that provinces, feeling betrayed by the DG’s chicanery, refused to work with the new organization.  As a result, it achieved little before being shut down in 2011.

That DG, of course, was Paul Cappon.

This report is essentially a self-justification for a dishonest act performed nearly a decade ago, rather than a genuine contribution to public policy.  If Cappon actually cared about repairing federal-provincial relations around learning, a mea culpa would have been more appropriate.

July 07

How to Measure Teaching Quality

One of the main struggles with measuring performance in higher education – whether of departments, faculties, or institutions – is how to measure the quality of teaching.

Teaching does not go entirely unmeasured in higher education.  Individual courses are rated by students through course evaluation surveys, which occur at the end of each semester.  The results of these evaluations do have some bearing on hiring, pay, and promotion (though how much bearing varies significantly from place to place), but these data are never aggregated to allow comparisons of quality of instruction across departments or institutions.  That’s partly because faculty unions are wary about using individual professors’ performance data as an input for anything other than pay and promotion decisions, but it also suits the interests of the research-intensive universities who do not wish to see the creation of a metric that would put them at a disadvantage vis-a-vis their less-research-intensive brethren (which is also why course evaluations differ from one institution to the next).

Some people try to get around the comparability issue by asking students about teaching generally at their institution.  In European rankings (and Canada’s old Globe and Mail rankings), many of which have a survey component, students are simply asked questions about the quality of courses they are in.  This gets around the issue of using course evaluation data, but it doesn’t address a more fundamental problem, which is that a large proportion of academic staff essentially believes the whole process is inherently flawed because students are incapable of knowing quality teaching when they see it.  There is a bit of truth here: it has been established, for instance, that teachers who grade more leniently tend to get better course satisfaction scores.  But this is hardly a lethal argument.  Just control for average class grade before reporting the score.

It’s not as though there isn’t a broad consensus on what makes for good teaching.  Is the teacher clear about goals and expectations?  Does she/he communicate ideas effectively?  Is he or she available to students when needed?  Are students challenged to learn new material and apply this knowledge effectively?  Ask students those kinds of questions and you can get valid, comparable responses.  The results are more complicated to report than a simple satisfaction score, sure – but it’s not impossible to do so.  And because of that, it’s worth doing.

And even the simple questions like “was this a good course” might be more indicative than we think.  The typical push-back is “but you can’t really judge effectiveness until years later”.  Well, OK – let’s test a proposition.  Why not just ask students about a course they took a few years ago, and compare it with the answers they gave in a course evaluation at the time?  If they’re completely different, we can indeed start ignoring satisfaction types of questions.  But we might find that a good result today is in fact a pretty good proxy for results in a few years, and therefore we would be perfectly justified in using it as a measure of teaching quality.

Students may be inexperienced, but they’re not dumb.  We should keep that in mind when dismissing the results of teaching quality surveys.

June 23

The Effects of Tuition Fees (Part 1)

For the last eighteen months or so, I’ve been working on a project with colleagues Dominic Orr and Johannes Wespel of the Deutsche Zentrum für Hochschul- und Wissenschaftsforschung (DZHW) for the European Commission, looking at the effects of changes in tuition fees and fee policies on institutions and students.  The Commission published the results on Friday, and I want to tell you a little bit about them – this week I’ll be telling you about the effects on institutions, and next week I’ll summarize the results with respect to students.

The first question we answered had to do with whether or not a rise in tuition ultimately benefits higher education institutions.  Critics of fees sometimes suggest that extra fees do not in fact result in institutions receiving more money, because governments simply pull a fast one on the public and withdraw public money from the system, thus leaving institutions no better off.  Our examination of nine case studies revealed there were certainly some occasions where this was the case – Canada in the mid-90s, Austria in 2001, and the UK in 2012 – but that in the majority of cases fee increases were accompanied by stable or increased government funding.  Moreover, in all the cases where there was an accompanying decrease in public funding, it was signalled well in advance by governments, and indeed the increase in fees was deliberately designed to be a replacement for public funds. We did not find a case where a government “pulled a fast one”.

The second question we asked was how universities reacted to the introduction of fees: did they suddenly start chasing money and becoming much more sensitive to the demands of students and donors?  The answer, by and large, was no, for three reasons.  First, tuition isn’t the only financial incentive on offer to institutions; particularly if they are already funded on a per-student basis, the introduction or increase of fees isn’t likely to change behaviour.  Second, institutions won’t go after fees in ways that they think will negatively affect their prestige.  In Germany for instance, many universities have considerable latitude to raise income via teaching through continuing-education-like programs, but effectively they don’t do this, because they believe that engaging in that sort of activity isn’t prestige-enhancing.  And third, institutions often delay altering their behaviour too much because they don’t believe government policy will “stick”.  In Germany, specifically, the feeling was that the introduction of fees was unlikely to last and so there was no point in getting too invested in attracting new students to take advantage of it.

In fact, although fees in public institutions are often touted as a way to make universities more flexible and more responsive to business, the labour force, etc., this never actually works in reality, because universities are saddled with enormous legacy costs (you can close a program, but you still have to pay the profs), and have a particular self-image that means  they closely-tied to traditional ways.  What does seem to work – at least to some degree – is to allow the emergence of new types of higher education institutions altogether.  In Poland, it was only the emergence of private universities that allowed the system to take on the explosion of demand in the 1990s.  In Finland, an entirely new type of higher education institution (ammattikorkeakoulu or “Polytechnics”) was developed to take care of applied education, and accounted for 80% of all enrolment growth since 1995.

Next week: the effects on students.  See you then.

June 16

Summer Reading

Hi all.  Enjoying summer yet?

Three recent works that I think are worth a peak at over the summer:

1.       George Fallis’ Rethinking Higher Education: Participation, Research and Differentiation.  The thing you need to know about George Fallis is that the size of the books he writes are all out of proportion to the point he is trying to make.  They’re good books, substantial books, useful books, but the actual point he makes could probably be made in an article of 15 pages or so.  And so it is here: this is a pretty good all-around 250 page look at the Ontario university system.  And at the end of the day he makes two original points: 1) demographic change means that there probably isn’t too much growth left in the Ontario system, so we should stop framing policy in terms of access; and, 2) what Ontario really needs is a policy on university research and graduate studies, rather than allowing them to continue to grow haphazardly as afterthoughts to undergraduate enrolments. I’m not entirely sold on the first proposition (there’s scope from growth in people switching from colleges and universities, and there’s scope to grow from continued international enrolments), but he’s absolutely bang-on for the second one.  As a recommendation for policy it’s so far from current government practice that it’s basically in another time-zone, but it’s a point that needed to be made.

2.       Just released last week by The Council of Ministers of Education, Canada was a fascinating report entitled The Role of Education Agents in Canada’s Education Systems.  Authored by Robert Coffey and Leanne Perry of Michigan State University, the report provides everything from a survey of education providers about their use of education agents (less enlightening than you’d think), to an international comparison of rules and codes of conduct regarding the use of agents (Canada is apparently more of a wild west than, say, Australia or the UK), to a really top-notch discussion of why and how institutions use agents in the first place, and how misconduct occurs and is dealt with.  There’s little that’s earth-shattering in here, but as a primer on an increasingly important topic, it’s well worth a read.

3.       An edited volume of works on Australian higher education called The Dawkins Revolution 25 Years On (not available in Canada, but can be ordered directly through Melbourne University Press).  Back in 1988, the country’s Education Minister, John Dawkins, brought in a series of changes (converting colleges into universities, introducing a limited amount of competition into funding, and introducing student charges via the Higher Education Contribution System) that fundamentally restructured higher education in a way rarely seen anywhere in the world, let alone Australia.  This excellent book of essays from people like Simon Marginson, Bruce Chapman, Gavin Moodie, Julie Wells ,and Andrew Norton both traces the development of the Dawkins agenda and cogently explains its impacts over the subsequent 25 years.  It’s the kind of book you read and wonder “why can’t we write this kind of stuff in Canada”?  Part of the answer to that, of course, is that our system is much more fractured nationally and less amenable to single narratives than is Australia’s; part of it too is that we’ve never had anything as interesting as Dawkins’s reforms to write about.  But even so, our attempts at similar essay-collections (for example, Higher Education in Canada and A Challenge for Higher Education in Ontario fall short of the standard set here.  This book deserves a wide readership.

June 12

Arigato, Sayonara

With election night, the World Cup getting under way, and tomorrow being Friday the 13th, it seems like as good a time as any to shut down the blog for the summer – I apologize to those of you who were hoping for one last rant on election results (though you can probably catch my thoughts over on twitter, where my handle is @AlexUsherHESA).  Starting Monday, this blog will be on a once-a-week schedule until August 25th,  when normal daily service will resume.

We’ve got an interesting program of research going on at HESA Towers over the summer, and we’ll have some very interesting material for you come the Fall, especially around internationalization.  I’ll be spending a  bit of time Down Under to get the skinny on some of the big changes currently happening there in higher education, and will report the findings back to you.  And we’ll have some new work on affordability that I think you’ll rather enjoy (or, at least those of you who don’t hate-follow me will enjoy it).

But before I sign off, two very quick requests:

1)      My summer reading project is to read as many institutional histories as possible (thrill a minute at HESA Towers, never a dull moment).  If you know the name and author of your university or college’s history, could you send them my way so I can find a used copy online?

2)      You guys read my stuff every day (well, most days, anyway).  Are there subjects you want to hear more about?  Less?  If you can take a couple of minutes to let me know, I’d really appreciate any feedback you can take the time to provide.

Have a great summer, everyone.

June 11

Tremors in China

I wanted to point everyone’s attention to a small article in the Chinese People’s Daily last Wednesday, which is potentially of enormous significance.

Apparently, of the country’s 31 Provinces, Municipalities, and Autonomous Regions, only seven have disclosed their figures with respect to higher education recruitment.  Every single one of them missed their targets, some by over 10%.  And these seven provinces represent a mix of economic backgrounds: Anhui and Quinghai are relatively poor interior provinces; Shandong and Fujian are richer coastal ones, and the balance are somewhere in between.  It’s a broad, broad swathe of the country – which makes it unlikely either that it’s a one-off fluke, or that the trends are much different in other non-reporting provinces.

Some are suggesting this is a demographic thing – but this is frankly nonsense.  Youth cohorts have been shrinking for several years now, and that hasn’t stopped the flood of students heading to higher education.  This is different.  This is a change in the participation rate.  It’s a change in the proportion of people who want to go to higher education.  It’s families finally starting to react to the high level of graduate under-employment.

This was the kind of thing the Chinese government was trying to forestall when it announced plans to convert 600 universities (out of 2400 in total) into polytechnics.  Indeed, given that the data was for 2013, it might actually have been the cause of the Party’s decision to transform these institutions.  But there’s no guarantee that, in fact, students want that kind of education either; as I explained back here, a major demand-driver for education in Confucian societies is the perception of moral goodness attached to higher studies, which may not be present in more technologically-oriented programs.  The party’s assumption that families skeptical about university education will head to polytechnics is unproven: it may be university or nothing.

What are the knock-on effects of this?  Remember that Chinese public universities took on $41 billion in debt to expand.  If they don’t have fee-paying students filling those seats, the chances of some universities defaulting is going to rise.  Ultimately, none are likely to fail – the prestige hit on local government would be too big – but you can see it leading to a general reining-in of university finance.

And the effect on Chinese students heading abroad?  Well, the era of scarcity in Chinese universities is already well and truly over – even before this drop, over 76% of gaokao-takers now get a place in universities.  Foreign universities don’t fulfill a demand-absorption function anymore – they are very clearly simply competing on quality with domestic institutions.  So far, there is no indication that this demand is slackening, which implies a great hunger in China for quality education, which not all local universities can yet provide.

But take it as a warning.  Youth numbers are declining.  Demand for university education even within the youth cohort is declining.  Eventually, this may translate into lower demand for foreign education as well.  Institutions who depend too heavily on this market may get burned.

June 10

Crazy Managerial Imperatives Around International Students

One of the weirdest – and I mean totally bat-guano-crazy – things in Canadian higher education is the way recruitment of international students is managed.  Although the image of international student recruitment is often seen simply as a money-spinner for institutions, the fact of the matter is that most institutions aren’t coming close to maximizing revenue from this source.  And that’s not because of any high-minded motives of institutions turning away students they don’t think are suitable for their university experience, either.  It’s simply because of the way institutions’ internal budget controls work.

In a regular business, sales forces get the budget they need to hit revenue targets.  But if sales are going well, and management thinks they can get more money by investing money in sales, then the sales force will get more money.  Simple as that.

Compare this to what is happening at many institutions in Canada around international recruitment budgets, where the discussion is more along these lines:

Senior Admin (to international office):  Can you get us more international students?  We could really use the cash.  Budget cuts, you know.

International Office: Um, sure.  But can I have some extra money for that?  Recruitment actually costs money.  Not to mention support once we get them here.

Senior Admin: What?  More money?  Didn’t I just tell you we don’t have any money?

International Office: But… we’ll get our money back and more (NB. There are circumstances where this isn’t true, as described back here, but for the moment let’s assume it is).  You need to spend money to make money.

Senior Admin: Look, there’s a budget freeze on.  If I give non-academic units more money, there’ll be hell to pay.  You know how envious everyone already is that you guys get to fly all over the place?

International Office: (Desperately suppressing the urge to start listing decanal and vice-presidential visits abroad) But you’re not “giving us money”. We generate revenue!

Senior Admin: Yes, but there’s nothing in our budget process that allows us to reflect that.

International Office: (repeatedly bangs head against wall.)

Seriously, this happens.  The idea of investing money to make money later on isn’t entirely foreign (sorry) to universities – but doing so via the international office often isn’t possible.  To a large extent that’s because of their historical roots.  Most of them weren’t set up as revenue-generating units – until a decade ago they were mostly busy doing things like checking student’s health insurance, and helping profs on sabbatical deal with accommodation and paperwork.  As a result, they tend to get tied up with other administrative units like Student Services or Human Resources (which tend to take a hit when times are bad), rather than with revenue units like Advancement (which usually doesn’t).

(As an aside, I’m pretty sure this is one of the reasons international offices turn to agents, rather than building up their own networks abroad; the latter requires upfront investment, while the former just requires paying for students once they arrive – which, as you can imagine, is a lot easier to sell within the bureaucracy.)

If institutions are serious about playing the international game, they need to get serious about how they fund and manage it.  Too many haven’t bothered to do that.

June 09

Teaching Load Versus Workload

I often get into discussions that go like this:

Me: Over time, the number of classes each professor teaches has gone down.  Places where people used to teach 3/2 (three classes one term, two the other) now teach 2/1.  Places where 4/3 or even 4/4 were common are now 3/2.   This has been one of the main things making higher education more expensive in Canada.

Someone else (usually a prof): Yeah, but classes are so much larger now than they used to be.

Me: Do you not think that teaching fewer classes maybe the cause of higher average class size?  Do you think that if everyone taught more classes average class size would fall?

(nota bene: This isn’t the whole story, obviously.  Student-staff ratios have gone up to such a degree that even if profs were teaching the same number of courses, numbers would still be up a bit.  Though how much is hard to say, because of the changing use of sessional lecturers.)

Someone else: Does it matter?  Same number of students, same amount of work.

Me: Is it?  Are three classes of fifty students actually the same amount as five classes of thirty students?  Doesn’t less class prep time more than make up for the increase in marking?

Someone else: Um, well, yeah.  Probably.  But we’re still doing lots of committee work!  And tenure requirements have become much more punishing than they used to be!  And those teaching loads don’t count graduate student supervisions.

Me: No doubt, committee work can take up a lot of time – though much of it exists simply to make the university less effective.  But that research one – that’s not distributed equally across the university, is it? I mean, we know that the pace of publication falls pretty quickly after tenure is granted (see figure 3 of this PPP article by Herb Emery).  And not all university research is of the same quality: Well over 10% of all Canadian faculty (24% in the humanities) have never had a publication cited by anyone else (HESA research, which we demonstrated back here).

Someone else:  And graduate supervision?

Me: Fair point.  But graduate supervision is all over the place.  Supervising a PhD in Science tends to be more intensive than in Arts.  And course-based Masters’ student are increasingly more like undergraduates than doctoral students in the loads they bring.  Hard to measure.

Someone else: But shouldn’t all this be measured?

Me: Of course.  But notice how Canadian university Collective Bargaining Agreements avoid the question of overall workload, even though they often get really specific about teaching loads.  Universities don’t want to measure this stuff because it would expose how many profs are working way too hard, and unions don’t want to measure this stuff because it would expose how many profs aren’t.    Look how hard both sides worked to discredit the HEQCO paper on professorial productivity, which posed exactly that question.

Someone else: is this ever going to change?

Me: Governments could put pressure on institutions to actually enforce the bits of the CBAs that require faculty to actually do the hard-to-measure stuff (committee work, research).  Junior staff could make more of a fuss within the unions to start ensuring equal treatment of workloads within the bargaining unit.  Short of that, no.

Someone else: Aren’t you a bit cynical?

Me: Around here, hard not to be.

June 06

Governance, Stress-Tests, and Preparing for the Worst

It’s the little things that worry me.  The slowdown in China.  The continuing failure of the Euro-zone to grow.  The fact that the ratio of the US Stock Market Cap to GDP is approaching the levels seen right before the crashes of 2001 and 2008.  Our economy might muddle through, or it might not.

Now add on to economic uncertainty the clear evidence that governments are showing decreasing enthusiasm about supporting higher education – nationally, there’s been a real decline in provincial higher ed funding over the past four years to the tune of about three percent.  Better than some sectors, certainly, but also very problematic, given that our universities essentially seize up if their budgets don’t grow at least 3.5% per year.  Oh, and throw in the clear reluctance of most governments to let tuition rise to compensate for any funding cuts.

Given all this, I’d say there’s a reasonable chance that universities in more than one province are heading for budget cuts on the order of 10% or so.  It’s likeliest in Ontario, but it could happen pretty much anywhere.

Is anyone ready for that?  Does anyone have a plan in their back-pocket that would help them get through that kind of restructuring.

I can hear all of you rolling your eyes.  Of course not - who does that?

Well, almost everyone, really.  Any business worth its salt has some pretty clear contingency plans if revenue drops.  Colleges don’t have exact contingency plans per se, but they pretty much all measure break-even points on a per-program basis; if required to cut, they would be able to produce plans very quickly.

But universities?  It is to laugh.  They’ll plan for growth until the cows come.  But plans to shrink?  Never.

Yet, it’s not as though they can claim blindness to the danger.  It’s not as though universities don’t remember the 1990s, when double-digit cuts occurred.  It’s not as though cuts on a limited scale aren’t already happening.  Despite the dangers, universities continue to merrily sign agreements with faculty that commit them to large expenditure increases in the future (Hey!  U of Ottawa!  Yeah, I’m looking at you!) instead of focussing on contingency plans.

I can sort of understand the reluctance of administrators to take this step, given the predictable faculty backlash.  What’s more puzzling is the absence of any pressure on institutions from their Boards of Governors on this score.  Our whole system of university governance is based on spheres of competence: academics run academic affairs through Senate, while Boards – supposedly filled with men and women with a modicum of business nous – are supposed to take care of the money.  And yet, more often than not, “taking care of the money” means dong fundraising or small-ball stuff like advising on endowment strategies.  It doesn’t seem to involve asking hard questions about the medium-to-long term solvency or stress-testing institutions to see how they’d fare if things go south.

Yet it should.  The risks institutions face are getting bigger each year.  A crash may not happen; but if it does, we’d all be better off if our responses were based on thoughtful long-term plans rather than the usual beheaded chicken routine that universities seem to prefer.  Boards of Governors are the ones best-placed to make it happen.  They need to step up and do so.

Page 1 of 5912345...102030...Last »