HESA

Higher Education Strategy Associates

January 30

Senior Management Salaries and Titles

A couple of weeks ago we had a fun look at academic salaries. And I know some of you were thinking: “Why pick on profs? What about skyrocketing administrative salaries?” Fair enough – let’s look at what happened to administrative pay in the last decade.

To stay consistent with earlier data on professors’ salaries, I use 2001 and 2009 as reference years. This being a free email, I stick to easily-accessible, cost-free data – namely, salary disclosure information from the Government of Ontario. Since Ontario is 1/3 of the country and previous work has shown their academic salary increases to be very close to the national average, I believe it to be reasonably representative of the country as a whole.

Across the 18 universities for which there are data for both years, average presidential salaries rose by 56%, from $214,563 to $335,617. That’s higher than the 43% average pay raise for full professors over the same period, but it’s a matter of degrees rather than orders of magnitude. It’s also marginally less than the 63% average increase in Ontario college presidents’ salaries. There were, however, big variations within this group – presidential salaries more than doubled at Ottawa, Laurentian and McMaster, but rose only 5% at the University of Toronto.

Comparative Salary Rises in Ontario – 2001-2009

Another common complaint about administrative bloat is the increase in the number of vice-presidential positions. By my count, in Ontario in 2001 there were 93 vice-presidential positions, including AVPs of various stripes. By 2009, that number had increased to – brace yourself – 193.

That’s not a misprint. It looks a little bit better in comparison when one remembers that Ontario academic staff numbers also grew by 28% in that same period – but not much.

That said, these jobs aren’t springing out of thin air. Usually, they existed previously in some form or another, but with lesser names. Basically, having the letters “V” and “P” in a title is just something universities do when a management position starts costing over $135,000/year. Thus, a newly-created AVP, government relations, with pay of $150,000 isn’t $150,000 in new salary, it’s a title upgrade from “director” and a $50,000 bump in salary.

So,is there runaway growth in presidential salaries? They’ve certainly grown faster than professors’ salaries, but it’s a matter of degrees; the growth is 6% per year compared to 4.6%. Is there runaway growth in administration? Harder to tell. There’s definitely been runaway inflation in titles – if there were a Central Bank for job titles in universities, the governor would be looking for work in Zimbabwe – but the net effect on salary mass is unclear.

A One Thought for another day, perhaps.

January 27

What We’re Reading Now: Higher Education?

With a title like Higher Education? How Colleges are Wasting Our Money and Failing Our Kids – and What We Can Do About It, you might assume this is another screed by a thirty-something with an axe to grind. But the authors – Andrew Hacker of Queen’s College and New York Times writer Claudia Dreifus – are anything but the usual suspects for ritual denunciations of higher learning.

What Hacker and Dreifus have managed is to fuse together a number of very different critiques of higher ed into a single, relatively slim volume. There are what some might say are “left-wing” (in the U.S., anyway) complaints about the academy: education is “too vocational” (basically a John Dewey critique of education, which says Liberal Arts are “real” education, and everything else in an ideal world would be banished to “mere” trade schools), too much is spent on intercollegiate athletics, etc. Then there is the “right-wing” critique: tenure, over-generous professorial compensation and teaching are being sacrificed at the altar of research. Finally, there are the critiques that don’t have a political home: too many non-academic staff, skyrocketing presidential salaries and the ludicrousness of the university committee system. Hacker and Dreifus combine the three fairly seamlessly into a single, pointed indictment.

It’s a nice book in that the authors manage to make solid points without the over-the-top accusatory style that American publishers seem to like so much these days. The section on administrative sprawl is particularly deft, respecting the good intentions behind the creation of positions like “residential communications co-ordinator,” “co-ordinator for lifetime fitness”, etc., while calmly pointing out that these are, essentially, frills that substantially raise the cost of education.

The book’s back-flap, with contains rather effusive praise from darlings of the American left such as Barbara Ehrenreich and Joseph Stiglitz, is also a bit of an eye-raiser. In Canada, making similar points about research vs. teaching or professorial pay tends to provoke (from faculty unions, mostly) accusations of being some kind of foaming reactionary. Apparently, in the U.S., the debate has at least progressed to the point where there is cross-party agreement that there are reasonable conversations to be had about institutional priorities and spending habits.

Large chunks of this book aren’t particularly relevant to Canadian higher education and the Dewey fundamentalism is a bit irritating. But the basics of the argument about pay and tenure will definitely be reverberating here over the next few years, and so the book is probably worth reading for that alone. You might not disagree with the authors, but we can only a hope that future debates on higher ed policy are conducted with as much tact and good grace as Hacker and Dreifus have mustered here.

January 26

Consensual Hallucinations

Imagine you’re running a Canadian university or college in 2008 or 2009. All signs point to a nasty recession, but your provincial government is still in spending mode and keeps giving you more money. It can’t last; the provincial deficit is completely unsustainable and cuts are inevitable within a couple of years. How should you use that extra money the finance minister just slipped you?

Anyone with a modicum of financial sense would tell you to save it. Put it in a rainy-day fund. Use it to offset the cuts that are so clearly just around the corner.

So why did almost nobody do this? Why did institutions collectively continue to spend in such a way that was so forseeably unsustainable? And why are so many institutions now in desperate straits when it’s been so obvious for so long that restraint was coming?

There are a number of plausible short-term excuses, I suppose. The crisis in many schools’ pension funds took up a lot of budget-cutting energy. Generous multi-year salary agreements take a while to work their way through the system.

(Though there’s some question-begging here: what exactly were institutions thinking when they signed deals tying them to effective salary increases of 4-5%/year in 2009 and 2010, after the crisis was perfectly apparent? Can’t blame a union for asking, of course, but what’s management for if not to say “no” occasionally?)

The real answer, though, boils down to two things. First is force of habit: the good times have been rolling for so long that most institutions have forgotten how to “do” austerity. But second – and more importantly – is that politics forms a serious barrier to sensible behaviour. Say you made a rainy-day fund: how would you explain it to politicians? They’d want to know why you weren’t spending all that money the legislature had voted for the purpose of educating the province’s young people. The perfectly sensible answer: – because we’re making preparations for when the legislature returns from outer space and admits spending trends are totally unsustainable – probably wouldn’t go over too well.

Governments, at the end of the day, account for 90% of institutional revenues through control of both grants and tuition policy. So even though Canadian institutions have very high levels of financial autonomy, penalties for acting rationally still remain. In order to keep their benefactor happy, institutions are required to enter into a consensual hallucination in which they must treat as permanent funds that are clearly going to disappear in a couple of years. They must use these funds to hire new staff, set up new centres and programs… and then act shocked and disappointed when they disappear.

Quite a way to run a railroad.

January 25

Distinct Missions

Why are Canadian universities so scared of acting differently from one another?  Why does no one want a niche? I’m not just talking about their cookie-cutter mission statements here, which seem to involve adding the words “research” and “excellence” to the output of a random word generator. I’m talking about the cookie-cutter ways they go about their daily business. In marketing-speak: they have little or no brand personality.

It’s not as though cool niche missions are that hard to dream up. Here’s two:

The “Best Jobs” University.  Giving employment guarantees and talking up graduate employment rates are so 90s; with a shrinking labour force, the issue isn’t going to be whether graduates get jobs, it’s what kind of jobs they will get. So why shouldn’t some university take it on as a mission to ensure that its graduates get the best jobs?

Think about it: if that were your mission, your professors would be bound to spend a lot more time talking to employers not just about “what they want,” but really working with them on a day-to-day basis to understand what skills students need to be more effective in the workplace. It would mean spending lots of money on placement officers and on career services. And it would mean a serious commitment to tracking and measuring how students do, and taking their feedback about what helped them and what didn’t to heart. It wouldn’t be easy, but at the end of the day, you’d be able to tell a real, quantifiable story about how your graduates succeed.

The Character University. One’s university experience is to a very large degree conditioned by one’s classmates. That’s why selective U.S. universities subject prospective students to a rigorous interview process; to make sure they are getting not just good students, but the “right” ones. Canadian universities instead choose to make their admissions decisions based entirely on grades, in part because they feel it’s an easy place to cut costs.

Two words: false economy.

So why not reverse the process and put character at the heart of an admissions process? It’s quite possible: the Loran Awards do it year after year and they manage to pick one future Rhodes Scholar each year in the process. Sure, it would mean spending more on admissions, but involving alumni as American schools do would keep costs down. And the benefits are enormous: your students would be a lot more interesting and rewarding to teach (filtering out the whiners and grade-grubbers would be central to the process), and moral fibre would be your selling point.

These ideas aren’t cost- or difficulty-free, of course. But they’d pay for themselves pretty quickly by attracting better students, producing happier alumni and raising public profile.

Any takers?

January 24

Cult Militias in the Quad

For those student affairs professionals among you who think you have it bad, consider the state of universities in Nigeria.

Prior to independence, future Nobel prize-winner Wole Soyinka and some friends started an anti-colonial political confraternity known as the “Pyrates” at University College, Ibadan. During the late 1970s and early 1980s, confraternities began to spread rapidly, adopting names like the “Black Axes,” the “Supreme Vikings” and (I’m not making this up) the “Klansmen Konfraternity.” Female counterparts also emerged, like the “Barracudas” and the “Black Bras”; they also began to practice much more cult-like initiation rituals, such as drinking blood, torture and even sexual assault.

During the post-1983 period of military rule, confraternities were seen as useful counterweights to pro-democracy student unions and given official access to weaponry to keep campuses tame. Though democracy has returned, cults still have a major and menacing physical presence on campuses. They are responsible for dozens of deaths – both on campus and off – and kidnappings (including of university officials) as well as significant amounts of violence and threats thereof towards academic staff (despite the fact that a few lecturers are themselves cult members) – not to mention involvement with ethnic militias.

Those of you who think I’m making this up – most of you, I’m sure – need to check out links here, here, here, here and especially here. It’s all true. Imagine Animal House with John Belushi replaced by Marlo Stanfield and co. from The Wire – that’s the state of many Nigerian campuses.

Why am I telling you this? Well, when reading up on this two things occurred to me. The first was what a great April Fool’s blog it would make (“Vince Tinto uncovers new African campus movement promoting belonging, closer teacher-student interaction”), but the second was: at the end of the day, it’s students that make a campus culture.

You can have the greatest researchers in the world, but at the end of the day, people know a university by its graduates and its students. Universities can reject in loco parentis all they want, but there’s no getting away from the fact that whatever students get up to among themselves reflect on the institution – as anyone who’s been at York or Concordia over the last decade knows all too well.

So why do Canadian universities leave so much to chance in student affairs? Why do we elevate grades over character in admissions? Why does university-led student programming essentially end after frosh week? Even absent a threat of campus death cults, why don’t reputation-concerned universities make a bigger effort on student life issues?

It seems to me that a university looking for a genuine niche could find one here.

January 23

A Very Odd Employment “Study”

Many of you will have seen a Toronto Star story last week about a new report from the Ontario government indicating that in the wake of the recession, the provincial economy was creating more new jobs for college grads and apprentices than for university graduates.  Cue blather about how technology and the recession are radically changing the labour market, etc.

Did anyone else find it odd that the Star never named the study?  I still haven’t been able to locate an actual document, just a TCU webpage with the following pie chart presented without back-up documentation:

It looks like one of those “job projections” that HRSD puts out through its Occupation Projection System. Except that the latest COPS projections show something very different: their latest report, which covers the period 2008-2017, suggests that new jobs that normally require university will grow 66% faster than jobs which normally require college.

Another oddity was the use of the word “projection” to look at a period (2008-13) which is two-thirds over.  Forget projections – why not just check how employment has actually evolved for those two groups?  Well, I did, and it turns out that total employment among college graduates actually dropped 7.2% since January 2008, while rising by about 8.5% for university graduates.

Employment, Ontario, by Highest Education Level, in Thousands

What’s going on here?  I think TCU simply mislabeled its graphs.  Their figures represent “total job openings,” not “employment growth.”  “Total job openings” are composed of “replacement jobs” and “new jobs.”  The “replacement jobs” data massively favour college grads, because that’s the skill level already required of most jobs in Canada.  The “new jobs” data looks better for university-educated students.  But since 75% of all job openings are replacement jobs, figures for “total job openings” figures ends looking best college grads. (Page 57 of the COPS report shows exactly this; the proportions line up very closely with the Ontario graph.)

In other words, college graduates will in future fill a greater number of positions than university graduates because they already have more jobs in absolute (but not relative) terms, not because the economy is tilting in their direction.

There’s probably some other stuff going on here – the population with college degrees is a bit older and retiring faster; using “highest degree of education” obscures what’s going on with university graduates who later attend college, etc.  And there’s the fact that the Star writer completely left out the “management” category of jobs – which is overwhelmingly populated by university graduates – when writing her story.

But forget about there being a sea change in the labour market.  This just looks like a case of bad graph labeling and weak reporting.

January 20

Graduate Surveys We’d Like to See

If there’s one type of Canadian educational survey where complete and utter stasis has set in, it’s graduate surveys. Questions like “are you employed,” “what are your earnings,” and “were you satisfied with your education” aren’t just boring, I think they’re actively making us stupider. There seems to be a general view that because the answers to these questions don’t change very much from year to year, that we’re doing as good a job as we ever have.

But labour market results aren’t achieved in a vacuum. Economic conditions (both global and local) play a role, as do demographics. Canada’s labour force, which has been increasing in size since WWII, is predicted to plateau in the next couple of years and then decline slightly thereafter. As employers get desperate for workers, they’ll take anyone (think about Alberta fast-food workers making $17/hour in the boom years); in those conditions, low levels of graduate unemployment can’t be taken as evidence of educational excellence.

In future, universities and colleges are going to be judged on how they make students more productive, not on whether they’re employed. That means institutions will need to dig a lot deeper in terms of figuring out how students acquire competencies and then put them into use. Surveys can be helpful in working out which elements of a student’s education proved to be useful and which didn’t. Graduates – even those from disciplines which aren’t vocationally-oriented (i.e., the humanities) – have a pretty good sense of which of their courses were useful and which were decorative. Identifying the courses (and professors!) that graduates in the labour force rate highly can be an enormously powerful tool in curriculum revision.

So, here’s a suggestion for graduate surveys: let’s ease up on the strictly quantitative stuff. The next time you do a survey of graduates, don’t ask them if they were satisfied with their education – ask them which class contributed most to their success in the job market. Don’t ask whether they’d recommend a university to a friend – ask them what missing skills they most wished they’d got before leaving school. Trust me, the answers will be revealing.

Finally: stop wasting information, and link individual graduates’ surveys to their student records. It’s not as time-consuming and expensive as you think, and it vastly increases the explanatory power of the available data.

As I mentioned Monday, Canadian institutions underwent a data revolution in the late 90’s and early 00’s, but unfortunately a benchmarking agenda took over and the discovery agenda was put to the side. But as we’ve shown over the past  three days, it doesn’t have to be that way. Better surveys are possible; we just need to design and execute them.

Let’s do it!

January 19

Faculty Workload Surveys We’d Like to See

While we’re on a roll about surveys, let me muse about one that I think many people in this country would like to see on academic staff and their workloads.

There is a lot of talk about teaching loads, particularly in comparison to professors in other countries where they get paid less (notably the United States). The problem is that we’re dealing in anecdotes; institutions unfortunately don’t publish hard data on teaching loads, either of the notional or the real variety (and there is a considerable difference between the two). Times used to be you could sort of work this out on your own by hand with a course calendar, but those aren’t published anymore so we need different ways of getting at this information.

One solution would be for Statistics Canada to add one or two fields to their annual survey of full-time staff. Right now, all they ask is field of study, age, rank and salary. It wouldn’t be an enormous stretch to ask about number of courses or total credit hours taught.

But why wait for Statscan? Here’s a simple little survey that VP Academics, or Provosts (or whatever they’re called in your neck of the woods) could fill out in a few minutes that would tell us everything we need to know:

Proportion of Tenured and Tenure-Track Professors Teaching Various Course Loads (Rows to Add to 100%)

No big database searches or anything – just “what’s your best guess about proportions of faculty in each category?” It doesn’t need to be 100% accurate – just give a rough idea.

The results would tell us a lot, wouldn’t they? Especially if you could do it every couple of years and create a time series. Yet, if I were to actually send out that survey, and ask people to fill it in, it’s a dead certainty that almost none would do so, even if given a guarantee of institutional anonymity. Most would find reasonable-sounding rationales for refusing, but really they’d just be afraid of the backlash if the data were ever published.

That, I would suggest, is indicative of a much larger problem: universities are not confident of their ability to explain their labour practices to the general public. But that’s no reason someone shouldn’t try out what would definitely be a very cool survey.

January 18

Student Surveys We’d Like to See

Surveys of current students tend to focus on just a few areas. Apart from questions about demographics and time use, they ask a lot of specific questions about satisfaction with student services along with a few general questions about overall satisfaction.

This is odd, because at the end of the day students don’t actually think student services are central to the overall quality of their PSE experience. What they care about first and foremost is the quality of the teaching they experience. Yet institutional surveys seem determined to avoid asking all but the most banal questions about teaching.

Sure, we have course evaluations. But these are used exclusively (if sparingly – but that’s another story) by departments. The data are never used as a tool to learn about what kinds of teaching methods work better than others, never linked to other demographic data to see if there are patterns in the data that link satisfaction or reported learning to student background, the amount of paid work a student engages in, etc. They are, in short, a massive lost opportunity.

What about the National Survey of Student Engagement (NSSE)? Well, despite allegedly being outcome-related, NSSE insists on treating a student’s entire class schedule as a single observation. It does ask about how often students “work in teams” or “make presentations in class,” but most students have a mix of classes, some of which have these elements and some which don’t. If you’re trying to understand how different teaching styles affect students, this stuff needs to be broken out class by class. NSSE, for all its cost, is essentially useless for this purpose.

Getting this data isn’t rocket science. Here at HESA, we regularly do surveys which ask questions about details of each of a student’s classes. That’s how we look at class size, it’s how we found out about the characteristics of students’ favourite and least-favourite classes and it’s how we learned about the effectiveness of e-learning resources in Canadian universities. If we can do it, so can universities and colleges.

From a public policy perspective, the reluctance to look harder at what makes for good teaching looks deeply suspicious. Teaching is what actually matters. It’s why institutions receive tax dollars. Finding out what kinds of pedagogical strategies are most effective should be a moral imperative for anyone who benefits from that amount of public support. But even if that weren’t true, having better knowledge about what works and what doesn’t in teaching is potentially an enormous source of strategic advantage for institutions.

But you can’t act on that advantage until you start collecting the data. So, enough with the student services questions; let’s start asking students real questions about teaching and learning.

January 17

Applicant Surveys We’d Like to See

I’ve always been a bit intrigued by the continuing popularity of Applicant Surveys. What is it that people expect to see in this year’s results that weren’t there last year?

There are basically three sets of research questions that are at the heart of current applicant surveys: who is applying (i.e., the social/ethnic composition), what information tools are students using to acquire information about institutions, and what do students say they are looking for in an institution?

The “who applies” question is an important one, but it’s not one that needs to get asked more than about once every three or four years. The data simply doesn’t change that fast.

At the margin, the “information tools” question does vary a bit with changes in technology. But in essence, the answer to this question is always the same: parents and friends at the top, schools and guidance counselors in the middle, followed by institutional websites, Maclean’s and the Globe and Mail, in more or less that order.

(Which is a disaster, of course; some of our recent research on the related topic of student financial aid found that students who relied on information from parents and friends were actually less knowledgeable about the topics at hand than people who had not sought ay information at all. Yikes.)

The important question isn’t “where did students get their information” but “what is it that students think they know about institutions”? Institutions need to have a much better sense of their own brand’s state of play if they are going to do anything useful about them. Sure, you can ask applicants if they think particular universities have “prestige researchers,” or offer “a safe environment”, but their answers are mostly ex-ante rationalizations, so why bother?

Bluntly, we know virtually nothing about the process of choice-formation. How do they develop interests in particular fields of study? How do they go about forming a choice set? How much do they understand about the differences between particular institutions? When do they become aware of institutional stereotypes (e.g., Western = “party school,” Trent = tree-huggers, etc.) and to what extent do these affect choice?

Admittedly, none of this is easy to get at through a survey. Some of these questions are inevitably qualitative (though analytical software is making post-survey coding ever-easier), and even the stuff that lends itself to quantitative analysis would require a lot of focus-group work to make sense.

But good things require work. In terms of being able to recruit better students, getting a jump on competitors gathering data to get inside the decision-making process is a lot more productive than seeing if the proportion of applicants saying they are looking for universities with “an emphasis on teaching” has moved another percentage point or not.

Page 109 of 119« First...102030...107108109110111...Last »