Higher Education Strategy Associates

Author Archives: Alex Usher

September 10

Improving the Discourse on Skills and Education

Recently, I did a fascinating set of roundtable discussions with employers and employer associations, and it brought home to me how one-dimensional much of our talk is regarding skills.

Broadly speaking, there are four sets of skills employers care about.  The first are job- or occupation-related skills: can a mechanic actually fix a car? Can an architect design buildings? And so on.  By and large, if you ask employers whether universities and colleges are successfully providing their graduates with this skill set, they say yes (in some fields, in some parts of the country, there are complaints that there aren’t enough graduates, but that’s a different story).  And that’s true more or less across blue-collar and white-collar occupations.

Then there’s a set of skills that, in Canada, go by the name: “essential skills” or “foundational skills”.  Most of this is basic literacy/numeracy stuff, but with communication, basic teamwork, and (increasingly) IT skills in there as well.  Here, Canada has a problem, and employers are not shy when it comes to talking about this.  Secondary school dropouts and recent immigrants who have yet to fully master one of our official languages tend to have the most problems with these skills, and the issue is concentrated in certain industries and occupations.  This tends to affect blue-collar jobs more than white-collar ones, but it’s also an issue in lower-level health and social service occupations (especially IT skills).

The third set of skills often gets called “soft-skills” or “integrative skills”.  This involves workplace savvy, primarily in white-collar industries: knowing how to act with clients, basic business and financial skills, how to operate in a multi-disciplinary/multi-functional team, and those somewhat nebulous qualities of critical thinking and problem-solving.  Basically, this is the stuff that Arts faculties claim to give you: the integrative thinking skills that keep businesses running.  They’re not the skills that get you hired, but they’re the skills that get you promoted.  Again, this is an area where employers who need these skills voice frustration with new graduates.

Finally, there are what get termed “leadership skills”.  It’s not always 100% clear what employers mean by this, but it usually is thought of as being different (and of a higher order) than the integrative thinking skills.  Again, this isn’t desired across the board: companies are hierarchies, and not everyone is at the top, so it’s actually a set of traits necessary in only a few.  But again, companies see these as lacking in young people, though, to be honest, young grads don’t have the experience to be put in leadership roles, and so it’s actually something they’ll need to a few years into their careers.

Now, when someone in business starts talking about a skills crisis, we mostly assume they mean that their new hires are lacking some set of skills, and lots of people (some of them inside the system itself) therefore use this as a stick with which to beat educational institutions for not doing their jobs.  But usually, when business says it needs skills, what it actually means is that it needs experienced workers with lots of on-the-job skills.  As such, what educational institutions can contribute in the short-term is pretty marginal.

But even to the extent that institutions can contribute – say, over the medium-to-long term – a simple desire for “more skills” doesn’t help very much.  Skills profiles vary enormously from occupation-to-occupation, and so too do perceptions of which skills are missing in each one.  Even within a single company, needs may vary substantially from one job to the next.  Getting business to be more specific about needs is a huge and urgent task.

Exacerbating this problem is our insistence that programs at different levels of education have to be of common length (mostly 4-year Bachelor’s degrees outside Quebec, mostly 2-year college programs outside Ontario).  For some occupations, this might be too much time in-class; for others it might not be enough.  If you’re running a 2-year program and someone tells you that grads need “more skills”, then the biggest question is: what should be dropped from the existing curriculum?  Forget competency-based education; we’d be a lot better-off if we could just get competency-adjusted curriculum lengths.  But here, governments tend to (unhelpfully) prefer standardized solutions.

Anyways, this is stuff that institutions – particularly community colleges – deal with all the time.  It’s a thankless but necessary job; getting it right is literally the foundation of the nation’s prosperity.

September 09

The Growth of Administration (Part 2)

In yesterday’s blog, I ended on the observation that over the period 2000-2012 at the 12 major universities where we have data (UBC, SFU, Alberta, Calgary, USask, Manitoba, Carleton, York, Toronto, Waterloo, Western, and Memorial) the rate of growth of support staff and administration was 16% faster than the rate of growth of academic staff.  To wit:

Figure 1: Growth in Support/Admin Positions vs Faculty Positions, 12 Large Institutions, 2000-2012














But that’s a 12-institution average.  In fact, very few individual institutions exhibit anything like this pattern.

Figure 2 shows increases in faculty and staff complements at each of the 12 institutions.  Some caution is required with the numbers: notably, while the definitions of “faculty” and “staff” are consistent over time at every institution, they differ across institutions in a number of ways.  So what you want to focus on here, above all, is the inter-institutional differences in the gap between staff and faculty hires.

Figure 2: Patterns in Institutional Staff Growth, 12 Institutions, 2000-2012














(To be clear: “staff” here includes any position that is not an academic post, the term does not only pertain to professional staff.  I’ll get to why this distinction is important later in the post.)

To start at the left side of the graph: Saskatchewan, Toronto, and Alberta are three institutions where administrative/support hiring massively outstripped faculty hiring.  At Saskatchewan, staff numbers went up 68% over the period 2000-2012, compared to faculty growth of just 19%.  At Toronto, the comparable figures were 49% and 2%; at Alberta, it was 53% and 18%.  These are places where claims of administrative bloat seem pretty clear cut.

But move along to the right, and one realizes the problem (if indeed it is one) isn’t universal.  At places like UBC, York, and Carleton, growth in admin/support is only slightly higher than growth in faculty numbers (note: our time period misses some of the recent growth from 2013 & 2014, to which Gary Mason’s article referred. At Waterloo and Calgary, faculty numbers increased more quickly than admin/support numbers from 2000-2012.

(If you’re wondering why SFU and MUN are off to one side, it’s because, over the course of the past decade, these two seem to have had some kind of change in how support staff were counted.  In MUN’s case, it seems to have resulted in a one-time loss of about 300 staff; at SFU, it led to a gain of 300 staff.  As a result of these shifts, SFU’s growth in support staff looks titanic, while Memorial appears to have shed staff.  Neither scenario is likely.  The grey bar for those two institutions are my very crude “best-guess” attempt to adjust for these definitional changes.)

The lesson here is that there really isn’t a single pattern prevalent across all institutions.  At some places, admin/support numbers are clearly growing wildly; at others, they are pretty stable.  It’s therefore probably better if people stopped making generalizations on this topic.

In the next post on this subject, which I’ll do sometime next week, I’ll try to answer the question: what are all these new staff doing, anyway?


September 08

The Growth of Administration (Part 1)

So, last week we talked about growth in non-academic staff; however, due to data limitations, we could only talk about dollars rather than numbers.  This is because no one actually collects non-academic staff numbers in Canada, and so most of the data (and anecdotes) around “academic bloat” comes from the US.  Last winter, I became sufficiently frustrated with fact-free arguments about “bloated administration” that I devoted part of my holiday to gathering data on this phenomenon.  I never quite finished the project back then, but last week gave me the nudge to try to put all of this data on the table.

You might wonder how I did this, given that no one keeps track of data, nationally.  Well, individual institutions *do* keep track of these numbers.  And some of them even put data up on the web, in things called “factbooks” or “databooks”.  And while the data definitions are sufficiently diverse that you can’t really make a national database from this information, it is possible to track changes over time at individual campuses, and then aggregate those changes across institutions.  So that’s what I did.

For my sample, I took 25 universities, which collectively comprise about 75% of the national system: the U-15, plus SFU, Victoria, Carleton, York, Ryerson, Guelph, Concordia, UQAM, UNB, and Memorial.  Of these, only about half had usable public institutional data on staff. UQAM, Concordia, Laval, and Montreal have very little institutional data online; apparently, Quebec schools don’t seem to think making data public is particularly important.  McGill’s website indicates that an institutional factbook containing such data exists; unfortunately, it is password-protected, because obviously the public can’t be trusted with such things (UNB’s data is also password-protected).  Dalhousie publishes a little bit of data (mostly about students), and very little else.  Queen’s has a “fast facts” page that touches on faculty numbers, but only back to about 2009.  Finally, Victoria, Guelph, and Ryerson all publish loads of institutional data online, but nothing on non-academic staff.

That leaves us with 14 institutions.  York, Carleton, and Calgary are all officially awesome, and have staff data on their websites going all the way back to 1990.  UBC, SFU, Alberta, Saskatchewan, Manitoba, Western, Waterloo, Toronto, and Memorial all have data back to 2000 (though in some cases, a trip to the Wayback Machine is required to get at it all). McMaster and Ottawa at least have data back to 2005.

So, what patterns do we see when we look at data from these institutions?  Well, if we just look at the national picture from 2005 to 2012 at the 14 institutions (remember: I did this last Christmas – there would probably be another year worth of data available if I did this again), we see that support and admin personnel numbers grew by a little over 17%, compared to a rate of faculty growth of about 11%.

Figure 1: Growth in Support/Admin positions vs Faculty positions, 14 large institutions, 2005-2012














However, if we pull Ottawa and McMaster out of the picture (because their data doesn’t go further back than this) and take the long view, back to 2000, we get a more striking picture.  At the 12 institutions where the best data is available, the rate of growth of admin and support staff outstripped academic staff growth by about 16% over twelve years.

Figure 2: Growth in Support/Admin positions vs Faculty positions, 12 large institutions, 2000-2012














We unfortunately cannot tell whether the pattern at our 12 universities is representative of trends across all institutions.  But these 12 collectively account for about 36% of the system by enrolments, which is a pretty big sample, so it’s unlikely that full national trends differ too much from this.

But there are more interesting stories to tell once you drill down into the data at a little more depth.  Tune in tomorrow.

Intrigued by this data so far?  Want to add your institution’s data to this list?  Send us a note at info@higheredstrategy.com.

September 04

Costing an Inuit University

There is an interesting initiative afoot to create something called the Inuit Nunangat University.  A workshop report on the concept is here.  Today, I thought I would contribute to the debate by looking at what such an initiative might cost.

Some background: the idea of an Arctic university is not new.  Many people have noted that Canada is the only member of the Arctic Council that does not have a university north of the Arctic Circle.  This largely has to do with a lack of major population centres, but no matter.  The Gordon Foundation wrote about this problem a few years ago.  At the time, my take on it was that the Arctic could probably support a small university on the model of the University of Greenland – roughly a dozen faculty working mainly in language and culture, with a bit of professional programming (i.e. BEds) thrown in.

Now, this new proposed university is somewhat hazy regarding scope (not surprising given that, at the moment, it’s just a workshop report).  It’s clear given that the proposal is for an Inuit university, rather than a University of Nunavut, that culture and language are going to be at the centre of the institutional mission: this proposal is less a University of the Arctic than it is an Inuit version of First Nations University.  Clearly, the authors have some big hopes for the future – programs in Science, Medicine, and Engineering are proposed – but equally clearly, any northern university is going to be fairly small for a long time.  The Inuit population of Canada is about 72,000; the population of Nunavut is about 35,000.  The territory only churns out about 240 high school graduates each year, and the local college (Nunavut Arctic College) already enrols about 1,300 students per year.  And some university-bound students will choose a southern university regardless of local options.  Put all that together and you’re very unlikely to see enrolments at such a university reach 1,000 for a long time, and 500 is probably a more realistic upper band.

In Canada, there are a number of similarly-sized stand-alone universities.  For instance, there is Université Ste. Anne (370 FT students), Canadian Mennonite University (480 FT) and The King’s University, Alberta  (670 FT students).  And while these universities are usually pretty tight  for money, they are all viable.  But they don’t have research programs to speak of, and they definitely don’t have Engineering or Medical schools attached to them.  These sorts of professional schools simply aren’t feasible without much larger student numbers.

For argument’s sake, let’s say a future Inuit Nunagat University ends up at about 600 students.  That’s close to the size of King’s University in Alberta, which somehow (honestly not sure how they do it) manages to staff faculties of Arts, Social Science, Science, and Business with about 45 full-time professors, on an annual operating budget (in 2013-14) that was just shy of $14 million.  That’s about $21,500 per student – but it doesn’t include any programs that might be considered “high-cost”.  It also assumes you can do all your programming in a single spot, rather than via distance education and community delivery; but that’s anathema in a territory that spans 2 million square miles and three time zones.  And there’s also the fact that staff costs are higher in the north.

To get a sense of what kind of adjustment factor you’d need to make to translate the $21,500 into a Nunavut context, consider the case of Nunavut Arctic College.  It provides programming in something like 25 locations across the territory, and does so at a cost of about $41,000 per student (excluding free services provided to the college by the Government of Nunavut, which would add another $7,700 or so).  That’s roughly two and a half times the per-student cost of college in the rest of the country.  So it seems fair to assume that a King’s-like institution would cost about $21,500 x 2.5 = $53,500 per student.  And that’s just for low-cost programs: no medicine, or engineering, or anything like that.  Total annual cost?  About $32 million.  And that’s before you get to any capital expenditures, or any of the other things on the workshop wish-list, like low tuition, grants, student housing, etc.

Now $32 million is a mind-bogglingly huge amount in the context of Nunavut own-source tax revenues, which are only about $180 million per year.  But since close to 90% of the Nunavut budget comes from Ottawa, it is actually only equal to a little under 2% of the entire territorial budget.  That’s still not a small ask, but it is in the realm of the financially possible, provided ambitions around program offerings remain modest.

September 03

One Lens for Viewing “Administrative Bloat”

The Globe’s Gary Mason wrote an interesting article yesterday about the Gupta resignation.  Actually, let me qualify: he wrote a very odd article, which ignored basically everything his Globe colleagues Simona Chiose and Frances Bula had reported the previous week, in order to peddle a tale in which the UBC Board fired Gupta for wanting to reduce administrative costs. This, frankly, sounds insane.  But Mason’s article did include some very eye-opening statistics on the increase of administrative staff at UBC over the past few years – such as the fact that, between 2009-10 and 2014-15, professional administrative staff numbers increased by 737, while academic staff numbers increased by only 28.  Eye-opening stuff.

And so, this seems as good a time as any to start sharing some of the institution-by-institution statistics on administrative & support (A&S) staff I’ve been putting together, which I think you will find kind of interesting.  But before I do that, I want to show you some national-level data that is of interest.  Not on actual staff numbers, mind you – that data doesn’t exist nationally.  However, through the annual CAUBO/Statscan Financial Information of Universities and Colleges (FIUC) survey, we can track how much we pay staff in various university functions.  And that gives us a way to look at where, within the university, administrative growth is occurring.

FIUC tracks both “academic” salaries and “other” (i.e. A&S) salaries across seven categories: “Instruction & Non-Sponsored Research” (i.e. at the faculty level); “Non-Credit Instruction” (i.e. cont. ed); “Library, Computing, and Communications”; “Physical Plant”; “Student Services”; “External Relations” (i.e. Government Relations plus Advancement); and, “Administration” (i.e. central administration).  Figure 1 shows the distribution of A&S salary expenditures across these different categories for 2013-14.  A little over 32% of total money is spent on faculty, while another 23% is spent in central administration.  Physical plant and student services account for about 11% apiece, while the remaining three areas account for 18% combined.

Figure 1: Distribution of A&S Salaries by Function, in 000s of Dollars, Canada, 2013-14














A zoom-in on the figures for central administration is warranted, as there has been some definitional change over time, which makes time-series analyses a bit tricky.  Back in 1998, the reporting rules were changed in a way that increased reported costs by about 30%.  Then, in 2003, about 15% of this category was hacked-off to create a new category: “external relations” – presumably because institutions wanted to draw a distinction between bits of central administration that increased revenues, and those that consumed them.  Figure 2 shows how that looks, over time.

Figure 2: Expenditure on Administrative & Support Salaries in Central Administration, in 000s of 2014 Real Dollars, Canada














Long story short: from the 80s through to the mid-90s, administrative & support salaries in central administration rose by a little over 3% per year in real terms.  Then, briefly, they fell for a couple of years, before resuming an upward trend.  Ignoring the one-time upward re-adjustment, aggregate A&S salaries in these two areas combined have been rising at 5.3%, after inflation, since 1999.  Which is, you know, a lot.

Now, let’s look at what’s been going on across the university as a whole.  Figure 3 shows changes in total A&S salary paid over time, relative to a 1979 base.  For this graph, I dropped the “non-credit” category (because it’s trivial); for central admin, I’ve both combined it with “external relations”, and corrected for the 1998 definitional change.  Also, for reference, I’ve included two dotted lines, which represent change in student numbers (in red), and change in total academic salary mass (in yellow).

Figure 3: Change in Real Total Academic & Support Salary Spending (1979-80 = 100) by Function, Canada

















Since 1979, student FTEs rose 120%, while academic salary mass doubled, after inflation.  A&S spending in libraries and physical plant rose by considerably less than this, by 27% and 57%, respectively.  A&S spending on “instruction” (that is, faculty & departmental offices) rose almost exactly in tandem with student numbers.  Spending on A&S salaries in central admin and in ICT rose about twice as fast as that, ending the 35-year period at three-and-a-half times their original rate.  But the really huge increases occurred in student services, where expenditures on A&S salaries are now six times as high as they were in 1979.

Over the next couple of weeks, I’ll be able to supplement this picture with institutional data, but the key take-aways for now are as follows: i) “central administration” salaries are growing substantially faster than enrolment and academic salary mass, but they represent less than a quarter of total A&S spending; ii) the largest component of A&S spending – that is, those reporting to academic Deans – is actually growing exactly on pace with enrolment; and, iii) the fastest-growing component of A&S spending is student services.  So, there has been a shift in A&S spending, but it’s not entirely to the bad, unless you’ve got a thing against student services.

More next week.

September 02

Some Basically Awful Graduate Outcomes Data

Yesterday, the Council of Ontario Universities released the results of the Ontario Graduates’ Survey for the class of 2012.  This document is a major source of information regarding employment and income for the province’s university graduates.  And despite the chipperness of the news release (“the best path to a job is still a university degree”), it actually tells a pretty awful story when you do things like, you know, place it in historical context, and adjust the results to account for inflation.

On the employment side, there’s very little to tell here.  Graduates got hit with a baseball bat at the start of the recession, and despite modest improvements in the overall economy, their employment rates have yet to resume anything like their former heights.

Figure 1: Employment Rates at 6-Months and 2-Years After Graduation, by Year of Graduating Class, Ontario














Now those numbers aren’t good, but they basically still say that the overwhelming majority of graduates get some kind of job after graduation.  The numbers vary by program, of course: in health professions, employment rates at both 6-months and 2-years out are close to 100%; in most other fields (Engineering, Humanities, Computer Science), it’s in the high 80s after six months – it’s lowest in the Physical Sciences (85%) and Agriculture/Biological Sciences (82%).

But changes in employment rates are mild compared to what’s been happening with income.  Six months after graduation, the graduating class of 2012 had average income 7% below the class of 2005 (the last class to have been entirely surveyed before the 2008 recession).  Two years after graduation, it had incomes 14% below the 2005 class.

Figure 2: Average Income of Graduates at 6-Months and 2-Years Out, by Graduating Class, in Real 2013/4* Dollars, Ontario














*For comparability, the 6-month figures are converted into real Jan 2013 dollars in order to match the timing of the survey; similarly, the 2-year figures are converted into June 2014 dollars.

This is not simply the case of incomes stagnating after the recession: incomes have continued to deteriorate long after a return to economic growth.  And it’s not restricted to just a few fields of study, either.  Of the 25 fields of study this survey tracks, only one (Computer Science) has seen recent graduates’ incomes rise in real terms since 2005.  Elsewhere, it’s absolute carnage: education graduates’ incomes are down 20%; Humanities and Physical Sciences down 19%; Agriculture/Biology down 18% (proving once again that, in Canada, the “S” in “STEM” doesn’t really belong, labour market-wise).  Even Engineers have seen a real pay cut (albeit by only a modest 3%).

Figure 3: Change in Real Income of Graduates, Class of 2012 vs. Class of 2005, by Time Graduation for Selected Fields of Study














Now, we need to be careful about interpreting this.  Certainly, part of this is about the recession having hit Ontario particularly harshly – other provinces may not see the same pattern.  And in some fields of study – Education for instance – there are demographic factors at work, too (fewer kids, less need of teachers, etc.).  And it’s worth remembering that there has been a huge increase in the number of graduates since 2005, as the double cohort – and later, larger cohorts – moved through the system.  This, as I noted back here, was always likely to affect graduate incomes, because it increased competition for graduate jobs (conceivably, it’s also a product of the new, wider intake, which resulted in a small drop in average academic ability).

But whatever the explanation, this is the story universities need to care about.  Forget tuition or student debt, neither of which is rising in any significant way.  Worry about employment rates.  Worry about income.  The number one reason students go to university, and the number one reason governments fund universities to the extent they do, is because, traditionally, universities have been the best path to career success.  Staying silent about long-term trends, as COU did in yesterday’s release, isn’t helpful, especially if it contributes to a persistent head-in-the-sand unwillingness to proactively tackle the problem.  If the positive career narrative disappears, the whole sector is in deep, deep trouble.

September 01

The Tennessee Promise

So, yesterday I talked about a big increase in access in the UK, which seems to have little to do with tuition fees.  Today, let’s talk about a developing story in the United States, where a lowering of net prices seems to have had a big impact on access.

You may recall that in the US over the last couple of years, there has been a growing movement for free community college, something that President Obama picked up on earlier this year.  But before Obama picked up this baton, free community college had already been introduced in Republican Tennessee, where governor Bill Haslam had turned something called “the Tennessee Promise into law in 2014.

Technically, the Tennessee Promise is not “free tuition”.  It’s only available to students entering straight from high school (which is a bit weird in terms of design, but whatever).  Students have to be full-time, maintain a 2.0 average, meet regularly with a mentor, and perform eight hours of community service per term.  And technically, what it does is reduce your tuition to zero after all other forms of aid and scholarship are taken care of (this is what is known in the business as a “last dollar” scholarship).  If you apply for the award and meet the terms, government will cover your tuition to the point where your net price is zero.  For a good number of people, this means free tuition with minimal strings attached, so let’s just call it free tuition.

Now, you might expect that with this kind of incentive, enrolment might rise a bit.  And you’d be right.  According to very early results, the number of freshmen is up 29.6% over last year.  Obviously this is a pretty impressive result, but before we get too excited, we should probably find out a little more about where these new students are coming from.  Are they “new” students, or are they mostly students who would have gone to a 4-year college, but have chosen 2-year instead?  And what about students’ financial background?  If you’re poor enough to be anywhere near maximum Pell grant ($5,775), the Tennessee Promise provides no additional aid, because tuition at Tennessee Community Colleges is about $4,000.  So it may well be that what the Tennessee Promise is doing is providing aid to people higher up the income ladder.  This is a little inefficient, but since (as I noted back here) community college students tend to come from poorer backgrounds anyway, this is not as regressive as it would be if it were implemented at 4-year colleges.

We should be able to answer these questions in a few weeks (yes, Canadians, in some places data is available in weeks, rather than years).  Even though Tennessee does not track applicants by income the way the UK does, the state’s excellent annual Higher Education Fact Book does contain two pieces of data that will help us track this.  The first is college-going rates by county, which will help us understand whether the jump in participation is concentrated in higher- or lower-income counties, and the second is the percentage of students who are Pell-eligible.  I’ll keep you up-to-date on this when the data is out.

The most intriguing possibility here is that rates of attendance for Pell-eligible students might be rising, even though the Tennessee Promise provides no actual added benefit for many of them.  It may well be that simply re-packing the way we frame higher education costs (“it’s free!”) matters more than the way we actually fund it (“your tuition is $4,000, and you also have a grant for $4,500”).

This would have significant policy ramifications for us in Canada.  As we noted last year in our publication, The Many Prices of Knowledge, many students at Canadian community colleges face an all-inclusive net price that is negative, or very close to it.  Similarly, poor first-year university students in both Ontario and Quebec have negative net prices.   No one knows it, because we package aid in such a ludicrously opaque fashion, but it’s true.  And if the Tennessee data provides evidence that the packaging of aid matters as much as the content, then it will be time for Canadian governments to re-evaluate that packaging, tout de suite.

August 31

An Interesting Story about Access in the U.K.

Remember how, in 2012, tuition in England rose by about $10,000-$12,000 (depending on the currency exchange rate you care to use) for everyone, all at once?  Remember how the increase was only offset by an increase in loans, with no increase in means-tested grants?  Remember how everyone said how awful this was going to be for access?

Well, let me show you some interesting data.  The following comes from UCAS, which, at this time of year, does daily (yes, daily!) reports on “accepted applicants” (that is, applicants who have been offered a place at universities for the term commencing in a couple of weeks).  Figure 1 shows what’s happened to student numbers from families in the lowest income quintile since 2011, which was the year before the tuition increase.

Figure 1: Number of Accepted Applicants from the Lowest Income Quintile, England, 2011-15














Big increase, right?  Over three years, it amounts to 19.8%.

“Oh well”, say the zero-tuition true believers, “this doesn’t prove anything.  What really matters is what happened to students from higher income backgrounds.  Surely, being less bound by financial constraints, their numbers grew even more”.

In a word: nope.  The rate of accepted applicants increased by more than three times faster for students from the bottom quintile (quintile 1) than it did for those from the top (quintile 5).  Of course that’s partly because they have a lot more room to grow: there are still about three times as many accepted applicants from the top quintile as the bottom quintile.  But the point is: contrary to expectations, the gap is closing.

Figure 2: Change in Number of Accepted Applicants by Income Quintile, England, 2011-2015, Indexed to 2011














“Ok”, say the skeptics; “let’s look at counterfactuals: what’s going on in neighbouring countries, where policy didn’t involve a massive tuition fee increase?  What about Wales, where tuition stayed at a little over £3,000, or Scotland where tuition is free (for Scots: English kids still have to pay the £9,000)?”

Fair question.  Figure 3 shows what happened to students from the lowest income quintile in all three countries: in Scotland, rates of accepted applicants are up by 28%, in Wales by 21%, and in England by 17%.

Figure 3: Change in Rate of Accepted Applicants, England, Scotland, and Wales, 2011-15, Indexed to 2011














“A-HA!”  Say the usual suspects.  “Clear evidence that free is better!”  Well, maybe.  But before declaring victory, why not look at rates of accepted applicants for low-income students across these three countries?   That is: what percentage of all youth from the bottom income quintile actually reach the stage of being “accepted applicants”?

Figure 4: Accepted Applicants from Bottom Quintile Families as a Percentage of all Bottom Quartile Youth, England Scotland, And Wales, 2011-2015














Quite a different story, isn’t it?  Turns out that in horrible, vicious, neo-liberal, £9,000 tuition England, 18% of lowest-income quintile youth apply, and are admitted to university.  In idyllic, equality-loving, £0 tuition Scotland, the figure is not much more than half that, at 10%.  So let’s just say that the evidence claiming fees explain participation rates, and changes thereof, is pretty limited.

But getting beyond the issue of fees, I think there’s a bigger story here.  Right across the UK, regardless of tuition fee regime, there is a massive uptick in participation from low-income students over the last couple of years.  Clearly, something is going right there with respect to low-income students.  Is it a change in aspirations?  Expectations?  Academic preparation?  As far as I know, no one has published on this – I have a feeling everyone was so keyed on explaining expected declines in participation that no one was set up to explain the opposite.  But whatever is going on, it’s a success, and other countries would do well to learn from it.

August 28

Boards, Senates, and Myths of University Exceptionalism

If there is one thing that the departure of Arvind Gupta has demonstrated, it’s that there are a large number of faculty (and others) who either misunderstand or dispute the role of Boards of Governors at universities.

Here’s the deal.  Regardless of whether an organization is for-profit or not-for-profit, there is some kind of committee at the top, which usually has the word “Board” in its title – Board of Trustees, Board of Governors, whatever.  The job of this board is threefold: first, make sure the organization meets its strategic goals. Second, make sure it meets its financial goals (in for-profits, these two are pretty much identical, but in non-profits they’re different).  Third, hire and hold accountable a chief executive for getting those things done.

At this point, I hear the objections: “universities aren’t corporations, how dare you compare us to a for-profit company, etc.”  The first of these is wrong: universities most definitely are corporations.  Corporate status is key to providing the legal framework for pretty much everything universities do.  True, they aren’t for-profit entities (in our country, anyway) but for-profit/not-for-profit is irrelevant with respect to governance: you still need a body at the top of the organizational hierarchy performing those three functions.

What makes universities unique is the degree to which staff are involved in developing  strategic goals.  Both for statutory and practical reasons, this job is more or less left to Senates (or their equivalents), and their committees.  Boards formally ratify these strategy documents, and thus “own” them, but compared to other types of organizations, they are very hands-off about this part of the job.  Senates, in effect, are the source of university exceptionalism.  But there is nothing – literally nothing – that makes universities exceptional with respect to the jobs of maintaining healthy finances, and selection/oversight of the chief executive.  The Board of a university executes those functions exactly the way the board of any other organization does.

When it comes to hiring, people kind of get this.  When new Presidents are hired, no one questions the prerogative of the Board to make the decision.  And while there is sometimes grumbling about who got chosen or who didn’t get chosen, no one parades around demanding “transparency” about why candidate X got picked instead of candidate Y.  But apparently when a President leaves, many people think that the Board owes the faculty all the gory details.  Because transparency.  Because “universities are different”.

Transparency is usually to the good, of course.  But sometimes, if you’re dealing with a personnel matter, the correct way to deal with it is to say goodbye as quickly and as amicably as possible.  By and large, you don’t do that by broadcasting the circumstances of the departure to the world.  Transparency sometimes comes second to expediency, tact, and judgement.  Yet, what a lot of people at UBC seem to be saying is that Boards owe them explanations.  Because “universities are different”.

To keep this short: universities are different – but not in that way.  Regardless of the organization they serve, boards don’t owe anybody explanations about personnel decisions.  They have a responsibility to make sure the organization is fulfilling its mandate (in managerial terms: making sure it has a strategic plan, and is fulfilling it), and providing a public good.  That’s it.   What they have to make clear in a university context is whether or not a dismissal/resignation affects the strategic plan, or (especially) if there was a dispute between Board and CEO regarding the nature or direction of the strategic plan.  And the reason they have an obligation in this scenario is because of Senate’s role in creating the strategy in the first place.

Sure, faculty might want to know details.  They’re curious.  They’d like to know (or impute) the politics of the whole thing.  But there is no right to know, and saying “universities are different” – when in this respect they clearly are not – doesn’t change anything.

August 27

Theories of Change

One of the easiest things to do in policy is to advocate for policy X, so as to change effect Y.  One of the hardest things to do is to get people to explain clearly their theory of change.  That is, what are the steps by which changing X actually affects Y?

Take performance-based funding.  It’s easy to get hot for the idea that organizations can be steered by offering incentives: if you pay schools for students, they’ll raise enrolment.  If you pay them for graduates, they might spend a bit more effort and money on academic support service.  And so on.  By this theory, all you need to do to get universities to change their behaviour is to offer the right financial incentives.

But here’s the problem: that theory works a lot better for individuals than for organizations.  If what you are trying to do is force a change in organizational culture (e.g. get them to shift to a more student-centred focus), you have to remember that individuals inside an organization aren’t necessarily going to face the same incentives as the institution.  Just because an organization is incentivized doesn’t mean everyone in it is incentivized.

In extremely hierarchical organizations, it’s possible for management to pass incentives on to staff in various ways.  But universities are not particularly hierarchical institutions.  Outside of terrorist cells, universities are about the most loosely-coupled organizations on earth.  Some of the larger among them, to quote Kevin Carey, are more like holding companies for a group of departments, which are themselves holding companies for professors’ research interests.

So let’s get back to the example of a government that hopes to get universities to pay more attention to student success.  Say the government comes up with a funding formula that potentially allows an institution to access a couple million dollars more if it increases its graduation rate.  What happens?

Well, it’s certain that university leadership will try to grab the money.  That’s their job.  Then they’ll think about how to achieve the goal.  Pretty much every authority on retention will tell you that it is a institution-wide exercise.  The key is identifying students that are having trouble, and then making sure they get appropriate assistance, either from instructor(s), or from some kind of centralized suite of academic services.  But while it’s easy enough to invest money in new centralized services, the key to such an approach still rests on professors (some more than others) altering the way they behave in class, so as to spend more time/effort identifying strugglers early, and then doing something about it (talking to the students themselves, sending their name to a counsellor who can then contact the student and offer assistance, etc.)

The question is: how do you get the professor to make those changes?  The promise of more money to the institution is a pretty weak one.  First, while many people’s behaviour will need to change in order to get the money, not everyone’s does, so there’s a rational reason to try to free ride on the process.  Second, even if the institution does get the money, it doesn’t follow that the money will be distributed in such a way that all individual profs  benefit.  A prof’s behaviour is not incentivized in the same way as the institution’s.  And if that’s so, why would we expect the prof to alter his or her behaviour?

I’m not saying it’s impossible steer universities by using money as an incentive; I’m saying that success in doing so requires the incentives to be aligned in such a way that everyone’s behaviour down the chain is incentivized.  And in a university, where every professor is, to an extent, a free agent, that’s really hard to do.  It works where the incentive aligns with career goals or professional norms (e.g. do more research).  But when it pushes against professional norms, it’s a lot more difficult.

Fundamentally, people trying to steer system reforms need to ask themselves: how will this incentive alter what individuals on the ground actually do on a day-to-day basis?  If there’s no good answer to that question, chances are the incentive isn’t likely to work.

Page 30 of 111« First...1020...2829303132...405060...Last »