HESA

Higher Education Strategy Associates

October 17

Innovation Literature Fail

So, I’ve been reading Mariana Mazzucato’s, The Entrepreneurial State.  It’s brilliant and irritating, in equal measures.  Brilliant because of the way it skewers certain free-market riffs about the role of risk and entrepreneurialism in the innovation process, and irritating because it’s maddeningly cavalier about applying business terms to government processes (in particular, the term “risk”, which Mazzucato doesn’t seem to understand means something entirely different in government, if losses can be made whole through taxation).

Anyways, one thing that occurred to me while reading was just how America-specific much of the literature on innovation is.  Take the Defence Advanced Research Projects Agency (DARPA).  In innovation policy circles it’s generally considered a wicked-cool way of organizing Big Science: it’s project-based, it brings teams together from both academia and business, and it has substantial independence.  And, of course, the basic research has produced things like GPS and the Internet (still the core anecdotes used to back the “government-should-be-involved-in-research” argument). 

Brilliant, right?  So why doesn’t everyone have a DARPA?  Why doesn’t Canada?

The answer is that DARPA wouldn’t make any sense here.  Our government agencies don’t have enough of the “big problems” that DARPA is designed to solve – or, at least, that could be solved at a price we can afford.  And frankly, we don’t have enough private-sector research scientists to make headway into these kinds of projects, anyway.

More broadly, the American system of funding science works because of a particular combination of factors: the problems needing to be solved, the presence of major private sector research efforts, a particular type of venture capital industry, and scale.  Canada – like most countries in the world – would, at most, get part-marks on any of those four criteria.  So why do we think that policies based on American examples work for us?

Take questions of “applied” vs. “basic” science.  Maybe the classic Vannevar Bush formulation of, “government funds universities to do basic research, and companies do the applied stuff” only makes sense in the US context.  Maybe without the VC culture, or the private sector research culture, the idea that government should only be playing in the “basic” side of the continuum doesn’t make any sense. Maybe countries who aren’t quite at the technological frontier don’t get as much bang for their buck in basic research as America does.

This is just speculation on my part, of course.  But I’m tired of the innovation literature assuming that US-inspired solutions will work here.  Just for once, I’d like to see some literature and policy prescriptions based on what works in Korea, the Netherlands, and Scandinavia.  There’s probably a whole other set of policy lessons to be learned, if only we looked for them in the right places.

October 16

A Simple Solution for Statistics on Doctoral Education

Higher Education statistics in Canada are notoriously bad.  But if you think general stats on higher ed are hard to come by, try looking at our statistical systems with respect to doctoral education and its outcomes.

Time-to-completion statistics are a joke.  Almost no one releases this data; when it is released, it often appears to be subject to significant “interpretation” (there’s a big difference between time-to-completion and “registered” time-to-completion.  If you want to keep the latter down, just tell students getting into sixth year to de-register until they’re ready to submit a thesis).  Employment statistics are even scarcer – and as for statistics on PhDs getting jobs in academia?  Ha!

It’s that last piece of data that students really want published; it’s also the one viewed with the most trepidation by directors of graduate programs, who are a bit worried about what the stats might reveal.  They would probably argue that this isn’t a fair measure of their program’s success since, after all, they don’t control the hiring market.  This is a fair enough point, though a reasonable person might ask in return why, if this is the case, PhD intakes are staying high, or even increasing?

Personally, I think the idea that everyone in a PhD program should aspire to an academic career is demented, and professors who peddle that idea to their students (and there are thousands of them) should be ashamed of themselves.  But sadly, a lot of students do buy into this myth, and when it doesn’t come true, they’re fairly upset.

There is, however, a simple way to address this problem.  Every department in the country should be required to maintain a webpage with statistics both on graduation rates, times-to-completion (real ones, from time-of-enrolment, to degree), and on the last five years worth of graduates.  How many are in tenure track positions?  How many are in post-docs?  How many are temping?  Etc.  No big survey, no nightmare of involving StatsCan and the provinces, and whatnot.  Just every department, publishing its own stats.  And no guff about response burdens.  Between Academia.edu and LinkedIn, this shouldn’t take more than a day to do.

“Sure”, I hear you saying, “and which departments will volunteer for this, exactly?”  But there’s actually a very simple way to get departments to fall into line.  The granting councils – who, as much as anyone, should be concerned about these issues – should simply make publication of such information a pre-requisite to obtaining any funding for graduate students.  Period.

Faced with this, I’m fairly certain that departmental objections would melt like butter.  So how about it, SSHRC and NSERC? Want to land a blow for good data?  Give this plan a try.  You’ll be heroes to grad students from coast to coast.

 

October 15

The Problem with Cutback Narratives

Let’s discuss how we talk about cutbacks.  And let’s talk about the University of Alberta.

U of A has been rather radically affected by the recent cutbacks imposed by the Alberta government.  But here’s the weird thing: apparently it’s not enough to say “we’ve had cuts of 7%”in one year”.  Instead, people feel the need to enhance that figure in many ways.  It’s not just a 7% cut, they say – “we were told by government to budget based on 2% growth, so it’s actually 9%”.  Or, “11% over two years”, etc.

If that’s not enough to invoke sympathy, people turn to statements like: “well we have to absorb a 7% cut, but that’s on top of the 18% we’ve already taken in the last four years.”  This, apparently, is a quote from the Dean of Science.  I have to hope it’s a misquote, because it’s not even vaguely true.

First, although the government grant fell 7%, that doesn’t mean revenue will fall 7%, because the University can always get new revenue – even with a ridiculous and unconscionable tuition fee freeze – by changing their enrolment mix to favour more expensive programs (professional master’s degree) and students (i.e. foreign ones).  In fact, according to a planning document which came out last month, the 2013-14 operating budget for faculties is actually 0.2% higher than it was last year (Science did take a 2% hit, but hey, someone’s got to pay for that 6% bump that Medicine received).

Second, it’s preposterous to say that cuts have been sustained over a four-year period.  Here’s the actual budget of the faculty of Science over the past four years:

University of Alberta Science Budget, 2009-10 to 2013-14, in Millions

 

 

 

 

 

 

 

 

 

 

 

 

… and for faculty operating budgets in general…

University of Alberta Faculties’ Budget, 2009-10 to 2013-14, in Millions

 

 

 

 

 

 

 

 

 

 

 

 

… and total operating budgets…

University of Alberta Operating Budget, 2009-10 to 2013-14, in Millions

 

 

 

 

 

 

 

 

 

 

 

 

The problem, as you can see, isn’t primarily about income.  The problem is that when your business is 60% labour costs, and you can’t fire people, and you hardwire-in annual 4% rises in labour costs, there isn’t a lot of flexibility in the system.  When revenue growth slows below 4%, cuts – sometimes quite painful ones – do occur… in non-salary areas.  Indeed, between 09-10 and 12-13, the Faculty of Science did cut the materials budget by 25%, from 2 million to 1.5 million… but the salary & benefits budget went from 69.2 million to 76 million over that same period.  Hmm.

Universities didn’t have to build brittle systems that would shatter if revenue growth fell below 4%.  The academic community, through a thousand little decisions, made this bed for itself and now has to lie in it.  Yet the dominant narrative is one of universities being passive victims of outsiders’ (i.e. government) actions.  A more thoughtful response – one befitting an institution devoted to dispassionate analysis – might be a bit more of an introspective one.

October 11

PIAAC: The Results for Aboriginal and Immigrant Canadians

One of the unbelievably cool things about this week’s PIAAC release is the degree to which StatsCan and CMEC have gone the extra mile to not only oversample for every province, but also for every territory (a first, to my knowledge), and for Aboriginal populations, as well – although they were not able to include on-reserve populations in their sample.  This allows us to take some truly interesting looks at several vulnerable sub-segments of the population.

Let’s start with the Aboriginal population.  Where numbers permit, we have province-by-province stats on this, albeit only for off-reserve populations.  Check out figure 1:

Figure 1: PIAAC Literacy Scores for Aboriginal and Mainstream Canadians, Selected Provinces.

 

 

 

 

 

 

 

 

 

 

 

 

So, good news first: in BC and Ontario, the gap between Aboriginal and mainstream Canadians is down to single digits – this is great news, even if it doesn’t include the on-reserve population.  But given the differences in educational attainment, you have to think that a lot of this is down to attainment rates: if one were to control for education, my guess is the difference would be exceptional.

The bad news, of course, is: WHAT THE HELL, NUNAVUT?  Jumpin’ Jehosophat, those numbers for the Inuit are awful.  The reason, of course, again comes down to education, with high-school completion rates for the population as a whole being below 50%.  Other territories are better, but not by much.  It’s a reminder of how much work is still needed in Canada’s north.

The immigration numbers are a bit more complicated.  The gap in literacy scores between non-immigrants and immigrants is about 25 points, and this gap is consistent at all levels of education.  That’s not because immigrants are less capable, it’s because, for the most part, they’re taking the test in their second – or possibly third – language (breaking down the data by test-takers’ native language confirms this).  As someone pointed out to me on twitter, the consequence of this is that PIAAC literacy isn’t pure literacy, per se - it’s a test of how well one functions in society’s dominant language.  Conversely, though, since facility in the dominant language clearly has an effect on remuneration, one wonders how much of the oft-discussed gap in salaries between immigrants and native-born Canadians, which seems illogical when just looking at educational levels, might be understood in light of this gap in “literacy”?

A larger point to remember, though, is that the presence of immigrants makes it difficult to use overall PIAAC scores as a commentary on educational systems. Over 20% of Canadians aged 16-65 are immigrants, and most of these people did their schooling outside of Canada, and, bluntly, they bring down the scores.  Provinces with high proportions of immigrants will naturally see lower scores.  Policymakers should be careful not to let such confounding variables affect their interpretation of the results.

October 10

More PIAAC: The Canadian Story

Yesterday I offered my thoughts on some of the highlights from the international portion of the PIAAC release; today I want to focus on the Canadian results. 

Figure 1 shows the overall literacy scores, by province.

Figure 1: Literacy Scores by Province, PIAAC

 

 

 

 

 

 

 

 

 

 

 

 

At first glance, PIAAC doesn’t seem to be telling us anything we didn’t already know from years of PISA & TIMSS surveys.  Alberta comes first, the Atlantic is mostly a mess, and everybody else is kind of in-between.  But look a little more closely at the data, and a different story emerges.  Remember that PISA and TIMSS are single-cohort snapshots of kids with identical amounts of education; PIAAC is a mashup of multiple cohorts, each with quite different educational patterns.  Because they are measuring such different things, similarities may simply be coincidental.

So let’s see what happens when we try to standardize for age and education.  Figure 2 shows PIAAC literacy scores, by province, for the 25-34 age cohort who possess a university degree:

Figure 2: Literacy Scores by Province, University Graduates Aged 25-34

 

 

 

 

 

 

 

 

 

 

 

 

At face value, Figure 2 is pretty exciting if you’re from the Atlantic.  I mean, hey, OECD says one year of schooling is equal to seven points on the PIAAC scale – which implies that Islanders with university degrees, on average, have literacy rates equal to about three years of extra education over the left-coasters.  But because of sample sizes, these numbers come with pretty big confidence intervals: PEI and Nova Scotia are outside the margin of error for BC and Saskatchewan, but not for anyone else.  The other six are all essentially equal.

Now take a look at the result for college graduates, aged 25-34:

Figure 3: Literacy Scores by Province, College Graduates Aged 25-34

 

 

 

 

 

 

 

 

 

 

 

 

There’s a similar pattern here, but the gaps at either end are bigger, and confidence intervals don’t help quite as much.  BC through Manitoba are all within each others’ margin of error.  Put PEI and Alberta are genuinely ahead of everyone else, except BC; Newfoundland and Saskatchewan come out looking bad no matter what.

Here’s what you should take from this:

1)   Alberta’s overall high PIAAC scores are due less to its own education system, and more to its ability to attract talent from elsewhere.  That’s the only way you can reconcile their own scores with what we know about their PSE access rates, and the performance shown in the second and third figures above.

2)   Saskatchewan needs to ask some hard questions.  Really hard.

3)   PEI is… odd.  This doesn’t look like a fluke.  But then, if they’ve got all these great skills, why is their economy such a basket case?

4)   Newfoundland is Newfoundland.  Decades of relative poverty will take its toll.

5)   Don’t get fooled by small differences – the other six provinces are essentially indistinguishable from one another.

More tomorrow.

October 09

Some Bombshells from the Programme for International Assessment of Adult Competencies (PIAAC)

So, yesterday saw the release of the first results from the Survey of Adult Skills, a product of the OECD’s Programme for International Assessment of Adult Competencies.  This survey is meant to examine how adults from different countries fare on a set of tests measuring cognitive and workplace skills, such as literacy, numeracy, and ICT skills; perhaps somewhat controversially, some of the OECD’s own employees are referring to it as a “ranking” (though, honestly, that does them a grave disservice).  Additionally, Statistics Canada did a seriously massive oversample, which allows us to make comparisons not only between provinces and territories (to my knowledge, this is the first time anyone’s gone to the trouble of getting representative data in Nunavut), but also between immigrants and non-immigrants, and aboriginals and mainstream Canadians.

Fun, huh?  So much fun it’s going to take me the rest of the week to walk you through all the goodies in here.  Let’s begin.

Most of the media coverage is going to be on the “horse-race” aspects of the survey – who came top, across the entire population – so that’s a good place to start.  The answer is: Japan for literacy and numeracy, Sweden for ICT skills.  Canada is middle of the pack on numeracy and literacy, and slightly above average on ICT.  These Canadian results also hold even when we narrow the age-range down to 16-24 year-olds, which is more than passing strange, since these are the same youth who’ve been getting such fantastic PISA scores for the last few years.  Most national PIAAC and PISA scores correlate pretty well, so why the difference for Canada?  Differences in sampling, maybe?  It’s a mystery which deserves to be resolved quickly.

But here’s the stuff that grabbed my attention: national literacy scores among young graduates, by level of education.  The scores for secondary school graduates are for 16-19 year-olds only; the scores for university graduates are for 16-29 year olds.  Note that scores are out of 500, with 7 points being equivalent (allegedly) to one extra year of schooling.

Figure 1: PIAAC scores by Country and Level of Education

 

 

 

 

 

 

 

 

 

 

 

 

Eyeball that carefully.  Japanese high school graduates (red bars) have higher literacy levels than university graduates (blue bars) in England, Denmark, Poland, Italy, and Spain.  Think about that.  If you were a university rector in one of those countries, what do you think you’d be saying to your higher education minister right about now?

Another way to look at this data is to look at “value added” by higher education systems by looking at the differences between scores for recent university or college (technically, tertiary non-university, always a tricky category) graduates, and those for secondary graduates.  Figure 2 shows the differentials for universities:

Figure 2: Difference in PIAAC Scores Between University (Tertiary A) and High School Graduates

 

 

 

 

 

 

 

 

 

 

 

 

And figure 3 shows them for tertiary non-universities (Tertiary B):

Figure 3: Difference in PIAAC Scores Between “Tertiary Non-University (Tertiary B) and High School Graduates

 

 

 

 

 

 

 

 

 

 

 

 

There’s an awful lot one could say about all this.  But for me it boils down to: 1) the fact that so many countries’ Tertiary B value-adds are negative is a bit scary; 2) The Americans, Finns, and Belgians (the Flemish ones, anyway) all have really good value-add results across their whole tertiary systems; 3) the Australians and Koreans appear to have absolutely god-awful value-add results across their whole tertiary systems; and, 4) Canada is just… middle-of-the-pack.  Not bad, but probably not where we should be, given how much higher-than-average our expenditures on higher education are.

More tomorrow.

October 08

Where Responsibility for Financial Sustainability Lies

I often write about the unsustainability of university finances, the lunacy of its cost base, the fact that Canadian profs are better paid than in any public system of higher education in the world, etc.  Some people have concluded from this that I am hostile to labour, or to academic unions in particular.

But that’s not true.  Though I do call BS on some of the sanctimonious nonsense that comes out of academic unions on the beleaguered state of their (let’s face it) quite privileged members, I cannot, and do not, blame people for banding together and bargaining in their best interests.  If you want to blame anyone for our current financial predicament, try looking at the administrators who keep saying yes to labour’s demands.

Although things are slowly changing, we had a pretty serious agency problem on the management side of the bargaining table for most of the last twenty years.  While senior administrators are sometimes demonized as aliens feasting parasitically on the body of the academy, the fact is that most of them are academics themselves, and a fair number of them intend to go back to do more teaching and research at some point.  That puts them in a bit of a conflict of interest since there’s a good chance that they will eventually end up in the bargaining unit with whom they’re negotiating.  And it’s not as if they’re negotiating with their own money – if they give away too much, they can always lobby government for more money, or higher permissible fee increases, right?

It’s also astonishing how long it’s taken for management to get serious about bargaining.  Though not all members are equally appreciative of its methods, CAUT’s great success over the years has been to help its members bargain, and – perhaps more crucially – to provide big stonking cheques for strike pay to its members on the eve of work-stoppages (the million-dollar check Jim Turk sent to St. FX’s union on the eve of last year’s strike played an enormous role in the shape of the final settlement).  But university administrations haven’t kept pace, and so they get picked off one by one, and keep agreeing to contracts which, in the long run, are unsustainable without significant hikes in public funding, or tuition fees, or both – neither of which, you may have heard, is imminent anywhere right now.

Long story short: university finances are largely a mess, and rising labour costs are a significant part of the problem.  But no one made institutions sign those deals.  Going to government and pleading for special treatment now because they made dumb deals in the past?  That dog just won’t hunt.

October 07

Superior Strategy

You may recall that, a few weeks ago, I was somewhat harsh about Western’s new strategic plan for being a kind of Stepford-link strategy: generic, and utterly lacking in anything that suggested Western had its own strengths and personality.  If you follow me on twitter, you may have seen me make some similar remarks about Waterloo’s strategic plan.  Waterloo is one of the country’s few universities that genuinely has a unique value proposition, and deep strengths on which to build – which is why it’s disappointing to see that its current strategic plan downplays (though does not eliminate) those things in favour of the generic, “research-research-hire-great-faculty-we-need-world-class-research” nonsense that passes for strategy in some quarters.

But having shown you some examples of bad strategy, I thought I should make note of good strategy when it happens.  And so, Ladies and Gentlemen, I give you Humber’s Strategic Plan.

(Some disclosure here: in the past twelve months, I have had two paid speaking engagements at Humber, one in connection with the making of  this strategic plan.  I also bid for the contract to help them develop the strategy as a whole – unsuccessfully.  Make of that what you will.)

Why is Humber’s strategy so good?  A few reasons.

A Clear, Concise Mission: “To be leaders in polytechnic education”.  What’s great about this statement is that, unlike some generic statement about excellence, it gives everyone in the organization guidance about what not to do.  They’re not there to become a university.  They’re not there to be a college.  They’re going to stick to their knitting, and try to carve out something distinctive in the emerging field of polytechnic education.

A Limited Number of Strategic Priorities.  In order to appease various internal constituencies, too many strategies adopt an unwieldy laundry-list of priorities. Humber kept it to three: partnerships, teaching and learning, and “strengthening the polytechnic identity” (which might sound flaky but, in practice, is a fairly tightly-defined process of institutional differentiation and program development).

Success Metrics: It’s simple: if you commit to goals (three per strategy, nine in total), you should have some idea of whether or not you’ve achieved them.  Yet, in most academic strategic plans, you will search in vain for simple, clear-language statements of intended outcomes.  Humber’s are beautifully clear.

It’s not all brilliant; in particular, I think it contains too much about online education which, given Humber’s profile, and other strategic priorities, seems more like a distraction that anything else.  But that’s more a quibble than a real problem.  Humber has actually thought hard here about its unique strengths, and is building on them in a focused manner.

That’s strategy, folks.  Read it.

October 04

Those Times Higher Education World Rankings (2013-14 Edition)

So, yesterday saw the release of the latest round of the THE rankings.  They were a bit of a damp squib, and not just for Canadian schools (which really didn’t move all that much).  The problem with actually having a stable methodology, as the Times is finding, is that there isn’t much movement at the top from year-to-year. So, for instance, this year’s top ten is unchanged from last year’s, with only some minor swapping of places.

(On the subject of the THE’s top ten: am I the only one who finds it completely and utterly preposterous that Harvard’s score for industry funding per professor is less than half of what it is at Beijing and Basel?  Certainly, it leads to suspicions that not everyone is filling out their forms using the same definitions.)

The narrative of last year’s THE rankings was “the rise of Asia” because of some good results from places like Korea and China.  This year, though, that story wasn’t tenable.  Yes, places like NUS did well (up 5 places to 25th); but the University of Hong Kong was down 8 spots to 43rd, and Korea’s Postech was down 10 spots to 60th.  And no other region obviously “won”, either.  But that didn’t stop the THE from imposing a geographic narrative on the results, with Phil Baty claiming that European flagship universities were “listing” – which is only true if you ignore Scandinavia and the UK, and see things like Leuven finishing 61st, as opposed to 58th, as significant rather than as statistical noise.

This brings us to the University of Basel story.  The THE doesn’t make a big deal out of it, but a university jumping from 134th to 68th says a lot.  And not about the University of Basel.  That the entirety of its jump can be attributed to changes in its scores on teaching and research – both of which are largely based on survey results – suggests that there’s been some weirdness in the pattern of survey responses.  All the other big movers in the top 100 (i.e. Paris Sud, U Zurich, and Lund, who fell 22, 31, and 41 places, respectively) also had huge changes in exactly these two categories.

So what’s going on here?  The obvious suspicion is that there were fewer French and Swiss respondents this year, thus leading to fewer positive responses for those schools.  But since the THE is cagey about disclosing details on survey response patterns, it’s hard to tell if this was the case.

And this brings us to what should really be the lead story about these rankings: for an outfit that bleats about transparency, too much of what drives the final score is opaque.  That needs to change.

October 03

Better Know a Higher Ed System – Poland

So, you’re a new, post-communist country.  You have an undereducated population; your universities are filled with discredited Marxists; you’re broke, and your constitution says you can’t charge tuition fees.  What do you do?

Well, if it’s 1990, and you’re Poland, you do two things:

1)      Let the private sector rip.  Sure, private universities are low prestige, and they only do cheap subjects like business, law, and social sciences.  But since those were precisely the areas where the – traditionally high-prestige – public universities were discredited most, the move actually worked out pretty well.  Close to 300 private institutions popped up over the following 15 years, often teaching part-time and weekends to a mainly older clientele who had missed out on education during their “traditional” school years under the communists.  At its peak in 2008, privates educated a third of all the students in the country.

2)       Set up a Dual Track Tuition System.  This was bit trickier.  The government didn’t feel it could change the constitution, but it did feel it could re-interpret it a bit. So while it continued to fully-fund a little over half a million students per year (i.e. they attended tuition free), it permitted institutions to enrol hundreds of thousands more students on a private basis, call them “part-time students”, and claim that the constitutional restrictions on fees didn’t apply to this new type of student.  By 2000, the public system had over half a million students studying in this mode.

These revenue-generating solutions weren’t unique to Poland – most post-socialist countries tried one or the other of these policies.  But the combination of the two had particularly positive outcomes where Poland was concerned.  Enrolments more than tripled, from about 500,000 in the early 1990s to 1.9 million in 2007, and the system became significantly more diverse.

But post-2007, a more negative trend took hold.  The “deferred demand” – older students who missed out during the socialist period – finally became satiated, and the post-1989 baby-bust meant that demographic trends started to turn sharply negative. Put those together, and what you have is a system in which enrolment levels – and hence funding – are now in serious decline.

And that makes Poland a country worth watching. Dealing with demographic and financial decline is something many university systems – including parts of our own – will be dealing with over the next decade.  So far, much of the adjustment has fallen on private universities – turns out that the demand-absorbing institutions that do so well when enrolments are on the way up are also the first to get hit when enrolments reverse.  But as the crisis starts to hit public institutions, there will be a lot of lessons from which institutions right across the OECD can learn.

Page 16 of 62« First...10...1415161718...304050...Last »