HESA

Higher Education Strategy Associates

May 08

Naylor Report, Part II

Morning all.  Sorry about the service interruption.  Nice to be back.

So, I promised you some more thoughts about the Fundamental Science Review.  Now that I’ve lot of time to think about it, I think I’m actually surprised by what it doesn’t say, says and how many questions remain open.

What’s best about the report?  The history and most of the analysis are pretty good.  I think a few specific recommendations (if adopted) might actually be a pretty big deal – in particular the one saying that the granting councils should stop any programs forcing researchers to come up with matching funding, mainly because it’s a waste of everyone’s time.

What’s so-so about it?  The money stuff for a start.  As I noted in my last blog post, I don’t really think you can justify a claim to more money based on “proportion of higher ed investment research coming from federal government”.  I’m more sympathetic to the argument that there needs to be more funds, especially for early career researchers, but as noted back here it’s hard to argue simultaneously that institutions should have unfettered rights to hire researchers but that the federal government should be pick up responsibility for their career progression.

The report doesn’t even bother, really, to make the case that more money on basic research means more innovation and economic growth.  Rather, it simply states it, as if it were a fact (it’s not).  This is the research community trying to annex the term “innovation” rather than co-exist with it.  Maybe that works in today’s political environment; I’m not sure it improves overall policy-making.  In some ways, I think it would have been preferable to just say: we need so many millions because that’s what it takes to do the kind of first-class science we’re capable of.  It might not have been politic, but it would have had the advantage of clarity.

…and the Governance stuff?  The report backs two big changes in governance.  One is a Four Agency Co-ordinating Board for the three councils plus the Canada Foundation for Innovation (which we might as well now call the fourth council, provided it gets an annual budget as recommended here), to ensure greater cross-council coherence in policy and programs.  The second is the creation of a National Advisory Committee on Research and Innovation (NACRI) to replace the current Science, Technology and Innovation Council and do a great deal else besides.

The Co-ordinating committee idea makes sense: there are some areas where there would be clear benefits to greater policy coherence.  But setting up a forum to reconcile interests is not the same thing as actually bridging differences.  There are reasons – not very good ones, perhaps, but reasons nonetheless – why councils don’t spontaneously co-ordinate their actions; setting up a committee is a step towards getting them to do so, but success in this endeavour requires sustained good will which will not necessarily be forthcoming.

NACRI is a different story.  Two points here.  The first is that it is pretty clear that NACRI is designed to try to insulate the councils and the investigator-driven research they fund from politicians’ bright ideas about how to run scientific research.  Inshallah, but if politicians want to meddle – and the last two decades seem to show they want to do it a lot – then they’re going to meddle, NACRI or no.  Second, the NACRI as designed here is somewhat heavier on the “R” than on the “I”.  My impression is that as with some of the funding arguments, this is an attempt to hijack the Innovation agenda in Research’s favour.  I think a lot of people are OK with this because they’d prefer the emphasis to be on science and research rather than innovation but I’m not sure we’re doing long-term policy-making in the area any favours by not being explicit about this rationale.

What’s missing?  The report somewhat surprisingly punted what I expected to be a major issue: namely, the government’s increasing tendency over time to fund science outside the framework of the councils in such programs as the Canada Excellence Research Chairs (CERC) and the Canada First Research Excellence Fund (CFREF).  While the text of the report makes clear the authors’ have some reservations about these programs, the recommendations are limited to a “you should review that, sometime soon”.  This is too bad, because phasing out these kinds of programs would be an obvious way to pay for increase investigator-driven funding (though as Nassif Ghoussoub points out here  it’s not necessarily a quick solution because funds are already committed for several years in advance).  The report therefore seems to suggest that though it deplores past trends away from investigator-driven funding, it doesn’t want to see these recent initiatives defunded, which might be seen in government as “having your cake and eating it too”.

What will the long-term impact of the report be? Hard to say: much depends on how much of this the government actually takes up, and it will be some months before we know that.  But I think the way the report was commissioned may have some unintended adverse consequences.  Specifically, I think the fact that this review was set up in such a way as to exclude consideration of applied research – while perfectly understandable – is going to contribute to the latter being something of a political orphan for the foreseeable future.  Similarly, the fact that the report was done in isolation from the broader development of Innovation policy might seem like a blessing given the general ham-fistedness surrounding the Innovation file, in the end I wonder if the end result won’t be an effective division of policy, with research being something the feds pay universities do and innovation something they pay firms to do.  That’s basically the right division, of course, but what goes missing are vital questions about how to make the two mutually reinforcing.

Bottom line: it’s a good report.  But even if the government fully embraces the recommendations, there are still years of messy but important work ahead.

April 18

Naylor Report, Take 1

People are asking why I haven’t talked about the Naylor Report (aka the Review of Fundamental Science) yet.  The answer, briefly, is i) I’m swamped ii) there’s a lot to talk about in there and iii) I want to have some time to think it over.  But I did have some thoughts about chapter 3, where I think there is either an inadvertent error or the authors are trying to pull a fast one (and if it’s the latter I apologize for narking on them).  So I thought I would start there.

The main message of chapter 3 is that the government of Canada is not spending enough on inquiry-driven research in universities (this was not, incidentally, a question the Government of Canada asked of the review panel, but the panel answered it anyway).  One of the ways that the panel argues this point is that while Canada has among the world’s highest levels of Research and Development in the higher education sector – known as HERD if you’re in the R&D policy nerdocracy – most of the money for this comes from higher education institutions themselves and not the federal government.  This, that say, is internationally anomalous and a reason why the federal government should spend more money.

Here’s the graph they use to make this point:

Naylor Report

Hmm.  Hmmmmm.

So, there are really two problems here.  The first is that HERD can be calculated differently in different countries for completely rational reasons.  Let me give you the example of Canada vs. the US.  In Canada, the higher education portion of the contribution to HERD is composed of two things: i) aggregate faculty salaries times the proportion of time profs spend on research (Statscan occasionally does surveys on this – I’ll come back to it in a moment) plus ii) some imputation about unrecovered research overhead.  In the US, it’s just the latter.  Why?  Because the way the US collects data on HERD, the only faculty costs they capture are the chunks taken out of federal research grants.  Remember, in the US, profs are only paid 9 months per year and at least in the R&D accounts, that’s *all* teaching.  Only the pieces of research grant they take out as summer salary gets recorded as R&D expenditure (and also hence as a government-sponsored cost rather than a higher education-sponsored one).

But there’s a bigger issue here.  If one wants to argue that what matters is the ratio of federal portion of HERD to the higher-education portion of HERD, then it’s worth remembering what’s going on in the denominator.  Aggregate salaries are the first component.  The second component is research intensity, as measured through surveys.  This appears to be going up over time.  In 2000, Statscan did a survey which seemed to show the average prof spending somewhere between 30-35% of their time on research. A more recent survey shows that this has risen to 42%.  I am not sure if this latest co-efficient has been factored into the most recent HERD data, but when it does, it will show a major jump in higher education “spending” (or “investment”, if you prefer) on research, despite nothing really having changed at all (possibly it has been and it is what explains the bump seen in expenditures in 2012-13)

What the panel ends up arguing is for federal funding to run more closely in tune with higher education’s own “spending”.  But in practice what this means is: every time profs get a raise, federal funding would have to rise to keep pace.  Every time profs decide – for whatever reasons – to spend more time on research, federal funds should rise to keep pace.  And no doubt that would be awesome for all concerned, but come on.  Treasury Board would have conniptions if someone tried to sell that as a funding mechanism.

None of which is to say federal funding on inquiry-driven research shouldn’t rise.  Just to say that using data on university-funded HERD might not be a super-solid base from which to argue that point

April 17

British Columbia: Provincial Manifesto Analysis

On May 9th, our left-coasters go to the polls.  What are their options as far a post-secondary education is concerned?

Let’s start with the governing Liberals.  As is often the case with ruling parties, some of their promises are things that are both baked into the fiscal framework and will take longer than one term to complete (e.g. “complete re-alignment of $3 billion in training funds by 2024”), or are simply re-announcements of previous commitments (page 85-6 of the manifesto appears to simply be a list of all the SIF projects the province already agreed to co-fund), or take credit for things that will almost certainly happen anyways (“create 1000 new STEM places”…. in a province which already has 55,000 STEM seats and where STEM spots have been growing at a rate of about 1700/year anyway…interestingly the Liberals didn’t even bother to cost that one…)

When you throw those kinds of promises away, what you are left with is a boatload of micro-promises, including: i) making permanent the current BC Training Tax Credit for employers, ii) creating a new Truck Logger training credit (yes, really), iii) spending $10M on open textbooks over the next 4 years, iv) reducing interest rates on BC student loans to prime, v) making minor improvements to student aid need assessment, vi) providing a 50% tuition rebate to Armed Forces Veterans, vii) creating a centralized province-wide admission system and viii) allowing institutions to build more student housing (currently they are restricted from doing so because any institutional debt is considered provincial debt and provincial debt is more or less verboten…so this is a $0 promise just to relax some rules).  There’s nothing wrong with any of those, of course, but only the last one is going to make any kind of impact and as a whole it certainly doesn’t add up to a vision.  And not all of this appears to be new money: neither the student loan changes nor the centralized application system promises are costed, which suggests funds for these will cannibalized from elsewhere within the system.  The incremental cost of the remaining promises?  $6.5 million/year.  Whoop-de-do.  Oh, and they’re leaving the 2% cap on tuition rises untouched.

What about the New Democrats?  Well, they make two main batches of promises.  One is about affordability, and consists of matching the Liberal pledge on a tuition cap, slightly outdoing them on provincial student loan interest (eliminating it on future and past loans, which is pretty much the textbook definition of “windfall gains”), and getting rid of fees for Adult Basic Education and English as a Second Language Program (which, you know, GOOD).  There’s also an oddly-worded pledge to provide a $1,000 completion grant “for graduates of university, college and skilled trades programs to help pay down their debt when their program finishes”: based on the costing and wording, I think that means the grant is restricted to those who have provincial student loans.

The NDP also has a second batch of policies around research – $50M over two years to create a graduate scholarship fund and $100M (over an unspecified period, but based on the costing, it’s more than two years) to fund expansion of technology-related programs in BC PSE institutions.  There is also an unspecified (and apparently uncosted) promise to expand tech-sector co-op programs.  Finally, they are also promising to match the Liberals on the issue of allowing universities to build student housing outside of provincial controls on capital spending.

Finally, there are the Greens, presently running at over 20% in the polls and with a real shot at achieving a significant presence in the legislature for the first time.  They have essentially two money promises: one, “to create a need-based grant system” (no further details) and two, an ungodly bad idea to create in BC the same graduate tax credit rebate that New Brunswick, Nova Scotia and now Manitoba all have had a shot at (at least those provinces had the excuse that they were trying to combat out-migration; what problem are the BC Greens trying to solve?).

Hilariously, the Green’s price-tag for these two items together is…$10 million.  Over three years.  Just to get a sense of how ludicrous that is, the Manitoba tax credit program cost $55 million/year in a province a quarter the size.  And within BC, the feds already give out about $75M/year in up-front grants.  So I think we need to credit the Greens with being more realistic than their federal cousins (remember the federal green manifesto?  Oy.), but they have a ways to go on realistic budgeting.

(I am not doing a manifesto analysis for the BC Conservatives because a) they haven’t got one and b) I’ve been advised that if they do release one it will probably be printed in comic sans.)

What to make of all this?  Under Gordon Campbell, the Liberals were a party that “got” post-secondary education and did reasonably well by it; under Christy Clark it’s pretty clear PSE can at best expect benign neglect.  The Greens’ policies focus on price rather than quality, one of their two signature policies is inane and regressive, and their costing is off by miles.

That leaves the NDP.  I wouldn’t say this is a great manifesto, but it beats the other two.  Yeah, their student aid policies are sub-optimally targeted (they’re all for people who’ve already finished their programs, so not much access potential), but to their credit they’ve avoided going into a “tuition freezes are magic!” pose.  Alone among the parties, they are putting money into expansion and graduate studies and even if you don’t like the tech focus, that’s still something.

But on the whole, this is a weak set of manifestos.  I used to say that if I was going to run a university anywhere I’d want it to be In British Columbia.  It’s the least-indebted jurisdiction in Canada, has mostly favourable demographics, has easy access from both Asia (and its students) and from the well-off American northwest.  And it’s got a diversified set of institutions which are mostly pretty good at what they do.  Why any province would want to neglect a set of institutions like that is baffling; but based on these manifestos it seems clear that BC’s PSE sector isn’t getting a whole lot of love from any of the parties.  And that’s worrying for the province’s long-term future.

April 12

Access: A Canadian Success Story

Statscan put out a very important little paper on access to post-secondary education on Monday.  It got almost zero coverage despite conclusively putting to bed a number of myths about fees and participation, so I’m going to rectify that by explaining it to y’all in minute detail.

To understand this piece, you need to know something about a neat little Statscan tool called the Longitudinal Administrative Database (LAD).  Every time someone files an income tax form for the first time, LAD randomly selects one in five of them and follows them for their entire lifetime.  If at the time someone first files a tax return they have the same address as someone who is already in the LAD (and who is the right age to have a kid submitting a tax form for the first time), one can make a link between a parent and child.  In other words, for roughly 4% of the population, LAD has data on both the individual and the parent, which allows some intergenerational analysis.  Now, because we have tax credits for post-secondary education (PSE), tax data allows us to know who went to post-secondary education and who did not (it can’t tell us what type of institution they attended, but we know that they did attend PSE).  And with LAD’s backward link to parents, it means we can measure attendance by parental income.

Got that?  Good.  Let’s begin.

The paper starts by looking at national trends in PSE participation (i.e. university and college combined) amongst 19 year-olds since 2001, by family income quintile.  Nationally, participation rates rose by just over 20%, from 52.6% to 63.8%.  They also rose for every quintile.  Even for youth the lowest income quintile, participation is now very close to 50%.

 Figure 1: PSE enrolment rates by Income Quintile, Canada 2001-2014

PSE by Income Quintile

This positive national story about rates by income quintile is somewhat offset by a more complex set of results for participation rates by region.  In the 6 eastern provinces, participation rate rose on average by 13.6 percentage points; in the four western provinces, it rose by just 2.8 percentage points (and in Saskatchewan it actually fell slightly).  The easy answer here is that it’s about the resource boom, but if that were the case, you’d expect to see a similar pattern in Newfoundland, and a difference within the west between Manitoba and the others.  In fact, neither is true: Manitoba is slightly below the western average and Newfoundland had the country’s highest PSE participation growth rate.

 Figure 2: PSE Participation rates by region, 2002-2014

PSE by region

(actually, my favourite part of figure 2 is data showing that 19 year-old Quebecers – who mostly attend free CEGEPs, have a lower part rate than 19 year-old Ontarians who pay significant fees, albeit with benefit of a good student aid system.)

But maybe the most interesting data here is with respect to the closing of the gap between the top and bottom income quintile.  Figure 3 shows the ratio of participation rates of students from the bottom quintile (Q1) to those from the top quintile (Q5), indexed to the ratio as it existed in 2001, for Canada and selected provinces.  So a larger number means Q1 students are becoming more likely to attend PSE relative to Q5s and a smaller number means they are becoming less likely.  Nationally, the gap has narrowed by about 15%, but the interesting story is actually at the provincial level.

Figure 3: Ratio of Q1 participation rates to Q5 participation rates, Canada and selected provinces, 2001-2014

Q1 to Q5 participation rates

At the top end, what we find is that Newfoundland and Ontario are the provinces where the gap between rich and poor has narrowed the most.  Given that one of these provinces has the country’s highest tuition and the other the lowest, I think we can safely rule out tuition, on its own, as a plausible independent variable (especially as Quebec, the country’s other low-tuition province, posted no change over the period in question).  At the bottom end, we have the very puzzling case of Saskatchewan, where inequality appears to have got drastically worse over the past decade or so.  And again, though it’s tempting to reach for a resource boom explanation, nothing similar happened in Alberta so that’s not an obvious culprit.

Anyways, here’s why this work is important.  For decades, the usual suspects (the Canadian Federation of Students, the Canadian Center for Policy Alternatives) have blazed with self-righteousness about the effects of higher tuition and higher debts (debt actually hasn’t increased that much in real terms since 2000, but whatever).  But it turns out there are no such effects.  Over a decade of tuition continuing to increase slowly and average debts among those who borrow of over $25,000 and it turns out not only did participation rates increase, but participation rates of the poorest quintile rose fastest of all.

And – here’s the kicker – different provincial strategies on tuition appear to have had diddly-squat to do with it.  So the entire argument the so-called progressives make in favour of lower tuition is simply out the window.  That doesn’t mean they will change their position, of course.  They will continue to talk about the need to eliminate student debt because it is creating inequality (it’s actually the reverse, but whatever).  But of course, this make the free-tuition position even sillier.  If the problem is simply student debt, then why advocate a policy in which over half your dollars go to people who have no debt?

It’s the Ontario result in particular that matters: it proves that a high-tuition/high-aid policy is compatible with a substantial widening of access.  And that’s good news for anyone who wants smart funding policies in higher education.

April 11

Populists and Universities, Round Two

There is a lot of talk these days about populists and universities.  There are all kinds of thinkpieces about “universities and Trump”, “universities and Brexit”, etc.  Just the other day, Sir Peter Scott delivered a lecture on “Populism and the Academy” at OISE, saying that over the past twelve months it has sometimes felt like universities were “on the wrong side of history”.

Speaking of history, one of the things that I find a bit odd about this whole discussion is how little the present discussion is informed by the last time this happened – namely, the populist wave of the 1890s in the United States.  Though the populists never took power nationally, they did capture statehouses in many southern and western states, most of whom had relatively recently taken advantage of the Morrill Act to establish important state universities.  And so we do have at least some historical record to work from – one that was very ably summarized by Scott Gelber in his book The University and the People.

The turn-of-the-20th-century populists wanted three things from universities. First, they wanted them to be accessible to farmers’ children – by which they meant both laxer admissions standards and “cheap”.  That didn’t necessarily mean they wanted to increase expenditures on university budgets substantially (though in practice universities did OK under populist governors and legislators); what it meant was they wanted tuition to remain low and if that entailed universities having to tighten their belts, so be it.  And the legacy of the populists lives on today: average state tuition in the US still has a remarkable correlation to William Jennings Bryan’s share of the vote in the 1896 Presidential election.

 

Fig 1: 2014-15 In-State Tuition Versus William Jennings Bryan’s Vote Share in 1896

Populism Graph

 

The second thing populists wanted was more “practical” education.  They were not into learning for the sake of learning, they were into learning for the sake of material progress and making life easier for workers and farmers; in many ways, one could argue that their attitude about the purpose of higher education was pretty close to that of Deng/Jiang-era China.  And to some extent they were pushing on an open door because the land-grant universities – particularly the A&Ms – were already supposed to have that mandate.

But there was a tension in the populists’ views on curriculum.  They weren’t crazy about law and humanities programs at state universities (too much useless high culture that divided the masses from the classes), but they did grasp that an awful lot of people who were successful in politics had gone through law and humanities programs and – so to speak – learned the tricks of the trade there (recall that rhetoric was one of the seven Liberal arts which still played a role in 19th century curricula).  And so, there was also concern that if public higher education were made too vocational, its beneficiaries would still be at a disadvantage politically.  There were various solutions to this problem, not all of which were to the benefit of humanities subjects, but the key point was this: universities should remain places where leaders are made.  If that meant reading some Marcus Aurelius, so be it: universities were a ladder into the ruling class, and the populists wanted to make sure their kids were on it.

And here, I think is where times have really changed. The new populists are, in a sense, more Gramscian than their predecessors.  They get that universities are ladders to power for individuals, but they also understand that the cultural function of universities goes well beyond that.  Universities are – perhaps even more so than the entertainment industry – arbiters of acceptable political discourse.  They are where the hegemonic culture is made.  And however much they may want their own kids to get a good education, today’s populists really want to smash those sources of cultural hegemony.

This is, obviously, not good for universities.  We can – as Peter Scott suggested – spend more time trying to make universities “relevant” to the communities that surround them.  Nothing wrong with that.  We can keep plugging away at access: that’s a given no matter who is in power.  But on the core issue of the culture of universities, there is no compromise.  Truth and open debate matter.  A commitment to the scientific method and free inquiry matter.  Sure, universities can exist without these things: see China, or Saudi Arabia.  But not here.  That’s what makes our universities different and, frankly, better.

No compromise, no pasarán.

April 10

Evaluating Teaching

The Ontario Confederation of University Faculty Associations (OCUFA) put out an interesting little piece the week before last summarizing the problems with student evaluations of teaching.  It contains reasonable summary of the literature and I thought some of it would be worth looking at here.

We’ve known for awhile now that the results of student evaluations are statistically biased in various ways.  Perhaps the most important way they are biased is that professors who mark more leniently get higher rankings from their students.  There is also the issue of what appears to be discrimination: female professors and visible minority professors tend to get lower ratings than white men.  And then there’s the point that OCUFA makes with respect to the comments section of these evaluations being a hotbed of statements which amount to harassment.  These points are all well worth making.

One might well ask: given that we all know about the problems with teaching evaluations, why in God’s name do institutions still use them?  Fair question.  Three hypotheses:

  1. Despite flaws in the statistical measurement of teaching, the comments actually do provide helpful feedback, which professors use to improve their teaching.
  2. When it comes to pay and promotion, research is weighted far more highly than teaching, so unless someone completely tanks their teaching evals – and by tanking I mean doing so much below par that it can’t reasonably be attributed to one of the biases listed above – they don’t really matter all that much (note: while this probably holds for tenured and tenure-track profs, I suspect the stakes are higher for sessionals).
  3. No matter how bad a measurement instrument they are, the idea that one wouldn’t treat student opinions seriously is totally untenable, politically.

In other words, there are benefits despite the flaws, the consequences of flaws might not be as great as you think, and to put it bluntly, it’s not clear what the alternative is.  At least with student evaluations you can maintain the pretense that teaching matters to pay and promotion.  Kill those, and what have you got?  People already think professors don’t care enough about teaching.  Removing the one piece of measurement and accountability for teaching that exists in the system – no matter how flawed – is simply not on.

That’s not to say there aren’t alternatives to measuring teaching.  One could imagine a system of peer evaluation, where professors rate one another.  Or one could imagine a system where the act of teaching and the act of marking are separated – and teachers are rated on how well their students perform.  It’s not obvious to me that professors would prefer such a system.

Besides, it’s not as though the current system can’t be redeemed.  Solutions exist.  If we know that easy markers get systematically better ratings, then normalize ratings based on the class average mark.  Same thing for gender and race: if you know what the systematic bias looks like, you can correct for it.  And as for ugly stuff in the comments section, it’s hardly rocket science to have someone edit the material for demeaning comments prior to handing it to the prof in question.

There’s one area where the OCUFA commentary goes beyond the evidence however, and that’s in trying to translate the findings of student teaching evaluations (ie. how did Professor X do in Class Y) to surveys of institutional satisfaction.  The argument they make here is that because the one is known to have certain biases, the other should never be used to make funding decisions.  Now, without necessarily endorsing the idea of using student satisfaction as a funding metric, this is terrible logic. The two types of questionnaires are entirely different, ask different questions, and simply are not subject to the same kinds of biases.  It is deeply misleading to imply otherwise.

Still, all that said, it’s good that this topic is being brought into the spotlight.   Teaching is the most important thing universities do.  We should have better ways of measuring its impact.  If OCUFA can get us moving along that path, more power to them.

April 07

CEU and Academic Freedom

Let me tell you about this university in Europe. It’s a small, private institution in which specializes in the humanities and social sciences. It’s run on western lines, and is one of the best institutions in the country for research. And now the Government is trying to shut it down, mainly because it finds the institution politically troublesome.

Think I’m talking about Central European University (CEU) in Budapest? Well, I’m not. I’m talking about the European University of Saint Petersburg (EUSP), which has had its license to operate revoked mainly because of its program of studies on gender and LGBTQ issues. And I’m kind of interested in why we focus on one and not the other.

First, let’s get down to brass tacks about what’s going on at Central European University (CEU). This Budapest-based institution, founded by George Soros 25 years ago during the transition away from socialism, is a gem in the region. No fields of study were more corrupted by four decades of communist rule than the Social Sciences, and CEU has done a stellar job not just in becoming a top-notch institution in its own right, but in becoming a bastion of free thought in the region.

The Hungarian government, which not to put too fine a point on it is run by a bunch of nationalist ruffians, has decided to try to restrict CEU’s operations by legislating a set of provisions which in theory apply to all universities but in practice apply only to CEU. The most important of these provisions basically says that institutions which offer foreign-accredited degrees (CEU is accredited by the Middle States Commission, which handles most accreditation of overseas institutions) have to have a campus in their “home country” in order to be able to operate in Hungary and be subject to a formal bilateral agreement between the “home” government and the Hungarian one (CEU does business on the basis of an international agreement, but it’s between Hungary and the State of New York, not the USA). There is, as CEU’s President Michael Ignatieff (yes, him) says, simply no benefit to CEU to do this: it is simply a tactic to raise CEU’s cost of doing business.

So, as you’ve probably gathered by now, this is not an attack on academic freedom the way we would use that term in the west. We’re not talking about chilling individual scholars here. The ruling Hungarian coalition couldn’t care less what gets taught at CEU: what bothers them is that the institution exists to support liberalism and pluralism. What we’re talking about is something much broader than just academic freedom; it’s about weakening independent institutions in an illiberal state. It’s also about anti-semitism (the right wing in Hungary routinely refer to CEU as “Soros University” so as to remind everyone of the institution’s Jewish founder). Yet somehow, the rallying cry is “academic freedom”, when plain old freedom and liberalism would be much more accurate.

I wonder why we don’t hear cries for academic freedom for EUSP, where in fact the academic angle – the university’ research program in gender and queer studies being targeted by a homophobic state – is much more clear cut. Is it because we reckon Russia is beyond salvation and Hungary is not? That would certainly explain our anemic reaction to increasing restrictions on academic freedom in China (where criticism of government is fine, but criticism of the Communist Party is likely to end extremely badly). It would explain why Turkey has faced essentially no academic consequences (boycotts, etc) for its ongoing purge of academic leaders.

I don’t mean to play the whole “why-do-we-grieve-bombings-in-Paris-but-not-Beirut” game. I get it, some places matter more in the collective imagination than others. But I actually think that CEU’s decision to portray this as an academic freedom issue rather that one of freedom tout court plays a role here. We can get behind calls for academic freedom (particularly when they are articulated by English-speaking academics) because academic freedom is something that is everywhere and always being tested around the edges (yeah, McGill, I’m looking at you). But calls for just plain old “freedom”? or “Liberalism”? The academy seems to get po-mo ickies about those.

Frankly, we need to get less squeamish about this. Academic freedom as we know it in the west does not exist in a vacuum. It exists because of underlying societal commitment to pluralism and liberalism. If we only try to defend the niche freedom without defending the underlying values, we will fail.

So, by all means, let’s support CEU. But let’s not do it just for academic freedom. Let’s do it for better reasons.

April 06

Lessons from Mid-Century Soviet Higher Education

I’ve been reading Benjamin Tromly’s excellent book Making the Soviet Intelligentsia: Universities and Intellectual Life under Stalin and Khrushchev. It’s full of fascinating tidbits with surprising relevance to higher education dilemmas of the here and now. To wit:

1) Access is mostly about cultural capital.

There were times and places where communists waged war on the educated, because the educated were by definition bourgeois. In China during the cultural revolution, or in places like Poland and East Germany after WWII, admission to higher education was effectively restricted to the children of “politically reliable classes”, meaning workers and peasants (if you wondered why urban Chinese parents are so OK with the punishing gaokao system, it’s because however insane and sadistic it seems, it’s better than what came before it).

But in the postwar Soviet Union, things were very different. Because of the purges of the 1930s, a whole class of replacement white-collar functionaries had emerged, loyal to Stalin, and he wanted to reward them. This he did by going entirely the opposite direction to his east European satellite regimes and making access to higher education purely about academic “merit” as measured by exams and the like. The result? By 1952, in a regime with free tuition and universal stipends for students, roughly 80% of students had social origin in the professional classes (i.e. party employees, engineers, scientists, teachers and doctors). The children of workers and farmers, who made up the overwhelming majority of the country’s population, had to make do with just the other 20%.

2)  The middle-class will pull whatever strings necessary to maintain their kids’ class position.

Khrushchev was not especially happy about the development of a hereditary intelligentsia, which made itself out to morally superior because of its extra years of education. Basically, he felt students were putting on airs and needed to be reminded that all that training they were receiving was in order to serve the working class, not to stand above it. And so, in 1958, he tried to shake things up by slapping a requirement on university admissions that reserved 80 per cent of places to individuals who has spent two years in gainful employment. This, he felt, would transform the student body and make it more at one with the toiling masses.

This has some predictably disastrous effects on admissions, as making people spend two years out of school before taking entrance exams tends to have fairly calamitous effects on exam results. But while the measure did give a big leg up to the children of workers and peasants (their numbers at universities doubled after the change, though many dropped out soon afterwards due to inadequate preparation), what was interesting was how far the Moscow/Leningrad elites would go to try to rig the system in their children’s favour. Some would try to get their children into two year “mental labor” jobs such as working as a lab assistant; others would find ways to falsify their children’s “production records”. Eventually the policy was reversed because the hard science disciplines argued the new system was undermining their ability to recruit the best and brightest. But in the meantime, the intelligentsia managed to keep their share of enrolments above 50%, which was definitely not what Khrushchev wanted.

3) Institutional prestige is not a function of neo-liberalism.

We sometimes hear about how rankings and institutional prestige are all a product of induced competition, neo-liberalism, yadda yadda. Take one look at the accounts of Soviet students and you’ll know that’s nonsense. Prestige hierarchies exist everywhere, and in the mid-century Soviet Union, everyone knew that the place to study was Lomonosov Moscow State University, end of story.

Remember Joseph Fiennes’ final monologue in Enemy at the Gates?  “In this world, even a Soviet one, there will always be rich and poor. Rich in gifts, poor in gifts…”. It’s true of universities too. Pecking orders exist regardless of regime type.

4) The graduate labour market is about self-actualization

One of the big selling points of the Soviet higher education system was the claim that “all graduates received a job at the end of their studies”. To the ears of western students from the 1970s onwards, who faced the potential of unemployment or underemployment after graduation, that sounded pretty good.

Except that it didn’t to Soviet students. A lot of those “guaranteed” jobs either took students a long way from their studies they loved (“I trained to be a nuclear scientist and now you want me to teach secondary school?”) or the big cities they loved (“I’m being sent to which Siberian oblast”?) or both. And failure to accept the job that was assigned was – in theory at least – punishable by imprisonment.

Yet despite the threat of punishment, Soviet students found a way to evade the rules. Getting married (preferably to someone from Moscow) was a good way to avoid being sent to the provinces. Many simply deserted their posts and found work elsewhere. And some – get this – enrolled in grad school to avoid a job they didn’t want (would never happen here of course).

The point here being: people have dreams for themselves, and these rarely match up neatly with the labour market, whether that market is free or planned. There’s no system in the world that can satisfy everyone; at some point, all systems have to disappoint at least some people. But that doesn’t mean they will take their disappointment lying down. Dreams are tough to kill.

 

April 05

Student/Graduate Survey Data

This is my last thought on data for awhile, I promise.  But I want to talk a little bit today about what we’re doing wrong with the increasing misuse of student and graduate surveys.

Back about 15 years ago, the relevant technology for email surveys became sufficiently cheap and ubiquitous that everyone started using them.  I mean, everyone.  So what has happened over the last decade and a half has been a proliferation of surveys and with it – surprise, surprise – a steady decline in survey response rates.  We know that these low-participation surveys (nearly all are below 50%, and most are below 35%) are reliable, in the sense that they give us similar results year after year.  But we have no idea whether they are accurate, because we have no way of dealing with response bias.

Now, every once in awhile you get someone with the cockamamie idea that the way to deal with low response rates is to expand the sample.  Remember how we all laughed at Tony Clement when he claimed  the (voluntary) National Household Survey would be better than the (mandatory) Long-Form Census because the sample size would be larger?  Fun times.  But this is effectively what governments do when they decide – as the Ontario government did in the case of its sexual assault survey  – to carry out what amounts to a (voluntary) student census.

So we have a problem: even as we want to make policy on a more data-informed basis, we face the problem that the quality of student data is decreasing (this also goes for graduate surveys, but I’ll come back to those in a second).  Fortunately, there is an answer to this problem: interview fewer students, but pay them.

What every institution should do – and frankly what every government should do as well – is create a balanced, stratified panel of about 1000 students.   And it should pay them maybe $10/survey to complete surveys throughout the year.  That way, you’d have good response rates from a panel that actually represented the student body well, as opposed to the crapshoot which currently reigns.  Want accurate data on student satisfaction, library/IT usage, incidence of sexual assault/harassment?  This is the way to do it.  And you’d also be doing the rest of your student body a favour by not spamming them with questionnaires they don’t want.

(Costly?  Yes.  Good data ain’t free.  Institutions that care about good data will suck it up).

It’s a slightly different story for graduate surveys.  Here, you also have a problem of response rates, but with the caveat that at least as far as employment and income data is concerned, we aren’t going to have that problem for much longer.  You may be aware of Ross Finnie’s work  linking student data to tax data to work out long-term income paths.  An increasing number of institutions are now doing this, as indeed is Statistic Canada for future versions of its National Graduate Survey (I give Statscan hell, deservedly, but for this they deserve kudos).

So now that we’re going to have excellent, up-to-date data about employment and income data we can re-orient our whole approach to graduate surveys.  We can move away from attempted censuses with a couple of not totally convincing questions about employment and re-shape them into what they should be: much more qualitative explorations of graduate pathways.  Give me a stratified sample of 2000 graduates explaining in detail how they went from being a student to having a career (or not) three years later rather than asking 50,000 students a closed-ended question about whether their job is “related” to their education every day of the week.  The latter is a boring box-checking exercise: the former offers the potential for real understanding and improvement.

(And yeah, again: pay your survey respondents for their time.  The American Department of Education does it on their surveys and they get great data.)

Bottom line: We need to get serious about ending the Tony Clement-icization of student/graduate data. That means getting serious about constructing better samples, incentivizing participation, and asking better questions (particularly of graduates).  And there’s no time like the present. If anyone wants to get serious about this discussion, let me know: I’d be overjoyed to help.

April 04

How to Think about “Better Higher Education Data”

Like many people, I am in favour of better data on the higher education sector.  But while this call unites a lot of people, there is remarkably little thinking that goes into the question of how to achieve it.  This is a problem, because unless we arrive at a better common understanding of both the cost and the utility of different kinds of data, we are going to remain stuck in our current position.

First, we need to ask ourselves what data we need to know versus what kinds of data it would be nice to know.  This is of course, not a value-free debate: people can have legitimate differences about what data is needed and what is not.  But I think a simple way to at least address this problem is to ask of any proposed data collection: i) what questions does this data answer?  And ii) what would we do differently if we knew the answer to that question?  If the answer to either question is vague, maybe we should put less emphasis on the data.

In fact, I’d argue that most of the data that institutions and government are keen to push out are pretty low on the “what would we do differently” scale.  Enrolments (broken down in various ways), funding, etc – those are all inputs.  We have all that data, and they’re important: but they don’t tell you much about what’s going right or going wrong in the system.  They tell you what kind of car you’re driving, but not your speed or direction.

What’s the data we need?  Outputs.  Completion rates.  Transitions rates.  At the program, institutional and system level. Also outcomes: what happens to graduates?  How quickly do they transition to permanent careers. And do they feel their educational career was a help or a hindrance to getting the career they wanted?  And yeah, by institution and (within reason) by program.  We have some of this data, in some parts of the country (BC is by far the best at this) but even here we rely far too heavily on some fairly clumsy quantitative indicators and not enough on qualitative information like: “what do graduates three years out think the most/least beneficial part of their program was?”

Same thing on research.  We need better data on PhD outcomes.  We need a better sense of the pros and cons of more/smaller grants versus fewer/larger ones.  We need a better sense of how knowledge is actually transferred from institutions to firms, and what firms do with the knowledge in terms of turning them into product or process innovations.  Or, arguably, on community impact (though there I’m not completely convinced we even know yet what the right questions are).

Very few of these questions can be answered through big national statistical datasets about higher education.  Even when it comes to questions like access to education, it’s probably far more important that we have more data on people who do not go to PSE than to have better or speedier data access to enrolment data.  And yet we have nothing on this, and haven’t had since the Youth in Transition Survey ended.  But that’s expensive and could cost upwards of ten millions of dollars.  Smaller scale, local studies could be done for a fraction of the cost – if someone were willing to fund them.

There are actually enormous data resources available at the provincial and institutional level to work from.  Want to find individuals who finished high school but didn’t attend PSE?  Most provinces now have individual student numbers which could be used to identify these individuals and bring them into a study.  Want to look at program completion rates?  All institutions have the necessary data: they just choose not to release it.

All of which is to say: we could have a data revolution in this country.  But it’s not going to come primarily from better national data sets run by Statistics Canada.  It’s going to come from lots of little studies creating a more data-rich environment for decision-making.  It’s going to come from governments and institutions changing their data mindsets from one where hoarding data is the default to one where publishing is the default.  It’s going to come from switching focus from inputs to outputs.

Even more succinctly: we are all part of the problem.  We are all part of the solution.  Stop waiting for someone else to come along and fix it.

Page 2 of 11412345...102030...Last »