HESA

Higher Education Strategy Associates

Author Archives: Alex Usher

April 18

Naylor Report, Take 1

Naylor Report, Take 1

People are asking why I haven’t talked about the Naylor Report (aka the Review of Fundamental Science) yet.  The answer, briefly, is i) I’m swamped ii) there’s a lot to talk about in there and iii) I want to have some time to think it over.  But I did have some thoughts about chapter 3, where I think there is either an inadvertent error or the authors are trying to pull a fast one (and if it’s the latter I apologize for narking on them).  So I thought I would start there.

The main message of chapter 3 is that the government of Canada is not spending enough on inquiry-driven research in universities (this was not, incidentally, a question the Government of Canada asked of the review panel, but the panel answered it anyway).  One of the ways that the panel argues this point is that while Canada has among the world’s highest levels of Research and Development in the higher education sector – known as HERD if you’re in the R&D policy nerdocracy – most of the money for this comes from higher education institutions themselves and not the federal government.  This, that say, is internationally anomalous and a reason why the federal government should spend more money.

Here’s the graph they use to make this point:

Naylor Report

Hmm.  Hmmmmm.

So, there are really two problems here.  The first is that HERD can be calculated differently in different countries for completely rational reasons.  Let me give you the example of Canada vs. the US.  In Canada, the higher education portion of the contribution to HERD is composed of two things: i) aggregate faculty salaries times the proportion of time profs spend on research (Statscan occasionally does surveys on this – I’ll come back to it in a moment) plus ii) some imputation about unrecovered research overhead.  In the US, it’s just the latter.  Why?  Because the way the US collects data on HERD, the only faculty costs they capture are the chunks taken out of federal research grants.  Remember, in the US, profs are only paid 9 months per year and at least in the R&D accounts, that’s *all* teaching.  Only the pieces of research grant they take out as summer salary gets recorded as R&D expenditure (and also hence as a government-sponsored cost rather than a higher education-sponsored one).

But there’s a bigger issue here.  If one wants to argue that what matters is the ratio of federal portion of HERD to the higher-education portion of HERD, then it’s worth remembering what’s going on in the denominator.  Aggregate salaries are the first component.  The second component is research intensity, as measured through surveys.  This appears to be going up over time.  In 2000, Statscan did a survey which seemed to show the average prof spending somewhere between 30-35% of their time on research. A more recent survey shows that this has risen to 42%.  I am not sure if this latest co-efficient has been factored into the most recent HERD data, but when it does, it will show a major jump in higher education “spending” (or “investment”, if you prefer) on research, despite nothing really having changed at all (possibly it has been and it is what explains the bump seen in expenditures in 2012-13)

What the panel ends up arguing is for federal funding to run more closely in tune with higher education’s own “spending”.  But in practice what this means is: every time profs get a raise, federal funding would have to rise to keep pace.  Every time profs decide – for whatever reasons – to spend more time on research, federal funds should rise to keep pace.  And no doubt that would be awesome for all concerned, but come on.  Treasury Board would have conniptions if someone tried to sell that as a funding mechanism.

None of which is to say federal funding on inquiry-driven research shouldn’t rise.  Just to say that using data on university-funded HERD might not be a super-solid base from which to argue that point

April 12

Access: A Canadian Success Story

Statscan put out a very important little paper on access to post-secondary education on Monday.  It got almost zero coverage despite conclusively putting to bed a number of myths about fees and participation, so I’m going to rectify that by explaining it to y’all in minute detail.

To understand this piece, you need to know something about a neat little Statscan tool called the Longitudinal Administrative Database (LAD).  Every time someone files an income tax form for the first time, LAD randomly selects one in five of them and follows them for their entire lifetime.  If at the time someone first files a tax return they have the same address as someone who is already in the LAD (and who is the right age to have a kid submitting a tax form for the first time), one can make a link between a parent and child.  In other words, for roughly 4% of the population, LAD has data on both the individual and the parent, which allows some intergenerational analysis.  Now, because we have tax credits for post-secondary education (PSE), tax data allows us to know who went to post-secondary education and who did not (it can’t tell us what type of institution they attended, but we know that they did attend PSE).  And with LAD’s backward link to parents, it means we can measure attendance by parental income.

Got that?  Good.  Let’s begin.

The paper starts by looking at national trends in PSE participation (i.e. university and college combined) amongst 19 year-olds since 2001, by family income quintile.  Nationally, participation rates rose by just over 20%, from 52.6% to 63.8%.  They also rose for every quintile.  Even for youth the lowest income quintile, participation is now very close to 50%.

 Figure 1: PSE enrolment rates by Income Quintile, Canada 2001-2014

PSE by Income Quintile

This positive national story about rates by income quintile is somewhat offset by a more complex set of results for participation rates by region.  In the 6 eastern provinces, participation rate rose on average by 13.6 percentage points; in the four western provinces, it rose by just 2.8 percentage points (and in Saskatchewan it actually fell slightly).  The easy answer here is that it’s about the resource boom, but if that were the case, you’d expect to see a similar pattern in Newfoundland, and a difference within the west between Manitoba and the others.  In fact, neither is true: Manitoba is slightly below the western average and Newfoundland had the country’s highest PSE participation growth rate.

 Figure 2: PSE Participation rates by region, 2002-2014

PSE by region

(actually, my favourite part of figure 2 is data showing that 19 year-old Quebecers – who mostly attend free CEGEPs, have a lower part rate than 19 year-old Ontarians who pay significant fees, albeit with benefit of a good student aid system.)

But maybe the most interesting data here is with respect to the closing of the gap between the top and bottom income quintile.  Figure 3 shows the ratio of participation rates of students from the bottom quintile (Q1) to those from the top quintile (Q5), indexed to the ratio as it existed in 2001, for Canada and selected provinces.  So a larger number means Q1 students are becoming more likely to attend PSE relative to Q5s and a smaller number means they are becoming less likely.  Nationally, the gap has narrowed by about 15%, but the interesting story is actually at the provincial level.

Figure 3: Ratio of Q1 participation rates to Q5 participation rates, Canada and selected provinces, 2001-2014

Q1 to Q5 participation rates

At the top end, what we find is that Newfoundland and Ontario are the provinces where the gap between rich and poor has narrowed the most.  Given that one of these provinces has the country’s highest tuition and the other the lowest, I think we can safely rule out tuition, on its own, as a plausible independent variable (especially as Quebec, the country’s other low-tuition province, posted no change over the period in question).  At the bottom end, we have the very puzzling case of Saskatchewan, where inequality appears to have got drastically worse over the past decade or so.  And again, though it’s tempting to reach for a resource boom explanation, nothing similar happened in Alberta so that’s not an obvious culprit.

Anyways, here’s why this work is important.  For decades, the usual suspects (the Canadian Federation of Students, the Canadian Center for Policy Alternatives) have blazed with self-righteousness about the effects of higher tuition and higher debts (debt actually hasn’t increased that much in real terms since 2000, but whatever).  But it turns out there are no such effects.  Over a decade of tuition continuing to increase slowly and average debts among those who borrow of over $25,000 and it turns out not only did participation rates increase, but participation rates of the poorest quintile rose fastest of all.

And – here’s the kicker – different provincial strategies on tuition appear to have had diddly-squat to do with it.  So the entire argument the so-called progressives make in favour of lower tuition is simply out the window.  That doesn’t mean they will change their position, of course.  They will continue to talk about the need to eliminate student debt because it is creating inequality (it’s actually the reverse, but whatever).  But of course, this make the free-tuition position even sillier.  If the problem is simply student debt, then why advocate a policy in which over half your dollars go to people who have no debt?

It’s the Ontario result in particular that matters: it proves that a high-tuition/high-aid policy is compatible with a substantial widening of access.  And that’s good news for anyone who wants smart funding policies in higher education.

April 11

Populists and Universities, Round Two

There is a lot of talk these days about populists and universities.  There are all kinds of thinkpieces about “universities and Trump”, “universities and Brexit”, etc.  Just the other day, Sir Peter Scott delivered a lecture on “Populism and the Academy” at OISE, saying that over the past twelve months it has sometimes felt like universities were “on the wrong side of history”.

Speaking of history, one of the things that I find a bit odd about this whole discussion is how little the present discussion is informed by the last time this happened – namely, the populist wave of the 1890s in the United States.  Though the populists never took power nationally, they did capture statehouses in many southern and western states, most of whom had relatively recently taken advantage of the Morrill Act to establish important state universities.  And so we do have at least some historical record to work from – one that was very ably summarized by Scott Gelber in his book The University and the People.

The turn-of-the-20th-century populists wanted three things from universities. First, they wanted them to be accessible to farmers’ children – by which they meant both laxer admissions standards and “cheap”.  That didn’t necessarily mean they wanted to increase expenditures on university budgets substantially (though in practice universities did OK under populist governors and legislators); what it meant was they wanted tuition to remain low and if that entailed universities having to tighten their belts, so be it.  And the legacy of the populists lives on today: average state tuition in the US still has a remarkable correlation to William Jennings Bryan’s share of the vote in the 1896 Presidential election.

 

Fig 1: 2014-15 In-State Tuition Versus William Jennings Bryan’s Vote Share in 1896

Populism Graph

 

The second thing populists wanted was more “practical” education.  They were not into learning for the sake of learning, they were into learning for the sake of material progress and making life easier for workers and farmers; in many ways, one could argue that their attitude about the purpose of higher education was pretty close to that of Deng/Jiang-era China.  And to some extent they were pushing on an open door because the land-grant universities – particularly the A&Ms – were already supposed to have that mandate.

But there was a tension in the populists’ views on curriculum.  They weren’t crazy about law and humanities programs at state universities (too much useless high culture that divided the masses from the classes), but they did grasp that an awful lot of people who were successful in politics had gone through law and humanities programs and – so to speak – learned the tricks of the trade there (recall that rhetoric was one of the seven Liberal arts which still played a role in 19th century curricula).  And so, there was also concern that if public higher education were made too vocational, its beneficiaries would still be at a disadvantage politically.  There were various solutions to this problem, not all of which were to the benefit of humanities subjects, but the key point was this: universities should remain places where leaders are made.  If that meant reading some Marcus Aurelius, so be it: universities were a ladder into the ruling class, and the populists wanted to make sure their kids were on it.

And here, I think is where times have really changed. The new populists are, in a sense, more Gramscian than their predecessors.  They get that universities are ladders to power for individuals, but they also understand that the cultural function of universities goes well beyond that.  Universities are – perhaps even more so than the entertainment industry – arbiters of acceptable political discourse.  They are where the hegemonic culture is made.  And however much they may want their own kids to get a good education, today’s populists really want to smash those sources of cultural hegemony.

This is, obviously, not good for universities.  We can – as Peter Scott suggested – spend more time trying to make universities “relevant” to the communities that surround them.  Nothing wrong with that.  We can keep plugging away at access: that’s a given no matter who is in power.  But on the core issue of the culture of universities, there is no compromise.  Truth and open debate matter.  A commitment to the scientific method and free inquiry matter.  Sure, universities can exist without these things: see China, or Saudi Arabia.  But not here.  That’s what makes our universities different and, frankly, better.

No compromise, no pasarán.

April 07

CEU and Academic Freedom

Let me tell you about this university in Europe. It’s a small, private institution in which specializes in the humanities and social sciences. It’s run on western lines, and is one of the best institutions in the country for research. And now the Government is trying to shut it down, mainly because it finds the institution politically troublesome.

Think I’m talking about Central European University (CEU) in Budapest? Well, I’m not. I’m talking about the European University of Saint Petersburg (EUSP), which has had its license to operate revoked mainly because of its program of studies on gender and LGBTQ issues. And I’m kind of interested in why we focus on one and not the other.

First, let’s get down to brass tacks about what’s going on at Central European University (CEU). This Budapest-based institution, founded by George Soros 25 years ago during the transition away from socialism, is a gem in the region. No fields of study were more corrupted by four decades of communist rule than the Social Sciences, and CEU has done a stellar job not just in becoming a top-notch institution in its own right, but in becoming a bastion of free thought in the region.

The Hungarian government, which not to put too fine a point on it is run by a bunch of nationalist ruffians, has decided to try to restrict CEU’s operations by legislating a set of provisions which in theory apply to all universities but in practice apply only to CEU. The most important of these provisions basically says that institutions which offer foreign-accredited degrees (CEU is accredited by the Middle States Commission, which handles most accreditation of overseas institutions) have to have a campus in their “home country” in order to be able to operate in Hungary and be subject to a formal bilateral agreement between the “home” government and the Hungarian one (CEU does business on the basis of an international agreement, but it’s between Hungary and the State of New York, not the USA). There is, as CEU’s President Michael Ignatieff (yes, him) says, simply no benefit to CEU to do this: it is simply a tactic to raise CEU’s cost of doing business.

So, as you’ve probably gathered by now, this is not an attack on academic freedom the way we would use that term in the west. We’re not talking about chilling individual scholars here. The ruling Hungarian coalition couldn’t care less what gets taught at CEU: what bothers them is that the institution exists to support liberalism and pluralism. What we’re talking about is something much broader than just academic freedom; it’s about weakening independent institutions in an illiberal state. It’s also about anti-semitism (the right wing in Hungary routinely refer to CEU as “Soros University” so as to remind everyone of the institution’s Jewish founder). Yet somehow, the rallying cry is “academic freedom”, when plain old freedom and liberalism would be much more accurate.

I wonder why we don’t hear cries for academic freedom for EUSP, where in fact the academic angle – the university’ research program in gender and queer studies being targeted by a homophobic state – is much more clear cut. Is it because we reckon Russia is beyond salvation and Hungary is not? That would certainly explain our anemic reaction to increasing restrictions on academic freedom in China (where criticism of government is fine, but criticism of the Communist Party is likely to end extremely badly). It would explain why Turkey has faced essentially no academic consequences (boycotts, etc) for its ongoing purge of academic leaders.

I don’t mean to play the whole “why-do-we-grieve-bombings-in-Paris-but-not-Beirut” game. I get it, some places matter more in the collective imagination than others. But I actually think that CEU’s decision to portray this as an academic freedom issue rather that one of freedom tout court plays a role here. We can get behind calls for academic freedom (particularly when they are articulated by English-speaking academics) because academic freedom is something that is everywhere and always being tested around the edges (yeah, McGill, I’m looking at you). But calls for just plain old “freedom”? or “Liberalism”? The academy seems to get po-mo ickies about those.

Frankly, we need to get less squeamish about this. Academic freedom as we know it in the west does not exist in a vacuum. It exists because of underlying societal commitment to pluralism and liberalism. If we only try to defend the niche freedom without defending the underlying values, we will fail.

So, by all means, let’s support CEU. But let’s not do it just for academic freedom. Let’s do it for better reasons.

April 06

Lessons from Mid-Century Soviet Higher Education

I’ve been reading Benjamin Tromly’s excellent book Making the Soviet Intelligentsia: Universities and Intellectual Life under Stalin and Khrushchev. It’s full of fascinating tidbits with surprising relevance to higher education dilemmas of the here and now. To wit:

1) Access is mostly about cultural capital.

There were times and places where communists waged war on the educated, because the educated were by definition bourgeois. In China during the cultural revolution, or in places like Poland and East Germany after WWII, admission to higher education was effectively restricted to the children of “politically reliable classes”, meaning workers and peasants (if you wondered why urban Chinese parents are so OK with the punishing gaokao system, it’s because however insane and sadistic it seems, it’s better than what came before it).

But in the postwar Soviet Union, things were very different. Because of the purges of the 1930s, a whole class of replacement white-collar functionaries had emerged, loyal to Stalin, and he wanted to reward them. This he did by going entirely the opposite direction to his east European satellite regimes and making access to higher education purely about academic “merit” as measured by exams and the like. The result? By 1952, in a regime with free tuition and universal stipends for students, roughly 80% of students had social origin in the professional classes (i.e. party employees, engineers, scientists, teachers and doctors). The children of workers and farmers, who made up the overwhelming majority of the country’s population, had to make do with just the other 20%.

2)  The middle-class will pull whatever strings necessary to maintain their kids’ class position.

Khrushchev was not especially happy about the development of a hereditary intelligentsia, which made itself out to morally superior because of its extra years of education. Basically, he felt students were putting on airs and needed to be reminded that all that training they were receiving was in order to serve the working class, not to stand above it. And so, in 1958, he tried to shake things up by slapping a requirement on university admissions that reserved 80 per cent of places to individuals who has spent two years in gainful employment. This, he felt, would transform the student body and make it more at one with the toiling masses.

This has some predictably disastrous effects on admissions, as making people spend two years out of school before taking entrance exams tends to have fairly calamitous effects on exam results. But while the measure did give a big leg up to the children of workers and peasants (their numbers at universities doubled after the change, though many dropped out soon afterwards due to inadequate preparation), what was interesting was how far the Moscow/Leningrad elites would go to try to rig the system in their children’s favour. Some would try to get their children into two year “mental labor” jobs such as working as a lab assistant; others would find ways to falsify their children’s “production records”. Eventually the policy was reversed because the hard science disciplines argued the new system was undermining their ability to recruit the best and brightest. But in the meantime, the intelligentsia managed to keep their share of enrolments above 50%, which was definitely not what Khrushchev wanted.

3) Institutional prestige is not a function of neo-liberalism.

We sometimes hear about how rankings and institutional prestige are all a product of induced competition, neo-liberalism, yadda yadda. Take one look at the accounts of Soviet students and you’ll know that’s nonsense. Prestige hierarchies exist everywhere, and in the mid-century Soviet Union, everyone knew that the place to study was Lomonosov Moscow State University, end of story.

Remember Joseph Fiennes’ final monologue in Enemy at the Gates?  “In this world, even a Soviet one, there will always be rich and poor. Rich in gifts, poor in gifts…”. It’s true of universities too. Pecking orders exist regardless of regime type.

4) The graduate labour market is about self-actualization

One of the big selling points of the Soviet higher education system was the claim that “all graduates received a job at the end of their studies”. To the ears of western students from the 1970s onwards, who faced the potential of unemployment or underemployment after graduation, that sounded pretty good.

Except that it didn’t to Soviet students. A lot of those “guaranteed” jobs either took students a long way from their studies they loved (“I trained to be a nuclear scientist and now you want me to teach secondary school?”) or the big cities they loved (“I’m being sent to which Siberian oblast”?) or both. And failure to accept the job that was assigned was – in theory at least – punishable by imprisonment.

Yet despite the threat of punishment, Soviet students found a way to evade the rules. Getting married (preferably to someone from Moscow) was a good way to avoid being sent to the provinces. Many simply deserted their posts and found work elsewhere. And some – get this – enrolled in grad school to avoid a job they didn’t want (would never happen here of course).

The point here being: people have dreams for themselves, and these rarely match up neatly with the labour market, whether that market is free or planned. There’s no system in the world that can satisfy everyone; at some point, all systems have to disappoint at least some people. But that doesn’t mean they will take their disappointment lying down. Dreams are tough to kill.

 

April 05

Student/Graduate Survey Data

This is my last thought on data for awhile, I promise.  But I want to talk a little bit today about what we’re doing wrong with the increasing misuse of student and graduate surveys.

Back about 15 years ago, the relevant technology for email surveys became sufficiently cheap and ubiquitous that everyone started using them.  I mean, everyone.  So what has happened over the last decade and a half has been a proliferation of surveys and with it – surprise, surprise – a steady decline in survey response rates.  We know that these low-participation surveys (nearly all are below 50%, and most are below 35%) are reliable, in the sense that they give us similar results year after year.  But we have no idea whether they are accurate, because we have no way of dealing with response bias.

Now, every once in awhile you get someone with the cockamamie idea that the way to deal with low response rates is to expand the sample.  Remember how we all laughed at Tony Clement when he claimed  the (voluntary) National Household Survey would be better than the (mandatory) Long-Form Census because the sample size would be larger?  Fun times.  But this is effectively what governments do when they decide – as the Ontario government did in the case of its sexual assault survey  – to carry out what amounts to a (voluntary) student census.

So we have a problem: even as we want to make policy on a more data-informed basis, we face the problem that the quality of student data is decreasing (this also goes for graduate surveys, but I’ll come back to those in a second).  Fortunately, there is an answer to this problem: interview fewer students, but pay them.

What every institution should do – and frankly what every government should do as well – is create a balanced, stratified panel of about 1000 students.   And it should pay them maybe $10/survey to complete surveys throughout the year.  That way, you’d have good response rates from a panel that actually represented the student body well, as opposed to the crapshoot which currently reigns.  Want accurate data on student satisfaction, library/IT usage, incidence of sexual assault/harassment?  This is the way to do it.  And you’d also be doing the rest of your student body a favour by not spamming them with questionnaires they don’t want.

(Costly?  Yes.  Good data ain’t free.  Institutions that care about good data will suck it up).

It’s a slightly different story for graduate surveys.  Here, you also have a problem of response rates, but with the caveat that at least as far as employment and income data is concerned, we aren’t going to have that problem for much longer.  You may be aware of Ross Finnie’s work  linking student data to tax data to work out long-term income paths.  An increasing number of institutions are now doing this, as indeed is Statistic Canada for future versions of its National Graduate Survey (I give Statscan hell, deservedly, but for this they deserve kudos).

So now that we’re going to have excellent, up-to-date data about employment and income data we can re-orient our whole approach to graduate surveys.  We can move away from attempted censuses with a couple of not totally convincing questions about employment and re-shape them into what they should be: much more qualitative explorations of graduate pathways.  Give me a stratified sample of 2000 graduates explaining in detail how they went from being a student to having a career (or not) three years later rather than asking 50,000 students a closed-ended question about whether their job is “related” to their education every day of the week.  The latter is a boring box-checking exercise: the former offers the potential for real understanding and improvement.

(And yeah, again: pay your survey respondents for their time.  The American Department of Education does it on their surveys and they get great data.)

Bottom line: We need to get serious about ending the Tony Clement-icization of student/graduate data. That means getting serious about constructing better samples, incentivizing participation, and asking better questions (particularly of graduates).  And there’s no time like the present. If anyone wants to get serious about this discussion, let me know: I’d be overjoyed to help.

April 04

How to Think about “Better Higher Education Data”

Like many people, I am in favour of better data on the higher education sector.  But while this call unites a lot of people, there is remarkably little thinking that goes into the question of how to achieve it.  This is a problem, because unless we arrive at a better common understanding of both the cost and the utility of different kinds of data, we are going to remain stuck in our current position.

First, we need to ask ourselves what data we need to know versus what kinds of data it would be nice to know.  This is of course, not a value-free debate: people can have legitimate differences about what data is needed and what is not.  But I think a simple way to at least address this problem is to ask of any proposed data collection: i) what questions does this data answer?  And ii) what would we do differently if we knew the answer to that question?  If the answer to either question is vague, maybe we should put less emphasis on the data.

In fact, I’d argue that most of the data that institutions and government are keen to push out are pretty low on the “what would we do differently” scale.  Enrolments (broken down in various ways), funding, etc – those are all inputs.  We have all that data, and they’re important: but they don’t tell you much about what’s going right or going wrong in the system.  They tell you what kind of car you’re driving, but not your speed or direction.

What’s the data we need?  Outputs.  Completion rates.  Transitions rates.  At the program, institutional and system level. Also outcomes: what happens to graduates?  How quickly do they transition to permanent careers. And do they feel their educational career was a help or a hindrance to getting the career they wanted?  And yeah, by institution and (within reason) by program.  We have some of this data, in some parts of the country (BC is by far the best at this) but even here we rely far too heavily on some fairly clumsy quantitative indicators and not enough on qualitative information like: “what do graduates three years out think the most/least beneficial part of their program was?”

Same thing on research.  We need better data on PhD outcomes.  We need a better sense of the pros and cons of more/smaller grants versus fewer/larger ones.  We need a better sense of how knowledge is actually transferred from institutions to firms, and what firms do with the knowledge in terms of turning them into product or process innovations.  Or, arguably, on community impact (though there I’m not completely convinced we even know yet what the right questions are).

Very few of these questions can be answered through big national statistical datasets about higher education.  Even when it comes to questions like access to education, it’s probably far more important that we have more data on people who do not go to PSE than to have better or speedier data access to enrolment data.  And yet we have nothing on this, and haven’t had since the Youth in Transition Survey ended.  But that’s expensive and could cost upwards of ten millions of dollars.  Smaller scale, local studies could be done for a fraction of the cost – if someone were willing to fund them.

There are actually enormous data resources available at the provincial and institutional level to work from.  Want to find individuals who finished high school but didn’t attend PSE?  Most provinces now have individual student numbers which could be used to identify these individuals and bring them into a study.  Want to look at program completion rates?  All institutions have the necessary data: they just choose not to release it.

All of which is to say: we could have a data revolution in this country.  But it’s not going to come primarily from better national data sets run by Statistics Canada.  It’s going to come from lots of little studies creating a more data-rich environment for decision-making.  It’s going to come from governments and institutions changing their data mindsets from one where hoarding data is the default to one where publishing is the default.  It’s going to come from switching focus from inputs to outputs.

Even more succinctly: we are all part of the problem.  We are all part of the solution.  Stop waiting for someone else to come along and fix it.

April 03

Data on Race/Ethnicity

A couple of week ago, CBC decided to make a big deal about how terrible Canadian universities were for not collecting data on race (see Why so many Canadian universities Know so little about their own racial diversity). As you all know, I’m a big proponent of better data in higher education. But the effort involved in getting new data has to be in some way proportional to the benefit derived from that data. And I’m pretty sure this doesn’t meet that test.

In higher education, there are only two points where it is easy to collect data from students: at the point of application, and at the point of enrolment. But here’s what the Ontario Human Rights Code has to say about collecting data on race/ethnicity in application forms:

Section 23(2) of the Code prohibits the use of any application form or written or oral inquiry that directly or indirectly classifies an applicant as being a member of a group that is protected from discrimination. Application forms should not have questions that ask directly or indirectly about race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, record of offences, age, marital status, family status or disability.

In other words, it’s 100% verboten. Somehow, CBC seems to have missed this bit. Similar provisions apply to data collected at the time of enrolment –a school still needs to prove that there is a bona fide reason related to one’s schooling in order to require a student to answer the question. So generally speaking, no one asks a question at that point either.

Now, if institutions can’t collect relevant data via administrative means, what they have to do to get data on race/ethnicity is move to a voluntary survey. Which in fact they do, regularly. Some do a voluntary follow-up survey of applicants through Academica, others attach race/ethnicity questions on the Canadian Undergraduate Survey Consortium (CUSC) surveys, others attach it to NSSE. Response rates on these surveys are not great: NSSE sometimes gets 50% but that’s the highest rate available. And, broadly speaking, they get high-level data about their student body. The data isn’t great quality because of the response rate isn’t fabulous and the small numbers mean that you can’t really subdivide ethnicity very much (don’t expect good numbers on Sikhs v. Tamils), but one can know at a rough order of magnitude what percentage of the student body is visible minority, what percentage self-identifies as aboriginal, etc. I showed this data at a national level back here.

Is it possible to get better data? It’s hard to imagine, frankly. On the whole, students aren’t crazy about being surveyed all the time. NSSE has the highest response rate of any survey out there, and CUSC isn’t terrible either (though it tends to work on a smaller sample size). Maybe we could ask slightly better questions about ethnicities, maybe we could harmonize the questions across the two surveys. That could get you data at institutions which cover 90% of institutions in English Canada (at least).

Why would we want more than that? We already put so much effort into these surveys: why go to all kinds of trouble to do a separate data collection activity which in all likelihood would have worse response rates than what we already have?

It would be one thing, I think, if we thought Canadian universities had a real problem in not admitting minority students. But the evidence at the moment the opposite: that visible minority students in fact attend at a rate substantially higher than their share of the population. It’s possible of course that some sub-sections of the population are not doing as well (the last time I looked at this data closely was a decade ago, but youth from the Caribbean were not doing well at the time). But spending untold dollars and effort to get at that problem in institutions across country when really the Caribbean community in Canada is clustered in just two cities (three, if you count the African Nova Scotians in Halifax)? I can’t see it.

Basically, this is one of those cases where people are playing data “gotcha”. We actually do know (more or less) where we are doing well or poorly at a national level. On the whole, where visible minorities are concerned, we are doing well. Indigenous students? Caribbean students? That’s a different story. But we probably don’t need detailed institutional data collection to tell us that. If that’s really what the issue is, let’s just deal with it. Whinging about data collection is just a distraction.

March 31

The Meaning of Zero

I’ve had a lot of time over the past week to think about the federal budget. And the more I think about it, the more baffled I am about the decision to completely stuff the granting councils. I think it is either a sign of real political ineptness, or that something pretty awful is in the pipeline.

It’s not as though the Liberals are averse to spending on Science, per se. The budget dropped hundreds of millions of dollars on Artificial Intelligence, Cleantech, Superclusters, what have you. And it’s not as though they have a problem with that money going to college and universities: the AI money was clearly headed to McGill, Toronto and Alberta, winning supercluster applications are going to need universities as partners (in a rational world they should all have also polytechnics/colleges to provide technical skills training as well, but I’m not totally convinced Industry Canada understands this yet).

So why not the granting councils?

Yeah, yeah, don’t say it: the Naylor Report. Because they are waiting for the Naylor Report (which has mysteriously disappeared) and they don’t want to spend any money until it’s out because there might be a big shake-up.

(Related note: the Science Minister, Kristy Duncan, was on my Ottawa-Toronto flight this week. I asked her when the Naylor Report would be published. She said read page 88 of the budget [which says the report will be released “in the coming months]. I asked what was taking so long. She said they had just had so many consultations, it took time to read them all. I said yeah, but Naylor submitted the report on time in December, right? She said – and I quote – “well, that’s a position”. Make of this what you will, but for me at least it did not dispel the impression that games are being played.)

The problem with this thesis is that imminent future program change wasn’t a barrier to spending in some other program areas. Youth Employment Services and the Post-Secondary Student Support Program, both got very significant increases in their budgets despite the fact that the budget indicated that both would be subject to change in the near future. In those cases, the budget was written so as to show a budget bump for two years and two years only, to indicate that the government didn’t think the old structures would still be around.

So why did the government push for temporary budget boosts in other areas but not the councils? I am not sure, but I don’t see a credible answer that says “once Naylor is published the taps will flow”. I think a more likely answer is this: maybe this government doesn’t actually like granting councils as a policy tool any more than the last one did. No, there’s no “war on science” – though frankly, if it were a Conservative government that had hidden the Naylor Report and given the councils 0%, I’m pretty sure we’d be hearing that phrase 24/7.   

But I think it’s dawning on people that federal disenchantment with granting councils is not a partisan thing. The Chretien/Martin government may eventually have been good to councils (1995 budget excepted), but they also set up and funded a whole bunch of different science agencies (Brain Canada, Genome Canada, etc) precisely because they thought they knew better than the councils where science money should be spent. The Harper government wasn’t much into creating new agencies, yet was pretty consistent in funding big science projects every year outside the council structure.

One last piece of data: Universities Canada couldn’t even muster up a word on the councils’ behalf on budget night – it was all “yay MITACS and yay future Naylor report”. Seriously, their press release was embarrassing. Possibly someone in government leaned on them to give positive publicity “or else” (this has been known to happen), but possibly also that in the grand scheme of things, as long as money is coming in via clusters or AI or whatever, university administrations don’t give two hoots about the councils either. And if they don’t, why would the government?

From all of this I draw two conclusions.

One, even if the Naylor Report does result in more money for Science (and I’m not sure we can take that for granted), it’s not obvious that the councils will be the recipients of the money. The belief in Ottawa that granting councils “don’t get the job done” is deep; there is a bipartisan consensus that politicians and senior public servants, collectively, can manage the science enterprise better than scientists.

Two, Universities Canada is apparently deeply comfortable with this situation, even if not all its members are. For there to be a change in policy direction, someone is going to need to challenge the prevailing science discourse directly in Ottawa. And if it Universities Canada isn’t going to do it, it will have to be done by scientists themselves organizing and representing themselves independently in Ottawa. Sure, CAUT claims to do this, but ask a random sample of active scientists if they think this is the right vehicle for Science representation and you’d probably struggle to get into double-digits. Scientists themselves have to organize this fight, and quickly.

Three, it’s possible I’m entirely mistaken about this. Maybe the government just goofed in its messaging and there really is a pot of gold at the end of the Naylor rainbow, and Universities Canada’s behind-the-scenes work (of which I assume there is a great deal) will pay off handsomely. But honestly, at this point: would you bet on that?

March 30

What’s Next for Student Aid

A few months ago, someone asked me what I wanted to see in the budget.  I said i) investment in aboriginal PSE, ii) system changes for the benefit of mature students and iii) changes to loan repayment (specifically, a reduction of the maximum loan payment from 20%  of disposable income to 15%).  To my great pleasure, the government came through on two of those wishes.  But there is still a lot of work to do yet.

Let’s start with the Post-Secondary Student Support Program, which the Government of Canada gives to individual First Nations to support band members’ education costs.  The Budget provides a $45 million (14%) bump to this program but also said the Government would “undertake a comprehensive and collaborative review with Indigenous partners of all current federal programs that support Indigenous students who wish to pursue post-secondary education”, which I think is code for “we’d prefer a new mechanism which is somewhat more transparent than PSSSP”.

Let’s just say I have my doubts about how easy this collaborative review will be.  Indigenous peoples – young ones especially – have a lot of issues with the federal government at the moment, and it will be difficult to try to manage a focussed review of this one subject without a lot of other agenda items intruding.  I’ve written on this subject before, and there certainly are ways which the funding could be arranged to be managed more efficiently.  That said, some of these ways involve taking management away from band councils and giving it to some other aboriginal organization operating at a larger scale and not all bands are going to find that appealing.

Anyways, the takeaway is: if the feds are expecting a replacement to PSSSP to be in place by fall 2019, they’d better get to work yesterday.

Now, what about the new measures for mature students/adults returning to school?  This was a welcome budget initiative, because the policy discussion has perhaps been focussed too heavily on traditional-aged students for the past few years.  There are however, maybe two cautions I would put on the initiative and how it will roll out.

The first is the budget description of the $287M over three years for programs benefitting these students as a “pilot project’.  I am fairly certain that is PMO-speak, not ESDC-speak.  First of all, I’m moderately certain the law doesn’t allow pilots; second, the idea that provinces are willingly going to spend time and money re-jigging all their program systems to accommodate program changes that are inherently temporary in nature is kind of fanciful.  So I suspect what’s going to happen here is that over the next few months CSLP is going to come up with a bunch of different ways to help this population (change cost allowances for older students, and maybe for dependents too), re-jig how prior-year income is calculated, raise loan limits for this population, raise grant eligibility, etc etc) and then roll them out in roughly ascending order of how irritating they are for provinces to program.  It’s not going to be a big bang, which may limit how well the policy is communicated to its intended targets.

But there’s a bigger issue at play here which the government missed in its haste to get a budget out the door.  One of the biggest problems in funding re-training are the artificial breaks in funding and jurisdiction that occur at the 12-month mark.  If your program is shorter than that, you’re covered by various provincial labour market initiatives and on the whole your compensation is decent.  Longer than that, you’re on fed/prov student aid, which in general is not as generous (and more to the point is repayable).  It would be useful for the two levels of government to work together to provide a more seamless set of benefits.  Perhaps regardless of program length, learners could benefit from 8 months of the more generous treatment and then move on to a slightly less-generous mixed loan/grant system.  This wouldn’t be a quick shift: my guess is that even if you now started talking about how to achieve this, it would still take four or five years for a solid, specific solution to come into view (if you think universities are slow, try federalism).  But still, now’s as good a time as any to start, and perhaps the dollars attached to the mature students programs may be a good conversation starter.

My third wish – the one that didn’t get any traction in this budget – was for improvement in student loan repayment.  I’m not that disappointed in the sense that I’m not greedy (no budget would ever have given me 3-for-3), but I do think there I work to be done here.  Perhaps this gets enacted as part of the follow-up to the Expert Panel on Youth being chaired by Vass Bednar and due for release at some point this spring (although who knows, if the Naylor Report is anything to go by, we could be waiting into 2019).  Or perhaps not: it’s not like CSLP hasn’t already been given a huge whack of work for the next couple of years.

But if that’s the worst problem we have in student aid in Canada, I’d say we are in pretty good shape.

 

(As a coda here, I’d just like to pay tribute to the Canada Student Loans Program’s Director-General, Mary Pichette, who is leaving the public service shortly.  Mary’s been involved two big rounds of CSLP reform: the one in 2004/5 which first created the grants for low-income students, and second the ones around the 2016 budget (not just the increase in grants but the many smaller but still important changes to need assessment as well. 

 I won’t say –I’m sure she wouldn’t want me to – that those two reforms were down to her.  But they were down to teams that she led.  She did a lot over her two stints in the program to make the policy shop more evidence-based and her legacy is simply that she’s made life easier for literally hundreds of thousands of student across the country.  They can’t thank her, but I can.  Mary, you will be missed.)

Page 1 of 11212345...102030...Last »