HESA

Higher Education Strategy Associates

October 24

How Domestic Students Experience Internationalization on Campus

So today, my colleague Jaqueline Lambert and I released a paper on how Canadian students view the process of internationalization (you can download the paper here).  It’s a mixed bag, frankly.

On the one hand, we find pretty clearly that students buy into the principles of internationalization.  They are very positive about the goals internationalization is meant to foster (diversity, more global awareness), and they’re even enthused about how an increased presence of foreign students improves their schools’ prestige.  Over forty percent of students say they’ve made a close friendship with a student from another country.  Eleven percent of students have already had a period of study abroad, and another seven percent say they have definite plans to do so.  All good.

It starts getting trickier, though, where the interests of international and domestic students are not perceived to be aligned.  Half of students agree, to some extent, with the statement: the presence of international students in a classroom enriches a learning experience.  But fully a third disagree.  Similarly, a third of students say that the presence of international students has actually hindered their classroom experience.

Figure 1: Perspectives on International Students in the Classroom.

 

 

 

 

 

 

 

 

 

 

 

There are also concerns about competition for resources.  Thirty-eight percent of Canadian students say the presence of foreign students means greater competition for on-campus jobs, while forty-two percent say it increases competition for scholarship dollars.

Figure 2: “The Increasing Number of International Students Attending my Institution has led to____”.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

But perhaps the most eye-opening consequence of internationalization has less to do with students from abroad, and more to do with professors.  Instructors with a less-than-perfect command of the language of instruction (French or English) are known to be common on campuses, and seven-out-of-ten students in our survey said they’ve had such an experience (more in STEM fields, less in humanities).  But when asked a more specific question – has a difficult-to-comprehend teacher significantly hindered your ability to perform or successfully complete a course? – fully one-third of all students said yes.

Figure 3: “Has a Difficult-to-Comprehend Instructor Significantly and Negatively Hindered your Ability to Perform or Successfully Complete a Course?”

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Figure 3 shouldn’t be taken to mean that all hard-to-understand professors are foreign, or that foreign professors are hard-to-understand.  But there is a significant proportion of instructors – often foreign, often hired for their research ability – who are having a hard time getting their message across to students.  And that’s not really OK: students pay a lot of money for their education, and it shouldn’t be too much to ask that their instructors have the necessary language skills to teach them.

What to take from all this?  Students remain relatively positive about internationalization, and that’s good.  But there are some warning signs here.  Not everyone perceives internationalization as an unalloyed good; more attention needs to be paid to the dislocations it causes if internationalization is to continue to be seen in a positive light.

 

October 23

The Best CFS Chair Ever

I see Brad Lavigne has a new book out about his years as Jack Layton’s campaign strategist.  Time perhaps to mention his other big accomplishment: namely, being the best Chairperson the Canadian Federation of Students (CFS) ever had.

The mid-1990s were an ugly time in Canadian PSE.  Federal and provincial governments were broke, and cutting back everywhere.  Partly as a result of this, the student movement polarized – a more left-wing leadership took over the organization and purged the moderates, who returned the favour by leaving CFS, and joining up with a previously-unaligned group of schools to create the more moderate Canadian Alliance of Student Associations (CASA) (full disclosure – I was CASA’s first National Director).  In 1996, Lavigne became CFS chair.  He was seen as one of the hardliners, but when Lavigne got to Ottawa, it was clear that he didn’t want to just rant and rage at the government.  He wanted some wins for students, and he was prepared to compromise to get them.

And so with this intention Lavigne took CFS into a seven-member coalition to improve student aid, which included not only their nemeses at CASA, but also the University Presidents represented by AUCC.  The coalition kept clear of the divisive issues like tuition, and focused solely on issues of student debt, which everyone agreed was bad.  It held together on a common platform for over a year, which reassured the federal government that it would get approval, not opprobrium, for agreeing to invest in this file (less than 3 years on from an infamous macaroni-throwing event, involving then-HRDC Minister Lloyd Axworthy, this kind of re-assurance was still necessary).

The result was the 1998 budget, the single biggest investment in student aid ever made in Canada.  It expanded interest relief considerably, making life easier for hundreds of thousands of borrowers.  It created a set of grants for students with dependents (a key CFS aim at the time), the Canada Education Savings Grants.  And it injected – what would become – $3.6 Billion worth of grants into the student aid system through the creation of the Canada Millennium Scholarship Foundation. Had Lavigne not brought CFS to the table, it’s quite possible none of it would have happened.

Unfortunately, his own members turned on him, thinking he’d gone too far in terms of co-operation.  In the end, he was made to criticize the deal he’d done so much to facilitate – on the lunatic grounds that it was all unacceptable if the 1995 transfer cuts weren’t restored.  As a result, CFS never took ownership of what was clearly its greatest-ever success, and Lavigne’s work was never recognized for what it was: a real act of statesmanship.

CFS – heck, the whole country – could use more student leaders like him.

October 22

Faculty Salary Data You Should Probably Ignore

Recently, the Ontario Confederation of University Faculty Associations (OCUFA) published a comparison of American and Canadian academics’ salaries.  Using Canada’s National Household Survey (NHS) and the US Occupational Employment Statistics (OES) survey (which they described as being not quite apples-to-apples, but at least Macintosh-to-Granny Smith), they noted that average salaries for the combined college-and-university instructor population (the OES cannot disaggregate below that level) were $76,000.  In Canada, the figure was $65,000.  Hence, according to them, with the dollar at par, there is a 17% gap in academic pay in favour of the Americans… and much more of a gap if PPP is taken into account.

There are three reasons why this conclusion is deeply suspect.

First, OES and NHS are not even vaguely comparable.  One is a world-class instrument, based on administrative data collected at over 200,000 places of employment; the other: a self-report from a nonrandom sample of Canadians which has been widely panned as a steaming pile of horse manure.

Second, the actual numbers seem to be slightly off.  When I go to the OES, the category for 2- and 4-year post-secondary teachers (25-1000), I get $77,600.  The Canadian NHS files show that “university professors and lecturers” (category 4011) earn $87,978 and “college and other vocational instructors” (category 4021) earn $57,275.  Together, weighted, that’s an average of $70,033.  So, a 10% gap, not a 17% one.

Third, since the two countries don’t have identical proportions of instructors in the 2- and 4-year sectors, it’s hard to tell how well these numbers reflect differences among university professors.  Neither do we have any sense of the proportion of part-timers and sessionals in the count, on either side of the border.  In other words, this comparison is based on a hodgepodge of non-comparable data, and proves absolutely nothing with respect to relative salaries of professors on either side of the 49th parallel.

More direct comparisons are possible.  Oklahoma State University has been doing an annual survey of salaries at Public and Land-grant Universities – the grouping of US institutions that look most similar to Canadian universities – for 40 years.  The figure below compares the 2012-13 OSU data with that of Canadian profs from Statistics Canada’s last UCASS study (2010-11), as published by CAUT.

Canada vs US Professors’ Salaries

 

 

 

 

 

 

 

 

 

 

 

 

One can quibble with this graph, of course.  The Canadian numbers have probably gone up another 6-7% in the intervening two years.  The US numbers don’t include the income professors get from summer research grants, which would probably add another 10% or so to their averages (see here for that calculation).  But effectively, there’s about a 15% pay gap in Canada’s favour, if dollars are counted at par, not a 17% gap the other way.

Naturally, one could get into arguments about purchasing power parity, living standards, and the like – that’s all fair game.  What’s not fair game is using a set of bad statistics when better ones are available, just because the bad data happens to better serve your cause. You’d think an association representing academics, of all people, would know that.

October 21

“Academic Freedom” or “Freedom from Evaluation”?

So, you may have heard that the University of Manitoba Faculty Association (UMFA) is threatening a strike, starting tomorrow.  What you may not have grasped is just how thin the grounds for the strike are.

You can see the university’s full bargaining position, here; UMFA, in contrast, has publicly issued only a single note (responding to a missive from the administration, which it felt was misleading) and an open letter to students published in the Free Press.  Frankly, for a group threatening to disrupt the lives of tens of thousands of students, this is pretty poor form (St. FX’s faculty union was admirably communicative during its strike, last year).  But we’ll let that pass for the moment.

Refreshingly, UMFA says the strike is not about money.  Instead, they say it’s about academic freedom.  One key issue is the desire to enshrine the right to criticize the administration in the collective bargaining agreement – which, you know, is fair enough, though it’s not obvious that there are extant cases of intimidation or retaliation that would make this grounds for a walkout.

The more important file, according to UMFA, is performance evaluation.  What UMFA wants is – and I quote – with respect to tenure, pay, and promotion, “no prescribed journals, venues, enumeration of publications or dollar amounts of research funding may be established or taken into consideration.”

Now, I do understand the objection to prescribed lists of journals, but there are easy solutions here that would still ensure that professors are publishing in rigorous journals.  External reviewers could assess individual cases (I understand Waterloo does this), journal impact factors could be used, or, if that’s too passé, one could use citation counts as evidence that at least other scholars find your work useful.  I can also see that prescribing dollar amounts of grants might be problematic, though in many fields it’s not too much to ask for a number greater than zero.

And the administration apparently sees this, too – they’ve already conceded on both of those points (see: page 9 of the admin response). What the admin hasn’t given in on is the bit about enumeration.  Read the passage above again… UMFA does not want enumeration of publications to count for pay, promotion, or apparently even tenure, for God’s sake.

Mind-blowing, huh? I understand the arguments against “publish or perish”, but this is bananas.

From UMFA’s public communications, it’s difficult to escape the impression that this strike is really about redefining “academic freedom” as “freedom from evaluation”.  That’s not something any reputable university can accept, and it’s a terrible reason to disrupt students’ semesters.  Hopefully, everybody will return to their senses before tomorrow’s strike deadline.

October 18

Better Know a Higher Ed System – Scandinavian Labour Market Edition

A bit of a different tack for this week’s Better Know a Higher Ed System.  I’m not actually going to bore you by explaining the intricacies of four different systems of higher ed, or drone on about the ever-trendy Finnish polytechnics, or anything like that.  I am, however, going to tell you some nifty things about the way education and the labour market interact in these Scandinavian countries, and why, as a result, one should be quite careful when interpreting higher education statistics from this region.

There are three notable and interconnected facts about Scandinavia that you need to know:

  • The average age of Scandinavian students is much higher than it is elsewhere.  In Canada (indeed, throughout the Anglosphere) the four-year ages with the highest participation rates are 18-21.  In Scandinavia, it’s usually 21-24.
  • Scandinavian countries are usually considered to have the highest participation rates in adult education in the world.
  • Scandinavian countries are usually considered to have among the highest dropout rates from higher education in Europe.

Now, you’re probably thinking: how exactly are these things interconnected?  Well, it has to do with these countries’ labour markets working completely differently than anywhere else.  As explained to me by a few different sources (including some senior Scandinavian civil servants), Scandinavian employers actually tend to hire based on skills rather than credentials.  It’s not entirely clear how or why this happens – it’s certainly not because of newfangled “badges” or any such thing.  It’s just their culture.

As a result, it’s quite common for students to go to school, study for a few modules, get the desired skills, and then move into the labour market.  Later, they can simply slip back into the system and finish their studies. You can see how this would distort the statistics from everyone else’s point of view: the first transition causes a lot of – what, to the outside world, looks like – drop-outs; the second creates the illusion of a fabulous system of lifelong learning. But they are, in fact, two sides of the same coin: students are just taking a leisurely path through studies, mixing periods of study with periods of work. Because they can.  Which, let’s face it, is pretty cool (as is much else about Scandinavia).

But there’s a cautionary tale here, as well.  We’re accustomed in the age of publications, like the OECD’s Education at a Glance, to compare countries based on international statistics, and to think that there’s something we can learn from “leaders” in particular categories.  But it’s not always true.  Scandinavian “success” at lifelong learning is ultimately a byproduct of a very unique set of attitudes amongst employers with respect to hiring young people.  And you just can’t import that.

October 17

Innovation Literature Fail

So, I’ve been reading Mariana Mazzucato’s, The Entrepreneurial State.  It’s brilliant and irritating, in equal measures.  Brilliant because of the way it skewers certain free-market riffs about the role of risk and entrepreneurialism in the innovation process, and irritating because it’s maddeningly cavalier about applying business terms to government processes (in particular, the term “risk”, which Mazzucato doesn’t seem to understand means something entirely different in government, if losses can be made whole through taxation).

Anyways, one thing that occurred to me while reading was just how America-specific much of the literature on innovation is.  Take the Defence Advanced Research Projects Agency (DARPA).  In innovation policy circles it’s generally considered a wicked-cool way of organizing Big Science: it’s project-based, it brings teams together from both academia and business, and it has substantial independence.  And, of course, the basic research has produced things like GPS and the Internet (still the core anecdotes used to back the “government-should-be-involved-in-research” argument). 

Brilliant, right?  So why doesn’t everyone have a DARPA?  Why doesn’t Canada?

The answer is that DARPA wouldn’t make any sense here.  Our government agencies don’t have enough of the “big problems” that DARPA is designed to solve – or, at least, that could be solved at a price we can afford.  And frankly, we don’t have enough private-sector research scientists to make headway into these kinds of projects, anyway.

More broadly, the American system of funding science works because of a particular combination of factors: the problems needing to be solved, the presence of major private sector research efforts, a particular type of venture capital industry, and scale.  Canada – like most countries in the world – would, at most, get part-marks on any of those four criteria.  So why do we think that policies based on American examples work for us?

Take questions of “applied” vs. “basic” science.  Maybe the classic Vannevar Bush formulation of, “government funds universities to do basic research, and companies do the applied stuff” only makes sense in the US context.  Maybe without the VC culture, or the private sector research culture, the idea that government should only be playing in the “basic” side of the continuum doesn’t make any sense. Maybe countries who aren’t quite at the technological frontier don’t get as much bang for their buck in basic research as America does.

This is just speculation on my part, of course.  But I’m tired of the innovation literature assuming that US-inspired solutions will work here.  Just for once, I’d like to see some literature and policy prescriptions based on what works in Korea, the Netherlands, and Scandinavia.  There’s probably a whole other set of policy lessons to be learned, if only we looked for them in the right places.

October 16

A Simple Solution for Statistics on Doctoral Education

Higher Education statistics in Canada are notoriously bad.  But if you think general stats on higher ed are hard to come by, try looking at our statistical systems with respect to doctoral education and its outcomes.

Time-to-completion statistics are a joke.  Almost no one releases this data; when it is released, it often appears to be subject to significant “interpretation” (there’s a big difference between time-to-completion and “registered” time-to-completion.  If you want to keep the latter down, just tell students getting into sixth year to de-register until they’re ready to submit a thesis).  Employment statistics are even scarcer – and as for statistics on PhDs getting jobs in academia?  Ha!

It’s that last piece of data that students really want published; it’s also the one viewed with the most trepidation by directors of graduate programs, who are a bit worried about what the stats might reveal.  They would probably argue that this isn’t a fair measure of their program’s success since, after all, they don’t control the hiring market.  This is a fair enough point, though a reasonable person might ask in return why, if this is the case, PhD intakes are staying high, or even increasing?

Personally, I think the idea that everyone in a PhD program should aspire to an academic career is demented, and professors who peddle that idea to their students (and there are thousands of them) should be ashamed of themselves.  But sadly, a lot of students do buy into this myth, and when it doesn’t come true, they’re fairly upset.

There is, however, a simple way to address this problem.  Every department in the country should be required to maintain a webpage with statistics both on graduation rates, times-to-completion (real ones, from time-of-enrolment, to degree), and on the last five years worth of graduates.  How many are in tenure track positions?  How many are in post-docs?  How many are temping?  Etc.  No big survey, no nightmare of involving StatsCan and the provinces, and whatnot.  Just every department, publishing its own stats.  And no guff about response burdens.  Between Academia.edu and LinkedIn, this shouldn’t take more than a day to do.

“Sure”, I hear you saying, “and which departments will volunteer for this, exactly?”  But there’s actually a very simple way to get departments to fall into line.  The granting councils – who, as much as anyone, should be concerned about these issues – should simply make publication of such information a pre-requisite to obtaining any funding for graduate students.  Period.

Faced with this, I’m fairly certain that departmental objections would melt like butter.  So how about it, SSHRC and NSERC? Want to land a blow for good data?  Give this plan a try.  You’ll be heroes to grad students from coast to coast.

 

October 15

The Problem with Cutback Narratives

Let’s discuss how we talk about cutbacks.  And let’s talk about the University of Alberta.

U of A has been rather radically affected by the recent cutbacks imposed by the Alberta government.  But here’s the weird thing: apparently it’s not enough to say “we’ve had cuts of 7%”in one year”.  Instead, people feel the need to enhance that figure in many ways.  It’s not just a 7% cut, they say – “we were told by government to budget based on 2% growth, so it’s actually 9%”.  Or, “11% over two years”, etc.

If that’s not enough to invoke sympathy, people turn to statements like: “well we have to absorb a 7% cut, but that’s on top of the 18% we’ve already taken in the last four years.”  This, apparently, is a quote from the Dean of Science.  I have to hope it’s a misquote, because it’s not even vaguely true.

First, although the government grant fell 7%, that doesn’t mean revenue will fall 7%, because the University can always get new revenue – even with a ridiculous and unconscionable tuition fee freeze – by changing their enrolment mix to favour more expensive programs (professional master’s degree) and students (i.e. foreign ones).  In fact, according to a planning document which came out last month, the 2013-14 operating budget for faculties is actually 0.2% higher than it was last year (Science did take a 2% hit, but hey, someone’s got to pay for that 6% bump that Medicine received).

Second, it’s preposterous to say that cuts have been sustained over a four-year period.  Here’s the actual budget of the faculty of Science over the past four years:

University of Alberta Science Budget, 2009-10 to 2013-14, in Millions

 

 

 

 

 

 

 

 

 

 

 

 

… and for faculty operating budgets in general…

University of Alberta Faculties’ Budget, 2009-10 to 2013-14, in Millions

 

 

 

 

 

 

 

 

 

 

 

 

… and total operating budgets…

University of Alberta Operating Budget, 2009-10 to 2013-14, in Millions

 

 

 

 

 

 

 

 

 

 

 

 

The problem, as you can see, isn’t primarily about income.  The problem is that when your business is 60% labour costs, and you can’t fire people, and you hardwire-in annual 4% rises in labour costs, there isn’t a lot of flexibility in the system.  When revenue growth slows below 4%, cuts – sometimes quite painful ones – do occur… in non-salary areas.  Indeed, between 09-10 and 12-13, the Faculty of Science did cut the materials budget by 25%, from 2 million to 1.5 million… but the salary & benefits budget went from 69.2 million to 76 million over that same period.  Hmm.

Universities didn’t have to build brittle systems that would shatter if revenue growth fell below 4%.  The academic community, through a thousand little decisions, made this bed for itself and now has to lie in it.  Yet the dominant narrative is one of universities being passive victims of outsiders’ (i.e. government) actions.  A more thoughtful response – one befitting an institution devoted to dispassionate analysis – might be a bit more of an introspective one.

October 11

PIAAC: The Results for Aboriginal and Immigrant Canadians

One of the unbelievably cool things about this week’s PIAAC release is the degree to which StatsCan and CMEC have gone the extra mile to not only oversample for every province, but also for every territory (a first, to my knowledge), and for Aboriginal populations, as well – although they were not able to include on-reserve populations in their sample.  This allows us to take some truly interesting looks at several vulnerable sub-segments of the population.

Let’s start with the Aboriginal population.  Where numbers permit, we have province-by-province stats on this, albeit only for off-reserve populations.  Check out figure 1:

Figure 1: PIAAC Literacy Scores for Aboriginal and Mainstream Canadians, Selected Provinces.

 

 

 

 

 

 

 

 

 

 

 

 

So, good news first: in BC and Ontario, the gap between Aboriginal and mainstream Canadians is down to single digits – this is great news, even if it doesn’t include the on-reserve population.  But given the differences in educational attainment, you have to think that a lot of this is down to attainment rates: if one were to control for education, my guess is the difference would be exceptional.

The bad news, of course, is: WHAT THE HELL, NUNAVUT?  Jumpin’ Jehosophat, those numbers for the Inuit are awful.  The reason, of course, again comes down to education, with high-school completion rates for the population as a whole being below 50%.  Other territories are better, but not by much.  It’s a reminder of how much work is still needed in Canada’s north.

The immigration numbers are a bit more complicated.  The gap in literacy scores between non-immigrants and immigrants is about 25 points, and this gap is consistent at all levels of education.  That’s not because immigrants are less capable, it’s because, for the most part, they’re taking the test in their second – or possibly third – language (breaking down the data by test-takers’ native language confirms this).  As someone pointed out to me on twitter, the consequence of this is that PIAAC literacy isn’t pure literacy, per se - it’s a test of how well one functions in society’s dominant language.  Conversely, though, since facility in the dominant language clearly has an effect on remuneration, one wonders how much of the oft-discussed gap in salaries between immigrants and native-born Canadians, which seems illogical when just looking at educational levels, might be understood in light of this gap in “literacy”?

A larger point to remember, though, is that the presence of immigrants makes it difficult to use overall PIAAC scores as a commentary on educational systems. Over 20% of Canadians aged 16-65 are immigrants, and most of these people did their schooling outside of Canada, and, bluntly, they bring down the scores.  Provinces with high proportions of immigrants will naturally see lower scores.  Policymakers should be careful not to let such confounding variables affect their interpretation of the results.

October 10

More PIAAC: The Canadian Story

Yesterday I offered my thoughts on some of the highlights from the international portion of the PIAAC release; today I want to focus on the Canadian results. 

Figure 1 shows the overall literacy scores, by province.

Figure 1: Literacy Scores by Province, PIAAC

 

 

 

 

 

 

 

 

 

 

 

 

At first glance, PIAAC doesn’t seem to be telling us anything we didn’t already know from years of PISA & TIMSS surveys.  Alberta comes first, the Atlantic is mostly a mess, and everybody else is kind of in-between.  But look a little more closely at the data, and a different story emerges.  Remember that PISA and TIMSS are single-cohort snapshots of kids with identical amounts of education; PIAAC is a mashup of multiple cohorts, each with quite different educational patterns.  Because they are measuring such different things, similarities may simply be coincidental.

So let’s see what happens when we try to standardize for age and education.  Figure 2 shows PIAAC literacy scores, by province, for the 25-34 age cohort who possess a university degree:

Figure 2: Literacy Scores by Province, University Graduates Aged 25-34

 

 

 

 

 

 

 

 

 

 

 

 

At face value, Figure 2 is pretty exciting if you’re from the Atlantic.  I mean, hey, OECD says one year of schooling is equal to seven points on the PIAAC scale – which implies that Islanders with university degrees, on average, have literacy rates equal to about three years of extra education over the left-coasters.  But because of sample sizes, these numbers come with pretty big confidence intervals: PEI and Nova Scotia are outside the margin of error for BC and Saskatchewan, but not for anyone else.  The other six are all essentially equal.

Now take a look at the result for college graduates, aged 25-34:

Figure 3: Literacy Scores by Province, College Graduates Aged 25-34

 

 

 

 

 

 

 

 

 

 

 

 

There’s a similar pattern here, but the gaps at either end are bigger, and confidence intervals don’t help quite as much.  BC through Manitoba are all within each others’ margin of error.  Put PEI and Alberta are genuinely ahead of everyone else, except BC; Newfoundland and Saskatchewan come out looking bad no matter what.

Here’s what you should take from this:

1)   Alberta’s overall high PIAAC scores are due less to its own education system, and more to its ability to attract talent from elsewhere.  That’s the only way you can reconcile their own scores with what we know about their PSE access rates, and the performance shown in the second and third figures above.

2)   Saskatchewan needs to ask some hard questions.  Really hard.

3)   PEI is… odd.  This doesn’t look like a fluke.  But then, if they’ve got all these great skills, why is their economy such a basket case?

4)   Newfoundland is Newfoundland.  Decades of relative poverty will take its toll.

5)   Don’t get fooled by small differences – the other six provinces are essentially indistinguishable from one another.

More tomorrow.

Page 22 of 68« First...10...2021222324...304050...Last »