HESA

Higher Education Strategy Associates

Category Archives: Uncategorized

November 24

Who’s More International?

We sometimes think about international higher education as being “a market”. This is not quite true: it’s actually several markets.

Back in the day, international education was mostly about graduate students; specifically, at the doctoral level. Students did their “basic” education at home and then went abroad to get research experience or simply emigrate and become part of the host country’s scientific structure. Nobody sought these students for their money; to the contrary these students were usually getting paid in some way by their host institution. They were not cash cows they did (and still do) contribute significantly to their institutions in other ways, primarily as laboratory workhorses.

In this market, the United States was long the champion since its institutions were the world’s best and could attract top students from all over the world. In absolute terms, it is still the largest importer of doctoral students. But in percentage terms, many other countries have surpassed it. Most of them, like Switzerland, are pretty small and small absolute numbers of international students nevertheless make up a huge proportion of the student body (in this case, 55%). The UK and France, however, are both relatively large markets, and despite their size they now lead the US in terms of percentage of doctoral students who are international (42 and 40% vs 35%). Canada, at 27%, is at right about the OECD average.

Figure 1: International Students at Doctoral Level as Percentage of Total
ottsyd-20161123-1

Let’s turn now to Master’s students, who most definitely *are* cash-cows. Master’s programs are short degrees, mainly acquired for professional purposes and thus people are prepared to pay a premium for good ones. The biggest market here are for fields like business, engineering and some social sciences. Education could be a very big market for international Master’s but tends not to be  because few countries (or institutions, for that matter) seem to have worked out the secret for international programs in what is, after all a highly regulated profession. In any case, this market segment is where Australia and the UK absolutely dominate, with 40 and 37% of their students being international. Again, Canada is a little bit better than the OECD average (14% vs. 12%).

Figure 2: International Students at Master’s Level as Percentage of Total
ottsyd-20161123-2

Figure 3 turns to the market which is largest in absolute terms: undergraduate students. Percentages here tend to be smaller because domestic undergraduate numbers are so large, but we’re still talking about international student numbers in the millions here. The leader here is – no, that’s not a misprint – Austria at 19% (roughly half of them come from Germany – for a brief explainer see here). Other countries at the top will look familiar (Great Britain, New Zealand, Australia) and Canada doesn’t look to bad, at 8% (which strikes me as a little low) compared to an OECD average of 5%. What’s most interesting to me is the US number: just 3%. That’s a country which – in better days anyway – has an enormous amount of room to grow its international enrollment and if it hadn’t just committed an act of immense self-harm would have be a formidable competitor for Canada for years to come.

Figure 3: International Students at Bachelor’s Level as Percentage of Total

ottsyd-20161123-3

Finally, let’s look at sub-baccalaureate credentials, or as OECD calls them, “short-cycle” programs. These are always a little bit complicated to compare because countries’ non-university higher education institutions and credentials are so different. Many countries (e.g. Germany) do not even have short-cycle higher education (they have non-university institutions, but they still give out Bachelor’s degrees). In Canada, obviously, the term refers to diplomas and certificates given out by community colleges. And Canada does reasonably well here: 9% of students are international, compared to 5% across OECD as a whole. But look at New Zealand: 24% of their college-equivalent enrollments are made up of international. Some of those will be going to their Institutes of Technology (which in general are really quite excellent), but some of this will also be students from various Polynesian nations coming to attend one of the Maori Wānanga.
Figure 4: International Students in Short-Cycle Programs as Percentage of Total

ottsyd-20161123-4

Now if you look across all these categories, two countries stand out as doing really well without being either of the “usual suspects” like Australia or the UK. One is Switzerland, which is quite understandable. It’s a small nation with a few really highly-ranked universities (especially ETH Zurich), is bordered by three of the biggest countries in the EU (Germany, France, Italy), and it provides higher education in each of their national languages. The more surprising one is New Zealand, which is small, has good higher education but no world-leading institutions, and is located in the middle of nowhere (or, at least, 5000 miles from the nearest country which is a net exporter of students). Yet they seem to be able to attract very significant (for them, anyway) numbers of international students in all the main higher education niches. That’s impressive. Canadians have traditionally focused on what countries like Australia and the UK are doing in international higher education because of their past track record. But on present evidence, it’s the Kiwis we should all be watching, and in particular their very savvy export promotion agency Education New Zealand.

Wellington, anyone?

November 21

The “Poorly Educated” and the US Election

Morning all.  Hope you’ve been well.

During the US election and its aftermath, a lot of the discussion has focused on the issue of education.  Specifically, many pollsters noted large shifts in favour of the democrats among college-educated whites and even larger shifts rightward from less-educated whites.  Trump’s statement in June that he “love(d) the poorly-educated” was in retrospect quite significant.  From this, many on the left have deduced that “education is more important than ever”, a statement which is almost perfectly calculated to feed every piece of Republican paranoia about leftist indoctrination (and hence likely to make higher education a real target in the 25 states where the Republicans control both houses and the Governor’s mansion).

But I think there’s an important caveat to the education story.  Check out the map from NYT’s excellent The Upshot, which shows in green the counties which showed the biggest democrat-to-republican shift between 2012 to 2016.

ottsyd-20161118

If there was an education issue at work here, it was a pretty particular one – one which seems to have manifested itself mostly around the Great Lakes.  It’s the rust-belt, more or less.  Places where the economy has been in either relative or absolute decline for 50 years or more.  It’s not about trade deals; think back to movies set in this area from the 1970s like Slapshot or Breaking Away.  When this area complains about the economy, it has little to do with Obama’s policies or even trade deals. These places were already in deep trouble long before anyone even dreamed of NAFTA. This is about communities that have been falling apart for 50 years.

In the region’s glory days, it was one of the wealthiest regions of the entire globe.  And during that period, it had an unbelievably low education-to-wealth ratio; maybe the lowest of any place in the world, at any time.  Higher Education?  Who needed it, when well-playing manufacturing jobs were a dime a dozen.  Human capital was for suckers.  And that, unfortunately, is an attitude which has endured.

Times changed, of course.  Underinvestment in capital put paid to the steel industry, titanic incompetence the auto industry, while energy costs reduced the competitiveness of pretty much every other sector.  Result: de-industrialization.  There’s nothing unique about this process.  It’s happened in many places around the world Flanders, Lancashire, Eastern Germany.  What’s unique about the US rust belt is mostly how rich it was before the fall.

There was a lot of post-election commentary about how the Rust Belt’s swing to Trump was really a way of saying “we’re upset because no one listens to us” but that’s not strictly speaking true.  People listen.  It’s just that literally no one knows how to effectively reverse de-industrialization.  Americans have it worse because for a variety of reasons (most of them rooted in racism) they lack much in the way of a social safety net.  And as American scholar Toney Carnevale is fond of saying, when a country lacks social programs, education actually becomes the safety net.

But there’s a problem with that.  In declining economies, education is no guarantee of a job because hiring is low.  So the ones with education leave (to California, say, or New York) leaving the remaining population with lower average skill levels, thus making economic regeneration even harder.  People come to see education as a vehicle for personal salvation, but also potentially as an agent of community destruction because while they are creating human capital, they are also priming it for export.  I’m sure many readers in Atlantic Canada will know that feeling.

And to top it off, you have to remember that what people in the area really liked about the old days wasn’t just the middle-class jobs, but the fact that mental toil wasn’t necessary.  JD Vance, in his recent book Hillbilly Elegy (this fall’s de rigeur read for those wanting to “get” Trump voters) notes with some sorrow that the people of his home communities in Ohio and Kentucky lack much in the way of desire for self-improvement through education.  “We don’t study as children and we don’t make our kids study as parents,” he says. “We hillbillies need to wake the hell up”.  That awakening may happen, but it didn’t on November 8th.  In the key states that swung the election, Trump’s promise to “Make America Great Again” was taken up by the people with the subtext “We Want Jobs That Don’t Require College Again”.

In short: more education, on its own, won’t solve the problems of de-industrialization.  And even if that weren’t true, it’s not clear that more education is a medicine everyone wants to take.

November 11

The New WSJ/Times Higher Education Rankings

Almost the moment I hit send on my last post about rankings, the inaugural Wall Street Journal/Times Higher Education rankings of US universities hit the stands.  It didn’t make a huge splash mainly because the WSJ inexplicably decided to put the results behind their paywall (which is, you know, BANANAS) but it’s worth looking at because I think in many ways it points the way to the future of rankings in many countries.

So the main idea behind these rankings is to try to do something different from the US News & World Report (USNWR) rankings which are a lot like Maclean’s rankings (hardly a surprise since the latter was explicitly modelled on the former back in 1991).  In part, the WSJ/THE went down the same road that Money Magazine went in terms of looking at output data: graduate outcomes like earnings and indebtedness, except that they were able to exploit the huge new database of institutional-level data on these things that the Obama administration.  In addition to that, they went a little bit further and created their own student survey to get evidence about student satisfaction and engagement.

Now this last thing may seem like old hat in Canada: after all, the Globe and Mail ran a rankings based on student surveys from 2003 to 2012 (we at HESA were involved from 2006 onwards and ran the survey directly for the last couple of years).  It’s also old hat in Europe, where a high proportion of rankings depend at least in part on student surveys.  But in the US, it’s an absolute novelty.  Surveys usually require institutional co-operation, and organizing this among more than a thousand institutions simply isn’t easy:  “top” institutions would refuse to participate, just as they won’t do CLA, NSSE, AHELO or any measurement system which doesn’t privilege money.

So what the Times Higher team did was effectively what the Globe did in Canada thirteen years ago: find students online, independent of their institutions, and survey them there.  The downside is that the minimum number of responses per institution is quite low (50, compared with the 210 we used to use at the Globe); the very big upside is that students’ voices are being heard and we get some data about engagement.  The result was more or less what you’d expect from the Canadian data: smaller colleges and religious institutions tend to do extremely well on engagement measures (the top three for Engagement were Dordt College, Brigham Young and Texas Christian).

So, I give the THE/WSJ effort high marks for effort here.  Sure, there are problems with the data.  The “n” is low and the resulting number have big error margins.  The income figures are only for those who have student loans and includes both those who graduated and those who did not.  But it’s still a genuine attempt to shift rankings away from inputs and towards processes and outputs.

The problem?  It’s still the same institutions coming in at the top.  Stanford, MIT. Columbia, Penn, Yale…heck, you don’t even hit a public institution (Michigan) until 24th position.  Even when you add all this process and outcome stuff, it’s still the rich schools that dominate.  And the reason for this is pretty simple: rich universities can stay relatively small (giving them an advantage on engagement) and take their pick of students who then tend to have better outcomes.  Just because you’re not weighting resources at 100% of the ranking doesn’t mean you’re not weighting items strongly correlated to resources at 100%.

Is there a way around this?  Yes, two, but neither is particularly easy.  The first is to use some seriously contrarian indicators.  The annual Washington Monthly rankings  does this, measuring things like percentage of students receiving Pell Grants, student participation in community service, etc.  The other way to do this is to use indicators similar to those used by THE/WSJ, but to normalize them based on inputs like income and incoming SATs.  The latter is relatively easy to do in the sense that the data already (mostly) exists in the public, but frankly there’s no market.  Sure, wonks might like to know about which institutions perform best on some kind of value-added measure, but parents are profoundly uninterested in this.  Given a choice between sending their kids to a school that efficiently gets kids from the 25th percentile up to the 75th percentile and sending their kid to a school with top students and lots of resources, finances permitting they’re going to take the latter every time.  In other words, this is a problem, but it’s a problem much bigger than these particular rankings.

My biggest quibble with these rankings?  WSJ inexplicably put them behind a paywall, which did much to kill the buzz.  After a lag of three weeks, THE made them public too, but too little too late.  A missed opportunity.  But still, they point the way to the future, because a growing number of national-level rankings are starting to pay attention to outcomes (American rankings remarkably are not pioneers here: in fact, the Bulgarian National Rankings  got there several years ago, and with much better data).  Unfortunately, because these kinds of outcomes data are not available everywhere and are not entirely compatible even where they are, we aren’t going to see these data sources inform international rankings any time soon.  Which is why, mark my words, literally all the interesting work in rankings over the next couple of years is going to happen in national rankings, not international ones.

November 10

Measuring Innovation

Yesterday, I described how the key sources of institutional prestige were beginning to shift away from pure research & publication towards research & collaboration with industry.  Or, to put it another way, the kudos now come not from solely doing research, but rather in participating in the process of turning discoveries into meaningful and commercially viable products.  Innovation, in other words (though that term is not unproblematic).  But while we all have a pretty good grasp on the various ways to measure research output, figuring out how to measure an institutions’ performance in terms of innovation is a bit trickier.  So today I want to look at a couple of emerging attempts to do just that.

First out of the gate in this area is Reuters, which has already published two editions of a “top 100 innovative universities” list.  The top three won’t surprise anyone (Stanford, MIT, Harvard) but the next three – Texas, Washington and the Korea Advanced Institute of Science and Technology – might:  it’s a sign at least that some non-traditional indicators are being put in the mix. (Obligatory CanCon section: UBC 50th, Toronto 57th and that’s all she wrote.)

So what is Reuters actually measuring?  Mostly, it’s patents.  Patents filed, Success rates of patents filed, percentage of patents for which coverage was sought in all three of the main patent offices (US, Europe, japan), patent citations, patent citation impact…you get the idea.  It’s a pretty one-dimensional view of innovation.  The bibliometric bits are slightly more interesting – percent of articles co-written with industry partners, citations in articles originating in industry – but that maybe gets you to one and a half dimensions, tops.

Meanwhile, the THE may be inching towards an innovation ranking.  Last year, it released a set of four “innovation indicators”, but only published the top 15 in each indicator (and included some institutions not usually thought of as universities in the list, such as “Wright-Patterson Airforce Base”, the Scripps Research Institute” and the “Danish Cancer Society”) which suggests this was a pretty quick rip-and-grab from the Scopus database rather than a long, thoughtful detailed inquiry into the subject.  Two of the four indicators, “resources from industry” and “industry contribution” (i.e. resources from industry as a percentage of total research budget), are based on data from the THE’s annual survey of institutions and while they may be reasonable indicators of innovation, for reasons I pointed out back here, you should intensely distrust the data.  The other two indicators are both bibliometric:  “patent citations” and “industry collaboration” (i.e. co-authorships).  On the whole, THE’s effort is slightly better than Reuters’, but is still quite narrow.

The problem is that the ways in which universities support innovation in an economic sense are really tough to measure.  One might think that counting spin-offs would be possible, but the definition of a spin-off might vary quite a bit from place to place (and it’s tough to know if you’ve caught 100% of said activity).  Co-working space (that is space where firms and institutions interact) would be another way to measure things, but it’s also very difficult to capture.  Economic activity in university tech parks is another, but not all economic activity in tech parks are necessarily university- or even science-based (this is an issue in China and many developing countries as well).  The number of students engaged in firm-based work-integrated learning (WIL) activities would be great but a) there is no common international definition of WIL and b) almost no one measures this anyway.  Income from patent licensing is easily obtainable in some countries but not others.

What you’d really want, frankly, is a summary of unvarnished opinions about the quality of industry partnerships with the businesses themselves, perhaps weighted by the size of the businesses involved (an 8 out of 10 at Yale probably means more than a 9 out of 10 at Bowling Green State).  We can get these at a national level through the World Economic Forum’s annual competitiveness survey, but not at an institutional level, which is presumably more important.  And that’s to say nothing of the value of finding ways to measure the various ways in which institutions support innovation in ways other than through industry collaboration.

Anyways, these problems are not insoluble.  They just take imagination and work.  If I were in charge of metrics in Ontario, say, I could think of many ways – some quantitative, some qualitative – that we might use to evaluate this.  Not many of them would translate easily into international comparisons.  For that to happen would require a genuine international common data set to emerge.  That’s unlikely to happen any time soon, but that’s no reason to throw up our hands.  It would be unimaginably bad if, at the outset of an era where institutions are judged on their ability to be economic collaborators, we allow patent counts to become the standard way of measuring success.  It’s vitally important that thoughtful people in higher education put some thought into this topic.

November 09

A Second Thought About Half-Way Through A Pretty Awful Day

Forgive the intrusion.  But our neighbour to the South electing a quasi-fascist narcissist isn’t an every day occasion.  There are some significant short-term consequences for Canadian higher education, and I thought I would just quickly enumerate them so that debate and preparation can begin.

First, the chances of a recession in the next couple of years just shot up quite a bit.  Tearing up NAFTA also means tearing up the FTA: there will be a pause in business investment while everyone works out what on earth the new rules are going to be.  Other forms of protectionist legislation, even if not aimed at us, has the potential to wreak serious havoc as well.  Unlike previous recessions, interest rate cuts cannot be part of our policy arsenal as they are already near-zero.  To some people’s minds, that calls for massive Keynesian borrowing-and-spending.  But as we’ve already seen with the first round of Trudeau spending, it’s not at all clear that the intended multiplier effects work very well in a small open economy.  Long story short: provincial governments were never likely to be flush enough to grants serious relief to universities and colleges any time soon, but yesterday’s vote made such prospects even more remote.

Second, the forecast demand for Canada as an international education destination just went Through. The. Roof.  Already earlier this week, the annual i-barometer global survey of education agents named Canada the #1 “hot” destination for students.  But now, with a President-elect who degrades women, despises Hispanic and Muslims and openly consorts with anti-semites, there’s going to be a huge diversion of interest away from the United States and (since the UK has already hung out a huge “Sod Off” sign on its window), this diversion is be headed towards exactly three places: New Zealand, Australia, and Canada.  One recent study suggested fully 65% of international students would be less likely to study in the US if Trump were elected.  Even if that over-states the case by a factor of two, we’re talking about a couple of hundred thousand internationally mobile students up for grabs.  Not to mention the almost-certain increase in the number of Americans heading North.

That has a couple of implications.  The main one is that Canadian universities are about to get more pricing power:  No more being the discount end of North American higher education.  But we have to up our game significantly.  We have to have real presence – not just agents – in major export markets.  And we have to up the student experience international students receive as well.   There are significant opportunities here: but also some potential significant costs.  There’s no time like now to have a really thorough debate about internationalization on our campuses.

Third, while I have been impressed by how by some prominent Americans (Jonathan Chait, Lin-Manuel Miranda) are coming out strongly this AM saying (correctly) “Screw moving to Canada, we need to stay and fight”, the fact of the matter is there are going to be a lot of faculty wanting to head north and a lot fewer of our own professors wanting to head south.  Universities will have a much better set of potential hires in front of them for the next couple of years.  This is great news: but we should try not to squander this opportunity the way we squandered the post-2008 rush north.  We can and should use the opportunity to poach selectively; but perhaps not break the bank on salaries while doing so.

(Also: I’m pretty sure we’re not going to be hearing about brain drain and the loss of talent to the US for awhile, so it’s an opportunity as well to re-calibrate some of our arguments about education and the labour market).

So the net effect here for Canadian institutions over the medium-term: less government money, more opportunities in international education, and thicker academic labour markets.  On balance, it’s probably more good news than bad, provided we act deliberately and rapidly while ensuring that these moves have wide buy-ins on our campuses.

But beyond the simple dollars and cents of it all, there are deeper issues.  A monster has become President of the United States.  Misery is going to fall upon the American people for the next two years if not four: on Blacks, immigrants, women, LGBTQs.  We all know people down there, know what they must be feeling today, and our hearts ache for them.  We need to show solidarity with them whenever we can.  But we also need to be vigilant here in Canada.  We are not immune to nativism and intolerance.

Last night around 11 PM Dalhousie President Richard Florizone tweeted “When voices of intolerance are loudest don’t be despondent – be emboldened, and even more committed to values of diversity & inclusion.” And that’s exactly right.   We have to work – and work hard – at these things and for fairness, every day.  In the end, that kind hard work is all that ever makes a difference.

November 09

Shifting Sources of Prestige

The currency of academia is prestige.  Professors try to increase theirs by publishing better and better papers, giving talks at conferences and so on.  Becoming more prestigious means offers to co-author with a more illustrious class of academics, increasing the chance of book deals at better university presses, etc.  And at the institutional level, universities become more prestigious by being able to attract and nurture a more prestigious group of professors, something which is done by lavishing them with higher salaries, more research funds, better equipment, better graduate students (and to a lesser degree undergraduate students too).  All this has been clear for a long time.

In any given field, we might know which ten or twenty people are at the top globally – Nobel Prize winners for instance (speaking of which, this Freakonomics podcast on How to Win a Nobel Prize is hugely informative and entertaining on the how the Swedish committees decide who really is “top of the field”).  But after that it is pretty hazy: one’s list of the top twenty health researchers who have yet to win the Nobel for Medicine probably depends a lot on what sub-field you’re in and how you evaluate the last decade’s relative progress in various other subfields.  Same with universities before rankings came along.  It doesn’t take a genius to work out that Toronto, McGill and UBC are the top three in Canada.  But after that it gets fuzzy.  If you were in Medicine, you might think number four was McMaster; in Engineering Waterloo and in Arts Montreal or Alberta.

Then along came large bibliometric databases, and shortly thereafter, rankings.  And then we knew how to measure prestige.  We did it by measuring publications, citations, and whatnot: the more, the better.  Universities began managing towards this metric, which built on longstanding trends in most disciplines towards more demanding publication requirements for tenure (the first known use of the phrase “publish or perish” dates from 1942).  Want prestige?  Research. Publish.  Repeat.

But I get the real sense that this starting to change, for universities if not individual professors.  I can’t provide much strong evidence here: you won’t see the change in the usual rankings because they are hardwired for old definitions of prestige.  Nevertheless, if you look around at which universities are “hot”, and receive the acclaim, it’s not necessarily the ones who are doing the publishing; rather, it’s the ones that are actively contributing to the dynamism of their local economies.  MIT’s gradual overtaking of Harvard is one example of this.  But so too is the fuss over institutions like SUNY Albany and its associated nanotech cluster, Akron and its Advanced materials cluster.  In Canada, the obvious example is Waterloo but even here in Toronto, Ryerson has become a “hot” university in part because of its focus on interacting with business in a couple of key areas such as tech (albeit in quite a different way from Waterloo).

To be clear, it’s not a case of publishing v. working with industry.  Generally speaking, companies like to know that the people they are working with are in fact at the front of their fields; no publishing, no partnership.  But it’s more of a general orientation: increasingly, the prestigious universities are the ones who not only have a concentration of science and engineering talent, but also have a sufficiently outward focus to act as an anchoring institution to one or more industrial clusters.

What’s interesting about this trend is that it has some clear winners and losers.   To even have a hope of working in an industrial centre, you need to be in a mid-size city which already has some industry (even if, as in Akron’s case, it’s down in the dumps).  That works for Canada, because (Queen’s excepted) nearly all our big and prestigious universities are in mid-sized or large cities.  In the US, however, it’s more difficult.  Their universities are often older, built in a time where people believed universities were better-off situated away from the “sinful” cities.  And so you have big, huge research institutions in places like Champaign, Illinois or Columbia Missouri which are going to struggle in this new environment (even places like Madison and Ann Arbour are far enough away from big cities to make thing problematic).  Basically, the Morrill Act is now imposing some pretty serious legacy costs on American higher education.

Part of the reason this shift hasn’t been more widely acknowledged is that bibliometrics are a whole lot easier to measure than economic value (and are valued more in tenure discussions).  But some people are starting to have a go at this problem, too.  More on this tomorrow.

November 08

Why I Do This Stuff

It’s Election Day in the America.  It’s a day that always make me think about how I got into this business.

Back in 1992, I was trying to stay out of a godawful job market by doing a Q-year in Economics at McGill (ended disastrously: don’t ask).  On November 2nd, I was sitting with some friends in the Shatner Building reading a New York Times story about the celebrations being planned in Little Rock for the next evening.  It was clearly going to be the biggest party in North America that week.  So we decided to go.

Some local car rental company had unwisely signed a deal with the student’s union offering any club a 3-day rental for $150, so seven of us (mostly from The Tribune) jumped into a Chevy Astrovan at 5 o’clock Monday afternoon and drove 23 hours straight from Montreal to Little Rock.  We had a good time while there (I managed to blag my way into a temporary press pass which – with a little help from a local laminating shop – became my ticket to hang out in the basement of the Excelsior Hotel following Wolf Blitzer around listening to him bullshit with other reporters).  But for me, the real event of that trip happened the next morning.

We left Little Rock at midnight, needing to get home so some of us could take midterms on Thursday. At about 7AM, we pulled into a McDonald’s near Champaign, Illinois.  By this time, none of us had really bathed or slept properly in about 48 hours and six of the seven of us were smokers (a fairly representative percentage in Montreal in the early 90s), so there was a kind of blue haze that followed us out of the van as we trudged into a nearly-empty restaurant for some coffee and McMuffins.

“You all look like crap,” said the woman behind the counter (note: she did not say “crap”, she used a different word).  We gave her the back story on our trek, and told her that we were coming back from Little Rock.

Somewhat to our bemusement, the woman began to cry.  “You saw the President?”  At first I thought this was just an exaggerated expression of royal-like deference American sometimes display towards the Commander-in-Chief.  But no.  She recovered slightly and said “Mr. Clinton is our President and my boy is going to go to college”.

So that was it.  Amidst all the back and forth of the campaign: Gennifer Flowers, The Comeback Kid, Ross Perot jumping in, Sister Souljah, Double Bubba, Ross Perot leaving, Murphy Brown, Dan Quayle’s spelling ability, Ross Perot coming back in again, “it’s the economy, stupid”….the main thing this woman had keyed in on was that Clinton was determined to expand access through a “Domestic G.I. Bill” higher education by letting all students either a) borrow via an income-contingent loan or b) do national service  (that never quite happened, though Clinton did manage to introduce both an income-contingent loan repayment option and Americorps).

The fact that we’d been near the man who was going to make this happen was just a bit overwhelming for her.  And that made a big impression on me.  Among those who have degrees, there’s often a world-weary cynical pose about higher education “not being worth it” – devalued degrees, crippling student debt, etc.  But to a family who’s never had someone attend post-secondary, and that moment when they realise they can go is something magical.  They’re not foolish enough to think that going to university or college means but they do know for sure – rightly – that going to higher education is by far the best way to get step up to the middle class.  And if you aren’t already in the middle class, that’s a Big Deal.  Something worth devoting a career to, anyway.

And that – in part – is how a visit to an Illinois McDonald’s 24 years ago got me studying student aid and from there, higher education generally.  And why every four years I think about that morning, and that woman, and wonder whether her son succeeded in college or not.  I really wonder.

November 07

Student Living Standards

Last month, a group called Meal Exchange, an inter-university student anti-hunger group, in collaboration with the Ryerson School of Social Work, published an interesting paper called Hungry for Knowledge: Assessing the Prevalence of Student Food Insecurity on Five Canadian Campuses.  People are mostly drawing the wrong conclusions from it, but it’s worth examining nonetheless.

Meal Exchange surveyed 4500 students at five campus across Canada using a battery of questions on food purchase & consumption identical to those used in Statistics Canada’s Canadian Community Health Survey.  These questions, and the percentage answering “yes” to each are shown below.

 

Question % Yes
I/we worried whether my/our food would run out before I got money to buy more 37.7%
The food that I/we bought just didn’t last and I didn’t have money to buy more 28.4%
I/we couldn’t afford to eat balanced meals 44.4%
I/we regularly relied on a few low-cost foods in order to avoid running out of money to buy food 58.0%
I skipped meals because there wasn’t enough money to buy food 27.4%
I/we did not eat for the whole day because there was not enough money to buy food 11.0%

 

On the basis of these questions, the authors ascribe each respondent a food security score.  If they answered positively to one or fewer of the six questions they are considered “food secure”, from 2-4 they are “moderately food insecure” and 5 or more means “food insecure”.  This is similar to the method Statistics Canada uses for its calculations of food security.  Based on these results, Meal Exchange determined that 61% of students were food insecure, 8.3% had severe food insecure and 30.7% had “moderate food insecurity”.

Now, the most important thing to note here is that the survey sample isn’t even vaguely scientific.  Students were recruited “via social media advertising, institutional survey committees, student associations, university health promotion departments and paper fliers distributed across Canada” – or, in simpler terms, “anyone they could find”.  Even the authors concede this may overstate the number of food insecure students. But leave that aside for a moment. Assume the numbers are right.  What do they mean?

Well, in some ways they arguably understate the issue.  Something like 40% of undergraduates live at home with their parents.  It is unlikely that very many of them are experiencing food insecurity.  The problem, such as it is, is concentrated in the 60% or so of students who live away from home.  That implies that something close to 2/3 of students living away from home are, broadly-speaking, food-insecure.

If that sounds a bit off to you, it’s probably because in fact the definition of “moderate” food insecurity is pretty expansive.  If you answer yes to the questions not eating balanced meals and relying on low-cost food (e.g. pasta, ramen), that makes you “food insecure” according to the study.  And while that might make sense for the majority of Canadians, it’s trickier for students.  For most of them simply leaving home means they have less access to nutritious food because mom and dad aren’t doing the shopping anymore.  That doesn’t actually mean they are malnourished or “can barely afford to eat” (as this quite disastrous Vice headline implied.  “Moderate food insecurity” as Statistics Canada defines it is basically just another way of measuring low-income status and doesn’t indicate hunger in a sense most people would recognize.

The fact that most students are low-income isn’t (or shouldn’t be) news. Check out the Canada Student Loans Program’s monthly living allowances:  these differ by province, but range between $968 (New Brunswick) and $1,408 (British Columbia) per month.  Now compare these amounts to the various Statscan measures of low-income status.  There’s the Low-Income Cut-Off (LICO), which is $1,420 per month in cities between 100-500K population and $1,680 for larger cities; the Low-Income Measure (LIM), which is about $1,737 per month, and the Market-based Measure (MBM) which varies by city, but tends to run between about $1,600 and $1,800.  Take any measure of student expenditure you want, there simply aren’t that many students living on their own who are spending more than that.  They’re low-income.  Period.

And Canadians for the most part are OK with that.  Our income-support programs are not designed to take people out of low-income status.  And with respect to students specifically (traditional-aged ones anyway), we don’t really believe they deserve to be particularly comfortable.  Call it “paying your dues” or whatever, but our view of how students are supposed to live is pretty ingrained.  They’re supposed to be studying so they should live simply; they’ll make money (good money, for the most part) soon enough.  And for a significant proportion of students – not all by any means, but a healthy percentage – there is a certain “Common People” aspect to their poverty: “if you called your dad he could stop it all” – J. Cocker.  Though they are living away from home, many have the option of improving their standard of living and food security simply by moving back in with their parents.  For these students, government intervention beyond what is already available is for the most part simply not required.

Now, predictably, the usual suspects are pushing the idea that 39% of students are “food insecure” (that is, they are combining the “severe” and “moderately” categories and calling them all “insecure”).  But that’s just the usual chicanery one expects from lobby groups: saying 40% are food insecure makes it sound like there’s some kind of major public health crisis rather than the same old story about students having low incomes.  And that’s a shame because the real story isn’t the re-packaging of low-income as food insecure – it’s the 8% of respondents who have severe food insecurity.

Even if that number is on the high side (and my guess based on our previous research on this subject we’ve conducted at HESA  is that it’s not wildly out of line), it’s still very worrying.  There really are students – mostly older, many with families – who don’t eat for 24 hours at a time, who really do have to choose between spending on food and spending on medicine or shelter, and don’t have other family resources to fall back on.  We don’t do a good job of identifying these students, and we clearly don’t do a very good job of supporting them.  These are the ones we really need to help, and urgently so.

Bref, ignore the sensational headlines suggesting widespread hunger.  Focus on the smaller but more important numbers which really do indicate that we have a problem.

November 03

The European Way of Student Services

One of the delights of working in international higher education is that while higher education is pretty much isomorphic the world over, it’s not entirely so. There’s not so much variation that expertise isn’t transferable, but not so little that you can’t be learn something new by appreciating another country’s system.  One are of particular interest is student accommodations and student services.

In North America we take it for granted that student services and residence are a responsibility of institutions – who else would do it?  But there are at least two answers to that the private sector could do it, or a public corporation not associated with a particular institution could do it.

If you hang out near universities for any length of time in Australia, for instance, you’ll see plenty of private-sector solutions for student housing. Companies build cheap, small-ish dorms (50-100 occupants) near universities, and rent them to students.  Which university?  Doesn’t matter.  So long as you’re a student, they’ll rent you a small single apartment.  It’s nothing special – IKEA to the max – but for someone looking for something cheap and full of fellow students, it makes a lot of sense.

(Some – but by no means all – of the companies building and operating student residences seem to be international in scope.  I met and chatted with representatives of one such company at NAFSA in Denver earlier this year and for the life of me I cannot figure out how this makes sense.  Every country has its own laws and building codes so where would economies of scale accrue? Still, apparently someone thinks there’s money to be made in this business.)

In Europe, however, there is a different approach: national student services companies.  In Germany, Deutsches Studentenwerk (DSW), a government-funded non-profit, is responsible for a network of student residences scattered across the country an addition to canteens and counseling services at a number of universities.  In France, a similarly-organized CNOUS is responsible for residences alongside things like student exchanges. DSW also provides certain other services for students like Legal Aid and assistance in obtaining student financial assistance (DSW in some ways seems to think of itself at least partially as a protector/champion of student rights somewhat in opposition to the universities and government).  The important thing here is that residence is not tied to enrollment in a particular university.  At a given residence in Munich or Berlin you might find students from one of a number of local institutions.

Now, you can see some real advantages to this.  First, an organization which specializes in student services might be more effective in performing it than one that sometimes views it as tangential to the “main” mission (say, at your average research university).  Second, it might be cheaper and more efficient. In Montreal, why duplicate housing services across UQAM, McGill and Concordia, all of which are within five stops of each other on the city’s Green Line?  Why not stick them all together?

But what’s possible in Europe isn’t always possible in North America.  Until fairly recently in Europe, universities were much more creatures of government than North American ones ever were – having a different state agency take over part of the system was no big deal (hell, Max Planck and CNRS took over most of the research mission so why not student services?) In North America, institutions (some of them, anyway) think of student services as part of their value proposition, an element on which they compete with other institutions.  Until very recently, that notion of inter-institutional “competition” was mostly absent from the European way of thinking.

In truth, the real reason most universities in North America wold be loath to take a European route is that residences matter for alumni donations. Research on this is pretty clear: the shine people take to their university is closely related to the strength of the attachments they form while there.  And though residences aren’t the only place students form attachments, they’re high up the list.  Take residences away from the institutional experiences, and your appeal to alumni is going to be a lot weaker.

But it’s an interesting model to ponder nonetheless.  Makes you think about what the true “boundaries” of a university really are.

November 02

Shifts in Credentialling

As Colin Mathews, President of the technology company Merit, remarked in an excellent little article in Inside Higher Ed a few weeks ago, credentials are a language.  One with limited vocabulary, sure, but a language nonetheless.  Specifically, it is a form of communication from educational institutions to (primarily) the labour market to convey information about their possessors.  There has been a lot of talk in the last couple of years, however, to the effect that the current vocabulary of credentials is inadequate and that change is needed to promote better labour market outcomes for both firms and graduates.  What to make of this?

The complaints about credentials basically come in two categories.  The first has to do with what I call the “chunking” of credentials.  At the moment, we essentially have two sets of building blocks: “Credits” (or “courses”) and degrees.  One is very short, the other often quite long.  Why not something in-between, which indicates mastery over a body of material which might be of interest to employers but is less than a full degree?  These shorter, more focused credentials like Coursera’s “specializations”, or Udacity’s “nanodegrees” are meant to supply labour market skills more quickly than traditional degrees.

Then there is the issue of how to interpret the information conferred by a credential.  A bachelor’s degree mainly tells an employer that the bearer has to stick-to-it-iveness to complete a four-year project (commitment means a lot to employers).  If the employer has hired graduates from a particular university or program before, then they might have a sense of an individual’s technical capabilities, too.   But beyond that, it’s blank.  A transcript might tell an employer what courses a student has taken, but unless the employer is going to take an inordinate amount of time to scrutinize the curriculum, that doesn’t really help them understand what they’ve been exposed to.  Marks help in the sense that an employer can get a sense of what a student has achieved at school, but increasingly, employers are finding that this is irrelevant.  What matters in many fields are the “soft skills” or “fuzzy skills”: on these nearly all credentials are silent.

Enter the idea of “badges”, digital or otherwise.  A solution half-inspired by competency-based education principles and half by Girl Guides/Boy Scouts.  The idea here is to give learners certificates based on particular skills they have demonstrated just as the Guides and Scouts do.  The problems with this are manifold.  First of all, unless a particular skill can be demonstrated through standardized testing, certifying skills is actually a fairly time-consuming and therefore costly activity.  This is why many of the emerging badging systems actually measure achievements and activities rather than skills (one badging system recently profiled in Inside Higher Education, for instance, hands out badges for attending certain types of meetings.  Your typical employer could not care less).

But even if you buy the Guide/Scout analogy, badges quickly run into the same problem as transcripts.  Say you have a knot-tying badge.  Unless an employer is intimately familiar with the Guide/Scout curriculum, s/he will have no actual idea what the knot badge actually signifies in terms of practical skills.  Can they do Zeppelin Bends? Constrictor Knots?  Do they understand ambient isotopy?  Or can they just do a slip knot?  And badges for soft skills are still pretty sketchy, so that part of the equation is still a blank.

All these moves towards what might be termed “microcredentials” are well-meaning.  Degrees are a pretty blunt and opaque way to express achievement and ability; it would be better if we could find ways of making these more transparent.  But the problem here is that all these solutions are being tested largely without talking to employers.  Badges, or whatever new microcredential solution we are talking about here, are all new languages.  Employers understand the language of degrees.  They do not understand the language of badges and see little benefit in learning new languages which seem to bring little additional benefit. For most, the old language of degrees is good enough – for now.  And so the spread of badges and various types of skill-based certification is so far pretty limited.

But it’s early days yet.  My guess is we’re in the first stages of what will likely be a 20- to 30-year shift in credentialing.  As Sean Gallagher notes in his excellent new book The Future of University Credentials, the general trend is going to be towards demanding that individuals be able to demonstrate mastery of particular skills and competencies.  Partly, that can be done through changes to assessment and reporting within existing degree systems; but it may also come through the regularization of certain new credentials, some of which may be issued by existing institutions and others from new providers.  I don’t think the final form of these credentials are going to look anything like the current fashion of badges; they are frankly too clumsy to be up to much.  But the push in this direction is too strong to ignore.  Expect a lot of very interesting experimentation in this field over the next few years.

Page 2 of 1012345...10...Last »