HESA

Higher Education Strategy Associates

June 03

A Better Way to Track Graduates

The real problem Canada has with respect to the whole “does-education-pay” debate is data. It’s not that we don’t have people collecting data – we do, lots of them. The problem is that they’re all collecting data over time frames so short as to be largely meaningless.

The gold standard used to be the National Graduate Survey, which surveyed every fifth graduating class two and five years out. Now the 2-year survey is a year behind schedule and the 5-year follow-up has been discontinued. That’s right, folks – at the start of the recession, when Statscan took a look at their suite of surveys and decided which ones to can and which ones to keep, they decided that the one on medium-term educational outcomes was among the least policy-relevant and canned it. You know, so they could keep funding their monthly poultry storage reports .

For about a decade now, a number of provinces (all except MB, SK and NL) have started collecting data too; indeed, they have been doing so on a biannual basis, which is much better than Statscan could ever manage. However, most only track them out to 24 months, so the issue of long-term outcomes is still unaddressed. BC is the only province which does 5-year reports, and they’re quite interesting (more about them tomorrow).

The long-term outcomes of degrees and programs clearly matter a great deal. So why can’t we measure them? Cost, mainly. Anything further out that about 24 months is expensive to do well (BC’s 5-year response rates are disappointing, for instance), and so – penny-wise pound-foolish nation that we are – we don’t do it.

But there actually is a very cost-effective way to do this; namely, to link student records to tax records. Virginia, Tennessee and Arkansas have already linked their grads’ data to unemployment records and others seem poised to follow. In Canada, we could quite easily do the same thing by having Statistics Canada link its Post-Secondary Student Information System (PSIS) to the T1 family file. Instantly, with no new data collection expenses, you’d have income data by institution, program of study – what have you – as many years out as you like. As always with Big Data, there are some privacy concerns, but frankly none of them are very convincing, certainly not compared with the major public policy gains available.

Linking administrative databases is cheaper, faster and more accurate than what we do now. Why we haven’t moved to this system already is one of the biggest mysteries in Canadian higher education policy.

May 31

Coursera Jumps the Shark

Remember when Coursera – the world’s largest purveyor of Massively Open Online Courses (MOOCs) – was going to disrupt higher education, and put hundreds if not thousands of public institutions out of business? I know it’s hard to cast your mind back all of eighteen months, but try.

Actually don’t.  Because it’s all over. Yesterday, Coursera did a weird strategy about-face by announcing that, rather than competing with public colleges, it’s going to start competing with Blackboard instead.

We’ve been heading this way for awhile.  Last summer, the all-conquering Coursera, armed with $22M or so in venture capital (VC) money, and getting free content from major educational institutions around the world (including McGill and University of Toronto), was seemingly poised to dominate education everywhere, forever, because… well… OK, this part was never clear.  There seemed to be some idea that if you stuck “great professors” (i.e. big research names at big research universities) in front of a camera, eyeballs would follow.  This was always preposterous – if it weren’t, University of the Air would be prime time.  But, of course, nobody ever got rich telling people that the revolution wasn’t coming.

Coursera has simply never had a coherent plan to generate revenue.  Oh sure, it had a bunch of ideas about how to do it, which were outlined in this leaked MOU with the University of Michigan, but few seem to have panned out.  The only thing we’ve heard from Coursera is that their idea for charging people for certificates of completion netted $220,000 in Q1 of this year.  Given that Coursera’s annual burn rate seems to be in the neighbourhood of $10M (that’s on top of their partners spending $50K/course to place it on the Coursera platform), this is peanuts.  Allegedly, they were going to try to make money on a bunch of other things, like being scouts for businesses on the lookout for bright young talent, but there have been no announcements of revenue from these sources.  Given how the tech news industry works, it’s a safe bet that means the figure is close to zero.

So now, with no money coming in, and no new round of venture financing announced since last year (attention education journalists: go interview some Coursera investors – they’re key to this story), it announced this week that it would be working with partners like the University of West Virginia and the University of New Mexico – places which Coursera swore in writing to its AAU/U-15/Russell Group partners that it would never allow to offer MOOCS, because it would taint the brand.  Together with these institutions, Coursera will be developing something called “campus-based MOOCs”, which, upon closer inspection, is completely indistinguishable from what we’ve called “blended learning” for roughly a decade now.

And so the revolution ends with a whimper, not with a roar.

May 30

The Economics of Merit Scholarships

There is a wonderful moment in Philip Delves Broughton’s Ahead of the Curve in which he describes a fight between a student and an administrator at Harvard Business School.  During the altercation, the student asks why he is being jerked-around, since, after all, he is “the customer”.  To this, the administrator calmly replies: “no you’re not, you’re the product”.

For serious institutions, this is exactly right.  People judge a school based on its alumni and their accomplishments.  Students are just inputs in the making of alumni.  And since the easiest way to improve your outputs is to improve your inputs, it’s usually worth paying for better raw materials.  So the way to think about undergraduate merit scholarships is as an institutional attempt to purchase better inputs.

Think about it: what would it be worth to new-ish universities (say, MacEwan or Mount Royal) to have one of their students win a Rhodes Scholarship?  What benefit would they get in terms of recruiting and reputation for something like that?  My guess would be easily half a million or so.  So if you were President of one of those schools, and you could somehow forsee which teenagers were most likely to become Rhodes scholars, what would you stump up to convince such a student to attend your institution?  $100,000?  $200,000?

That sounds like a ridiculous question because we’re conditioned to think about the size of institutional scholarships as being a function of tuition (or, in the case of truly exceptional scholarships, like McGill’s Greville-Smiths, a function of tuition plus cost of living).  Yet we have no problem thinking about merit awards much larger than tuition at the graduate level; so why are squeamish about it for undergraduate students?

I think one reason is that people see too much waste in the current scholarships.  The number of genuinely outstanding people who could shift an institution’s reputation is pretty small; yet, on average, Canadian institutions pass out entrance awards to nearly two-thirds of their entering students in sums so small one wonders what possible purpose they could be achieving.  Fewer, bigger scholarships – or an outright diversion of money from merit to need – might bring greater results, but people are wary about potentially handing even more money to a system which, at present, achieves very little.

It would be interesting if one Canadian institution broke from the herd and started paying for talent as if it mattered, instead of dropping seven figures a year on masses of one-time $1000 and $1500 scholarships.  Among other things, we might just find out just how many great alumni it takes to shift public perception of an institution.  My guess is it’s fewer than you’d think – which is precisely why this is worth a try.

May 29

Rise Through the Ranks (RTR)

So, here’s the little budget secret that everyone in higher education tries to hide: it’s called Rise Through the Ranks (RTR).  That’s the name given to the automatic raise professors and librarians get every year, simply based on seniority.  And over the next few years, as we head into genuine zero-budget-increase territory, it’s going to significantly erode institutional purchasing capacity.

Come collective bargaining time, unions and administrations seem to go at it hammer-and-tongs.  “We demand a 2.5% annual pay increase!”  “No, we can’t possibly give more than 1.5%”, etc.  But what neither side tells the public is that these numbers are on top of raises that every professor gets for “passing Go” every year.  You see, there is the “headline” wage raise, and then there is RTR.   The fights you see in public are almost always about the former, and almost never about the latter.  The size of RTR varies from place to place; in most recent agreements I’ve seen it’s between 2 and 3%.  So when you hear that a faculty union has settled for 2%, keep in mind that individual faculty members are getting increases of 4-5% for the length of the agreement.

Some people argue that RTR doesn’t matter because it gets paid for by the savings an institution reaps when a full professor retires, and a cheaper, assistant professor is hired to replace him/her.  It is true, of course, that an institution books about a one-time $60K salary savings each time a professor retires.  But given that the average salary of professors in Canada is about $113,000, an RTR of 2.5%, means the cost of RTR per professor is a shade under $2,800.  Thus, for retirement to cover the cost of RTR, the retirement rate – that is, the percentage of all professors retiring in a given year – would need to be 4.7%.  In reality, at most Canadian universities, that figure is between 2 and 3%, so retirements only eat up about half the cost of RTR.  In practice, all those three-year agreements allegedly worth 2%/year actually cost the institution about 3.5%.

You can kind of see why no one wants to talk about RTR.  It’s in neither the unions nor the administrations’ interest to make it obvious that staff are getting annual salary bumps twice the rate of inflation; both know there would be a backlash.

Unfortunately, that doesn’t cut it anymore, because “2%” agreements are no longer affordable.  A 3-year 2% agreement means 10% higher labour costs at the end of the contract.  In most provinces, total income growth over the next three years from operating grants and tuition fees will be 5%, at best.  How that slack gets made up is anyone’s guess.

May 28

Some Developments in Rankings

I was in Warsaw the week before last for the International Rankings Expert Group (IREG) Forum.  The forum is designed both for those interested in rankings, and for rankers themselves – the principals behind the US News & World report rankings, the Shanghai Jiao Tong rankings, Germany’s CHE rankings, and the  Quacquarelli Symonds rankings are all regular participants.  It’s always been an interesting place to hear firsthand how rankings are evolving.  When it first started nearly a decade ago, there was a certain degree of rivalry between the main rankers.  Everyone was eager to prove the superiority of their own methodology.  There is much less of that now.  Among those who rank, there is an acceptance that there are many different ways to rank, and the desirability of any given system is largely dependent on the intended audience and the availability of data (which institutions themselves tend to control).  Nowadays, the Forum is interesting as a way to see how people are trying to refine indicators of institutional activity.

The main item of interest in Wasrsaw, though, was the inauguration of IREG’s system of quality certificates.  Seven years ago, the IREG group (including me) put together something called the Berlin Principles, a statement of good practice in rankings.  About three years ago, IREG moved to turn the Berlin Principles into the basis of a quality assurance system; that is, it would offer to certify rankings systems as being Berlin-compliant.  In Asia and Eastern Europe, where rankings have become a de facto method of quality assurance, there seems to be great demand for having external quality assurance for such rankers.

In any event, the rankings done by the Perspektywy Education Foundation and QS (the subject rankings) were the first two to volunteer for the process, and their audit groups were chaired by well-known higher education experts, such as Jamil Salmi (ex-head of the Tertiary Education Group at the World Bank) and Tom Parker (ex-Executive Director of the Institute for Higher Education Policy in Washington).  The process was essentially about compliance with the Berlin principles – most notably, the bits involving data integrity.  Passing an audit is meant to be the equivalent of an ISO 9000 certification, and both QS and Perspektywy passed, becoming the first to qualify for these certificates.

It remains to be seen, of course, if this certification will actually mean anything in terms of how people view rankings (will people be more drawn to the QS rankings now that they have an external stamp of approval?).  Regardless, this is a reasonably big step forward for rankings, generally.  Rankers are starting to show the kind of transparency that they demand of others, and their attention to quality and sound methodology is being rewarded.  It’s a step forward that everyone should applaud.

May 27

Higher Education Management, Hermit Kingdom-Style

Frabjous day!  I have just read one of the great higher education management tracts of all time. I’m of course speaking about, On Improving Higher Education, by Kim Il Sung.  (Pyongyang Foreign Languages Publishing House, 1974).

Don’t let Kim’s “communist” label fool you – what this guy cared most about was the concept of Juche  (self-reliance), which continues to be the underlying ideology of the north’s nationalist, quasi-fascist state.  As you can imagine, this meant a lot of belt-tightening.  As such, Kim’s thoughts have mostly to do with things like optimization, efficiency, and austerity.  Really, a perfect book for our times.

Pay raises, for instance are Right Out.  “As long as you make an issue out of remuneration, you cannot be a revolutionary,” says Kim, righteously noting that nobody paid Marx to write Das Kapital (the fact that Marx died before completing it might have had something to do with that, but no matter). North Korean intellectuals had the privilege of giving lectures and writing books, “and yet they insist on receiving money for this wonderful task,” Kim splutters.

Work rules, too, come under serious scrutiny.  Responding to complaints that “university and college professors lecture a thousand hours a year”, which some consider to be too much, Kim is clear: “You are wrong!  Fundamentally speaking, calculating lecture hours is not the attitude of a revolutionary.  If you are true revolutionaries who serve the people, you would never calculate the hours; you try hard by all means to work as much as you can”.

(I make the following offer to university administrations across Canada: if any of you decide to try to outflank your faculty union to the left by telling them their views are evidence of captiveness to bourgeois ideology, I’m buying the first round.)

Times were tough in North Korea in the 60s.  In order to rebuild the country after war, there was a need to get Engineers into the field quickly, leaving little time for things like, say, a final year of studies.  “They say our technicians’ qualifications are low, but in fact they are not so low”, says Kim.  “Our Engineers may have graduated a year earlier, but since they have had more training at the actual places of production, they have many merits”.  Glen Murray couldn’t have put it any better.

The same applies to student enrolment, generally.  “Providing so many students with stipends is causing a heavy burden on the state”, he notes.  The solution?  Send them to work, and have them study while working.  If MOOCs had been around in the 60s, you know he’d have been all over them.  Juche, dontcha know.

Really, I can’t recommend this book enough.  A text for our times.

May 24

The Best Idea I’ve Seen All Year

I travel around a fair bit, and I get to see a lot of interesting stuff that’s going on at universities in Canada, and abroad.  People often ask me: what’s the best thing you’ve seen recently?  The answer this year, hands down, is UBC’s Start-up Services Voucher.

Now, UBC’s been a leader in commercialization and spin-off companies for at least twenty years.  They caught a lot of attention when they created a $10 million Seed Fund, capitalized by donations from alumni and the BC Innovation Council, which was designed to promote entrepreneurship by making early stage, pre-seed investments in start-ups founded by students or recent alumni.

But more quietly, the university has done something else which I think is much more interesting: about two years ago, it created the Start-up Services Voucher.  If you’re a UBC student, staff, or faculty member, and want to start a business, you’re eligible for up to $5000 worth of business services (though, in practice, most use far less).  And unlike virtually every other entrepreneurship system in Canadian PSE, there are no requirements whatsoever with respect to using UBC technology, nor is there any stipulation that the business be some kind of technology enterprise.  Want to open a flower shop?  This fund’s for you.

There’s no catch.  UBC certainly isn’t interested in equity, for instance.  All they want is recognition.  All companies that move through the program must display a logo declaring themselves as “UBC-affiliated companies” for a period of five years.

How brilliant is that?

First, it creates a great, dense network between an institution and small businesses in its community (which will no doubt pay off philanthropically, down the road).  Second of all, it allows the institution to get a much better handle on the post-graduation activities of its entrepreneurs, and hence allows UBC to highlight its larger role in job creation and innovation in British Columbia.  Frankly, UBC could pay for this out of the Government Relations budget, and it would make complete sense – how great will it be to be able to walk into an MLA’s office and rattle off the names of all the new, “UBC-affiliated” businesses that have started-up in his/her riding?

Students learn a lot in PSE, and not just inside the classroom.  When they start their own businesses, it’s the ultimate expression of the mix of hard, soft, and creative skills that they’ve gained at school, and are now applying in innovative ways.  It’s a huge, practical impact that universities and colleges have on their communities that no one’s ever been able to quantify or publicize.

Until now.  Bravo, UBC.  A great idea that deserves more attention – and some imitators.

May 23

A Rare Piece of Good Policy in Quebec

So, although it wasn’t widely noticed at the time, one really excellent piece of policy came out of the crap-fest that was the Quebec Education Summit, a couple of weeks ago; it’s a policy that deserves a great deal of wider study and emulation.  For the first time in Canadian history, a government managed to get rid of a crappy tax credit, and use it to improve targeted, needs-based subsidies.

Here’s what happened. The PQ, during its naked bid to win the affections of students in the run-up to the 2012 election, promised students that not only would they rescind the tuition hike imposed by the Liberal government, but they would also uphold the generous new student aid package the Liberals offered as a sweetener.  But of course, that meant spending double – so they needed a new form of revenue, at a time when the provincial budget was under pressure.

Enter the tax credits.

Now, if you’ve at all been following student aid over the last decade, you’ll know that Canada went tax-credit crazy around about 1998.  Mostly, it was a federal thing – a way for the feds to get money to parents for education, without the tedious mucking about of negotiating deals with provinces.  But some provinces went along for the ride, too. In any case, the value of education tax credits rapidly surpassed the combined value of grants and loan remission.

Total Value of Education Tax Credits vs. Grants & Remission, Canada, 1990-91 to 2009-10, in Billions of 2010 Dollars

 

 

 

 

 

 

 

 

 

 

 

 

Why does this matter?  Because tax credits are given out without regard to income or need.  And since kids from better-off backgrounds are more likely to go to PSE, tax credit expenditure, on aggregate, mostly ends up in the hands of people from the top 2 income quartiles.  Grants, on the other hand, are more likely to end up in the hands of people from the bottom 2 quartiles (the reason they don’t end up there entirely is because a lot kids from richer backgrounds get quite a lot of aid once they turn 22, and become “independent” students).

Distribution of Benefits by Income Quartile, for Selected Student Assistance Measures

 

 

 

 

 

 

 

 

 

 

 

 

Source: Usher, A (2004) Who Gets What? The Distribution of Government Subsidies for Postsecondary Education in Canada. Toronto: Educational Policy Institute.   

Now, over the past decade, a number of groups have recommended replacing tax credits with grants of some kind – even the CFS, who bizarrely denounce tax credits as regressive, even though they have EXACTLY the same re-distributive consequences as the tuition cuts which the CFS backs so fervently (consistency is not their strong suit).  Almost everyone – bar Michael Ignatieff – ignored these calls, essentially on the grounds that  Canadians wouldn’t stand for what would amount to a tax hike.

Well, now the PQ has proved them wrong.  A government has converted a regressive universal program into a targeted progressive one, to no opposition whatsoever, even in a highly-taxed province.  Policy-makers in the rest of the country should take note.

May 22

Bad Arguments for Basic Research

Last week’s announcement that the NRC was “open for business” has, if nothing else, revealed how shockingly weak most of the arguments are in favour of “basic” research.

Opponents of the NRC move have basically taken one of two rhetorical tacks.  The first is to present the switch in NRC mandate as the equivalent of the government abandoning basic science.  This is a bit off, frankly, considering that the government spends billions of dollars on SSHRC, NSERC, CIHR, etc.  Even if you’re passionate about basic research, there are still valid questions to be answered about why we should be paying billions of dollars a year to government departments doing basic research when the granting councils fund universities to ostensibly do the same thing.

The second argument is to say that government shouldn’t support applied science, because: a) it’s corporate welfare, and b) all breakthroughs ultimately rely on basic science, and so we should fund that exclusively.  It seems as though those who take this line have never heard of Germany’s Fraunhofer Institute, a publicly funded agency in Germany which does nothing but conduct applied research of direct utility to private enterprises.  It’s generally seen as a successful and useful complement to the government’s investments in basic science through the Max Planck Institute, and to my knowledge, Germany has never been accused of being anti-science for creating and funding Fraunhofer.

Another point here: the benefits of “basic” research leak across national borders. Very little of the upstream basic research that drives our economy is Canadian in origin.  So while it’s vitally important that someone, somewhere, puts a lot of money down on risky, non-applied research, individual countries can – and probably should – make some different decisions on basic vs. applied research based on local conditions.

The relative benefit of a marginal dollar investment in applied research vs. basic research depends on the kind of economy a country has, the pattern of firm size, and receptor capacity for research.  It’s not an easy thing to measure accurately – and I’m not suggesting that the current government has based its decision on anything so empirical – but it’s simply not intellectually honest to claim that one is always a better investment than the other.

Opposition to the NRC change is clearly – and probably justifiably – coloured by a more general irritation at a host of this government’s other policies on science and knowledge (Experimental Lakes, long-form census, etc).  But that’s still no excuse for this farrago of flimsy argumentation.  Rational policy-making requires us to engage in something more than juvenile, binary discussions about what kind of research is “best”.

May 21

Post-Graduation Employment

The meme on “underperforming universities” these days revolves around the idea that specific fields of study – usually Bachelor’s degrees in the humanities – do not lead to good jobs.  But this depends in no small measure on what one means by a “good job”, and over what time frame one chooses to measure success.

The graph below shows data from Ontario, six months after graduation.  Between 2003-2007, the employment rate of graduates in the labour market (i.e. excluding those who chose to study) bounced around between 92 and 94%.  In 2009, the rate fell by about 7%, to roughly 86%, more or less equally across all disciplines. Some fields of study were consistently below the average – specifically, fine arts, physical sciences (which seems to include biological sciences), and engineering.  Some fields of study were well above the average, notably education and nursing.  Humanities and social sciences ended up half way between the two.

Employment Rates of Ontario Graduates Six Months After Graduation in Selected Fields of Study, 2003-2009

 

 

 

 

 

 

 

 

 

 

 

 

The science figure is especially interesting, isn’t it?  Makes you wonder why there’s an S in STEM.

Now, some of you will surely be scratching your heads at this point.  Aren’t STEM graduates supposed to be in high demand?  How are both getting beat by Arts grads? Three quick answers. The first is that these figures exclude people who have gone back to school (unhelpfully, the Ontario data doesn’t tell you how big a number this is).  Two is that Engineers may take longer for a job search because they are secure in the knowledge that their eventual job will pay pretty well (see below) – the pattern we see after six months is also the pattern after twenty-four, as the chart below describes. And three is that the picture does change a bit after two years.

Employment Rates of Ontario Graduates Two Years After Graduation in Selected Fields of Study, 2003-2009

 

 

 

 

 

 

 

 

 

 

 

 

The classes of 2003 and 2005 had 2-year employment rates of about 96.5%.  That fell to about 95% for the class of 2007, and 93% for the class of 2009.  The fall was concentrated in education, humanities, social science, fine arts, and physical sciences; other disciplines saw less change.

Finally, there is the issue of income.  Here you see the real knock on studying in the humanities;  it’s not that they don’t get jobs – it’s that they end up in some jobs that don’t pay well.  Now, their incomes do increase about twice as fast as others between six months and two years (in the midst of a recession, they jump, on average, by 21%), but they start from a lower base.  An interesting point here, which I have made before, is that the difference in outcomes between students in the sciences and the social sciences is negligible.

Income of Ontario University Graduates Six Months and Two Years Out, Selected Fields of Study, Class of 2009

 

 

 

 

 

 

 

 

 

 

 

 

Clearly, jobs aren’t the issue – students of all stripes find work soon enough.  The issue is the rate of return.  We should focus on that.

Page 22 of 62« First...10...2021222324...304050...Last »