HESA

Higher Education Strategy Associates

Category Archives: international

September 30

The Problem with Global Reputation Rankings

I was in Athens this past June, at an EU-sponsored conference on rankings, which included a very intriguing discussion about the use of reputation indicators that I thought I would share with you.

Not all rankings have reputational indicators; the Shanghai (ARWU) rankings, for instance, eschew them completely.  But QS and Times Higher Education (THE) rankings both weight them pretty highly (50% for QS, 35% for THE).  But this data isn’t entirely transparent.  THE, who release their World University Rankings tomorrow,  hides the actual reputational survey results for teaching and research by combining each of them with some other indicators (THE has 13 indicators, but it only shows 5 composite scores).  The reasons for doing this are largely commercial; if, each September, THE actually showed all the results individually, they wouldn’t be able to reassemble the indicators in a different way to have an entirely separate “Reputation Rankings” release six months later (with concomitant advertising and event sales) using exactly the same data.  Also, its data collection partner, Thomson Reuters, wouldn’t be able to sell the data back to institutions as part of its Global Institutional Profiles Project.

Now, I get it, rankers have to cover their (often substantial) costs somehow, and this re-sale of hidden data is one way to do it (disclosure: we at HESA did this with our Measuring Academic Research in Canada ranking.  But given the impact that rankings have for universities, there is an obligation to get this data right.  And the problem is that neither QS nor THE publish enough information about their reputation survey to make a real judgement about the quality of their data – and in particular about the reliability of the “reputation” voting.

We know that the THE allows survey recipients to nominate up to 30 institutions as being “the best in the world” for research and teaching, respectively (15 from one’s home continent, and 15 worldwide); the QS allows 40 (20 from one’s own country, 20 world-wide).  But we have no real idea about how many people are actually ticking the boxes on each university.

In any case, an analyst at an English university recently reverse-engineered the published data for UK universities to work out voting totals.  The resulting estimate is that, among institutions in the 150-200 range of the THE rankings, the average number of votes obtained for either research or teaching is in the range of 30-to-40, at best.  Which is astonishing, really.  Given that reputation counts for one third of an institution’s total score, it means there is enormous scope for year-to-year variations  – get 40 one year and 30 the next, and significant swings in ordinal rankings could result.  It also makes a complete mockery of the “Top Under 50” rankings, where 85% of institutions rank well below the top 200 in the main rankings, and therefore are likely only garnering a couple of votes apiece.  If true, this is a serious methodological problem.

For commercial reasons, it’s impossible to expect the THE to completely open the kimono on its data.  But given the ridiculous amount of influence its rankings have, it would be irresponsible of it – especially since it is allegedly a journalistic enterprise – not to at least allow some third party to inspect its data and give users a better sense of its reliability.  To do otherwise reduces the THE’s ranking exercise to sham social science.

September 19

Better Know a Higher Ed System: France

France is one of the original homelands of the university: the University of Paris was the first real university outside the Mediterranean basin, and was home to six universities by 1500 – only Italy and Spain had more at the time.  But while it has quite ancient roots, it is also, in many respects, one of the youngest systems of higher education in Europe, because the entire university system was wiped out during the Revolution, and then developed again from scratch during Napoleonic period that followed.

Unlike virtually every other system on earth, the French do not put universities at the top of the higher education hierarchy.  Instead, there are what are called “les Grandes Écoles”: peak, specialized institutions that only operate in a certain limited number of fields – École des Mines and Polytechnique for Engineering, l‘École Normale Superieur for Education, and l‘École Nationale d’Administration Publique” to train the masters of the universe.  Most of these go back two centuries – Polytechnique was an excellent spot for Napoleon to train his gunners – but ENAP actually only dates from the 1940s.

One step down in the hierarchy are the big “Instituts”, which serve as the training ground for professions, mainly in technology (IUT), but also in fields like nursing.  Universities, for the most part (medical studies excepted), are widely viewed as the dregs of the system, the catch-all for people not smart enough to make the grandes écoles, or driven enough to do professional studies.  That’s partly because they are bereft of many prestige disciplines, but it’s also because, historically, they are not centres of research.  As with many other European countries (notably Germany and Spain), the public research mission was largely the responsibility of the Centre National de Recherche Scientifique (CNRS), which was not attached to the universities.

Another historical feature of French universities is the degree to which they have been under state control.  Legally, all faculties were part of a single “Universite de France” for most of the 19th century.  Universities as we know them – autonomous institutions that pursue their own plans and goals – are fairly recent.  If you’re being generous, they date back to 1968; in fact they didn’t reach North American levels of autonomy until the loi Pecresse in 2007 – in practice, though, the shift happened in late 1980s.  Prior to that, hiring and promotion was essentially all done through the Ministry; curricula were also laid down on national lines by expert committees run from Paris.

Recently, international rankings have been a major spur to change.  When the Academic Ranking of World Universities first appeared in 2003, it created the “choc de Shanghai” – the country was genuinely shocked at how weak its institutions were seen to be.  Much of it was down to system design, of course.  The Grandes Ecoles couldn’t compete with American multiversities because they were small, single-discipline institutions, and the universities couldn’t compete because the research was all tied up at CNRS.  But the French government, instead of standing up and saying “this ranking is irrelevant because our structures are different, and frankly our system of research and innovation works pretty well anyway”, decided to engage in a wild bout of policy-making: excellence initiatives, institutional mergers, etc.  It’s all designed implicitly to make their system look more American; though to keep up pretences, if anyone asks it’s actually about being “world-class”.

Maybe the most interesting development to watch is what’s going on at Paris Saclay – a campus that brings together roughly two dozen universities and scientific institutions in a single spot.  It’s both a federation of universities and a new independent institution.  The governance arrangements look like a nightmare, but the potential is certainly there for it to become a genuinely European super-university.  It’s not the only new university in the world whose founders dream of hitting the Shanghai Top Ten, but it’s probably the one with the best chance of doing so.

September 05

Better Know a Higher Ed System: New Zealand

We don’t hear much up here about New Zealand higher education, mainly because the country’s tiny, and literally located at the end of the earth.  But that’s a pity, because it’s an interesting system with a lot to like about it.

The country’s university system is pretty ordinary: eight universities, three of which were founded in the 19th century, and the rest founded after WWII. All of them are pretty much based on English lines, with just one – Auckland – generally considered to be “world-class”.  Rather, what makes New Zealand an interesting higher education system is what happens outside the universities.

About 30 years ago, New Zealand came close to bankruptcy; in response, the government moved to sharply liberalize the economy.  In education, this meant eliminating established educational monopolies, and widening the ability to provide education: anyone who wanted to deliver a degree or a diploma could do so, provided they could meet an independent quality standard.  Polytechnics – equivalent to our colleges – started offering degrees (in the process becoming an inspiration to our own colleges, some of whom proceeded to push for their own degree-granting status, and labelled themselves “polytechnics”), and hundreds of private providers started offering diplomas.  Despite this liberalization, the system is still able to enforce a qualifications framework, which allows people to stack lower-level qualifications towards higher-level ones – and that’s down to having a serious high-quality regulator in the New Zealand Qualifications Authority.

Another major system feature are the “wānangas”.  The term is a Maori word indicating traditional knowledge, but in practice the term has come to mean “Maori polytechnic” (the country’s universities all use the term “Whare Wānanga” – meaning “place of learning” – to translate their names into Maori).  There are three of these, two of which are tiny (less than 500 students), and one of which is freaking massive (38,000 today, down from a peak of 65,000 ten years ago). I’ll tell you the story of Te Wānanga o Aotearoa another time, because it deserves its own blog post.  But for the moment just keep in mind that in New Zealand, wānangas are considered the fourth “pillar” of higher education (along with universities, polytechnics, and privates), and that these institutions, entirely run by Maori, have had an enormously positive impact on Maori educational attainment rates (see this previous blog for stats on that).

A last point to note about NZ is its international strategy.  Like our government, New Zealand’s aims in this area are pretty mercantilist: students in = money in = good.  It could not possibly care less about outward mobility or other touchy-feely stuff.  What distinguishes their strategy from ours, however, is that theirs is smart.  Brilliant, actually.  Take a couple of minutes to compare Canada’s laughably thin and one-dimensional policy with Education New Zealand’s unbelievably detailed set of strategies, goals, and tactics laid out not just for the country as a whole, but for each of six key sub-sectors: universities, colleges, privates, primary/secondary schools, English language sector, and educational service/product providers.  That, my friends, is a strategy.  Now ask yourself: why we can’t produce something that good?

In short, there’s a lot Canadians could learn from New Zealand – if only we paid more attention.

June 10

Crazy Managerial Imperatives Around International Students

One of the weirdest – and I mean totally bat-guano-crazy – things in Canadian higher education is the way recruitment of international students is managed.  Although the image of international student recruitment is often seen simply as a money-spinner for institutions, the fact of the matter is that most institutions aren’t coming close to maximizing revenue from this source.  And that’s not because of any high-minded motives of institutions turning away students they don’t think are suitable for their university experience, either.  It’s simply because of the way institutions’ internal budget controls work.

In a regular business, sales forces get the budget they need to hit revenue targets.  But if sales are going well, and management thinks they can get more money by investing money in sales, then the sales force will get more money.  Simple as that.

Compare this to what is happening at many institutions in Canada around international recruitment budgets, where the discussion is more along these lines:

Senior Admin (to international office):  Can you get us more international students?  We could really use the cash.  Budget cuts, you know.

International Office: Um, sure.  But can I have some extra money for that?  Recruitment actually costs money.  Not to mention support once we get them here.

Senior Admin: What?  More money?  Didn’t I just tell you we don’t have any money?

International Office: But… we’ll get our money back and more (NB. There are circumstances where this isn’t true, as described back here, but for the moment let’s assume it is).  You need to spend money to make money.

Senior Admin: Look, there’s a budget freeze on.  If I give non-academic units more money, there’ll be hell to pay.  You know how envious everyone already is that you guys get to fly all over the place?

International Office: (Desperately suppressing the urge to start listing decanal and vice-presidential visits abroad) But you’re not “giving us money”. We generate revenue!

Senior Admin: Yes, but there’s nothing in our budget process that allows us to reflect that.

International Office: (repeatedly bangs head against wall.)

Seriously, this happens.  The idea of investing money to make money later on isn’t entirely foreign (sorry) to universities – but doing so via the international office often isn’t possible.  To a large extent that’s because of their historical roots.  Most of them weren’t set up as revenue-generating units – until a decade ago they were mostly busy doing things like checking student’s health insurance, and helping profs on sabbatical deal with accommodation and paperwork.  As a result, they tend to get tied up with other administrative units like Student Services or Human Resources (which tend to take a hit when times are bad), rather than with revenue units like Advancement (which usually doesn’t).

(As an aside, I’m pretty sure this is one of the reasons international offices turn to agents, rather than building up their own networks abroad; the latter requires upfront investment, while the former just requires paying for students once they arrive – which, as you can imagine, is a lot easier to sell within the bureaucracy.)

If institutions are serious about playing the international game, they need to get serious about how they fund and manage it.  Too many haven’t bothered to do that.

June 05

Articles of Faith

Further to Tuesday’s blog about STEM panics, I note a new report out from Canada 2020, a young-ish organization with pretensions to be the Liberals’ pet think tank called Skills and Higher Education in Canada: Towards Excellence and Equity.  Authored by the Conference Board’s Daniel Munro, it covers most of the ground you’d expect in a “touch-all-the-bases” report.  And while the section on equity is pretty good, when it comes to “excellence” this paper – like many before it – draws some conclusions based more on faith than facts.

Take for example, this passage:

Differences in average literacy skills explain 55 per cent of the variation in economic growth among OECD countries since 1960. With very high skills and higher education attainment rates, it is not surprising to find Canada among the most developed and prosperous countries in the world. But with fewer advanced degree-holders (e.g., Masters and PhDs), and weak performance on workplace education and training, it is also not surprising to find that Canada has been lagging key international peers in innovation and productivity growth for many years.

The first sentence is empirically correct, but things head south rapidly from there.  The average literacy rates do not necessarily imply, as the second sentence suggests, that higher education attainment rates are a cause of prosperity; Germany and Switzerland do OK with low rates, and Korea’s massive higher education expansion was a consequence rather than a cause of economic growth.  The final sentence goes even further, implying specifically that the percentage of the population with advanced degrees is a determinant of productivity growth.  This is flat-out untrue, as the figure below shows.

Figure 1: Productivity vs. PhDs per 100K of population, select OECD countries

image003

 

 

 

 

 

 

 

 

 

 

 

 

Countries in this graph: US, UK, NL, NO, BE, CH, D, SE, FI, O, DK, UK, US, IE, FR, CA, JP

The pattern is one you see in a lot of reports: find a stat linking growth to one particular educational attainment metric, then infer from this that any increase on any educational metric must produce growth.  It sounds convincing, but it usually isn’t true.

It’s the same with STEM.  Munro tells us Canada has a higher share of STEM of university graduates than the OECD average (true – and something we rarely hear), but also intones gravely that Canada lags “key international competitors” like Finland and Germany – which is simply nonsensical.  Our competitive economic position is in absolutely no way affected by the proportion of STEM grads in Finland (it’s Finland, for God’s sake.  Who cares?); as for Germany, since they have a substantially lower overall university attainment rate than Canada, so our number of STEM grads per capita is still higher than theirs (which is surely a more plausible metric as far the economy is concerned.

I don’t want to come across here as picking on Munro, because he’s hardly the only person who makes these kinds of arguments; such dubious assumptions underpin a lot of Canadian reports on education.  We attempt – for the most part admirably – to benchmark our performance internationally, but then use the results to gee up politicians for action by drawing largely unwarranted conclusions about threats to our competitive position if we aren’t in the top few spots of any given metric.   I don’t doubt these tactics are well-meant (if occasionally a bit cynical), but that doesn’t make them accurate.

The fact is, there are very few proven correlations between attainment metrics in education and economic performance, and even fewer where the arrow of causality runs from education to growth (rather than vice-versa).  If we have productivity problems, are they really related to STEM?  If they are related to STEM, which matters more – increasing STEM PhDs, or improving STEM comprehension among secondary school students?

We have literally no idea.  We have faith that more is better, but little evidence.  And we should be able to do better than that.

May 08

Why (Almost) Everyone Loves International Students (Part 2)

Yesterday, I showed how good international students were for universities’ bottom lines.  But it’s not quite as simple as I made it out to be.  Whether admitting international students makes sense or not depends on four factors:

1)      How much of the income do you get to keep?  In Quebec, international students in “regulated” programs (which include Arts) are worth essentially nothing to institutions because the government claws it all back.  On the other hand, in block-grant provinces (and in Saskatchewan, which is part-formula, part block), international students are basically pure profit.  The only reason to not take international students is if the provincial government might punish you for it, because of fears of crowding out local demand (cf. Alberta).  In most formula-funding provinces, and for Quebec’s unregulated programs, the return is somewhere in-between – institutions can charge what they want for international students, but get zero subsidy for them from the province.

2)      What’s the marginal cost per student?  Remember: marginal, not average.  There is a tendency to think that international students are more financially beneficial in Arts or Business because average costs are lower there than in Science and Engineering.  And while, to some extent, that’s  true, what really matters is how close to capacity each program is.  An extra Engineering student in a class of 29 with a capacity of 30 is actually going to be cheaper than an extra Arts student in a class of 30 with the same capacity, because being the 31st student means starting a new class section, hiring a new instructor, occupying more classroom space, etc.  The problem for most institutions is that they have only the barest notion of what marginal costs are across the institution at any given time.

3)      What’s the cost of recruitment?  At most mid-sized institutions these days, recruitment costs per international student are – all told – in the $6-7K range, once you take agent fees, overhead, and everything else into consideration.  Assuming the student is coming for four years and is going to generate 60-80K in fees, that’s pretty good (less so if your school has a problem with international student retention).  But it’s even better if you’re McGill, Toronto, or UBC; with so much brand prestige you don’t need to spend so much.

4)      What’s the opportunity cost?  Now that you know your income and expenses from international students, you can work out what your net income is by field of study.  But opportunity costs matter, too; your potential earnings from domestic students need to be taken into account.  For most institutions outside the big cities, the answer is “nothing” because the alternative to an international student is no students at all.  In these cases, the decision to admit international students is obvious.  Where it gets less obvious is where you can gain income from a domestic student.  At that point, you need to work out how net (not gross) income from a graduate student compares with net income from government grants and tuition fees.  At some institutions, in some fields of study, it will sometimes make more sense to enroll a domestic student over an international one.  But it’s close.

Got all that?  Good.  Now go build your strategic enrolment plans.

May 07

Why Everyone Loves International Students (Part 1)

A nice simple post today: why universities are going bananas for international students.

The first figure shows undergraduate tuition fees for international students in each province.  They range from a little under $10,000 in Newfoundland, to just over $25,000 in PEI.  The national average for this period is $18,840; in Ontario it is $23,000.

International Undergraduate Tuition Fees by Province, 2012, in $2013

image001

 

 

 

 

 

 

 

 

 

 

 

 

What’s more, fees for international students have been going up quite steadily for two decades.  Over the last 21 years, fees for international students have risen annually by an average of 4% in real terms (i.e. over and above inflation).

Average International Undergraduate Tuition Fees by Province, 1990-2012, in $2013

image002

 

 

 

 

 

 

 

 

 

 

 

 

And these fee rises seem to have no effect on demand.  Check out the rise in the number of international students.  Is that great or what?  High fees?  Lots of international students.  Raise fees?  MORE International students!

International Student Enrolments, 1992-2011

image003

 

 

 

 

 

 

 

 

 

 

 

 

Does anyone expect universities to turn down that kind of money, from an apparently inexhaustible source?  Especially when the amount they get from government is flat, and tuition is tightly regulated?

OK, yes, the decision to take in international students is, in fact, marginally more complicated than I’m making it out to be here.  I’ll get to that tomorrow.  But the basic case for international students is right there in those three graphs.

Money talks, you know.  Gotta pay the bills.

February 28

Better Know a Higher Ed System: Senegal

Hi all.  I’ve been in Dakar, Senegal this past week, developing a student program here.  Here’s a quick snapshot of the place:

Senegal is home to francophone Africa’s oldest university, l’Universite Cheikh Anta Diop (UCAD), sometimes known simply as the University of Dakar.  It’s one of the few institutions on the continent that predates independence.  For a very long time, it was the country’s only university – francophone African countries were slower to expand higher education opportunities than anglophone, for reasons I’ll get into shortly – and, in fact, it still accounts for about 90% of enrolments in the public system, and essentially 100% of its prestige programs.

As in most of Africa, Senegal started allowing private universities to operate in the early 1990s.  For a long time, these were few and small.  But then, in the past decade, their numbers shot up, from about 30 in 2000 to around 110 in 2010.  A handful of these – mostly management schools – had the scale to offer quality education, but with an average enrolment of 200 students, the sector as a whole struggles.

The reason francophone Africa was so slow to expand higher education is that national governments couldn’t afford it.  That’s not just because they were poor, but also because the prevailing model involved zero tuition, bursaries for (nearly) all, plus free/subsidized meals and accommodations.  The only way to keep costs down was to slam tight the lid on student numbers.  That was workable until the 80s baby boom started hitting universities fifteen years ago (hence the surge in private university numbers).  It made even less sense once the effects of universal primary and universal secondary education began to be felt, and the number of university-eligible students grew.

At this point, some bright light in government decided that the way to deal with this problem was to guarantee university education to everyone with a baccalauréat (the French kind, the one you get after high school).  Financially, this made so little sense that a series of hasty moves were made: tuition fees were implemented – with undergraduates now asked to pay $60/year, master’s students $120, and doctoral students $180.  (For comparison, the privates tend to charge between $1,750 and $2,250/year in fees.)  This provoked a couple of weeks of riots and some burned mini-buses, but the government held firm, and eventually the students paid up and went back to class – although this turned out to be a problem because, with the bacc guarantee, there were now far too many students.  UCAD, bursting at the seams, could accept only about 70% of the required students.

This led the Senegalese government to an innovative policy solution: namely, taking 6600 first year students, and paying the better private schools to educate them.  In the short-term, this works for everyone: UCAD gets some relief in student numbers, privates get some extra money, and government gets to keep its promise.  But with first year student numbers projected to increase by 10-15% per year as far as the eye can see, it’s at best a temporary solution.

The Senegalese government has finally discovered that la gratuité n’est pas rentable.  Future expansion is going to mean more students paying more money, in both the public and private sectors.  Given its status as a regional leader in higher education, it could herald the start of major change in higher education policy right across francophone Africa.

February 06

When the Times Higher Education Rankings Fail The Fall-Down-Laughing Test

You may have noted the gradual proliferation of rankings at the Times Higher Education over the last few years.  First the World University Rankings, then the World Reputation Rankings (a recycling of reputation survey data from the World Rankings), then the “100 under 50” (World Rankings, restricted to institutions founded since the early 60s, with a methodological twist to make the results less ridiculous), then the “BRICS Rankings” (World Rankings results, with developed countries excluded, and similar methodological twists).

Between actual rankings, the Times Higher staff can pull stuff out of the database, and turn small bits of analysis into stories.   For instance, last week, the THE came out with a list of the “100 most international” universities in the world.  You can see the results here.  Harmless stuff, in a sense – all they’ve done is take the data from the World University Rankings on international students, foreign faculty, and international research collaborations, and turned it into its own standalone list.  And of course, using those kinds of metrics, geographic and political realities mean that European universities – especially those from the really tiny countries – always come out first (Singapore and Hong Kong do okay, too, for similar reasons).

But when their editors start tweeting stuff – presumably as clickbait – about how shocking it is that only ONE American university (MIT, if it matters to you) makes the top 100 – you have to wonder if they’ve started drinking their own Kool-Aid.  Read that list of 100 again, take a look at who’s on the list, and think about who’s not.  Taken literally, the THE is saying that places like the University of Ireland, Maynooth, the University of Tasmania, and King Abdulaziz University are more international than Harvard, Yale, and Stanford.

Here’s the thing about rankings: there’s no way to do validity testing other than what I call the, “fall-down-laughing test”.  Like all indicator-systems, they are meant to proxy reality, rather than represent it absolutely.  But since there’s no independent standard of “excellence” or “internationalization” in universities, the only way you can determine whether or not the indicators and their associated weights actually “work” is by testing them in the real word, and seeing if they look “mostly right” to the people who will use them.  In most international ranking systems (including the THE), this means ensuring that either Harvard or Stanford comes first: if your rankings come up with, say, Tufts, or Oslo, or something as #1, it fails the fall-down-laughing test, because “everybody knows” Harvard and Stanford are 1-2.

The THE’s ranking on “international schools” comprehensively fails the fall-down-laughing test. In no world would sane academics agree that Abdulaziz and Maynooth are more international than Harvard.  The only way one could possibly believe this is if you’ve reached the point where you believe that specifically chosen indicators actually *are* reality, rather than proxies for it.  The Times Higher has apparently now gone down that particular rabbit hole.

January 24

Canada’s International Education Strategy – How Did It Get So Bad?

When our Department of Foreign Affairs, Trade and Development (DFATD – not DFAIT as I said a few days ago; sorry) delivers something as bad as our new International Education Strategy, an inquest is in order.  But since self-reflection isn’t exactly an abundant resource in Ottawa at the best of times, it’s an inquest we’re going to have to undertake ourselves.

Let’s start with the document’s basic failures:

  • It talks about increasing enrolment without assessing capacity constraints;
  • It shows no obvious signs of being conversant with international education markets, how students choose their destination countries, or how students subsequently choose a country of residence;
  • It spends an inordinate amount of time talking about discussions with the rarely-before heard-of “Canadian Consortium for  International Education”, which is made up mostly of Ottawa-based industry groups (e.g. AUCC, ACCC, CBIE) who – surprise, surprise – reciprocated by praising the document to the skies, despite its evident thinness.

What, you might wonder, links these points?

It seems clear that the document’s authors valued pleasing the Minister and Ottawa-based education groups more than they valued functioning relationships with provinces and institutions.  That’s a fairly common Ottawa problem.  It’s much easier to work with tame, de-fanged Ottawa interest groups, who will always say “thank you” for a new government policy no matter how silly it is, than to deal with provinces who keep rudely reminding you that education is in fact their jurisdiction.

But that’s too easy an “out”.  Lots of federal departments still talk to their provincial counterparts in a constructive way over areas of shared jurisdiction.  The Canada Student Loans Program, for instance, manages to do this reasonably well – why can’t DFATD do so?

I see three possible reasons.  The first is that the people asked to run with this file were junior, and didn’t know any better.  The second, more likely reason is that Foreign Affairs is too sniffy to talk to mere provinces (“I joined the service to go to Rome, not Regina!”).  But most likely of all is simply that the government just doesn’t care enough about this file to do a good job on it.  Partly, that’s due to the regime, but the culture at DFATD is a culprit, too.  My sense is that international education is a bit of a backwater there; people on the rise don’t stay very long.  Actually doing a good job would require lots of tedious consultation with provinces and institutions.  By the time the file actually achieved something that could be thrown on your CV, you’ve already moved on to your next rotation, so why bother?  Better to dash off something quick for an “announceable” than to do the hard work for which someone else will inevitably take credit.

If that’s true, then the problem runs deeper than a single, deeply flawed report; there’s a whole institutional culture that stand between us and good policy-making.  And the Ottawa NGOs’ habit of thanking the government any time it announces something, regardless of how inane, far from making things better is just enabling the dysfunction.  We need to deal with this.  Soon.

Page 1 of 612345...Last »