HESA

Higher Education Strategy Associates

Category Archives: international

November 27

Better Know a Higher Ed System: India (Part 1)

India is a big, crazy, multi-faceted, barely-functioning-but-still-impressive-it’s-functioning-at-all kind of country.  So it shouldn’t come as any surprise that its higher education system is a big, crazy, multi-faceted, barely-functioning- but-still-impressive-it’s-functioning-at-all kind of system.

The indigenous tradition of higher education stretches back to the 6th century AD.  Back then, Nalanda University was a world-centre of (mostly) Buddhist learning, which attracted students from Nepal, China, Southeast Asia, and Tibet.  Nalanda was also the first university with student dorms, and (allegedly) developed the first library cataloguing system.  But since Nalanda was destroyed by the Mamluks in the 12th century, its influence on modern Indian higher education has been zero.  Rather, the roots of the current education system can be traced to a very small number of institutions founded in the mid-nineteenth century by the British.

As in many colonial systems, these universities both bred nationalist revolutionaries, and gave those same revolutionaries an unshakeable belief that the English education system at the time of independence was the pinnacle of human achievement and should never be altered on any account whatsoever.  Which was a bit of a problem since those institutions were almost entirely Humanities-based with little by the way of Social Sciences, let alone the hard Sciences and Engineering.  That was (almost) OK if you thought of universities primarily as a place for civil service training; it wasn’t even vaguely OK if you wanted to build an advanced economy.

The basic institutional division in contemporary Indian higher education is between “universities”, which can grant degrees, and “colleges”, which cannot; the colleges are all affiliated to universities, meaning that college students take the exams of the affiliated university and receive their credential from there (remember BC’s university colleges in the late-80s/early 90s? Like that).  Colleges don’t get to choose their affiliate university; rather, each university has a geographic “catchment area” in which it has an effective monopoly.

Today, there are roughly 550 universities and 33,000 colleges.  (In case you’re wondering, that works out to an average enrolment per college of about 500, which from an efficiency point of view is madness.)  Most universities are funded by state governments, but the central government directly funds about 40 universities (mainly prestigious ones like Delhi U.).  It also funds another 110 or so “degree-awarding institutions”, which are not technically universities – the world-famous Indian Institutes of Technology (IIT) and Management (IIM) come under this heading.  There are also another 12,000 or so diploma institutions, which, if you squint hard enough, are analogous to our community colleges.

Though India is often thought of as quite statist, its higher education system has a very large private sector – in fact, pretty much the largest in the entire world.  Of those 550 universities, roughly 200 are private, as are about 19,000 of the 33,000 colleges, and 55% of the student body is enrolled in private institutions.  Complicating things still further is the fact that some private universities (mainly ones that were founded before the 1970s) receive quite substantial grants, while others receive nothing; on the flip side, cash-strapped public universities now run a large number of full-cost-recovery programs, and therefore are themselves substantially privately funded.

Managing a system like this is pretty chaotic – all the more so when you have an insane regulatory system, plus conflicting and insistent demands both to focus on access and to improve quality.  But more about that next time.

November 26

Who Owns Internationalization?

One of the first things you realize when studying how institutions deal with the process of internationalization is how fragmented authority actually is in Canadian universities – to the point where you sometimes have to wonder whether anyone’s actually in charge of the whole operation.

Part of the reason for this fragmentation  is that internationalization isn’t a single activity, but rather a process that affects a whole range of other activities in which universities normally engage.  To the extent that internationalization is about research connections, it tends to get run through a VP Research office.  To the extent it’s about recruiting students, it’s typically a purpose-built unit reporting to a Provost, but functionally linked (often uncomfortably) to the Admissions office.  To the extent that it’s about attracting foreign faculty, it’s completely ad hoc, and run by departments according to their own needs.

To the extent that internationalization is about creating agreements/MOUs with institutions all over the world, well, that’s a dog’s breakfast, because these agreements don’t all deal with the same issues.  Some are about exchanges, some are about one-way student mobility (e.g., 2+2 agreements), others are about research collaboration, etc. etc.  And because these agreements are a dog’s breakfast, it’s not always clear which bit of the university is in charge.  Sometimes it’s bottom-up: faculty members can propose agreements based around their own research interests; other times it comes from a purpose-built office that may or may not take any account of researchers’ interests.

Now, it’s not quite true to say that “no one’s in charge” of internationalization, because every one of these processes has someone in charge, at least nominally.  Operationally, identifiable people are in charge of recruiting international students, dealing with international student services, etc.  But it’s very rare to see anyone knitting the work of these various processes together into a coherent whole.  That is to say, there is lots of operational authority in internationalization, but very little in the way of strategic authority over internationalization.

In many places, this – remarkably – is seen as a plus.  A lot of people in international policy think “decentralization” is a good thing per se, because operational authority should lie closer to centres of real expertise, rather than being bottled up in a single office somewhere, so that institutions can be nimble in responding to opportunities.  That’s certainly true from the perspective of operational effectiveness, but what has largely been lost is the ability of institutions to steer internationalization policy across the various areas in a common way.  Too often there is no one making sure that what’s being done in international recruitment ties in with what is being done in research collaborations, or international mobility agreements, and so forth.

Where institutional coherence is abandoned, “internationalization” can thus look a lot like an excuse for administrators to swan around the world to no obvious discernible purpose to anyone inside the organization.  This situation pushes cynicism of internationalization well above general faculty levels of skepticism about administration.

All of which is to say: high-quality internationalization requires someone to steer all the various activities in a common, self-reinforcing manner.  Institutions don’t need to create a VP of Internationalization to achieve this; in many cases, a Provost or Vice-Provost could do just as good a job, depending on institutional culture and current priorities (the occasional support of an engaged President doesn’t hurt, either). But what is needed is sustained attention from someone who has the clout to demand some policy coherence.  Unfortunately, this is precisely what’s lacking on many campuses.

September 30

The Problem with Global Reputation Rankings

I was in Athens this past June, at an EU-sponsored conference on rankings, which included a very intriguing discussion about the use of reputation indicators that I thought I would share with you.

Not all rankings have reputational indicators; the Shanghai (ARWU) rankings, for instance, eschew them completely.  But QS and Times Higher Education (THE) rankings both weight them pretty highly (50% for QS, 35% for THE).  But this data isn’t entirely transparent.  THE, who release their World University Rankings tomorrow,  hides the actual reputational survey results for teaching and research by combining each of them with some other indicators (THE has 13 indicators, but it only shows 5 composite scores).  The reasons for doing this are largely commercial; if, each September, THE actually showed all the results individually, they wouldn’t be able to reassemble the indicators in a different way to have an entirely separate “Reputation Rankings” release six months later (with concomitant advertising and event sales) using exactly the same data.  Also, its data collection partner, Thomson Reuters, wouldn’t be able to sell the data back to institutions as part of its Global Institutional Profiles Project.

Now, I get it, rankers have to cover their (often substantial) costs somehow, and this re-sale of hidden data is one way to do it (disclosure: we at HESA did this with our Measuring Academic Research in Canada ranking.  But given the impact that rankings have for universities, there is an obligation to get this data right.  And the problem is that neither QS nor THE publish enough information about their reputation survey to make a real judgement about the quality of their data – and in particular about the reliability of the “reputation” voting.

We know that the THE allows survey recipients to nominate up to 30 institutions as being “the best in the world” for research and teaching, respectively (15 from one’s home continent, and 15 worldwide); the QS allows 40 (20 from one’s own country, 20 world-wide).  But we have no real idea about how many people are actually ticking the boxes on each university.

In any case, an analyst at an English university recently reverse-engineered the published data for UK universities to work out voting totals.  The resulting estimate is that, among institutions in the 150-200 range of the THE rankings, the average number of votes obtained for either research or teaching is in the range of 30-to-40, at best.  Which is astonishing, really.  Given that reputation counts for one third of an institution’s total score, it means there is enormous scope for year-to-year variations  – get 40 one year and 30 the next, and significant swings in ordinal rankings could result.  It also makes a complete mockery of the “Top Under 50” rankings, where 85% of institutions rank well below the top 200 in the main rankings, and therefore are likely only garnering a couple of votes apiece.  If true, this is a serious methodological problem.

For commercial reasons, it’s impossible to expect the THE to completely open the kimono on its data.  But given the ridiculous amount of influence its rankings have, it would be irresponsible of it – especially since it is allegedly a journalistic enterprise – not to at least allow some third party to inspect its data and give users a better sense of its reliability.  To do otherwise reduces the THE’s ranking exercise to sham social science.

September 19

Better Know a Higher Ed System: France

France is one of the original homelands of the university: the University of Paris was the first real university outside the Mediterranean basin, and was home to six universities by 1500 – only Italy and Spain had more at the time.  But while it has quite ancient roots, it is also, in many respects, one of the youngest systems of higher education in Europe, because the entire university system was wiped out during the Revolution, and then developed again from scratch during Napoleonic period that followed.

Unlike virtually every other system on earth, the French do not put universities at the top of the higher education hierarchy.  Instead, there are what are called “les Grandes Écoles”: peak, specialized institutions that only operate in a certain limited number of fields – École des Mines and Polytechnique for Engineering, l‘École Normale Superieur for Education, and l‘École Nationale d’Administration Publique” to train the masters of the universe.  Most of these go back two centuries – Polytechnique was an excellent spot for Napoleon to train his gunners – but ENAP actually only dates from the 1940s.

One step down in the hierarchy are the big “Instituts”, which serve as the training ground for professions, mainly in technology (IUT), but also in fields like nursing.  Universities, for the most part (medical studies excepted), are widely viewed as the dregs of the system, the catch-all for people not smart enough to make the grandes écoles, or driven enough to do professional studies.  That’s partly because they are bereft of many prestige disciplines, but it’s also because, historically, they are not centres of research.  As with many other European countries (notably Germany and Spain), the public research mission was largely the responsibility of the Centre National de Recherche Scientifique (CNRS), which was not attached to the universities.

Another historical feature of French universities is the degree to which they have been under state control.  Legally, all faculties were part of a single “Universite de France” for most of the 19th century.  Universities as we know them – autonomous institutions that pursue their own plans and goals – are fairly recent.  If you’re being generous, they date back to 1968; in fact they didn’t reach North American levels of autonomy until the loi Pecresse in 2007 – in practice, though, the shift happened in late 1980s.  Prior to that, hiring and promotion was essentially all done through the Ministry; curricula were also laid down on national lines by expert committees run from Paris.

Recently, international rankings have been a major spur to change.  When the Academic Ranking of World Universities first appeared in 2003, it created the “choc de Shanghai” – the country was genuinely shocked at how weak its institutions were seen to be.  Much of it was down to system design, of course.  The Grandes Ecoles couldn’t compete with American multiversities because they were small, single-discipline institutions, and the universities couldn’t compete because the research was all tied up at CNRS.  But the French government, instead of standing up and saying “this ranking is irrelevant because our structures are different, and frankly our system of research and innovation works pretty well anyway”, decided to engage in a wild bout of policy-making: excellence initiatives, institutional mergers, etc.  It’s all designed implicitly to make their system look more American; though to keep up pretences, if anyone asks it’s actually about being “world-class”.

Maybe the most interesting development to watch is what’s going on at Paris Saclay – a campus that brings together roughly two dozen universities and scientific institutions in a single spot.  It’s both a federation of universities and a new independent institution.  The governance arrangements look like a nightmare, but the potential is certainly there for it to become a genuinely European super-university.  It’s not the only new university in the world whose founders dream of hitting the Shanghai Top Ten, but it’s probably the one with the best chance of doing so.

September 05

Better Know a Higher Ed System: New Zealand

We don’t hear much up here about New Zealand higher education, mainly because the country’s tiny, and literally located at the end of the earth.  But that’s a pity, because it’s an interesting system with a lot to like about it.

The country’s university system is pretty ordinary: eight universities, three of which were founded in the 19th century, and the rest founded after WWII. All of them are pretty much based on English lines, with just one – Auckland – generally considered to be “world-class”.  Rather, what makes New Zealand an interesting higher education system is what happens outside the universities.

About 30 years ago, New Zealand came close to bankruptcy; in response, the government moved to sharply liberalize the economy.  In education, this meant eliminating established educational monopolies, and widening the ability to provide education: anyone who wanted to deliver a degree or a diploma could do so, provided they could meet an independent quality standard.  Polytechnics – equivalent to our colleges – started offering degrees (in the process becoming an inspiration to our own colleges, some of whom proceeded to push for their own degree-granting status, and labelled themselves “polytechnics”), and hundreds of private providers started offering diplomas.  Despite this liberalization, the system is still able to enforce a qualifications framework, which allows people to stack lower-level qualifications towards higher-level ones – and that’s down to having a serious high-quality regulator in the New Zealand Qualifications Authority.

Another major system feature are the “wānangas”.  The term is a Maori word indicating traditional knowledge, but in practice the term has come to mean “Maori polytechnic” (the country’s universities all use the term “Whare Wānanga” – meaning “place of learning” – to translate their names into Maori).  There are three of these, two of which are tiny (less than 500 students), and one of which is freaking massive (38,000 today, down from a peak of 65,000 ten years ago). I’ll tell you the story of Te Wānanga o Aotearoa another time, because it deserves its own blog post.  But for the moment just keep in mind that in New Zealand, wānangas are considered the fourth “pillar” of higher education (along with universities, polytechnics, and privates), and that these institutions, entirely run by Maori, have had an enormously positive impact on Maori educational attainment rates (see this previous blog for stats on that).

A last point to note about NZ is its international strategy.  Like our government, New Zealand’s aims in this area are pretty mercantilist: students in = money in = good.  It could not possibly care less about outward mobility or other touchy-feely stuff.  What distinguishes their strategy from ours, however, is that theirs is smart.  Brilliant, actually.  Take a couple of minutes to compare Canada’s laughably thin and one-dimensional policy with Education New Zealand’s unbelievably detailed set of strategies, goals, and tactics laid out not just for the country as a whole, but for each of six key sub-sectors: universities, colleges, privates, primary/secondary schools, English language sector, and educational service/product providers.  That, my friends, is a strategy.  Now ask yourself: why we can’t produce something that good?

In short, there’s a lot Canadians could learn from New Zealand – if only we paid more attention.

June 10

Crazy Managerial Imperatives Around International Students

One of the weirdest – and I mean totally bat-guano-crazy – things in Canadian higher education is the way recruitment of international students is managed.  Although the image of international student recruitment is often seen simply as a money-spinner for institutions, the fact of the matter is that most institutions aren’t coming close to maximizing revenue from this source.  And that’s not because of any high-minded motives of institutions turning away students they don’t think are suitable for their university experience, either.  It’s simply because of the way institutions’ internal budget controls work.

In a regular business, sales forces get the budget they need to hit revenue targets.  But if sales are going well, and management thinks they can get more money by investing money in sales, then the sales force will get more money.  Simple as that.

Compare this to what is happening at many institutions in Canada around international recruitment budgets, where the discussion is more along these lines:

Senior Admin (to international office):  Can you get us more international students?  We could really use the cash.  Budget cuts, you know.

International Office: Um, sure.  But can I have some extra money for that?  Recruitment actually costs money.  Not to mention support once we get them here.

Senior Admin: What?  More money?  Didn’t I just tell you we don’t have any money?

International Office: But… we’ll get our money back and more (NB. There are circumstances where this isn’t true, as described back here, but for the moment let’s assume it is).  You need to spend money to make money.

Senior Admin: Look, there’s a budget freeze on.  If I give non-academic units more money, there’ll be hell to pay.  You know how envious everyone already is that you guys get to fly all over the place?

International Office: (Desperately suppressing the urge to start listing decanal and vice-presidential visits abroad) But you’re not “giving us money”. We generate revenue!

Senior Admin: Yes, but there’s nothing in our budget process that allows us to reflect that.

International Office: (repeatedly bangs head against wall.)

Seriously, this happens.  The idea of investing money to make money later on isn’t entirely foreign (sorry) to universities – but doing so via the international office often isn’t possible.  To a large extent that’s because of their historical roots.  Most of them weren’t set up as revenue-generating units – until a decade ago they were mostly busy doing things like checking student’s health insurance, and helping profs on sabbatical deal with accommodation and paperwork.  As a result, they tend to get tied up with other administrative units like Student Services or Human Resources (which tend to take a hit when times are bad), rather than with revenue units like Advancement (which usually doesn’t).

(As an aside, I’m pretty sure this is one of the reasons international offices turn to agents, rather than building up their own networks abroad; the latter requires upfront investment, while the former just requires paying for students once they arrive – which, as you can imagine, is a lot easier to sell within the bureaucracy.)

If institutions are serious about playing the international game, they need to get serious about how they fund and manage it.  Too many haven’t bothered to do that.

June 05

Articles of Faith

Further to Tuesday’s blog about STEM panics, I note a new report out from Canada 2020, a young-ish organization with pretensions to be the Liberals’ pet think tank called Skills and Higher Education in Canada: Towards Excellence and Equity.  Authored by the Conference Board’s Daniel Munro, it covers most of the ground you’d expect in a “touch-all-the-bases” report.  And while the section on equity is pretty good, when it comes to “excellence” this paper – like many before it – draws some conclusions based more on faith than facts.

Take for example, this passage:

Differences in average literacy skills explain 55 per cent of the variation in economic growth among OECD countries since 1960. With very high skills and higher education attainment rates, it is not surprising to find Canada among the most developed and prosperous countries in the world. But with fewer advanced degree-holders (e.g., Masters and PhDs), and weak performance on workplace education and training, it is also not surprising to find that Canada has been lagging key international peers in innovation and productivity growth for many years.

The first sentence is empirically correct, but things head south rapidly from there.  The average literacy rates do not necessarily imply, as the second sentence suggests, that higher education attainment rates are a cause of prosperity; Germany and Switzerland do OK with low rates, and Korea’s massive higher education expansion was a consequence rather than a cause of economic growth.  The final sentence goes even further, implying specifically that the percentage of the population with advanced degrees is a determinant of productivity growth.  This is flat-out untrue, as the figure below shows.

Figure 1: Productivity vs. PhDs per 100K of population, select OECD countries

image003

 

 

 

 

 

 

 

 

 

 

 

 

Countries in this graph: US, UK, NL, NO, BE, CH, D, SE, FI, O, DK, UK, US, IE, FR, CA, JP

The pattern is one you see in a lot of reports: find a stat linking growth to one particular educational attainment metric, then infer from this that any increase on any educational metric must produce growth.  It sounds convincing, but it usually isn’t true.

It’s the same with STEM.  Munro tells us Canada has a higher share of STEM of university graduates than the OECD average (true – and something we rarely hear), but also intones gravely that Canada lags “key international competitors” like Finland and Germany – which is simply nonsensical.  Our competitive economic position is in absolutely no way affected by the proportion of STEM grads in Finland (it’s Finland, for God’s sake.  Who cares?); as for Germany, since they have a substantially lower overall university attainment rate than Canada, so our number of STEM grads per capita is still higher than theirs (which is surely a more plausible metric as far the economy is concerned.

I don’t want to come across here as picking on Munro, because he’s hardly the only person who makes these kinds of arguments; such dubious assumptions underpin a lot of Canadian reports on education.  We attempt – for the most part admirably – to benchmark our performance internationally, but then use the results to gee up politicians for action by drawing largely unwarranted conclusions about threats to our competitive position if we aren’t in the top few spots of any given metric.   I don’t doubt these tactics are well-meant (if occasionally a bit cynical), but that doesn’t make them accurate.

The fact is, there are very few proven correlations between attainment metrics in education and economic performance, and even fewer where the arrow of causality runs from education to growth (rather than vice-versa).  If we have productivity problems, are they really related to STEM?  If they are related to STEM, which matters more – increasing STEM PhDs, or improving STEM comprehension among secondary school students?

We have literally no idea.  We have faith that more is better, but little evidence.  And we should be able to do better than that.

May 08

Why (Almost) Everyone Loves International Students (Part 2)

Yesterday, I showed how good international students were for universities’ bottom lines.  But it’s not quite as simple as I made it out to be.  Whether admitting international students makes sense or not depends on four factors:

1)      How much of the income do you get to keep?  In Quebec, international students in “regulated” programs (which include Arts) are worth essentially nothing to institutions because the government claws it all back.  On the other hand, in block-grant provinces (and in Saskatchewan, which is part-formula, part block), international students are basically pure profit.  The only reason to not take international students is if the provincial government might punish you for it, because of fears of crowding out local demand (cf. Alberta).  In most formula-funding provinces, and for Quebec’s unregulated programs, the return is somewhere in-between – institutions can charge what they want for international students, but get zero subsidy for them from the province.

2)      What’s the marginal cost per student?  Remember: marginal, not average.  There is a tendency to think that international students are more financially beneficial in Arts or Business because average costs are lower there than in Science and Engineering.  And while, to some extent, that’s  true, what really matters is how close to capacity each program is.  An extra Engineering student in a class of 29 with a capacity of 30 is actually going to be cheaper than an extra Arts student in a class of 30 with the same capacity, because being the 31st student means starting a new class section, hiring a new instructor, occupying more classroom space, etc.  The problem for most institutions is that they have only the barest notion of what marginal costs are across the institution at any given time.

3)      What’s the cost of recruitment?  At most mid-sized institutions these days, recruitment costs per international student are – all told – in the $6-7K range, once you take agent fees, overhead, and everything else into consideration.  Assuming the student is coming for four years and is going to generate 60-80K in fees, that’s pretty good (less so if your school has a problem with international student retention).  But it’s even better if you’re McGill, Toronto, or UBC; with so much brand prestige you don’t need to spend so much.

4)      What’s the opportunity cost?  Now that you know your income and expenses from international students, you can work out what your net income is by field of study.  But opportunity costs matter, too; your potential earnings from domestic students need to be taken into account.  For most institutions outside the big cities, the answer is “nothing” because the alternative to an international student is no students at all.  In these cases, the decision to admit international students is obvious.  Where it gets less obvious is where you can gain income from a domestic student.  At that point, you need to work out how net (not gross) income from a graduate student compares with net income from government grants and tuition fees.  At some institutions, in some fields of study, it will sometimes make more sense to enroll a domestic student over an international one.  But it’s close.

Got all that?  Good.  Now go build your strategic enrolment plans.

May 07

Why Everyone Loves International Students (Part 1)

A nice simple post today: why universities are going bananas for international students.

The first figure shows undergraduate tuition fees for international students in each province.  They range from a little under $10,000 in Newfoundland, to just over $25,000 in PEI.  The national average for this period is $18,840; in Ontario it is $23,000.

International Undergraduate Tuition Fees by Province, 2012, in $2013

image001

 

 

 

 

 

 

 

 

 

 

 

 

What’s more, fees for international students have been going up quite steadily for two decades.  Over the last 21 years, fees for international students have risen annually by an average of 4% in real terms (i.e. over and above inflation).

Average International Undergraduate Tuition Fees by Province, 1990-2012, in $2013

image002

 

 

 

 

 

 

 

 

 

 

 

 

And these fee rises seem to have no effect on demand.  Check out the rise in the number of international students.  Is that great or what?  High fees?  Lots of international students.  Raise fees?  MORE International students!

International Student Enrolments, 1992-2011

image003

 

 

 

 

 

 

 

 

 

 

 

 

Does anyone expect universities to turn down that kind of money, from an apparently inexhaustible source?  Especially when the amount they get from government is flat, and tuition is tightly regulated?

OK, yes, the decision to take in international students is, in fact, marginally more complicated than I’m making it out to be here.  I’ll get to that tomorrow.  But the basic case for international students is right there in those three graphs.

Money talks, you know.  Gotta pay the bills.

February 28

Better Know a Higher Ed System: Senegal

Hi all.  I’ve been in Dakar, Senegal this past week, developing a student program here.  Here’s a quick snapshot of the place:

Senegal is home to francophone Africa’s oldest university, l’Universite Cheikh Anta Diop (UCAD), sometimes known simply as the University of Dakar.  It’s one of the few institutions on the continent that predates independence.  For a very long time, it was the country’s only university – francophone African countries were slower to expand higher education opportunities than anglophone, for reasons I’ll get into shortly – and, in fact, it still accounts for about 90% of enrolments in the public system, and essentially 100% of its prestige programs.

As in most of Africa, Senegal started allowing private universities to operate in the early 1990s.  For a long time, these were few and small.  But then, in the past decade, their numbers shot up, from about 30 in 2000 to around 110 in 2010.  A handful of these – mostly management schools – had the scale to offer quality education, but with an average enrolment of 200 students, the sector as a whole struggles.

The reason francophone Africa was so slow to expand higher education is that national governments couldn’t afford it.  That’s not just because they were poor, but also because the prevailing model involved zero tuition, bursaries for (nearly) all, plus free/subsidized meals and accommodations.  The only way to keep costs down was to slam tight the lid on student numbers.  That was workable until the 80s baby boom started hitting universities fifteen years ago (hence the surge in private university numbers).  It made even less sense once the effects of universal primary and universal secondary education began to be felt, and the number of university-eligible students grew.

At this point, some bright light in government decided that the way to deal with this problem was to guarantee university education to everyone with a baccalauréat (the French kind, the one you get after high school).  Financially, this made so little sense that a series of hasty moves were made: tuition fees were implemented – with undergraduates now asked to pay $60/year, master’s students $120, and doctoral students $180.  (For comparison, the privates tend to charge between $1,750 and $2,250/year in fees.)  This provoked a couple of weeks of riots and some burned mini-buses, but the government held firm, and eventually the students paid up and went back to class – although this turned out to be a problem because, with the bacc guarantee, there were now far too many students.  UCAD, bursting at the seams, could accept only about 70% of the required students.

This led the Senegalese government to an innovative policy solution: namely, taking 6600 first year students, and paying the better private schools to educate them.  In the short-term, this works for everyone: UCAD gets some relief in student numbers, privates get some extra money, and government gets to keep its promise.  But with first year student numbers projected to increase by 10-15% per year as far as the eye can see, it’s at best a temporary solution.

The Senegalese government has finally discovered that la gratuité n’est pas rentable.  Future expansion is going to mean more students paying more money, in both the public and private sectors.  Given its status as a regional leader in higher education, it could herald the start of major change in higher education policy right across francophone Africa.

Page 1 of 612345...Last »