HESA

Higher Education Strategy Associates

October 02

Too Big to Fail?

Here’s a serious question: are universities too big to fail?  And if so, what are the consequences of that?

If we had a fully public system, with tight government oversight on budgets, and no deficit spending – sort of like what much of continental Europe has – this wouldn’t be an issue.  By definition, public institutions couldn’t fail (though presumably a government would be free to close an institution should it wish to do so).   But the existence of institutional financial autonomy changes things.  Who, ultimately, bears the responsibility for an institution’s finances?  What happens if things go wrong?

One of the interesting things about the very market-driven reforms in both Australia and England is the assumption that universities are, at the end of the day, on their own.  They can have their freedom – especially to raise fees – but the flip side is that no one’s going to bail them out, either.  That might sound a little crazy to Canadians, who tend to think of campuses as next to immortal, but the fact is they’re not.  If people don’t attend universities, they go under.  It happens all the time with international campuses (think of Waterloo’s venture in the Emirates), and there’s a long history of private universities going under in the US (think Antioch College’s 2008 flame-out).  Even in Canada, we’ve had closures (or rather, closures thinly disguised as mergers – Augustana College in Alberta being the most obvious).

The Canadian view here reflects a very deep ambivalence on the part of governments about the role of markets in higher education.  By and large, governments love it – LOVE IT! – when entrepreneurial universities make money through licensing, or get foreign students to cough up cash for courses, because that means institutions are less likely to come banging on government’s door for money.  They are significantly less enthusiastic about universities getting local students to cough up money for courses, because that means parents come banging on their doors.  And they are positively allergic to the idea that a campus might lose serious money doing something entrepreneurial – or worse, try to cut services in order to make ends meet.

As a result, Canadian governments laud entrepreneurial universities in public, but stifle them in private.  And the reason for this is that governments are unwilling to risk the public opprobrium that would come from a major campus closure.  At the end of the day, governments know they would be forced to step in because universities really are too big to fail.

Only one province is actually honest about the implication of this: British Columbia.  There, precisely because the government feels ultimately responsible for institutional survival, institutions do not have the freedom either to build their own buildings (even if they raise their own cash) or to negotiate independently with staff (the province insists that settlements meet specific public-sector-wide guidelines).  My guess is that over time, most Canadian governments will converge on the BC model because institutions are, in fact, too big to fail, and as a people, we’re scared of failure. But that policy is not without cost: making universities safe can also make it more difficult for them to excel.

It would be great if one day we had a debate where we talked honestly about the pros and cons of institutional autonomy and markets in higher education, instead fumbling around addressing symptoms rather than causes, and dealing in slogans rather than empirics.  But that wouldn’t be very Canadian, would it?

October 01

A Venn Diagram About Skills Gaps

Short and sweet today, folks, as I know you’re all busy.

We’ve done a lot of research over the years at HESA Towers.  We read up on what employers want – and we also do studies that look at how recent graduates fare in the labour market, and what they wish they’d had more of while in university.  And pretty much, without exception, regardless of field of study, those two sources agree on what students need to be better-prepared for the labour market.

1

 

 

 

 

 

 

 

 

 

 

 

 

 

 

So, want to give your grads a boost in the labour market?  Figure out how to give them those basic business skills.  Experiential learning is probably the most effective way to do it, but there are other ways, as well, both inside and outside the classroom.

It’s that simple.  Well, not simple at all really.  But at least the problem is well-defined.

September 30

The Problem with Global Reputation Rankings

I was in Athens this past June, at an EU-sponsored conference on rankings, which included a very intriguing discussion about the use of reputation indicators that I thought I would share with you.

Not all rankings have reputational indicators; the Shanghai (ARWU) rankings, for instance, eschew them completely.  But QS and Times Higher Education (THE) rankings both weight them pretty highly (50% for QS, 35% for THE).  But this data isn’t entirely transparent.  THE, who release their World University Rankings tomorrow,  hides the actual reputational survey results for teaching and research by combining each of them with some other indicators (THE has 13 indicators, but it only shows 5 composite scores).  The reasons for doing this are largely commercial; if, each September, THE actually showed all the results individually, they wouldn’t be able to reassemble the indicators in a different way to have an entirely separate “Reputation Rankings” release six months later (with concomitant advertising and event sales) using exactly the same data.  Also, its data collection partner, Thomson Reuters, wouldn’t be able to sell the data back to institutions as part of its Global Institutional Profiles Project.

Now, I get it, rankers have to cover their (often substantial) costs somehow, and this re-sale of hidden data is one way to do it (disclosure: we at HESA did this with our Measuring Academic Research in Canada ranking.  But given the impact that rankings have for universities, there is an obligation to get this data right.  And the problem is that neither QS nor THE publish enough information about their reputation survey to make a real judgement about the quality of their data – and in particular about the reliability of the “reputation” voting.

We know that the THE allows survey recipients to nominate up to 30 institutions as being “the best in the world” for research and teaching, respectively (15 from one’s home continent, and 15 worldwide); the QS allows 40 (20 from one’s own country, 20 world-wide).  But we have no real idea about how many people are actually ticking the boxes on each university.

In any case, an analyst at an English university recently reverse-engineered the published data for UK universities to work out voting totals.  The resulting estimate is that, among institutions in the 150-200 range of the THE rankings, the average number of votes obtained for either research or teaching is in the range of 30-to-40, at best.  Which is astonishing, really.  Given that reputation counts for one third of an institution’s total score, it means there is enormous scope for year-to-year variations  – get 40 one year and 30 the next, and significant swings in ordinal rankings could result.  It also makes a complete mockery of the “Top Under 50” rankings, where 85% of institutions rank well below the top 200 in the main rankings, and therefore are likely only garnering a couple of votes apiece.  If true, this is a serious methodological problem.

For commercial reasons, it’s impossible to expect the THE to completely open the kimono on its data.  But given the ridiculous amount of influence its rankings have, it would be irresponsible of it – especially since it is allegedly a journalistic enterprise – not to at least allow some third party to inspect its data and give users a better sense of its reliability.  To do otherwise reduces the THE’s ranking exercise to sham social science.

September 29

Differentiation and Branding From the Student Perspective

One question that always comes up (or should come up, anyway) in discussions of university branding and positioning is: “how different is our institution, really”?  Well, for a few years, when we ran the Globe and Mail Canadian University Report survey, we used to ask students questions that would allow us to see how different students thought their university was.  The results were… interesting.

We asked students to locate their institution on an 11-point double-ended scale.  Did they think their institution was more focussed on graduate students or undergraduates?  Was it focussed on a couple of key fields of study, or more spread out?  Focussed on global issues or local ones?  Was it open to new ideas, or cautious about them? Did students have to be self-sufficient, or was the institution a nurturing one?  Was the curriculum more theoretical or applied? Was the student body homogeneous or diverse?  And, broadly, was it professor-centred or student-centred?  We could then plot their answers on a spidergram to show how institutions differed from one another.  What we found was an interesting degree of clustering, which allowed us to make specific categorizations of institutions.

The most definable groups of institutions were what we called the Liberal Arts schools – mostly small schools where students described their institution as undergrad-focussed, very nurturing, and not focussed on a particular field of study.  This includes the usual suspects at Acadia, Mt. Allison. St. FX, but also the religious schools (e.g. Redeemer, Trinity Western), the Western Colleges (Brescia, Huron, King’s) and Guelph (but not Trent).  The other big clustering of schools were what we called the “graduate-focussed schools”.  This was basically the U-15, minus McGill, U of T St. George (though the other U of T campuses qualified), Waterloo, Manitoba, and Saskatchewan, but including Carleton, SFU and Concordia.  These two groups position themselves on the spidergram as follows:

Figure 1: Positioning Results for “Graduate” and “Liberal Arts” Schools

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

Note: The further a line is to the edge of the spidergram, the more the value tends towards the first word in the axis description (e.g. “graduate” rather than “undergraduate”).

There were a number of individual schools that had very distinct profiles.  Waterloo, for instance, looks entirely unlike every other institution. Students there are much more likely to describe their institution as open to change (a trait which is significantly correlated with student satisfaction), and much more likely to describe their institution as being “focussed” on a few key areas.  OCAD University was by far the most likely to be described as having an applied curriculum and a student-centred program.  McGIll and Toronto (St. George) had weirdly identical results (i.e. their students described them in *exactly* the same terms), describing them as being much more global, much more graduate focussed, and much more sink-or-swim (i.e. NOT nurturing) than other schools.

Figure 2: Positioning Results for OCAD University, University of Waterloo and McGill/University of Toronto (St. George)

unnamed-1

 

 

 

 

 

 

 

 

 

 

 

 

Now, here’s the important bit.  The results from every other school in the country, from Trent to Saint Mary’s, to Fraser Valley, to Saskatchewan and Manitoba – were basically one big lump.  None of these institutions really stood out in any way – basically, students just described these places as “school”.  None had any real distinguishing characteristics on which you could build a sensible brand.  If they had a colour, they would be beige.

If you view higher education as one more social service where the goal is provide a uniform product to students everywhere, this is a good thing.  But if you think universities should be distinct entities, catering to varying niche tastes that evolve over time, it’s a pretty depressing picture.

September 26

Te Wānanga o Aotearoa

A couple of weeks ago, I promised I would tell you the story of Te Wānanga o Aotearoa, the entirely Maori-run polytechnic with over 35,000 students.  So here it is.

The 1970s saw significant Aboriginal cultural revivals in many parts of the world.  Aboriginal higher education – or at least the access of aboriginal peoples to mainstream higher education – was a significant part of that.  In Canada, the struggle was mostly about gaining a foothold in mainstream institutions; in the United States, the focus was much more on creating aboriginal-controlled institutions, known as tribal colleges and universities.  In New Zealand, the Maori journey in higher education was similar to the US in that it involved creating their own separate institutions, and then later seeking recognition for them.  The result was a class of institutions called “Wānangas” (a term that roughly equates with “knowledge”).

The first Wānanga (Te Wānanga o Raukawa) focused mostly on language and culture, and had relatively small enrolments (still under 1,000).  The second, Te Wānanga o Aotearoa, was more focussed on skill acquisition – mainly Maori arts and crafts, but also with courses in tourism and computer skills.  They were a fairly marginal institution, too, until they caught a big break in the late 1980s when the Education Act was being re-written.  At the time, New Zealand’s governing Labour Party was putting the country through a major free-market revolution.  In education, that meant caring more about outcomes and outputs than about the provider’s pedigree.  It was also a time when the government was making concerted efforts to improve relations between Maori and Pakeha, and treat the Treaty of Waitangi with some respect.  And so, when it came time to write the Act, they decided to give Wānanga status as a fourth official type of tertiary education, alongside universities, polytechnics, and privates.  That guaranteed them some annual funding, but because the government was in financial straits, there was no money available for capital.                                

That’s where things got interesting.  Part of the whole return to the Treaty of Waitangi involved creating a Treaty Tribunal to adjudicate cases where Maori felt that public policy were not in keeping with the terms of the treaty.  Te Wānanga o Aotearoa decided to challenge the capital funding policy, arguing that they were a recognized form of education but had been unable to benefit, as others had, from public capital spending.  The tribunal agreed with them, handing the Wānanga what looked to be a whopping cash settlement.  Cannily, however, they played a long game and negotiated a reduced settlement on capital in exchange for a straight per-student funding agreement with – and this is crucial – no cap on numbers.

It was at this point that all manner of fun broke loose.  The funding deal allowed Te Wānanga o Aotearoa to indulge all its most entrepreneurial instincts, and the school went from having 1,000 students in 1998 to having 65,000 students in 2002.  This involved a lot of institutional change in terms of widening the scope of the types of programs it offered, and it involved a lot of community delivery – at one point they had over 300 teaching sites.  Its ability to attract significant numbers of non-Maori students – particularly recent immigrants – was another important factor.

But it also involved cutting some corners.  In 2005, the Auditor-General came down hard on the school, suggesting (basically) that while Te Wānanga o Aotearoa may have done wonders in expanding access, it would have been nice if they had kept some actual receipts for their spending.  The resultant tighter enforcement rules drove down enrolments to roughly 30,000, where they remain today.  In the short-term, that caused a bit of a financial crisis, as well as layoffs.  But in the longer term it probably made the organization stronger, and it remains by far the world’s largest aboriginal-controlled institution of higher education, delivering thousands of recognized tertiary credentials each year, including some at the Bachelor’s level.

The Wānanga model is not one we’ve adopted in Canada, but for those leaders in government and aboriginal organizations seeking to expand educational opportunities for aboriginal Canadians, there are a lot of important lessons to be drawn from Te Wānanga o Aotearoa’s experience.  We should pay heed to them.

September 25

Ending the Merit Scholarship Arms Race

Here’s a way the new Ontario Minister of Training Colleges and Universities, Reza Moridi, could do everyone an enormous service, and win political capital at the same time: force institutions to cut back radically on automatic merit-based entrance scholarships.

Here’s the background: at some point in the 1990s, Canadian institutions hooked onto the idea of giving out entrance awards as a way of managing enrolment.  It was a nice trick to help lock students in early in the admissions process – you register with me by a certain date, and I give you $1,000 (or whatever).  To keep costs down, “merit” was redefined as being exclusively in terms of high school grades and awarded according to a grid – $1,000 if you had over a 90, $750 if you had an 85, etc.   These were dubbed “Automatic Academic” awards by Franca Gucciardi in her 2004 paper Recognizing Excellence.

In the aughts, institutions in Ontario started getting truly stupid about this “merit” money, and an arms race broke out.  Here’s roughly how it worked: one year, school X would decide it was going to try to steal some smart science and engineering students from school Y, and would raise its offer to students with grades over 85% from (say) $1,500 to $2,000.  This worked, and school Y would see its yields go down.  But the following year, school Y would decide to up its offer by $1000, and the pendulum would swing back!  And so on, and so forth.  By 2007, Ontario institutions were handing out something on the order of $70 million in this fashion, and there is every reason to think this amount has increased since then.

There is no durable way an institution can get out in front of the pack in such an arms race – others would always match the awards, and an equilibrium would be restored.  All institutions are making themselves worse off because their net tuitions are all declining, yet no one can disengage because they are afraid they would lose share to others.  And into the deal we get a complete devaluation of the word “merit” because institutions are giving these awards to about 70% of incoming freshmen.

This is a classic collective action problem.  And collective action problems have solutions: imposed government rule-making.  In this case, the Government of Ontario should restrict the ability of universities to give away money based on grades alone to, say, 1% of what they take in via tuition.  If they want to run more complicated merit schemes – things that take into account service, innovation, and what have you – great.  Let ‘em knock themselves out (they won’t of course, because those take actual effort to organize, but whatever).  Or, if they want to give out more money based on need, that’d be OK too.  Just end the insanity around academic awards.

You think universities would oppose this on grounds of competitiveness or institutional autonomy?  Think again.  They’d love for someone to get them out of this arms race, because they’re clearly incapable of doing it themselves.  And of course the Ontario Undergraduate Student Alliance has already called for essentially the same thing, so the student constituency is already covered off.

So, how about it Dr. Moridi? Are you up for a quick win?  Just end the merit arms race.

September 24

The Math at Windsor

Not only is there strike talk at Laurentian, but there is also a strike in the air at Windsor (a one-day strike was held last week, but a full strike is promised for October 1st if no deal is reached).  Bargaining there began earlier this year, but for whatever reason, no progress was made in negotiations over the spring.  After a conciliator was unable to nudge the two sides closer together, the university was in the legal position to impose its offer on the faculty, which it did in early July. This was a canny piece of timing: by doing this in the summer, the university deprived the union of an immediate strike threat (because who cares if profs go on strike in summer?).

This was a rare case of a university playing hardball on timing; and though this may have wrong-footed the union, they’ve responded by making great rhetorical hay out of having a contract imposed on them.  Now, the strike is no longer about petty monetary demands, it’s about the right to collective bargaining.  Yay, righteousness!  And that’s a big bonus for the union, because if the strike was just about financial proposals, their position would be almost indefensible.

The union position is that the university’s offer – 0%, 0%, and 3% over three years – is inadequate because staff can’t be expected to accept wage increases below inflation.  While that’s one way of framing the institution’s offer, it glosses over the stonking amount of money the university is offering faculty through its Progression Through the Ranks (PTR) system (for a refresher course on PTR, see here).  Under the university’s offer, every single professor (other than those with over 30 years experience) gets an annual pay rise of $2,550.  This isn’t based on merit or anything, the way it is at Alberta or UBC or Waterloo, it’s just for sticking around another year.  On top of that, they get 0%, 0%, 3%.

Now that doesn’t translate easily into a percentage figure because $2,550 represents a different percentage for each professor, depending on their current pay. But let’s take a stab at it based on known average pay by rank.  Current pay figures are unavailable (thanks for cutting the UCASS faculty salary survey, StatsCan!), but I do have them from 4 years ago – they’ve probably gone up slightly since then, but for giggles let’s use them to take a look at what the university offer means if the PTR is included.

1

 

 

 

 

 

 

 

 

 

 

On top of that, there’s something called the “Windsor Salary Scale” – a Windsor-only deal, in place for decades, which means that the Windsor salary scale always rises at the Ontario median.  Based on the past few years’ deals, this would mean average pay rises of another 4% or so (15.4% now for assistant profs, if you’re counting) – though it’s hard to predict exactly, since we don’t know what future salary settlements across Ontario will look like.  On the other hand, professors will also be asked to pay more into their pensions, what with returns being so meagre in our low-interest rate environment.  So let’s call these two a wash and stick with the figures in the table above.

To summarize, this deal – which, recall, had to be imposed – will see nearly all faculty salaries rise by rates well above inflation (in the case of assistant professors, by a factor of two).  The only ones who will not see a rise equal to inflation are that tiny minority (the 30+ years crew) already at the top of the pay grid, and who in most circumstances will be earning over $150,000.

Remember, this is at a university in a region where first-year enrolment fell by 10% this year, and where the regional youth cohort (Windsor-Essex-Chatham-Kent) is set to shrink by 15% or so over the next six years – meaning the institution will receive even fewer tuition dollars, and will receive a declining share of total government grants budget, which, if we’re lucky, will decline by only 3% in real terms over the next three years.

And still the union said no.

September 23

Another Reason to Get Serious About Measuring Workloads

So I see the Laurentian faculty union is threatening to strike.  The main issues are “workload” (they’d like to have lower undergraduate teaching loads to deal with an influx of graduate students) and pay (they’d like to “close the gap” with the rest of Ontario).

This is where the entire system would be well served by having some understanding of what, exactly, everybody is getting paid for.  Obviously, if you’re doing the same amount and type of work as someone else, you’ve got a pretty good claim to parity.  The problem is that what professors do – that is, their expected workload and outputs – can vary significantly from one place to another.

Lets’s take the issue of graduate supervision.  Laurentian profs are doing more of it than they used to – overall, 6% of full-time enrolments at Laurentian were at the graduate level in 2012, up from 4% five years earlier.  But if we’re going to use “the Ontario average” as a goal, it’s worth noting that across the province, 12% of full-time students are graduate students.  So on average, Laurentian professors do only about half as much graduate supervision as other professors across the province – and probably less if we were to weight doctoral supervision more highly.

Well, what about undergraduate teaching – maybe they do more of it that others?  On paper, they teach 3/2 (except in Science and Engineering, where its 2/2).  That’s the same as at most smaller Ontario institutions, and somewhat more than you’d see at larger institutions where 2/2 or even 2/1 is the norm.  But that’s not the whole story: class sizes are smaller at Laurentian.  Sixty-seven per cent of all undergraduate classes at Laurentian are under 30 students, compared to just 51% at York (though, surprisingly, the figure at Queen’s is almost the same as Laurentian – 65%).  But ask yourself: which takes more work, a 2/1 with average class sizes of 60, or a 3/2 with an average class size of 30?  Hard to tell.  But how can you make arguments about “equal pay for equal work” unless you know?

Then there’s research output.  If you use tri-council funding as a metric, and normalize for field of study, Laurentian profs in Science and Engineering are winning about 55% of the national average – higher than Ryerson, but less than half of what Carleton gets.  That’s not too bad.  In humanities and social sciences, however, Laurentian wins only 21% of the national average – about a fifth of what they get at Ottawa, and a third of what they get at Laurier (all data from our Measuring Academic Research in Canada paper, available here.  I could go on with data about publications and citations, but you get the idea: Laurentian professors’ research output isn’t all that close to the provincial average.

To recap: Laurentian is a school where (on average) professors have lower graduate teaching responsibilities and research output than the Ontario average, and an undergraduate teaching load that is higher than average in terms of number of classes, but is arguably lower in terms of total students taught.  So where should their pay be, relative to the provincial average?  Probably somewhere below the average, which indeed is where it is.

But the question for this dispute is: how far below?  Better comparative data, combined with some agreement about the relative weight of different parts of the professorial job, would take a lot of heat out of this debate.

September 22

Where Do Students Want to Live?

Today, we at HESA released a paper called: Moving On?  How Students Think About Choosing a Place to Live After Graduation, which is based on a 2011 survey of 1,859 students from across the country.  Obviously, you should go read the whole thing, but for the time-pressed here are the highlights:

1)      Part of the paper’s purpose is to examine the qualities students look for in a place to live.  Turns out Richard Florida’s whole shtick about young educated types looking for cities that are “hip” or “creative” may be somewhat wide of the mark; in fact, students’ main priorities in finding a place of residence are access to good jobs, healthcare services, and a low crime rate.  Access to cultural events and foodie cultures rank way, way down the list.  To put that another way: what young educated people look for in a place to live is pretty much what everyone else looks for.

2)      A solid majority of students intend to stay in the province in which they graduated.  That said, just over 40% of students are at least open to the idea of moving.  However, these students are not evenly distributed.  Students in the prairie provinces (including Alberta) are much more open to moving away than are students in British Columbia.  And, equally, students are not open to moving just anywhere – of the people open to a move, most have three or fewer potential destination provinces in mind, and are not open to any others (the most commonly-sought destinations are Ontario and British Columbia – Manitoba and Saskatchewan are the least ).  Only 7% are genuinely open to a move to pretty much anywhere in the country.

3)      Here’s perhaps the most important piece of news: financial incentives for graduates such as the tax credits used by Saskatchewan, Manitoba, and New Brunswick have almost no effect.  We asked students what they expected to earn in their first job in the province they were most likely to call home.  Then we asked them how much it would take to get them to move to each of the other provinces.  For most provinces (BC was the outlier), about a quarter said “nothing could get me to go there” and another 25% said “I’d go for an extra $25,000 or more” (which is really just a polite way of saying “never”).  But, intriguingly, between 13% (Manitoba) and 25% (British Columbia) of all students, say they’d move to that province for either no extra money or even a cut in pay – just give them a job and they’d go.  The percentage who say they’d move for an extra $2,000 (roughly the value of the tax credits in SK, MB and NB)?  About 1%.  Move the financial incentive up to $5,000 and you get another 1%.  And that’s perfectly consistent, right across the country.

The fact is, students are going to move where they’re going to move.  They are either tied to their present spot by networks of friends and family, or they are lured by money, jobs, and prosperity.  A couple of thousand bucks, in the grand scheme of things, just doesn’t seem to matter that much.

All of which begs the question: how come more provinces aren’t making like Nova Scotia and ditching these tax rebate programs?

September 19

Better Know a Higher Ed System: France

France is one of the original homelands of the university: the University of Paris was the first real university outside the Mediterranean basin, and was home to six universities by 1500 – only Italy and Spain had more at the time.  But while it has quite ancient roots, it is also, in many respects, one of the youngest systems of higher education in Europe, because the entire university system was wiped out during the Revolution, and then developed again from scratch during Napoleonic period that followed.

Unlike virtually every other system on earth, the French do not put universities at the top of the higher education hierarchy.  Instead, there are what are called “les Grandes Écoles”: peak, specialized institutions that only operate in a certain limited number of fields – École des Mines and Polytechnique for Engineering, l‘École Normale Superieur for Education, and l‘École Nationale d’Administration Publique” to train the masters of the universe.  Most of these go back two centuries – Polytechnique was an excellent spot for Napoleon to train his gunners – but ENAP actually only dates from the 1940s.

One step down in the hierarchy are the big “Instituts”, which serve as the training ground for professions, mainly in technology (IUT), but also in fields like nursing.  Universities, for the most part (medical studies excepted), are widely viewed as the dregs of the system, the catch-all for people not smart enough to make the grandes écoles, or driven enough to do professional studies.  That’s partly because they are bereft of many prestige disciplines, but it’s also because, historically, they are not centres of research.  As with many other European countries (notably Germany and Spain), the public research mission was largely the responsibility of the Centre National de Recherche Scientifique (CNRS), which was not attached to the universities.

Another historical feature of French universities is the degree to which they have been under state control.  Legally, all faculties were part of a single “Universite de France” for most of the 19th century.  Universities as we know them – autonomous institutions that pursue their own plans and goals – are fairly recent.  If you’re being generous, they date back to 1968; in fact they didn’t reach North American levels of autonomy until the loi Pecresse in 2007 – in practice, though, the shift happened in late 1980s.  Prior to that, hiring and promotion was essentially all done through the Ministry; curricula were also laid down on national lines by expert committees run from Paris.

Recently, international rankings have been a major spur to change.  When the Academic Ranking of World Universities first appeared in 2003, it created the “choc de Shanghai” – the country was genuinely shocked at how weak its institutions were seen to be.  Much of it was down to system design, of course.  The Grandes Ecoles couldn’t compete with American multiversities because they were small, single-discipline institutions, and the universities couldn’t compete because the research was all tied up at CNRS.  But the French government, instead of standing up and saying “this ranking is irrelevant because our structures are different, and frankly our system of research and innovation works pretty well anyway”, decided to engage in a wild bout of policy-making: excellence initiatives, institutional mergers, etc.  It’s all designed implicitly to make their system look more American; though to keep up pretences, if anyone asks it’s actually about being “world-class”.

Maybe the most interesting development to watch is what’s going on at Paris Saclay – a campus that brings together roughly two dozen universities and scientific institutions in a single spot.  It’s both a federation of universities and a new independent institution.  The governance arrangements look like a nightmare, but the potential is certainly there for it to become a genuinely European super-university.  It’s not the only new university in the world whose founders dream of hitting the Shanghai Top Ten, but it’s probably the one with the best chance of doing so.

Page 3 of 6712345...102030...Last »