Higher Education Strategy Associates

January 04

Innovation Buzzword Bingo

Morning all.  Regular service will pushed back one week to January 10th, but I couldn’t let the Globe op-ed “Southern Ontario Should be an Innovation Cluster, Not a Farm Team” by three Ontario university presidents (McMaster’s Patrick Deane, Toronto’s Meric Gertler, and Waterloo’s Feridun Hamdullahpur) go without comment.

The article reads like someone set out to fill a buzzword bingo card.  Words like “supercluster”, “resilient”, “enhancing interaction”, “external connectivity”, “cluster-building infrastructure”, and “entrepreneurship ecosystem” all duly make an appearance; hell, there’s even a reference to Michael Porter.  And while none of it is wrong, exactly – clusters are good, infrastructure never hurts, etc. – the six actual policy proposals the presidents lay out in terms of creating an innovation cluster are mighty thin.

1)  Invest in organizations that drive local economic development and quality of life, from civically minded governance bodies to cultural institutions.

In what way does this proposal differ from what governments already do?  Do governments everywhere not invest in cultural institutions and things that drive local economic development?  Are there things governments should stop doing in order to prioritize these things?  And how might one distinguish good from bad investments in cultural institutions?  When should the spending stop?

2)  Co-ordinate investments in research areas with both the highest success rates and strongest growth potential, from regenerative medicine to quantum science; from advanced materials to environmental technologies.

Co-ordinate how?  How is “success rate” measured?  Or “growth potential”?  Is this actually a plea to prioritize CFREF-type programs over granting council funding?  Or perhaps it’s a plea for granting councils to become more focussed in their funding?  A lot more detail should be here before anyone signs on to this principle.

3)  Ensure that our immigration rules make us a destination of choice for high-potential individuals.

think this is a plea for government to streamline immigration procedures, awkwardly phrased.  And, yeah, streamline away.  Can’t hurt.  But generally speaking, being a destination of choice has more to do with economic opportunities in a country than the state of its entry visa system.

4)  Turn taxpayers into equity partners and give the public a share of the upside.

Equity partners in what?  New companies?  Like Mariana Mazzucato wants?  Can anyone name a single successful innovation cluster where this happens?  Try to imagine how the public sector would behave if it had an equity stake in companies.  Imagine what it would do to pick or favour winners in order to maximize share value.  Imagine the pressure to “save” or “bailout” losers.  Imagine the chaos that would surround the decision to ever try to sell shares.  This is a half-baked nightmare of an idea, one which would effectively impose a form of Peronism on any emerging tech sector.  Does anyone truly believe this would make tech companies more successful?  Please.

5)  Support firms that can scale up by connecting them to successful mentors, addressing gaps in our venture financing systems, and leveraging public procurement strategically.

Can anyone point me to a single study that links firm size to quality or quantity of mentorship?  No?  OK, then.

6)  Inculcate a culture of risk-taking that rewards rather than penalizes failure, which fosters adaptability and learning from mistakes.

Can anyone point me to a single instance anywhere of a government successfully inculcating a culture of risk-taking in business?  No? Ok, then.

Like I said before, it’s not so much that these ideas are wrong (well, apart from the taxpayer equity stuff) as that they are painfully unspecific.  It’s great that universities are now at least couching their requests for more research funding in the context of an acknowledgement of innovation ecosystems, and not simply relying on the absurd formula of: $ for university research –> Black Box where miracles occur –> Innovation!

But in practice most of these recommendations either are not particularly workable or vague to the point of  being unhelpful.  Better innovation policy is going to require a lot more than this.

December 11

Signing Off (For Now)

A short one today, the last of term.  Normal service will resume January 4th, 2016.

It’s been an interesting semester.  Progress on some fronts, I think.  I was certainly very pleased that for the first time ever in this election, all three major national parties released platforms that included investments in students and, in one way or another, all targeted lower-income families.  That’s progress.  The incandescently idiotic Green platform suggests there’s still a market for dumb PSE ideas, but you can’t have everything.

But progress in one area, eternal return to bad ideas in another.  Take for example the “students and universities suck these days” piece published in the LA Review of Books last week.  Written by UPEI religious studies prof Ron Srigley, it’s almost a cartoon of every knee-jerk “kids these days!   Why isn’t the academy the way it was in the 1960s” piece written in the last two decades.  Unengaged students! Too many administrators!  E-learning is an abomination! Etc., etc.  Snore.

(If you’re curious why he wrote this piece for an American publication instead of a Canadian one, it’s possibly because he calls UPEI – and by extension his former haunts at Laurentian and Nipissing – “3rd and 4th tier Canadian institutions”.  Wonder what the Arts faculty Christmas party is going to be like in Charlottetown this year.)

From a personal point of view, of course, the cockroach-like staying power of bad ideas in PSE is depressing.  Why can’t the sector move on?  Why must there be so many dinosaurs?  But of course from a consulting point of view it’s heartening.  The perpetual re-emergence of bad ideas means I get to re-do the same projects every decade or so, which really helps with margins.  Plus I get to re-cycle blog posts.

And speaking of blog posts, it’s time for my twice-yearly bleg for feedback:  How are you enjoying these blog posts?  Is there stuff you like?  Don’t like?  If there are things I can be doing better, I’d love to know.

And finally, thanks to all for reading all year.  It’s awfully nice to have an audience that is as engaged and committed to education as you folks are.  Happy holidays to all.

December 10

Reports, Books, and CUDO

It’s getting close to that time of year when I need to sign off for the holidays (tomorrow will be the last blog until January 4th).  So before then, I thought it would be worth quickly catching up on a few things.

Some reports you may have missed.  A number of reports have come out recently that I have been meaning to review.  Two, I think, are of passing note:

i) The Alberta Auditor-general devoted part of his annual report (see pages 21-28) to the subject of risk-management of cost-recovery and for-profit enterprises in the province’s post-secondary institutions, and concluded that the government really has no idea how much risk the provinces’ universities and colleges have taken on in the form of investments, partnerships, joint ventures, etc.  And that’s partly because the institutions themselves frequently don’t do a great job of quantifying this risk.  This issue’s a sleeper – my guess is it will increase in importance as time goes on.

ii) The Ontario auditor general reviewed the issue of University Intellectual Property (unaccountably, this story was overlooked by the media in favour of reporting on the trifling fact that Ontarians have overpaid for energy by $37 billion over the last – wait, what?  How much?).  It was fairly scathing about the province’s current activities in terms of ensuring the public gets value for money for its investments. A lot of the recommendations to universities consisted of fairly nitpicky stuff about documentation of commercialization, but there were solid recommendations on the need to track the impact of technology transfer, and in particular the socio-economic impact.  Again, I suspect similar issues will crop up with increasing frequency for both governments and institutions across the country.

Higher Ed Books of the Year.  For best book, I’m going to go with Lauren Rivera’s Pedigree: How Elite Students Get Elite Jobs, which I reviewed back here.   I’ll give a runner-up to Kevin Carey’s The End of College, about which I wrote a three-part review in March (here, here, and here).  I think the thesis is wrong, and as others have pointed out there are some perspectives missing here, but it put a lot of valuable issues about the future of higher education on the table in a clear and accessible way.

Worst book?  I’m reluctantly having to nominate Mark Ferrara’s Palace of Ashes: China and the Decline of American Higher Education.  I say reluctantly because the two chapters on the development of Chinese higher education is pretty good.  But the thesis as a whole is an utter train wreck.  Basically it amounts to: China is amazing because it is spending more money on higher education, and the US is terrible because it is spending less money on higher education (though he never bothers to actually check how much each is spending, say, as a proportion of GDP, which is a shame, as he would quickly see that US expenditure remains way above China’s even after adjusting for the difference in GDP).  The most hilarious bits are the ones where he talks about the erosion of academic freedom due to budget cuts, whereas in China… (you see the problem?  The author unfortunately doesn’t).  Dreck.

CUDO: You may recall I had some harsh things to say about the stuff that Common University Dataset Ontario was releasing on class sizes.  I offered a right of reply, and COU has kindly provided one, which I reproduce below, unedited:

We have looked into the anomalies that you reported in your blog concerning data in CUDO on class size.  Almost all data elements in CUDO derive from third party sources (for example, audited enrolment data reported to MTCU, NSSE survey responses) or from well-established processes that include data verification (for example, faculty data from the National Faculty Data Pool), and provide accurate and comparable data across universities. The class size data element in CUDO is an exception, however, where data is reported by universities and not validated across universities. We have determined that, over time, COU members have developed inconsistent approaches to reporting of the class size data in CUDO.

 COU will be working with universities towards more consistent reporting of class size for the next release of CUDO.

With respect to data concerning faculty workload:  COU published results of a study of faculty work in August 2014,  based on data collected concerning work performed by full-time tenured faculty, using data from 2010 to 2012. We recognize the need for further data concerning teaching done by contract teaching staff. As promised in the 2014 report, COU is in the process of updating the analysis based on 2014-15 data, and is expanding the data collection to include all teaching done in universities by both full-time tenured/tenure track faculty and contract teaching staff. We expect to release results of this study in 2016.

Buonissimo.  ‘Til tomorrow.

December 09

Who Are the U-15, Exactly?

Over the last few years, two new players have been introduced into the Ottawa higher education lobbying ecosystem.  One is Polytechnics Canada – essentially, the country’s largest, most technologically sophisticated, and most research intensive colleges; the other is the U-15 – essentially the country’s largest, most technologically sophisticated, and most research intensive universities.  For a variety of reasons, both of these new players have had a pretty good run in Ottawa, lately. Certainly, in the the U-15’s case, it’s often seemed like the tail wagging the AUCC dog (see CF-REF stories, passim).  But for an organization with such clout, its membership criteria is fairly opaque.

The U-15 had its origins in a caucus of five research-intensive Ontario institutions (McMaster, Waterloo, Toronto, Queen’s, and Western) in the late 1980s.  In the early 1990s, the group expanded to include the big three Quebec universities (McGill, Montreal, and Laval), plus UBC and Alberta, and was renamed the Group of Ten, or G-10.  Then, in the mid-2000s, Ottawa, Dalhousie, and Calgary were added, and more recently the inclusion of Manitoba and Saskatchewan has added some geographical balance.  The “G” changed to a “U” and voila!  U-15.

That the U-15 is a coalition of research intensive universities is clear.  But the decisions about which institutions get to join, and which do not, seem remarkably arbitrary.  Are they the 15 biggest universities?  Obviously not, or York would be there.  Are they the 15 universities with the largest granting council incomes?  No; if it were, then Guelph would be in and Dal would be out.  How about the 15 universities with the largest research incomes per faculty?  Nope; you’d have to include INRS, ETS, and Victoria. Research impact?  Depends on the measure used, but you could make a pretty good case for Simon Fraser on some of these.

Put it this way: the original G-10, plus Calgary and Ottawa, have a pretty good case to be a U-12.  Dal, Saskatchewan, and Manitoba are not manifestly different on most performance indicators from Simon Fraser, Guelph, Victoria, and York.  So why are the former members, and the latter not?  The answer, in a word, is politics.  Meteorological conditions in the underworld would need to be unseasonably frosty before UBC lets another BC institution into the club (Toronto and York have a similar issue).  Meanwhile, the Dal/Man/Sask trio give the organization a more pan-Canadian patina – without them, the club would be confined to the country’s four largest provinces.  This extra breadth goes a long way in Ottawa to convincing people you’re a truly “national” organization.

Now, of course the U-15 isn’t really alone in being a self-appointed elite, with fuzzy boundaries between itself and the rest of the pack.  Just last month, an article in the Oxford Review of Education came to the same conclusion regarding the UK’s “Russell Group”, saying that on most measures, only Oxford and Cambridge really stand apart from the rest of the country’s pre-92 universities (in the UK, a pre-92 university is a university that was purpose built as-such; a post-92 university is one that was originally a polytechnic – think of the dividing line between “research” and “regional” universities in BC, and you’ve more or less got it).

But it’s not always the case.  In Australia, the G-8 really are the top 8 in research almost any way you slice it.  And in the United States, the American Association of Universities (which, by a weird historic fluke, also includes McGill and U of T) actually has quite a detailed set of membership criteria: research expenditures normalized by number of faculty, number of National Academy members, National Research Council faculty quality indicators, faculty honors, and scholarly citations.  In fact, it’s strict enough that four years ago the organization actually kicked out Nebraska (Syracuse subsequently withdrew voluntarily so as not to suffer the same fate).

However, this is Canada: asking elites to meet some established criteria for proving eliteness isn’t really a thing here.  Because, you know, someone might fail.  Embarrassment would be caused.  And that wouldn’t do, not for someone already “in the club”.  So the likelihood of any U-15 members being asked to leave is pretty slight.  And while the U-15 would probably prefer that the top tier of non-members go and form their own club, as they have done in other Anglophone speaking countries (e.g. the Innovative Universities Group in Australia, or the [now-defunct] 1994 Group in the UK), the likelier outcome is a gradual process of letting on new members – eventually.  Guelph will come first, because it causes the least political friction with an existing member.  SFU and Vic would be the next obvious candidates.

Of course the problem is that if the group is insufficiently exclusive, the top tier may walk away themselves and form their own, even more exclusive group.  U-5, anyone?

December 08

Innovation Ecosystems: Promise and Opportunism

We sometimes think of innovation policy as being about generating better ideas through things like sponsored research.  And that’s certainly one part of it.  But if those ideas are generated in a vacuum, they go nowhere – making ideas spread faster is the second pillar of innovation policy (a third pillar – to the extent that innovation is about new product-generation – has to do with venture capital and regulatory environments, but we’ll leave those aside for now).

Yesterday, I discussed why the key to speeding up innovation was the density of the medium through which new ideas travel: basically, ideas about IT travel faster in Waterloo than in Tuktoyaktuk; ideas about marine biology travel faster in Halifax than in Prince Albert.  And the faster ideas travel and collide (or “have sex” in Matt Ridley’s phrase), the more innovation is produced, ceteris paribus.

Now, although they don’t quite use this terminology, the proponents of big universities and big cities alike find this logic pretty congenial.  You want density of knowledge industries?  Toronto/Montreal/Vancouver have that.  You want density of superstar researchers?  U of T, McGill, and UBC have that (especially if you throw in allied medical institutes).  That makes these places the natural spot to invest money for innovation, say the usual suspects.  All you need to do is invest in “urban innovation ecosystems” (whatever those are – I get the impression it’s largely a real estate play to bring scientists, entrepreneurs, and VCs into closer spatial proximity), and voila!  Innovation!

This is where sensible people need to get off the bus.

It’s absolutely true that innovation requires a certain ecosystem of researchers, and entrepreneurs, and money.  And on average productive ecosystems are likelier to occur in larger cities, and around more research-intensive universities.  But it’s not a slam dunk.  Silicon Valley was essentially an exurb of San Francisco when it started its journey to being a tech hub.  This is super-inconvenient to the “cool downtowns” argument by the Richard Floridas of this world; as Joel Kotkin has repeatedly pointed out, innovative companies and hubs are as likely (or likelier) to be located in the ‘burbs, as they are in funky urban spaces, mainly because it’s usually cheaper to live and rent space there.  Heck, Canada’s Silicon Valley was born in the heart of Ontario Mennonite country.

We actually don’t have a particularly good theory of how innovation clusters start or improve.  Richard Florida, for instance, waxes eloquent about trendy co-working spaces in Miami as a reason for its sudden emergence as a tech hub. American observers tend to attribute success to the state’s low tax rate, and presumably there are a host of other possible catalysts.  Who’s right?  Dunno.  But I’m willing to bet it’s not Florida.

We have plenty of examples of smaller communities hitting tech take-off without having a lot of creative amenities or “urban innovation strategies”. Somehow, despite the lack of population density, some small communities manage to get their ideas out in the world in ways that gets smart investors’ attention.  No one has a freaking clue how this happens: research on “why some cities grow faster than others” is methodologically no more evolved than research on “why some universities become more research intensive than others”, which is to say it’s all pretty suspect.  Equally, some big cities never get particularly good at innovation (Montreal, for instance, is living proof that cheap rent, lots of universities, and bountiful cultural amenities aren’t a guarantee of start-up/innovation success).

Moreover, the nature of the ecosystem is likely to differ somewhat in different fields of endeavor.  The kinds of relationships required to make IT projects work is quite different from the kinds that are required to make (for example) biotech work.  The former is quick and transactional, the latter requires considerably more patience, and hence is probably less apt to depend on chance meetings over triple espressos in a shared-work-environment incubator.  Raleigh-Durham and Geneva are both major biotech hubs that are neither large nor particularly hip (nor, in Raleigh’s case, particularly dense).

It’s good that governments are getting beyond the idea that one-dimensional policy instruments like “more money in granting councils” or “tax credits” are each unlikely on their own to kickstart innovation.  It’s good that we are starting to think in terms of complex inter-relations between actors (some, but not all of which involve spatial proximity), and using “ecosystem” metaphors.  Complexity is important. Complexity matters.

But to jump from “we need to think in terms of ecosystems” to “an innovation agenda is a cities agenda” is simply policy opportunism.   The real solutions are more complex. We can and should be smarter than this.

December 07

H > A > H

I am a big fan of the economist Paul Romer, who is most famous for putting knowledge and the generation thereof at the centre of  discussions on growth.  Recently, on (roughly) the 25th anniversary of the publication of his paper on Endogeneous Technological Change, he wrote a series of blog posts looking back on some of the issues related to this theory.  The most interesting of these was one called “Human Capital and Knowledge”.

The post is long-ish, and I recommend you read it all, but the upshot is this: human capital (H) is something stored within our neurons, which is perfectly excludable.  Knowledge (A) – that is, human capital codifed in some way, such as writing – is nonexcludable.  And people can use knowledge to generate more human capital (once I read a book or watch a video about how to use SQL, I too can use SQL).  In Romer’s words:

Speech. Printing. Digital communications. There is a lot of human history tied up in our successful efforts at scaling up the H -> A -> H round trip.

And this is absolutely right.  The way we turn a patterns of thought in one person’s head into thoughts in many people’s heads is the single most important question in growth and innovation, which in turn is the single most important question in human development.  It’s the whole ballgame.

It also happens to be what higher education is about.  The teaching function of universities is partially about getting certain facts to go H > A > H (that is, subject matter mastery), and partially about getting certain modes of thought to go H > A > H (that is, ways of pattern-seeking, sense-making, meta-cognition, call it what you will). The entire fight about MOOCs, for instance, is a question of whether they are a more efficient method of making H > A > H happen than traditional lectures (to which I think the emerging answer is they are competitive if the H you are talking about is “fact-based”, and not so much if you are looking at the meta-cognitive stuff.  But generally, “getting better” at H > A > H in this way is about getting more efficient at the transfer of knowledge and skills, which means we can do more of it for the same price, which means that economy-wide we will have a more educated and productive society.

But with a slight amendment it’s also about the research function of universities.  Imagine now that we are not talking H > A > H, but rather H > A > H1.  That is, I have a certain thought pattern, I put it into symbols of some sort (words, equations, musical notation, whatever) and when it is absorbed by others, it generates new ideas (H1). This is a little bit different than what we were talking about before.  The first is about whether we can pass information or modes of thought quickly and efficiently; this one is about whether we can generate new ideas faster.

I find it helpful to think of new ideas as waves: they emanate outwards from the source and lose in intensity as they move further from the source.  But the speed of a wave is not constant: it depends on the density of the medium through which the ideas move (sound travels faster through solids than water, and faster through water than air, for instance).

And this is the central truth of innovation policy: for H > A > H1 to work, there has to be a certain density of receptor capacity for the initial “A”.  A welder who makes a big leap forward in marine welding will see his or her ideas spread more quickly if she is in Saint John or Esquimault than if she is in Regina.  To borrow Matt Ridley’s metaphor of innovation being about “ideas having sex”, ideas will multiply more if they have more potential mates.

This is how tech clusters work: they create denser mediums through which idea-waves can pass; hence, they speed up the propagation of new ideas, and hence, under the right circumstances, they speed up the propagation of new products as well.

This has major consequences for innovation policy and the funding of research in universities.  I’ll explain that tomorrow.

December 04

Defending Liberal Arts: Try Using Data

A few weeks back, I wrote about the Liberals Arts/humanities, and some really bad arguments both for and against them.  As usual when I write these, I got a lot of feedback to the effect of: “well, how would you defend the Liberal Arts, smart guy”?  Which, you know, fair enough.  So, here’s my answer.

The humanities, at root, are about pattern recognition in the same way that the sciences and the social sciences are: they just seek patterns in different areas of human affairs – in music, in literature, and in the narrative of history.  And though humanities cannot test hypotheses about patterns using the same kinds of experimental methods as elsewhere, they can nevertheless promote greater understanding of thorough synthesis.  Or, to paraphrase William Cronon’s famous essay, the humanities are about making connections, only connections.  In a networked world, that’s a valuable skill.

None of this, to me, is in doubt.  What is in doubt is whether this promise made by the humanities and Liberal Arts is actually delivered upon.  Other disciplines synthesize and make connections, too.  They promote critical thinking (the idea that other disciplines, disciplines founded on the scientific method, don’t promote critical thinking is the most arrogant and stupid canard promoted by people in the humanities).  What the humanities desperately need is some proof that what they claim is true is, in fact, true.  They need some data.

In this context, it’s worth taking a look at the Wabash National Study on Liberal Arts education.  This was an elaborate, longitudinal, multi-institutional study to look at how students in liberal arts programs develop over time.  Students took a battery of tests – on critical reasoning, intercultural effectiveness, moral character, leadership, etc. – at various points in their academic career to see the effects of Liberal Arts teaching, holding constant the effects of things like gender, age, race, prior GPA, etc.  You can read about the results here – and do read them, because it is an interesting study.

At one level, the results are pretty much what we always thought: students do better if they are in classes where the teaching is clear and well-organized, and they learn more where they are challenged to do things, like applying theories to practical problems in new contexts, or integrating ideas from different courses in a project, or engaging in reflective learning.  And as can be seen here in the summary of results, the biggest positive effects of liberal arts education are on moral reasoning, critical thinking, and leadership skills (academic motivation, unfortunately, actually seems to go down over time).

So: mostly good for Liberal Arts/humanities, right?  Not quite.  Let me quote the most interesting bit: the research found that “even with controls for student pre-college characteristics and academic major, students attending liberal arts colleges (as compared to their peers at research universities and regional institutions) reported significantly higher levels of clarity and organization in the instruction they received, as well as a significantly higher frequency of experiences on all three of the deep-learning scales.”  In other words, the effects of Liberal Arts on students in Liberal Arts colleges are significantly greater than the effects on students studying similar programs in other, larger institutions.  That is to say, it’s the teaching environment and teaching practices, not the subject matter itself, which seems to make more of a difference.

Now, this does not suggest that Liberal Arts/humanities can’t deliver those kinds of benefits at larger universities; it’s just to say that for it to deliver those benefits, the focus needs to be on providing the subject matter using quite specific teaching practices and – not to beat around the bush – keeping class sizes down (which may in turn have implications for teaching loads and research activity, but that’s another story).

There are some good stories for the Liberal Arts in the Wabash data, and some not so good stories.  But the point is, there is data.  There are some actual facts and insights that can be used to improve programs, to make them better at producing well-rounded critical thinkers.  And at the end of the day, the inquiry itself is what’s important.  Humanities’ biggest problem isn’t that it’s got nothing to sell; it’s that too frequently they act like they have nothing to learn.  If more institutions adopted Wabash-like approaches, and acted upon them, my guess is the Liberal Arts would get a lot more respect than they currently do.

December 03

Every University and College Needs a Fool

OK, yes, lots of ways to complete that sentence (e.g. “Every university and college needs a fool… and mine already has several”, etc.).  But I mean this in a very literal sense.  Institutions need the equivalent of Medieval Fools, or Court Jesters, to help them combat bad institutional culture.

In addition to being a barrel of laughs, Fools had a specific function in medieval and early renaissance courts; namely, they were able to speak truth to power, albeit obliquely (think Robin Williams rather than Jon Stewart). Because they were dressed as figures of fun, they had some license to tweak the noses of the powerful, because their words could be shrugged off as the ravings of a simpleton.  Yet, frequently, those ravings were useful because they presented truths that could not otherwise be said aloud.  Those Fools were no fools; as Shakespeare said, playing the Fool took considerable wisdom.

Now, I’m not actually suggesting that universities and colleges need to dress someone up in an ass’ costume and run around making fun of people in an academic council meeting (inspiring a thought as that may be).  Nor am I suggesting that there needs to be someone who is specifically charged with poking fun at executive power at a university – most institutions already have enough self-appointed critics filling that job.

No, what I have in mind is something different: someone who has license to speak truth across the institution.  Not constantly, as a gadfly role (that would just get annoying).  But occasionally, maybe once every year, it would be useful for a Fool to give each institution a once-over.  And where I think this could be most useful is not on issues of specific policy – again, each institution has lots of self-appointed critics of management to do that – but rather on issues of institutional culture.

As a friend was observing to me yesterday, bad institutional culture never looks bad from the inside.  There’s always good reasons for this little bit of secrecy, or flippant refusal to make data public; there’s always a good reason for sanctioning financial or business entanglements, which are at best borderline, or good reasons to not make tough decisions, thus allowing problems to fester.

No one sets out to be part of a bad institutional culture.  Bad cultures are created gradually, inch-by-inch, so slowly that no one on the inside notices.  The function of a university/college Fool would be to come in from the outside and say, maybe once a year, forcefully and publicly: What the heck are you people doing?  How did you all get this inappropriately cozy with industry?  How did your principles of governance get so undermined that the faculty union thought it appropriate to grieve Senate decisions? (Don’t scoff – this has happened.) Why are you even thinking about evicting a student union from its building? 

Everybody wants to be part of a good academic culture.  Fools might be able to play a role in keeping everyone on the straight and narrow.  It’s got to be at least as good an idea as having organizational behavior consultants crawling all over the place.

December 02

The Seven Habits of Highly Effective Universities and Colleges

One of the problems in higher education is that there’s a whole lot of effort expended on “who’s the best” (which, as measured by most rankings, is some function of money, age, and size), and not a lot of serious effort put into answering the question: “how can institutions get better”?  (Or at least, in finding answers that don’t boil down to: publish more/get more international students.)

I get to see a fair number of universities around the world.  And so while I don’t claim the following list is based on anything like empirical data, I can say that nearly all good universities follow these same seven habits.

1)     They are outwardly focussed. Highly effective universities understand that little can be accomplished inside a single institution.  Effective universities need partners – other universities, businesses, governments, whatever.  But building effective partnerships requires three things: having a very good understanding of what those potential partners want, having an understanding of what they think of you as a partner, and having a willingness to change in order to improve their chances of making better partnerships.  Some of this depends on having the other qualities listed below; but at root, it depends on being focussed on possibilities that exist outside the university, and doing whatever it takes to exploit them.

2)     They focus on what they can control, not what they can’t.  The surest sign a university isn’t effective is that it spends a lot of time moaning about what government is or isn’t doing.  Sure, government can have positive or deleterious effects.  And it’s important for universities to make their voices heard in order to promote good policies over bad ones.  But it’s even more important not to dwell on this subject.  In most developed counties – and certainly here in Canada – institutions have sufficient control over finance and policy to make an enormous amount of difference over their own situation.  Effective institutions maintain focus on this fact.

3)     They Pay Attention to Hiring.  At the end of the day, an institution’s nature and culture is a product of the hiring process.  Make a mistake – bring in a prof who is a whinger, or who is inclined to slack off gradually after gaining tenure – and you infect a department for a generation.  Every academic hire shapes the institution’s academic profile; every academic hire is implicitly a multi-million dollar decision.  There is literally no job more important at a post-secondary institutions than hiring.

4)     They Set High Standards.  There cannot be high performance without standards.  These need not always be written down; in fact, arguably, at the very highest-performing institutions there is no need for codified standards.  But one way or another, institutions need to ensure that units are performing at their best; they also need to have ways to be seen to be holding people accountable for working at the best.

5)    They Tell Stories.  Strong institutional cultures require a common belief in a narrative about what makes the institution great.  Great university and college leaders spend a lot of time finding ways to create and reinforce those narratives.  The sign of a great institution?  People all tell the same anecdotes to explain how and why their institution came to greatness.

6)     They Know How to Decide and Move On.  Whether they have strong Presidencies, or whether they have remarkably effective governance processes, effective universities don’t faff around.  They take strategy seriously and they take important decisions with due consideration, but not undue delay.

7)     Respect.  The best institutions treat everyone with respect.  Students.  Staff.  Stakeholders (particularly government and taxpayers).  That doesn’t mean they bend over to accommodate every whim from these groups; it just means they treat them with due regard.  Students and staff aren’t patronized; discussions with government and the public are honest and evidence-based.

This isn’t to say money, age, and size don’t help.  But in their absence, these seven traits make it easy to distinguish between the top performers and the rest.

December 01

The Higher Education of Heads of Government

To follow up on yesterday’s musings about the educational history of Canadian Prime Ministers: I think you can tell something about a country’s social structure just by looking at the clustering of leaders’ educational backgrounds.

In this exercise, I look at the records for Canadian, British, Australian, Japanese, and New Zealand Prime Ministers, German Chancellors, and French and American Presidents.  I would have included Italy but politicians’ Wikipedia bios are weirdly silent on education (even in the Italian versions).  I take all leaders back to 1900, except in New Zealand where Dominion Status was not granted until 1907, and Japan where I stop at 1945 because holy moley there are a lot of them.

The most interesting thing to me is the degree of concentration we see in each country.  In the UK, it is absolutely absurd, with nine of the last thirteen prime ministers (dating back to 1940) having studied at Oxford (Callaghan and Major did not attend university, Churchill went to Sandhurst, and Brown studied at Edinburgh).  Australia runs a close second.  Of the fifteen prime ministers with a university education (fourteen did not attend), seven went to the University of Sydney, three to Melbourne, and one each to ANU and Western Australia (two went to UK universities, without ever attending an Australian one).

Japan and France have a different sort of concentration.  In France, where every head of state since 1900 had a post-secondary degree, fourteen of the seventeen Presidents studied in Paris.  Among the pre-WWI presidents, all of whom went to school before 1908 during a period where there was only one university in France (but lots of different affiliated faculties dotted around the country), they nearly all studied Law in Paris.  Since DeGaulle, all Presidents have attended a “Grande Ecole” in Paris, with the exception of Sarkozy who attended Paris X.  In Japan, 29 of 32 post-War prime ministers studied in Tokyo, the only exceptions being Uno (Kobe), Ikeda (Kyoto), and Tanaka (no PSE).  Eleven of these went to the University of Tokyo, and seven to Waseda, with the rest scattered around the capital’s other mainly private universities.  So, in the UK, France, Japan, and – perhaps oddly – Australia, elites come from a fairly narrow set of proving grounds.  The US is a bit better, but maybe not much: the last twenty Presidents have five Harvard Degrees and five Yale Degrees among them (only one – Bush Junior – has both).

However, Canada and Germany seem to have much less concentrated patterns of attendance.  In Canada, the university with the most prime ministerial graduates is U of T (four out of seventeen).  In Germany, you have to be careful how you count pre-WWII, because those guys went to school in the 1800s when it was still the tradition to wander around taking courses at three universities before eventually taking a set of exams somewhere (credits were not a thing back then), but Humboldt has five Chancellors  (out of twenty-seven) if you don’t get picky about where the exams were taken.  In other words, in these countries, the path to the top seems somewhat more open to people from a wider set of backgrounds.

But that’s nothing compared to New Zealand, where only seven out of twenty-two prime ministers even went to university (closest competitor on that score is Australia, where fourteen out of 29 were non-attenders), and no university can claim more than two of them.  In fact, they’ve had as many prime ministers who did not finish secondary school as they’ve had those who finished university.  The contrast with Canada is fascinating; even if you knew nothing else about the two countries, you’d know our society has been much more urbanized and stratified for longer than our kiwi cousins.

To summarize:











Another interesting comparison is with respect to study abroad. Canada looks pretty good on this measure with 6 PMs having got some education abroad, but we actually come second to Australia, which has seven.  The Japanese do well, too, with five.  Germany has a number of Chancellors who studied at Strasbourg, but at the time it was part of Germany – Adenauer (LSE) and Luther (Geneva) were the two who actually studied abroad.  The New Zealanders have one.  The Americans have one and change, what with Clinton having a couple of years at Oxford and Kennedy having a couple of weeks at LSE.  The French and English, needless to say, have none.

(A final completely tangential fact I have to throw in here, because not enough people know it: before moving to Columbia, Obama started his educational career at Occidental College, which was the real-life setting for “California University” on the original Beverly Hills 90210.  This means he literally could have been at the Peach Pit all those years.  Fabulous.)

Anyways, I’m not sure much of this means anything, but it is an interesting way to think about comparative stratification, both social and educational.

Page 22 of 109« First...10...2021222324...304050...Last »