Higher Education Strategy Associates

January 12

Management in Universities

In organizations, people work in teams, but teams work effectively is difficult: this is what management is for.  It doesn’t always work well, but efficient management – making teams work together smarter, faster, and better – is the key to organizational success, whether you are in the private, public, or non-profit sectors.

Universities, of course, are an exception.

OK, not entirely.  Every university has units that must act as a team in order to deliver results.  Bookstores, admissions offices, physical plant: if teamwork goes down, if work is badly managed, the unit will not produce the desired results, and this can have deleterious effects on other units (difficult to do lab work or teach classes if the heat isn’t working properly; tough to pay staff if admissions are falling, etc.).

But in academic units?  Ha!  No.

It’s not that academics are resistant to team work.  The lone wolf is rare in academe.  If an academic is running a lab, s/he is running a team.  Any major long-term project – whether funded through a granting council or self-initiated/funded – involves co-operation with one or more scholars and co-authors, and requires co-ordination of work among scholars who may be all over the world.  Teams are everywhere.

But for most profs, the term “team” simply doesn’t apply to the folks down the hall who just happen to have adjacent offices.  That’s not to say they dislike those folks; they may go for coffee together, they may team-teach the odd class, and they recognize “they are all in this together” in the sense that they are all getting paycheques from the same source. But fundamentally, departments and faculties are not seen as a key unit of collaboration.

To people not embedded in the academy, this sounds bananas.  For instance, academic staff in colleges, where departments are seen as teams jointly delivering an integrated academic program, tend to find this behaviour nonsensical.  But in universities, non-professional undergraduate programs (i.e. those not subject to accreditation) and degrees are only dimly seen as a product that requires “management”.  Indeed, the entire academic architecture of North American universities has been set up to avoid thinking of degrees as a specific set of inputs requiring efficient management.

We set up degrees as smorgasbords from which students choose, rather than (as in most of the rest of the world) a fairly structured set of modules requiring integration.  Get so many credits from bucket A, and some from bucket B, and a few from bucket C, and Presto!  A degree.  No integration required.  And then we inculcate professors in a peculiar academic ideology in which the principal meaning of academic freedom is what some call “classroom sovereignty” – i.e. what happens in class is my business and no one else’s.  The idea that a particular class covering a particular subject might belong to the department as a whole, rather than the academic unit responsible for ensuring quality control, is a violation of academic freedom – at least according to the Canadian Association of University Teachers, our august national faculty body.

(Note: I am very definitely not endorsing this point of view.  Just explaining it.)

So, having set up degree programs so that teamwork is unnecessary, except for somewhat pro-forma curriculum reviews, profs are unsurprisingly a bit bewildered to find there are a lot of managers floating around, particularly at the faculty level.  What are they all doing, exactly, one reasonably wonders?

And the answer, briefly, is that a lot of people who get called “managers”, and may even have the title of manager, are in fact not managing anybody, but rather are simply doing tasks that are deemed to require professional competence.  Sometimes these people are academics on secondment (in which case, they get a small bump in pay and an “Associate Dean” title of some sort), or they are non-academics with a particular skill: someone to do communications, marketing, alumni relations, development, event co-ordination, etc.   A lot of them get “director” or “manager” titles not because of managerial responsibility, but rather because of simple title inflation.

So yes, there is a lot of management in universities.  But it doesn’t involve managing academics, who on the whole prefer to be left unmanaged.  And as long as one could assume with some confidence that everyone was pulling their weight, and being rewarded according to their contributions, it would be fine.  I leave it to the reader to decide if that’s actually the case.

January 11

Why Class Size Matters (Up to a Point)

At the outset of the MOOC debate about four years ago, there was a line of argument that went something like this:

MOOC Enthusiast:  These MOOCs are great.  Now the classroom is not a barrier.  Now we can teach hundreds of thousands of students at a time!  Quel efficiency!

Not MOOC Enthusiast:  They’re just videos.  They can’t give you the same human touch as an in-class experience with a professor.

MOOC Enthusiast: How’s that human touch going for you in the 1,000-person intro class?

To which there was never really a particularly good reply, just a lot of sputtering about underfunding, etc. The fact is, from a student’s point of view, there probably isn’t a lot of difference between a 1,000 person classroom and an online course, at least as far as personal touch from a professor is concerned.  There are some other differences, of course, mainly in terms of the kinds of study supports available, but if your argument is that direct exposure to tenured faculty is what matters, then this is kind of beside the point.

There was a period of time during which it was fashionable to say that class size didn’t matter, and that it was what happened in the class, not how big it was, etc., etc.  I am ever less convinced by some of these arguments.  Small classes matter for two reasons.  One is the ability – in science, health ,and engineering disciplines in any case – to be in contact with advanced equipment.  If classes are too large, students don’t get enough time with the top equipment and hence aren’t as prepared for careers in their fields as they might be.  Obviously this matters more in places like Africa than in North America, but you’d be surprised at how often this issue pops up here.  I know of at least one “world-class” university in Canada that, faced with budget cuts in the late 1990s, instituted a policy of not offering lab courses to science majors until third year (yes, really).

The second reason is perhaps more universal: the larger the class, the less interaction there is, not just between professors and students but also between students.  And this interaction matters because it is the key to developing many of the soft skills required for employability.  Work that is presented in class and argued among colleagues – whether assigned to teams or individuals – is pretty much the only place where students actually come to understand in real time how arguments are made and broken, how to interact with colleagues and experts, how to deal with (hopefully constructive) criticism, among other skills. When I go to developing countries (where I am currently doing a lot of work) and I hear about how students don’t have labour force skills, this is exactly what employers are talking about, and there’s simply no way to provide them those skills at the scale of classes currently being offered.  So, small classes are good, but not primarily for disciplinary reasons (though those may benefit as well).  It’s mostly about employability.

Canadian polytechnics actually worked this out awhile ago.  One of the most notable differences between degree programs at polytechnics and universities is that class sizes are relatively constant over four years in polytechnics, whereas universities (apart from the smallest of liberal arts colleges) employ a pyramid model, with huge classes in first year and many more smaller ones in upper years (CUDO data – flawed as it is – suggests that there are more classes with 30 students or less for 4th year students than there are classes of all sizes for first year students).  Students at polytechnics are getting the benefits of smaller classes all the way through, while for most university students, these benefits aren’t seen until third year at the earliest.

By this, I don’t mean to suggest that class size is destiny.  The point that what happens in a class is a function of more than its size is a relevant one (although a slightly trickier one to make today than in pre-MOOC times).  But interaction matters.  If institutions are going to increase class sizes (as they have done repeatedly over the past two decades, both through admitting more students and reducing professors’ undergraduate course loads), there needs to be a strategy to work out how interaction can be maintained or improved.  Otherwise, it’s very hard to say that quality isn’t being impaired.

January 04

Innovation Buzzword Bingo

Morning all.  Regular service will pushed back one week to January 10th, but I couldn’t let the Globe op-ed “Southern Ontario Should be an Innovation Cluster, Not a Farm Team” by three Ontario university presidents (McMaster’s Patrick Deane, Toronto’s Meric Gertler, and Waterloo’s Feridun Hamdullahpur) go without comment.

The article reads like someone set out to fill a buzzword bingo card.  Words like “supercluster”, “resilient”, “enhancing interaction”, “external connectivity”, “cluster-building infrastructure”, and “entrepreneurship ecosystem” all duly make an appearance; hell, there’s even a reference to Michael Porter.  And while none of it is wrong, exactly – clusters are good, infrastructure never hurts, etc. – the six actual policy proposals the presidents lay out in terms of creating an innovation cluster are mighty thin.

1)  Invest in organizations that drive local economic development and quality of life, from civically minded governance bodies to cultural institutions.

In what way does this proposal differ from what governments already do?  Do governments everywhere not invest in cultural institutions and things that drive local economic development?  Are there things governments should stop doing in order to prioritize these things?  And how might one distinguish good from bad investments in cultural institutions?  When should the spending stop?

2)  Co-ordinate investments in research areas with both the highest success rates and strongest growth potential, from regenerative medicine to quantum science; from advanced materials to environmental technologies.

Co-ordinate how?  How is “success rate” measured?  Or “growth potential”?  Is this actually a plea to prioritize CFREF-type programs over granting council funding?  Or perhaps it’s a plea for granting councils to become more focussed in their funding?  A lot more detail should be here before anyone signs on to this principle.

3)  Ensure that our immigration rules make us a destination of choice for high-potential individuals.

think this is a plea for government to streamline immigration procedures, awkwardly phrased.  And, yeah, streamline away.  Can’t hurt.  But generally speaking, being a destination of choice has more to do with economic opportunities in a country than the state of its entry visa system.

4)  Turn taxpayers into equity partners and give the public a share of the upside.

Equity partners in what?  New companies?  Like Mariana Mazzucato wants?  Can anyone name a single successful innovation cluster where this happens?  Try to imagine how the public sector would behave if it had an equity stake in companies.  Imagine what it would do to pick or favour winners in order to maximize share value.  Imagine the pressure to “save” or “bailout” losers.  Imagine the chaos that would surround the decision to ever try to sell shares.  This is a half-baked nightmare of an idea, one which would effectively impose a form of Peronism on any emerging tech sector.  Does anyone truly believe this would make tech companies more successful?  Please.

5)  Support firms that can scale up by connecting them to successful mentors, addressing gaps in our venture financing systems, and leveraging public procurement strategically.

Can anyone point me to a single study that links firm size to quality or quantity of mentorship?  No?  OK, then.

6)  Inculcate a culture of risk-taking that rewards rather than penalizes failure, which fosters adaptability and learning from mistakes.

Can anyone point me to a single instance anywhere of a government successfully inculcating a culture of risk-taking in business?  No? Ok, then.

Like I said before, it’s not so much that these ideas are wrong (well, apart from the taxpayer equity stuff) as that they are painfully unspecific.  It’s great that universities are now at least couching their requests for more research funding in the context of an acknowledgement of innovation ecosystems, and not simply relying on the absurd formula of: $ for university research –> Black Box where miracles occur –> Innovation!

But in practice most of these recommendations either are not particularly workable or vague to the point of  being unhelpful.  Better innovation policy is going to require a lot more than this.

December 11

Signing Off (For Now)

A short one today, the last of term.  Normal service will resume January 4th, 2016.

It’s been an interesting semester.  Progress on some fronts, I think.  I was certainly very pleased that for the first time ever in this election, all three major national parties released platforms that included investments in students and, in one way or another, all targeted lower-income families.  That’s progress.  The incandescently idiotic Green platform suggests there’s still a market for dumb PSE ideas, but you can’t have everything.

But progress in one area, eternal return to bad ideas in another.  Take for example the “students and universities suck these days” piece published in the LA Review of Books last week.  Written by UPEI religious studies prof Ron Srigley, it’s almost a cartoon of every knee-jerk “kids these days!   Why isn’t the academy the way it was in the 1960s” piece written in the last two decades.  Unengaged students! Too many administrators!  E-learning is an abomination! Etc., etc.  Snore.

(If you’re curious why he wrote this piece for an American publication instead of a Canadian one, it’s possibly because he calls UPEI – and by extension his former haunts at Laurentian and Nipissing – “3rd and 4th tier Canadian institutions”.  Wonder what the Arts faculty Christmas party is going to be like in Charlottetown this year.)

From a personal point of view, of course, the cockroach-like staying power of bad ideas in PSE is depressing.  Why can’t the sector move on?  Why must there be so many dinosaurs?  But of course from a consulting point of view it’s heartening.  The perpetual re-emergence of bad ideas means I get to re-do the same projects every decade or so, which really helps with margins.  Plus I get to re-cycle blog posts.

And speaking of blog posts, it’s time for my twice-yearly bleg for feedback:  How are you enjoying these blog posts?  Is there stuff you like?  Don’t like?  If there are things I can be doing better, I’d love to know.

And finally, thanks to all for reading all year.  It’s awfully nice to have an audience that is as engaged and committed to education as you folks are.  Happy holidays to all.

December 10

Reports, Books, and CUDO

It’s getting close to that time of year when I need to sign off for the holidays (tomorrow will be the last blog until January 4th).  So before then, I thought it would be worth quickly catching up on a few things.

Some reports you may have missed.  A number of reports have come out recently that I have been meaning to review.  Two, I think, are of passing note:

i) The Alberta Auditor-general devoted part of his annual report (see pages 21-28) to the subject of risk-management of cost-recovery and for-profit enterprises in the province’s post-secondary institutions, and concluded that the government really has no idea how much risk the provinces’ universities and colleges have taken on in the form of investments, partnerships, joint ventures, etc.  And that’s partly because the institutions themselves frequently don’t do a great job of quantifying this risk.  This issue’s a sleeper – my guess is it will increase in importance as time goes on.

ii) The Ontario auditor general reviewed the issue of University Intellectual Property (unaccountably, this story was overlooked by the media in favour of reporting on the trifling fact that Ontarians have overpaid for energy by $37 billion over the last – wait, what?  How much?).  It was fairly scathing about the province’s current activities in terms of ensuring the public gets value for money for its investments. A lot of the recommendations to universities consisted of fairly nitpicky stuff about documentation of commercialization, but there were solid recommendations on the need to track the impact of technology transfer, and in particular the socio-economic impact.  Again, I suspect similar issues will crop up with increasing frequency for both governments and institutions across the country.

Higher Ed Books of the Year.  For best book, I’m going to go with Lauren Rivera’s Pedigree: How Elite Students Get Elite Jobs, which I reviewed back here.   I’ll give a runner-up to Kevin Carey’s The End of College, about which I wrote a three-part review in March (here, here, and here).  I think the thesis is wrong, and as others have pointed out there are some perspectives missing here, but it put a lot of valuable issues about the future of higher education on the table in a clear and accessible way.

Worst book?  I’m reluctantly having to nominate Mark Ferrara’s Palace of Ashes: China and the Decline of American Higher Education.  I say reluctantly because the two chapters on the development of Chinese higher education is pretty good.  But the thesis as a whole is an utter train wreck.  Basically it amounts to: China is amazing because it is spending more money on higher education, and the US is terrible because it is spending less money on higher education (though he never bothers to actually check how much each is spending, say, as a proportion of GDP, which is a shame, as he would quickly see that US expenditure remains way above China’s even after adjusting for the difference in GDP).  The most hilarious bits are the ones where he talks about the erosion of academic freedom due to budget cuts, whereas in China… (you see the problem?  The author unfortunately doesn’t).  Dreck.

CUDO: You may recall I had some harsh things to say about the stuff that Common University Dataset Ontario was releasing on class sizes.  I offered a right of reply, and COU has kindly provided one, which I reproduce below, unedited:

We have looked into the anomalies that you reported in your blog concerning data in CUDO on class size.  Almost all data elements in CUDO derive from third party sources (for example, audited enrolment data reported to MTCU, NSSE survey responses) or from well-established processes that include data verification (for example, faculty data from the National Faculty Data Pool), and provide accurate and comparable data across universities. The class size data element in CUDO is an exception, however, where data is reported by universities and not validated across universities. We have determined that, over time, COU members have developed inconsistent approaches to reporting of the class size data in CUDO.

 COU will be working with universities towards more consistent reporting of class size for the next release of CUDO.

With respect to data concerning faculty workload:  COU published results of a study of faculty work in August 2014,  based on data collected concerning work performed by full-time tenured faculty, using data from 2010 to 2012. We recognize the need for further data concerning teaching done by contract teaching staff. As promised in the 2014 report, COU is in the process of updating the analysis based on 2014-15 data, and is expanding the data collection to include all teaching done in universities by both full-time tenured/tenure track faculty and contract teaching staff. We expect to release results of this study in 2016.

Buonissimo.  ‘Til tomorrow.

December 09

Who Are the U-15, Exactly?

Over the last few years, two new players have been introduced into the Ottawa higher education lobbying ecosystem.  One is Polytechnics Canada – essentially, the country’s largest, most technologically sophisticated, and most research intensive colleges; the other is the U-15 – essentially the country’s largest, most technologically sophisticated, and most research intensive universities.  For a variety of reasons, both of these new players have had a pretty good run in Ottawa, lately. Certainly, in the the U-15’s case, it’s often seemed like the tail wagging the AUCC dog (see CF-REF stories, passim).  But for an organization with such clout, its membership criteria is fairly opaque.

The U-15 had its origins in a caucus of five research-intensive Ontario institutions (McMaster, Waterloo, Toronto, Queen’s, and Western) in the late 1980s.  In the early 1990s, the group expanded to include the big three Quebec universities (McGill, Montreal, and Laval), plus UBC and Alberta, and was renamed the Group of Ten, or G-10.  Then, in the mid-2000s, Ottawa, Dalhousie, and Calgary were added, and more recently the inclusion of Manitoba and Saskatchewan has added some geographical balance.  The “G” changed to a “U” and voila!  U-15.

That the U-15 is a coalition of research intensive universities is clear.  But the decisions about which institutions get to join, and which do not, seem remarkably arbitrary.  Are they the 15 biggest universities?  Obviously not, or York would be there.  Are they the 15 universities with the largest granting council incomes?  No; if it were, then Guelph would be in and Dal would be out.  How about the 15 universities with the largest research incomes per faculty?  Nope; you’d have to include INRS, ETS, and Victoria. Research impact?  Depends on the measure used, but you could make a pretty good case for Simon Fraser on some of these.

Put it this way: the original G-10, plus Calgary and Ottawa, have a pretty good case to be a U-12.  Dal, Saskatchewan, and Manitoba are not manifestly different on most performance indicators from Simon Fraser, Guelph, Victoria, and York.  So why are the former members, and the latter not?  The answer, in a word, is politics.  Meteorological conditions in the underworld would need to be unseasonably frosty before UBC lets another BC institution into the club (Toronto and York have a similar issue).  Meanwhile, the Dal/Man/Sask trio give the organization a more pan-Canadian patina – without them, the club would be confined to the country’s four largest provinces.  This extra breadth goes a long way in Ottawa to convincing people you’re a truly “national” organization.

Now, of course the U-15 isn’t really alone in being a self-appointed elite, with fuzzy boundaries between itself and the rest of the pack.  Just last month, an article in the Oxford Review of Education came to the same conclusion regarding the UK’s “Russell Group”, saying that on most measures, only Oxford and Cambridge really stand apart from the rest of the country’s pre-92 universities (in the UK, a pre-92 university is a university that was purpose built as-such; a post-92 university is one that was originally a polytechnic – think of the dividing line between “research” and “regional” universities in BC, and you’ve more or less got it).

But it’s not always the case.  In Australia, the G-8 really are the top 8 in research almost any way you slice it.  And in the United States, the American Association of Universities (which, by a weird historic fluke, also includes McGill and U of T) actually has quite a detailed set of membership criteria: research expenditures normalized by number of faculty, number of National Academy members, National Research Council faculty quality indicators, faculty honors, and scholarly citations.  In fact, it’s strict enough that four years ago the organization actually kicked out Nebraska (Syracuse subsequently withdrew voluntarily so as not to suffer the same fate).

However, this is Canada: asking elites to meet some established criteria for proving eliteness isn’t really a thing here.  Because, you know, someone might fail.  Embarrassment would be caused.  And that wouldn’t do, not for someone already “in the club”.  So the likelihood of any U-15 members being asked to leave is pretty slight.  And while the U-15 would probably prefer that the top tier of non-members go and form their own club, as they have done in other Anglophone speaking countries (e.g. the Innovative Universities Group in Australia, or the [now-defunct] 1994 Group in the UK), the likelier outcome is a gradual process of letting on new members – eventually.  Guelph will come first, because it causes the least political friction with an existing member.  SFU and Vic would be the next obvious candidates.

Of course the problem is that if the group is insufficiently exclusive, the top tier may walk away themselves and form their own, even more exclusive group.  U-5, anyone?

December 08

Innovation Ecosystems: Promise and Opportunism

We sometimes think of innovation policy as being about generating better ideas through things like sponsored research.  And that’s certainly one part of it.  But if those ideas are generated in a vacuum, they go nowhere – making ideas spread faster is the second pillar of innovation policy (a third pillar – to the extent that innovation is about new product-generation – has to do with venture capital and regulatory environments, but we’ll leave those aside for now).

Yesterday, I discussed why the key to speeding up innovation was the density of the medium through which new ideas travel: basically, ideas about IT travel faster in Waterloo than in Tuktoyaktuk; ideas about marine biology travel faster in Halifax than in Prince Albert.  And the faster ideas travel and collide (or “have sex” in Matt Ridley’s phrase), the more innovation is produced, ceteris paribus.

Now, although they don’t quite use this terminology, the proponents of big universities and big cities alike find this logic pretty congenial.  You want density of knowledge industries?  Toronto/Montreal/Vancouver have that.  You want density of superstar researchers?  U of T, McGill, and UBC have that (especially if you throw in allied medical institutes).  That makes these places the natural spot to invest money for innovation, say the usual suspects.  All you need to do is invest in “urban innovation ecosystems” (whatever those are – I get the impression it’s largely a real estate play to bring scientists, entrepreneurs, and VCs into closer spatial proximity), and voila!  Innovation!

This is where sensible people need to get off the bus.

It’s absolutely true that innovation requires a certain ecosystem of researchers, and entrepreneurs, and money.  And on average productive ecosystems are likelier to occur in larger cities, and around more research-intensive universities.  But it’s not a slam dunk.  Silicon Valley was essentially an exurb of San Francisco when it started its journey to being a tech hub.  This is super-inconvenient to the “cool downtowns” argument by the Richard Floridas of this world; as Joel Kotkin has repeatedly pointed out, innovative companies and hubs are as likely (or likelier) to be located in the ‘burbs, as they are in funky urban spaces, mainly because it’s usually cheaper to live and rent space there.  Heck, Canada’s Silicon Valley was born in the heart of Ontario Mennonite country.

We actually don’t have a particularly good theory of how innovation clusters start or improve.  Richard Florida, for instance, waxes eloquent about trendy co-working spaces in Miami as a reason for its sudden emergence as a tech hub. American observers tend to attribute success to the state’s low tax rate, and presumably there are a host of other possible catalysts.  Who’s right?  Dunno.  But I’m willing to bet it’s not Florida.

We have plenty of examples of smaller communities hitting tech take-off without having a lot of creative amenities or “urban innovation strategies”. Somehow, despite the lack of population density, some small communities manage to get their ideas out in the world in ways that gets smart investors’ attention.  No one has a freaking clue how this happens: research on “why some cities grow faster than others” is methodologically no more evolved than research on “why some universities become more research intensive than others”, which is to say it’s all pretty suspect.  Equally, some big cities never get particularly good at innovation (Montreal, for instance, is living proof that cheap rent, lots of universities, and bountiful cultural amenities aren’t a guarantee of start-up/innovation success).

Moreover, the nature of the ecosystem is likely to differ somewhat in different fields of endeavor.  The kinds of relationships required to make IT projects work is quite different from the kinds that are required to make (for example) biotech work.  The former is quick and transactional, the latter requires considerably more patience, and hence is probably less apt to depend on chance meetings over triple espressos in a shared-work-environment incubator.  Raleigh-Durham and Geneva are both major biotech hubs that are neither large nor particularly hip (nor, in Raleigh’s case, particularly dense).

It’s good that governments are getting beyond the idea that one-dimensional policy instruments like “more money in granting councils” or “tax credits” are each unlikely on their own to kickstart innovation.  It’s good that we are starting to think in terms of complex inter-relations between actors (some, but not all of which involve spatial proximity), and using “ecosystem” metaphors.  Complexity is important. Complexity matters.

But to jump from “we need to think in terms of ecosystems” to “an innovation agenda is a cities agenda” is simply policy opportunism.   The real solutions are more complex. We can and should be smarter than this.

December 07

H > A > H

I am a big fan of the economist Paul Romer, who is most famous for putting knowledge and the generation thereof at the centre of  discussions on growth.  Recently, on (roughly) the 25th anniversary of the publication of his paper on Endogeneous Technological Change, he wrote a series of blog posts looking back on some of the issues related to this theory.  The most interesting of these was one called “Human Capital and Knowledge”.

The post is long-ish, and I recommend you read it all, but the upshot is this: human capital (H) is something stored within our neurons, which is perfectly excludable.  Knowledge (A) – that is, human capital codifed in some way, such as writing – is nonexcludable.  And people can use knowledge to generate more human capital (once I read a book or watch a video about how to use SQL, I too can use SQL).  In Romer’s words:

Speech. Printing. Digital communications. There is a lot of human history tied up in our successful efforts at scaling up the H -> A -> H round trip.

And this is absolutely right.  The way we turn a patterns of thought in one person’s head into thoughts in many people’s heads is the single most important question in growth and innovation, which in turn is the single most important question in human development.  It’s the whole ballgame.

It also happens to be what higher education is about.  The teaching function of universities is partially about getting certain facts to go H > A > H (that is, subject matter mastery), and partially about getting certain modes of thought to go H > A > H (that is, ways of pattern-seeking, sense-making, meta-cognition, call it what you will). The entire fight about MOOCs, for instance, is a question of whether they are a more efficient method of making H > A > H happen than traditional lectures (to which I think the emerging answer is they are competitive if the H you are talking about is “fact-based”, and not so much if you are looking at the meta-cognitive stuff.  But generally, “getting better” at H > A > H in this way is about getting more efficient at the transfer of knowledge and skills, which means we can do more of it for the same price, which means that economy-wide we will have a more educated and productive society.

But with a slight amendment it’s also about the research function of universities.  Imagine now that we are not talking H > A > H, but rather H > A > H1.  That is, I have a certain thought pattern, I put it into symbols of some sort (words, equations, musical notation, whatever) and when it is absorbed by others, it generates new ideas (H1). This is a little bit different than what we were talking about before.  The first is about whether we can pass information or modes of thought quickly and efficiently; this one is about whether we can generate new ideas faster.

I find it helpful to think of new ideas as waves: they emanate outwards from the source and lose in intensity as they move further from the source.  But the speed of a wave is not constant: it depends on the density of the medium through which the ideas move (sound travels faster through solids than water, and faster through water than air, for instance).

And this is the central truth of innovation policy: for H > A > H1 to work, there has to be a certain density of receptor capacity for the initial “A”.  A welder who makes a big leap forward in marine welding will see his or her ideas spread more quickly if she is in Saint John or Esquimault than if she is in Regina.  To borrow Matt Ridley’s metaphor of innovation being about “ideas having sex”, ideas will multiply more if they have more potential mates.

This is how tech clusters work: they create denser mediums through which idea-waves can pass; hence, they speed up the propagation of new ideas, and hence, under the right circumstances, they speed up the propagation of new products as well.

This has major consequences for innovation policy and the funding of research in universities.  I’ll explain that tomorrow.

December 04

Defending Liberal Arts: Try Using Data

A few weeks back, I wrote about the Liberals Arts/humanities, and some really bad arguments both for and against them.  As usual when I write these, I got a lot of feedback to the effect of: “well, how would you defend the Liberal Arts, smart guy”?  Which, you know, fair enough.  So, here’s my answer.

The humanities, at root, are about pattern recognition in the same way that the sciences and the social sciences are: they just seek patterns in different areas of human affairs – in music, in literature, and in the narrative of history.  And though humanities cannot test hypotheses about patterns using the same kinds of experimental methods as elsewhere, they can nevertheless promote greater understanding of thorough synthesis.  Or, to paraphrase William Cronon’s famous essay, the humanities are about making connections, only connections.  In a networked world, that’s a valuable skill.

None of this, to me, is in doubt.  What is in doubt is whether this promise made by the humanities and Liberal Arts is actually delivered upon.  Other disciplines synthesize and make connections, too.  They promote critical thinking (the idea that other disciplines, disciplines founded on the scientific method, don’t promote critical thinking is the most arrogant and stupid canard promoted by people in the humanities).  What the humanities desperately need is some proof that what they claim is true is, in fact, true.  They need some data.

In this context, it’s worth taking a look at the Wabash National Study on Liberal Arts education.  This was an elaborate, longitudinal, multi-institutional study to look at how students in liberal arts programs develop over time.  Students took a battery of tests – on critical reasoning, intercultural effectiveness, moral character, leadership, etc. – at various points in their academic career to see the effects of Liberal Arts teaching, holding constant the effects of things like gender, age, race, prior GPA, etc.  You can read about the results here – and do read them, because it is an interesting study.

At one level, the results are pretty much what we always thought: students do better if they are in classes where the teaching is clear and well-organized, and they learn more where they are challenged to do things, like applying theories to practical problems in new contexts, or integrating ideas from different courses in a project, or engaging in reflective learning.  And as can be seen here in the summary of results, the biggest positive effects of liberal arts education are on moral reasoning, critical thinking, and leadership skills (academic motivation, unfortunately, actually seems to go down over time).

So: mostly good for Liberal Arts/humanities, right?  Not quite.  Let me quote the most interesting bit: the research found that “even with controls for student pre-college characteristics and academic major, students attending liberal arts colleges (as compared to their peers at research universities and regional institutions) reported significantly higher levels of clarity and organization in the instruction they received, as well as a significantly higher frequency of experiences on all three of the deep-learning scales.”  In other words, the effects of Liberal Arts on students in Liberal Arts colleges are significantly greater than the effects on students studying similar programs in other, larger institutions.  That is to say, it’s the teaching environment and teaching practices, not the subject matter itself, which seems to make more of a difference.

Now, this does not suggest that Liberal Arts/humanities can’t deliver those kinds of benefits at larger universities; it’s just to say that for it to deliver those benefits, the focus needs to be on providing the subject matter using quite specific teaching practices and – not to beat around the bush – keeping class sizes down (which may in turn have implications for teaching loads and research activity, but that’s another story).

There are some good stories for the Liberal Arts in the Wabash data, and some not so good stories.  But the point is, there is data.  There are some actual facts and insights that can be used to improve programs, to make them better at producing well-rounded critical thinkers.  And at the end of the day, the inquiry itself is what’s important.  Humanities’ biggest problem isn’t that it’s got nothing to sell; it’s that too frequently they act like they have nothing to learn.  If more institutions adopted Wabash-like approaches, and acted upon them, my guess is the Liberal Arts would get a lot more respect than they currently do.

December 03

Every University and College Needs a Fool

OK, yes, lots of ways to complete that sentence (e.g. “Every university and college needs a fool… and mine already has several”, etc.).  But I mean this in a very literal sense.  Institutions need the equivalent of Medieval Fools, or Court Jesters, to help them combat bad institutional culture.

In addition to being a barrel of laughs, Fools had a specific function in medieval and early renaissance courts; namely, they were able to speak truth to power, albeit obliquely (think Robin Williams rather than Jon Stewart). Because they were dressed as figures of fun, they had some license to tweak the noses of the powerful, because their words could be shrugged off as the ravings of a simpleton.  Yet, frequently, those ravings were useful because they presented truths that could not otherwise be said aloud.  Those Fools were no fools; as Shakespeare said, playing the Fool took considerable wisdom.

Now, I’m not actually suggesting that universities and colleges need to dress someone up in an ass’ costume and run around making fun of people in an academic council meeting (inspiring a thought as that may be).  Nor am I suggesting that there needs to be someone who is specifically charged with poking fun at executive power at a university – most institutions already have enough self-appointed critics filling that job.

No, what I have in mind is something different: someone who has license to speak truth across the institution.  Not constantly, as a gadfly role (that would just get annoying).  But occasionally, maybe once every year, it would be useful for a Fool to give each institution a once-over.  And where I think this could be most useful is not on issues of specific policy – again, each institution has lots of self-appointed critics of management to do that – but rather on issues of institutional culture.

As a friend was observing to me yesterday, bad institutional culture never looks bad from the inside.  There’s always good reasons for this little bit of secrecy, or flippant refusal to make data public; there’s always a good reason for sanctioning financial or business entanglements, which are at best borderline, or good reasons to not make tough decisions, thus allowing problems to fester.

No one sets out to be part of a bad institutional culture.  Bad cultures are created gradually, inch-by-inch, so slowly that no one on the inside notices.  The function of a university/college Fool would be to come in from the outside and say, maybe once a year, forcefully and publicly: What the heck are you people doing?  How did you all get this inappropriately cozy with industry?  How did your principles of governance get so undermined that the faculty union thought it appropriate to grieve Senate decisions? (Don’t scoff – this has happened.) Why are you even thinking about evicting a student union from its building? 

Everybody wants to be part of a good academic culture.  Fools might be able to play a role in keeping everyone on the straight and narrow.  It’s got to be at least as good an idea as having organizational behavior consultants crawling all over the place.

Page 20 of 107« First...10...1819202122...304050...Last »