HESA

Higher Education Strategy Associates

Tag Archives: Curriculum

June 14

Two Approaches to Student Success

I’ve recently been doing a little bit of work recently on student success and I am struck by the fact that there are two very different approaches to student success, depending on which side of the Atlantic you are sitting on.  I’m not sure one is actually better than the other, but they speak to some very different conceptions of where student success happens within an institution.

(To be clear, when I say “student success” I mostly mean “degree/program completion”.  I recognize that there are evolving meanings of this which mean something more/different.  Some are extending the notion to not just completion but career success – or at least success in launching one’s career; others suggest completion is overrated as a metric since some students only attend to obtain specific skills and never intend to complete, and if these students drop out in order to take a job, that’s hardly a failure.  I don’t mean to challenge either of these points, but I’m making a point about the more traditional definition of the term).

What I would call the dominant North American way of thinking about student success is that it is an institutional and, to a lesser extent, a faculty matter rather than something dealt with at the level of the department or the program.  We throw resources from central administration (usually, from Institutional Research) to identify “at-risk” students.  We use central resources to bolster students’ mental health, and hire councillors, tutors and academic support centrally as well.  Academic advisors tend to be employed by faculties rather than the central admin, but the general point still stands – these are all things that are done on top of, and more or less without reference to, the actual academic curriculum.

The poster child for this kind of approach is Georgia State University (see articles here and here).  It’s an urban university with very significant minority enrolments, one that at the turn of the century had a completion rate of under 30%.  By investing heavily in data analytics and – more importantly – in academic tutors and advisors (I’ve heard but can’t verify that their ratio of students to advisors is 300:1 or less, which is pretty much unimaginable at a Canadian university).  Basically, they throw bodies at the problem.  Horrible, dreaded, non-academic staff bloat bodies.  And it works: their retention rates are now up over 50 percent – their improvement among minority students has been a whopping 32 percentage points.

But what they don’t seem to do is alter the curriculum much.  It’s a very North American thing, this.  The institution is fine, it’s students that have to make adjustments, and we have an army of counsellors to help them do so.

Now, take a gander at a fascinating little report from the UK called What Works: Student retention and success change programme phase 2  In this project, a few dozen individual retention projects were put together across 13 participating institutions, piloted and evaluated.  The projects differed from place to place, but they were built on a common set of principles, the first and most important one being as follows; “interventions and approaches to improve student retention and success should, as far as possible, be embedded into mainstream academic provision”.

So what got piloted were mostly projects that involved some adjustment to curriculum, either in terms of the on-boarding process (e.g. “Building engagement and belonging through pre-entry webinars, student profiling and interactive induction”) or the manner in which assessments are done (e.g., “Inclusive assessment approaches: giving students control in assignment unpacking”) or simply re-doing the curriculum as a whole (e.g. “Active learning elements in a common first-year engineering curriculum”).

That is to say, in this UK program, student success was not treated as an institutional priority dealt with by non-academic staff.  It was treated as a departmental-level priority, dealt with by academic staff.

I would say at most North American universities this approach is literally unimaginable.  Academic staff are not “front-line workers” who deal with issues like academic preparedness; in fact, often professors who do try to work with a student and refer them to central academic or counselling services will discover they cannot follow up an individual case with central services because the latter see it as a matter of “client confidentiality”.  And outside of professional faculties, our profs teach individual courses of their own choosing rather than jointly manage and deliver a set curriculum which can be tweaked.  Making a curriculum more student-friendly assumes there is a curriculum to alter, rather than simply a basket of courses.

Part of this is a function of how university is conceptualized.  In North America, we tend to think that students choose an institution first and a program of study later (based on HESA’s research on student decisions, I think this is decreasingly the case, but that’s another story). So, when we read all the Vince Tinto-related research (Tinto being the guru of student retention studies, most of which is warmed-over Durkheim) about “belonging”, “fit” and so on, we assume that what students are dropping out of is the institution not the program, and assign responsibilities accordingly.  But in Europe, where 3-year degrees are the norm and they don’t mess around with things like breadth requirements, the assumption is you’re primarily joining a program of study, not an institution.  And so when Europeans read Tinto, they assume the relevant unit is the department or program, not the institution.

But also I think the Europeans – those interested in widening access and participation, anyway – are much less likely to think of the problem as being one of how to get students to adapt to university and its structures.  Quite often, they reverse the problem and say “how can the institution adapt itself to its students”?

It’s worth pondering, maybe, whether we shouldn’t ask that question more often, ourselves.  I think particularly when it comes to Indigenous students, we might be better served with a more European approach.

 

April 11

Populists and Universities, Round Two

There is a lot of talk these days about populists and universities.  There are all kinds of thinkpieces about “universities and Trump”, “universities and Brexit”, etc.  Just the other day, Sir Peter Scott delivered a lecture on “Populism and the Academy” at OISE, saying that over the past twelve months it has sometimes felt like universities were “on the wrong side of history”.

Speaking of history, one of the things that I find a bit odd about this whole discussion is how little the present discussion is informed by the last time this happened – namely, the populist wave of the 1890s in the United States.  Though the populists never took power nationally, they did capture statehouses in many southern and western states, most of whom had relatively recently taken advantage of the Morrill Act to establish important state universities.  And so we do have at least some historical record to work from – one that was very ably summarized by Scott Gelber in his book The University and the People.

The turn-of-the-20th-century populists wanted three things from universities. First, they wanted them to be accessible to farmers’ children – by which they meant both laxer admissions standards and “cheap”.  That didn’t necessarily mean they wanted to increase expenditures on university budgets substantially (though in practice universities did OK under populist governors and legislators); what it meant was they wanted tuition to remain low and if that entailed universities having to tighten their belts, so be it.  And the legacy of the populists lives on today: average state tuition in the US still has a remarkable correlation to William Jennings Bryan’s share of the vote in the 1896 Presidential election.

 

Fig 1: 2014-15 In-State Tuition Versus William Jennings Bryan’s Vote Share in 1896

Populism Graph

 

The second thing populists wanted was more “practical” education.  They were not into learning for the sake of learning, they were into learning for the sake of material progress and making life easier for workers and farmers; in many ways, one could argue that their attitude about the purpose of higher education was pretty close to that of Deng/Jiang-era China.  And to some extent they were pushing on an open door because the land-grant universities – particularly the A&Ms – were already supposed to have that mandate.

But there was a tension in the populists’ views on curriculum.  They weren’t crazy about law and humanities programs at state universities (too much useless high culture that divided the masses from the classes), but they did grasp that an awful lot of people who were successful in politics had gone through law and humanities programs and – so to speak – learned the tricks of the trade there (recall that rhetoric was one of the seven Liberal arts which still played a role in 19th century curricula).  And so, there was also concern that if public higher education were made too vocational, its beneficiaries would still be at a disadvantage politically.  There were various solutions to this problem, not all of which were to the benefit of humanities subjects, but the key point was this: universities should remain places where leaders are made.  If that meant reading some Marcus Aurelius, so be it: universities were a ladder into the ruling class, and the populists wanted to make sure their kids were on it.

And here, I think is where times have really changed. The new populists are, in a sense, more Gramscian than their predecessors.  They get that universities are ladders to power for individuals, but they also understand that the cultural function of universities goes well beyond that.  Universities are – perhaps even more so than the entertainment industry – arbiters of acceptable political discourse.  They are where the hegemonic culture is made.  And however much they may want their own kids to get a good education, today’s populists really want to smash those sources of cultural hegemony.

This is, obviously, not good for universities.  We can – as Peter Scott suggested – spend more time trying to make universities “relevant” to the communities that surround them.  Nothing wrong with that.  We can keep plugging away at access: that’s a given no matter who is in power.  But on the core issue of the culture of universities, there is no compromise.  Truth and open debate matter.  A commitment to the scientific method and free inquiry matter.  Sure, universities can exist without these things: see China, or Saudi Arabia.  But not here.  That’s what makes our universities different and, frankly, better.

No compromise, no pasarán.

May 26

Taking Advantage of Course Duplication

I recently came across an interesting blogpost from a professor in the UK named Thomas Leeper (see here), talking about the way in which professors the world over spend so much time duplicating each others’ work in terms of developing curricula.  Some key excerpts:

” …the creation of new syllabi is something that appears to have been repeated for decades, if not centuries. And yet, it all seems rather laborious in light of the relatively modest variation in the final courses that each of us creates on our own, working in parallel.”

“… In the digital age, it is incredibly transparent that the particular course offerings at every department are nearly the same. The variation comes in the quality of lectures and discussion sections, the set of assignments required of students, and the difficulty of the grading.”

“We expend our efforts designing strikingly similar reading lists and spend much less time on the factors that actually differentiate courses across institutions: lecture quality, learning activities, and feedback provision… we should save our efforts on syllabus construction and spend that time and energy elsewhere in our teaching.”

Well, quite.  But I think you can push this argument a step further.  I’ve heard (don’t quote me because I can’t remember exactly where) that if you group together similar courses across institutions (e.g. Accounting 100, American History survey courses, etc.), then something like 25% of all credits awarded in the United States are accumulated in just 100 “courses”.  I expect numbers would not be entirely dissimilar in other Anglophone countries.  And though this phenomenon probably functions on some kind of power law – the next 100 courses probably wouldn’t make up 10% of all credits – my guess is your top 1000 courses would account for 50-60% or more of all credits.

Now imagine all Canadian universities decided to get together and make a really top-notch set of e-learning complements to each of these 1000 courses.  The kinds of resources that go into a top-notch MOOC  in order to improve the quality of each of these classes (like, for instance, the University of Alberta’s Dino 101).  Not that they would be taught via MOOC – teaching would remain the purview of individual professors – but that each have excellent dedicated on-line resources associated with each of them.  Let’s say they collectively put $500,000 into each of them, over the course of four years.   That would be $500M in total, or $125M per year.  Obviously, those aren’t investments any single institution could contemplate, but if we consider the investment from the perspective of the entire sector (which spends roughly $35 billion per year) this is chump change.   $120 per student.  A trifle.

So, a challenge to university Presidents and Provosts: why not do this?  We’re talking here about a quantum jump in learning resources available for half the credits undergraduate earn each semester.  Why not collectively invest money to improve the quality of the learning environment?  Why not free up professors’ time so they can focus more on lecture quality and feedback provision?  And to provinces and CMEC: why not create incentives for institutions to do precisely this?

A debate should be had.  Let’s have it.

April 07

Innovation to Watch at the University of Sydney

Australian universities seem to do “Big Change” a lot better than universities elsewhere.  A few years ago, the University of Melbourne radically overhauled its entire curriculum in the space of about two years partly to create a more North American-like distinction between undergraduate and professional degrees and partly to reduce degree clutter by winnowing the number of different degrees from over a hundred to just six.  (For a refresher, I wrote about this back here).

If you read press reports about the University of Sydney’s new strategic plan (read the full document here,  it’s completely worth it) you might think Sydney is just aping Melbourne: it’s culling of degrees from 120 to 20, mostly by wiping out five-year “double degrees”, and also reducing the number of faculties from 16 to 6.

But the reduction in the number of degrees is actually a much less interesting story than what Sydney plans to do in terms of its curriculum.  From 2018, every program is to have two courses in third-year: one to integrate and apply disciplinary skills and another to apply disciplinary knowledge and skills in context.  Every degree will culminate in a final-year project or practicum.  Every program will have cultural competency embedded within it, and support for international studies will rise so that (hopefully) the proportion of students with an international experience will rise from 19% to 50%.  A strong framework to support career transitions will also be set up. Involving both curricular and co-curricular efforts

Here’s the most interesting bit: And an entirely new “open learning environment” will be created within the university, which will provide short, on-demand courses in areas such as entrepreneurship, ethics, project management, leadership (you know, all the employability-related skills universities usually claim students pick up by osmosis).  Some of these courses will be online, while some will be blended online/workshop; some will be non-credit and some will be small-credit. 

Did I mention they are going to develop a university-wide approach to measuring how desired graduate qualities such as disciplinary depth, interdisciplinary effectiveness, communication ability and cultural competence have been attained?  Yes, really.

What makes this kind of change deeply impressive – and potentially highly significant – is that it is not coming from a second-tier, ambitious institution trying to catch attention by doing something new.  This is the country’s oldest university.  This is a big, old prestigious institution taking big serious steps to actually change the undergraduate degree structure in order to provide students with better skills without sacrificing academic rigour.  It’s a research university that cares enough about undergraduate learning outcomes that it will measure them in some way beyond graduation rates and immediate employment rates. 

This is cutting edge stuff.  It may even be a world first.  We should all hope it is not the last; this kind of approach needs to spread quickly.

April 06

Fuzzy Skills

About a month ago, Universities Canada held a meeting to talk up the Liberal Arts.  I wasn’t there, and can only go by what I saw on twitter and what I can glean from this University Affairs article which you can read here.  But if the conversation was actually anything like what the sub-head suggests it was (we need better stories!), I’m not impressed.

At one level, “we need better stories” is always true.  Good communication is always worthwhile.  But if you claim that’s all you need then basically you’re saying that actual changes in practices are not necessary. We here in academia are fine, it’s you ignorant lot out there who are the problem – and once we tell better stories, you will see the light.   It’s arrogant, frankly.  More introspection about needed pedgagogical changes and less “we need better stories”, please (I note that Mount Allison’s Robert Campbell at least took that tack – good on him).

Moreover if you look at the “good” stories that Arts faculties want to tell, you’ll find they’re pretty much all about how various social scientists have changed public policy.  Very little is about the humanities (a result perhaps of the usual Canadian confusion about the distinction between “Arts”, “Liberal Arts” and “Humanities”).  At best, you get some vague words about how humanities promotes “soft skills”, which frankly isn’t very helpful.  Partly that’s because “soft skills” as a term is somewhat gendered (and thus likely to turn off males) and partly because there’s very little evidence that humanities education does much to foster that cluster of personality traits, social graces, and all that other stuff which clusters around “emotional intelligence”.  It’s possible – maybe even likely – that humanities graduates might possess these skills, but that may simply be a question of who chooses to enter these fields rather than what skills get developed by the disciplines.

Yet I think there is a simple and unambiguous way to sell the humanities: they are not about soft skills,   they are about “fuzzy skills”.    They are about ambiguity.  They are about pattern recognition.  They are about developing and testing hypotheses in areas of human affairs where evidence is always partial and never clear-cut.  Humanities graduates are not about following rules; they are about interpreting rules when the context changes.  

And you know what?  Doing that kind of interpretation well is *hard*.  The worst mistake the humanities have ever made is accepting the public impression that not being an “exact” science means humanities are “easy”.  They are not.  Good work in the humanities is hard precisely because there are many possible answers to a question.  The difficulty lies in sifting the more plausible from the less plausible (unless of course you dive completely into the post-modernist “I’m OK you’re OK” intellectual rathole where every answer is equally correct; then humanities is just nonsense). 

Think about the world of espionage and intelligence: this is extraordinarily difficult work precisely because we never have enough information and empathy to know exactly what a target is thinking or might be doing.  But it is precisely the synthesis of information from across a wide range of disciplines, and the close reading of texts – what we used to call philology-  that allows us to make competent guesses.  Quantitative data analysis is useful in this (and lord knows we probably shouldn’t let humanities students graduates without some understanding of statistics and probability); but so too are the basic “fuzzy skills” taught in humanities programs.  When business talks about “critical thinking” skills it is precisely this kind of analysis and decision-making, writ small, that they are talking about.

I think that’s a pretty good story for the humanities.  The problem is that for these good stories to work, humanities faculties have to live up to them.  Simply telling a good story isn’t enough. Curricula (and more importantly assessment) need to be re-designed in order to show how these fuzzy skills are actually being taught and absorbed.  No more assuming students get these non-disciplinary skills by osmosis because “everybody knows” that’s what humanities do.  Design for fuzzy skills.  Incorporate them.  Measure them.

And then you’ll have both a good story and a good reality.  That would be real and welcome progress.

June 09

STEM and STEAM: The “Two Cultures” and Academic Incentives

About a month ago, I wrote about whether institutions would adjust their program mix if it would help improve economic growth.  Nearly everyone that wrote me implicitly assumed that the “right” mix for economic growth implied a switch to a more STEM-heavy system, before going on to say something like “but what about the humanities?”  I found this kind of amusing, because I actually don’t automatically assume that STEM (Science, Technology, Engineering, and Mathematics) degrees are where it’s at in terms of growth, and there are a couple of quite high-powered papers out that support this view.

The first, Revisiting the STEM Workforcecomes from the National Science Board in the US.  This publication makes a couple of sensible points, the most important being that STEM skills and STEM degrees are not the same thing.   Lots of STEM graduates end up in non-STEM employment; conversely, many STEM-field jobs are held by people who are not themselves STEM graduates (Steve Jobs, famously, went to Reed College and was self-taught as far as computers went).  Basically, the link between higher education credentials and labour market skills is nowhere near as tight as people tend to assume.

The second new STEM report, from the Canadian Council of Academies, makes an even more important point: namely, that STEM skills are a necessary condition for innovation, but not a sufficient one.  The panel that wrote the report (led by former Bank of Canada Governor David Dodge) did not go quite as far as Don Tapscott did in his plea to replace a focus on STEM degrees with a focus on STEAM degrees (i.e. STEM + Arts).  They did, however, point to a number of other types of skills, such as communication, team work, leadership, creativity, and adaptability, which they felt were at least as important as narrow STEM skills.  The panel also made the point that the best way to meet future human resource challenges is to focus more broadly on skill acquisition from pre-primary to higher education, across a range of subjects – because, frankly, you never know what kind of labour market you’re going to need.

Both reports say we need to get over our obsession with STEM, a conclusion that typically brings cheers from the humanities’ defenders.  But be careful here: even if you buy the “more STEAM” conclusion, it says nothing about the number of Arts degrees that should be produced.  Companies are not dying to hire more Arts grads so they can add that little something of creativity and communication to existing teams of STEM workers.  What they are looking for are individuals who can integrate all of those skills.  It’s a call for more crossover degrees involving both Arts and STEM.  It’s a call to get beyond C.P. Snow’s Two Cultures.

The real problem is that universities genuinely do not know how to deliver programs like this.  Fundamentally, they are designed to focus on degrees rather than skills. Sure, programs can cross departmental lines; however, programs that cross faculty lines are the red-headed step-children of higher education.  As a result, “real” programs – read: prestigious programs – more or less follow disciplinary lines.  Within universities, faculties count success by how many students are “theirs”, but cross-faculty programs exist in a kind of no-man’s-land: they simultaneously belong to everyone and no one.  With no incentives, there’s simply no pressure from below – that is, from faculty – to embark on the arduous journey of creating a curriculum, and working it through the academic approval process.  In other words, STEAM only works for Arts at a resource level (and hence a political level) if it means more Arts degrees; if not, then forget it.

It would all be so much easier if institutions were built around what we wanted students to learn; instead, they are organized by academic disciplines that are necessary guardians of research quality, but in many respects actively hinder the development of balanced graduates who can succeed in work and society.  Finding ways to mitigate this problem is one of the most important questions facing higher education, but we can’t seem to talk about it openly.  That’s a problem that needs solving.

March 11

The End of College? (Part 1)

Over the next couple of days, I want to talk a bit about a new book called The End of College, written by the New America Foundation’s Kevin Carey.  It’s an important book not just because it’s been excerpted repeatedly in some major publications, or because the conclusions are correct (in my view: they’re not), but because it has an unerringly precise diagnosis of how higher education came to its present malaise, and the nature of the economic and institutional reasons that impede change in higher education.

Carey’s narrative starts by tracing the origins of universities’ current problems back to the 19the century, when America had three competing types of universities.  First were the small liberal arts colleges devoted either to Cardinal Newman’s ideals, or training clergy, or both; second were the Land Grant institutions, created by the Morrill Act and devoted to the “practical arts”; and a third was a group that wanted to emulate German universities and become what we now call “research universities”.  Faced with three different types of institutions from which to choose, America chose not choose at all – in effect, it asked universities to embody all three ideals at once.

On top of that, American universities made another fateful decision, which was to adopt what is known as the Elective model (I prefer the term “Smorgasbord model”, and wrote about it back here).  Starting at Harvard under President Charles Eliot, this move did away with programs consisting of a standardized set of courses in a standard curriculum, and replaced it with professors teaching more or less what they felt like, and students getting to choose the courses they liked.  This mix of specialization and scholarly freedom was one of the things that allowed institutions to accommodate both liberal and practical arts within the same faculties.  In Carey’s words: “the American university emerged as an institution that was designed like a research university, charged with practical training and immersed in the spirit of liberal education”.

The problem is that this hybrid university simply didn’t work very well as far as teaching was concerned.  The research end of the university began demanding PhDs – research degrees – as minimum criteria for hiring.  So hiring came to center on research expertise even though this was no guarantee of either teaching quality or ability in practical arts. And over time, universities largely abandoned responsibility for teaching to those people who were experts in research but amateurs at teaching.  No one checked up on teaching effectiveness or learning outcomes.  Degrees came to be a function of time spent in seats rather than actual measures of competence, proficiency, or mastery of a subject.

Because no one could check up on actual outputs or outcomes – not only are our research-crazy institutions remarkably incurious about applying their talents to the actual process of learning, they actively resist outsiders attempts to measure, too (see: AHELO) – competition between universities was fought solely on prestige.  Older universities had a head start on prestige; unless lavishly funded by the public (as the University of California was, for a time), the only way to complete with age was with money – often students’ money.  Hence, George Washington University, New York University, the University of Southern California, and (to a lesser extent) Washington U St. Louis all rose in the rankings by charging students exorbitant fees and ploughing that money into the areas that bring prestige: research, ivy, nicer quads, etc.  (Similarly, Canadian institutions devoted an unholy percentage of all the extra billions they got in tuition and government grants since the late 90s into becoming more research-intensive; in Australia, G-8 universities are shameless in saying that the proceeds of deregulated tuition are going to be ploughed into research.)  The idea that all those student dollars might actually be used to – you know – improve instruction rarely gets much of a look-in.

Maybe if we were cruising along at full employment, no one would care much about all this.  But the last six years have seen slow growth and (in the US at least) unprecedented declines in disposable middle-class incomes, as well as graduates’ post-school incomes.  So now you’ve got a system that is increasingly expensive (again, more so in the US than Canada), doesn’t attempt to set outcomes standards or impose standards on its professors, or do much in terms of working out “what works”.

Carey – rightly, I think – sees this as unsustainable: something has to give.  The question is, what? Tomorrow, I’ll discuss Carey’s views on the subject, and on Friday I’ll provide some thoughts of my own.

December 10

The History of the Smorgasbord

One of the things that clouds mutual understanding of higher education systems across the Atlantic is the nature of the Arts curriculum.  And in particular, the degree to which they actually have them in Europe, and don’t over here.

When students enroll in a higher education program in Europe, they have a pretty good idea of the classes they’ll be taking for the next three years.  Electives are rare; when you enter a program, the required classes are in large part already laid out.  Departments simply don’t think very much in terms of individual courses – they think in terms of whole programs, and share the teaching duties required to get students through the necessary sequence of courses.

If you really want to confuse a European-trained prof just starting her/his career in Canada, ask: “what courses do you want to teach?”  This is bewildering to them, as they assume there is a set curriculum, and they’re there to teach part of it.  As often as not, they will answer: “shouldn’t you be telling me what courses to teach”?  But over here, the right to design your own courses, and have absolute sovereignty over what happens within those courses, is the very definition of academic freedom.

And it’s not just professors who have freedom.  Students do too, in that they can choose their courses to an extent absolutely unknown in Europe. Basically, we have a smorgasbord approach to Arts and Sciences (more the former than the latter) – take a bunch of courses that add up to X credits in this area, and we’ll hand you a degree.  This has huge advantages in that it makes programs flexible and infinitely customizable.  It has a disadvantage in that it’s costly and sacrifices an awful lot of – what most people would call – curricular consistency.

So why do we do this?  Because of Harvard.  Go back to the 1870s, when German universities were the envy of the world.  The top American schools were trying to figure out what was so great about them – and one of the things they found really useful was this idea called “academic freedom”.  But at Harvard, they thought they would go one better: they wouldn’t just give it to profs, they’d give it to students, too. This was the birth of the elective system.  And because Harvard did it, it had to be right, so eventually everyone else did it too.

There was a brief attempt at some of the big eastern colleges to try and put a more standard curriculum in place after World War II, so as to train their budding elites for the global leadership roles they were expected to assume.  It was meant to be a kind of Great Books/Western Civ curriculum, but profs basically circumvented these attempts by arguing for what amounted to a system of credit “baskets”.  Where the university wanted a single course on “drama and film in modern communication” (say), profs argued for giving students a choice between four or five courses on roughly that theme.  Thus, the institution could require students to take a drama/film credit, but the profs could continue to teach specialist courses on Norwegian Noir rather than suffer the indignity of having to teach a survey course (not that they made their case this way – “student choice” was the rallying call, natch).

Canadian universities absorbed almost none of this before WWII – until then, our universities were much closer to the European model.  But afterwards, with the need to get our students into American graduate schools, and so many American professors being hired thereafter (where else could we find so many qualified people to teach our burgeoning undergrad population?), Canadian universities gradually fell into line.  By the 1970s, our two systems had coalesced into their present form.

And that, friends, is how Arts faculties got their smorgasbords and, to a large extent, jettisoned a coherent curriculum.

November 24

The Arts Problem(s)

There’s no polite way to say this: Canadian universities have an Arts problem.

At the heart of institutions’ looming fiscal problems is their inability to convince major customer groups (government, students) to pay the desired price for the product they’re offering.  The reason for this, mainly, is the perception that the product on offer is not value-for-money.  Part of this is due to our ludicrously opaque student aid systems, which lead students and families and politicians into thinking that net tuition is a heck of a lot higher than it actually is (see here for more on that, or here for the full report).  But part of it also has to do with the fact that people are under the impression that returns on education ain’t what they used to be.

That’s not entirely fair, of course.   The recession is responsible for most of the downturn in graduate jobs, not some sudden change in what the market “wants” in terms of skills.  And it’s not even true that returns are falling for all fields of study: some have held up relatively well in recent years.  But it is a problem in Arts.  Look what data from the annual survey of Ontario Graduates says: though employment rates remain high, the actual monetary returns are very bad at the moment – down roughly 20% in real terms over the past few years.

Figure 1: Average Income (in $2013) Two Years After Graduation, Ontario Graduating Classes from 2003-2011, Selected Disciplines

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

Not surprisingly, students are voting with their feet.  Look at the pattern of applications by program in Ontario: after a series of small declines in Arts, last year saw a decline of 10%.

Figure 2: Share of Total Applications to Ontario Universities, by Selected Fields of Study, 2003-14

unnamed

 

 

 

 

 

 

 

 

 

 

 

 

 

The point here is that, increasingly, the perception of Arts is that they aren’t very useful.  And yes, it’s annoying that people want to reduce education to considerations of short-term employment, but it is what it is.  When we ask people to pay so much (either privately or via tax dollars), people expect results, and they aren’t seeing them.

So something has to change in the Arts; not just for their own sake, but for the sake of all of higher education, which is being tarred with the same brush.  And that something is a greater focus on employability.

Now, even saying something like that causes paroxysm among some: “I’m not going to create cannon-fodder for the knowledge economy, etc. etc.”  But as I’ve said before, it shouldn’t be beyond the wit of talented academics to devise a curriculum that meets both the traditional aims of a liberal arts degree, and that places more emphasis on employability skills (what is the ability to critically appraise arguments, appreciate complex chains of causation, and clear and effective writing if not employability skills?).  Indeed, I’ve even suggested there are some good models available from fields like medicine to do exactly this.

But if fixing the Arts was as simple as that, it probably would have happened already.  The biggest problem with Arts isn’t that the curriculum is difficult to alter, it’s that to a large extent curriculum simply doesn’t exist.  For decades, Arts faculties in North America have been headed inexorably towards a “buffet”, where if you take a few courses from column A, a few from column B, and we’ll call it a degree as long as the credit hours line-up.  Or, more bluntly, there is no curriculum, there’s just a bunch of courses.  This is completely unlike Arts faculties in the rest of the world, where course choice is more limited and degrees are much more structured.

So here’s the real issue: the preliminary work required to improve curriculum – that is, getting folks to realize there’s a curriculum in the first place – is therefore pretty massive.  And this is why it’s likely that, even though Arts needs to improve quickly to stem declining enrolments, it’s unlikely that change will actually occur quickly.

In the best of all worlds, this is a task people should have started working on years ago.  But as they say, the second-best time to start anything is now.  We should roll up our sleeves and get cracking.

October 10

A Miracle in Melbourne

Today, I want to tell you about one of the most amazing stories in recent higher education history.  It happened at the University of Melbourne about eight years ago, and it involved having the country’s leading university completely up-end its entire curriculum – every single degree program – in the space of about 24 months.  Ladies and Gentlemen, I give you: the Melbourne Model.

The basic story is this: A decade ago, Melbourne – like all Australian universities – had a three-year undergraduate study degree, with law and medicine being direct-entry first degree programs (a bit like how Quebec allows direct-entry to these programs for select CEGEP grads), and with a fourth-year acting as an honours year for those wishing to pursue graduate (mainly doctoral) studies.  Then in 2005, a new President (Glyn Davis) came to office, vowing to make Melbourne a more research-intensive kind of place.  In the first draft of a widely-circulated strategic plan, Davis suggested it might be time to “examine the possibility” of moving to, what he called, a “US graduate-school model”, with a much more generalist three-year undergraduate program, followed by graduate/professional studies (it was referred to internally as a 3+2+3 system, which implied a much larger role for Master’s programs).  The proposal was seen as useful both because it might increase research-intensiveness and because a major re-design might force the Melbourne community to think harder about graduate outcomes and what it actually meant to be a “Melbourne Graduate”.

The professional model was by no means the centrepiece of the strategic plan, but it generated curiously little comment, and eventually ended up in the final version in February 2006 without having been subject to much debate.  Having got that far, Davis and his team went for broke: all faculties were told to re-design their curricula in time for implementation in January 2008.

It was at this point, of course, that people freaked.  Much of the Arts faculty thought it was going to be sold down the river – until then, many of their students took joint courses with professional programs (e.g. law/history) and many reckoned that without the professional link, they’d be sunk.  It took a while for it to sink-in that with law now inaccessible for direct entrants (a fact that enraged many parents), more students had time to take three years to study something purely for interest.  History – and most of the rest of the Faculty – in fact did just fine.

One of the most interesting decisions was to limit the number of Bachelor’s degrees being offered to just six – Arts, Science, Bioscience, Environments, Music, and Commerce, and to some extent de-link the degree from the faculties in which professors resided (there were 12 faculties).   These degrees were also designed to have common elements between them regarding program depth and what they called “knowledge transfer” (what we could probably call experiential learning).  They didn’t achieve this goal perfectly, but then, when you’re trying to re-vamp every single degree in a university with 40,000 students in the space of under 18 months, you can tolerate the odd imperfection.

There still remained the trick of selling the idea to government and students.  The former was important for financial reasons because Australia doesn’t fund non-research graduate degrees, so the switch to a “professional” model theoretically put money for all those students at risk – but since allowing the switch didn’t cost the government anything (it would spend what it had always spent on those students) it was a relatively quick sell.  A more serious issue was convincing students that this was a good idea.  After all, students bent on law or medicine would now have to go through three years of undergraduate study first, while other institutions could still offer it to them straight out of high school.  Partly through effective marketing, and partly because of the institution’s own brand power (Melbourne is essentially Australia’s U of T) this fear never materialized.  Applications from top students held up, and in some fields the institution was able to become even more selective.

Try, if you will, to imagine a Canadian institution that could re-jig all of its curricula from top to bottom in less than 24 months, not because a government told them to, but simply because it seemed like a good way to make the university a better place.  I can’t, but I wish I could.  What Melbourne achieved here is proof positive that universities can change, and at speed, if they wish to do so.  And that’s news everyone needs to hear.

Page 1 of 3123