HESA

Higher Education Strategy Associates

Tag Archives: Credentials

May 10

Why Education in IT Fields is Different

A couple of years ago, an American academic by the name of James Bessen wrote a fascinating book called Learning by Doing: The Real Connection Between Innovation, Wages and Wealth.  (It’s brilliant.  Read it).  It’s an examination of what happened to wages and productivity over the course of the industrial revolution, particularly in the crucial cotton mill industry.  And the answer, it turns out, is that despite all the investment in capital which permitted vast jumps in labour productivity, in fact wages didn’t rise that much at all.  Like, for about fifty years.

Sound familiar?

What Bessen does in this book is to try to get to grips with what happens to skills during a technological revolution.  And the basic problem is that while the revolution is going on, while new machines are being installed, it is really difficult to invest in skills.  It’s not simply that technology changes quickly and so one has to continually retrain (thus lowering returns to any specific bit of training); it’s also that technology is implemented in very non-standard ways, so that (for instance) the looms at one mill are set up completely differently from the looms at another and workers have to learn new sets of skills every time they switch employers.  Human capital was highly firm-specific.

The upshot of all this: In fields where technologies are volatile and skills are highly non-standardized, the only way to reliably increase skills levels is through “learning by doing”.  There’s simply no way to learn the skills in advance.  That meant that workers had lower levels bargaining power because they couldn’t necessarily use the skills acquired at one job at another.  It also meant that, not to put too fine a point on it, that formal education becomes much less important compared to “learning by doing”.

The equivalent industry today is Information Technology.  Changes in the industry happen so quickly that it’s difficult for institutions to provide relevant training; it’s still to a large extent a “learning by doing” field.  Yet, oddly, the preoccupation among governments and universities is: “how do we make more tech graduates”?

The thing is, it’s not 100% clear the industry even wants more graduates.  It just wants more skills.  If you look at how community colleges and polytechnics interact with the IT industry, it’s often through the creation of single courses which are designed in response to very specific skill needs.  And what’s interesting is that – in the local labour market at least – employers treat these single courses as more or less equivalent to a certificate of competency in a particular field.  That means that these college IT courses these are true “microcredentials” in the sense that they are short, potentially stackable, and have recognized labour market value.  Or at least they do if the individual has some demonstrable work experience in the field as well (so-called coding “bootcamps” attempt to replicate this with varying degrees of success, though since they are usually starting with people from outside the industry, it’s not as clear that the credentials they offer are viewed the same way by industry).

Now, when ed-tech evangelists go around talking about how the world in future is going to be all about competency-based badges, you can kind of see where they are coming from because that’s kind of the way the world already works – if you’re in IT.  The problem is most people are not in IT.  Most employers do not recognize individual skills the same way, in part because work gets divided into tasks in a somewhat different way in IT than it does in most other industries.  You’re never going to get to a point in Nursing (to take a random example) where someone gets hired because they took a specific course on opioid dosages.  There is simply no labour-market value to disaggregating a nursing credential, so why bother?

And so the lesson here is this: IT work is a pretty specific type of work in which much store is put in learning-by-doing and formal credentials like degrees and diplomas are to some degree replaceable by micro-credentials.  But most of the world of work doesn’t work that way.  And as a result, it’s important not to over-generalize future trends in education based on what happens to work in IT.  It’s sui generis.

Let tech be tech.  And let everything else be everything else.  Applying tech “solutions” to non-tech “problems” isn’t likely to end well.

March 22

The Next Big Skills Policy Agenda

So today is budget day.  If the papers are anything to go by, there’s something big-ish in there about “skills” which will no doubt be presented as some massive benefit to the country’s middle class (and those trying to join it). I have difficulty imagining what might be announced since most skills policies are in the hands of the provinces.  But what I do know is that skills policy is an area long overdue a makeover.

The labour force is aging.  Any new burst of productivity – essential for rising incomes – is going to have to come from older workers, not newer ones.  Part of that is going to have to come from firms making greater capital investments – that is, better machines and IT infrastructure for workers to use.  But part of it is going to have to come from more intensive and continuous skills upgrading on the part of workers’ themselves.  And this is a problem, because historically Canada has been uniquely bad at achieving a culture of skills upgrading.  Go back year after year, report after report, and it’s the same story: where continuous upgrading is concerned, it tends to be concentrated among people who already have high levels of skills.  Those that have get, those that do not, do not.

Part of the problem here is funding.  That’s why we sometimes see government get interested in handing money either to individuals or to firms (for example, the Canada Jobs Grant) to subsidize training.  But I’d argue that money is at best a partial barrier to more training.  A larger barrier is time.  And a lot of existing institutional practices are as much a hindrance as a help in this regard.

Workers don’t have a lot of spare time.  They have jobs, kids, parents, families: all of which make time a scarce resource.  We don’t normally think of time as something governments can control, but they actually do have a couple of policy levers they could pull, if they wanted to.  First, they could create incentives or entitlements to time-off for the purpose of training/re-training.  This idea was mooted 35 years ago in the Macdonald Commission report as a “Time Bank” – every year, workers would accrue a certain amount of time off specifically for the purpose of training.  It would no doubt be a colossally unpopular move among employers, but is still probably something worth considering (and might not create that much dissension provided it was fairly applied across all workplaces and didn’t create free-rider problems).

But the other way to make more time available to people is to radically re-consider the nature of the credentials being sought.  Universities, God Bless ‘em, have never seen a labour market problem they couldn’t design a 1- or 2-year Master’s Degree to solve.  The problem is a) not everyone wants to do a year of full-time study (or the part-time equivalent over a longer period of time) and b) who really wants to wait until next September to get started if you just got laid off last week?

From an adult learners’ perspective, the best thing in the world would be credentials that are both shorter and continuously available.  The latter can be solved to some extent simply by throwing money at it.  Continuous intake is relatively easy if you have more instructors to teach more classes at different times of the year.  Putting a greater fraction of classes online could conceivably bring some economies of scale that would assist in the process.

But the bigger problem is reducing the length of credentials.  In theory, there is a pretty clear way forward, which are called “stackable credentials”.  Many institutions use some variant of this: thirty credits equals a certificate and once you bunch three certificates together you get an applied degree, or something along those lines.  But even the notion of thirty credits can be kind of off-putting if what you think you need is just a minor skills upgrade. What is needed is a trusted provider (which usually means a non-profit provider) to come up with a way to come up with smaller-duration credentials which actually convey to employers a sense of competency/mastery in particular fields, and which could also combine over time (i.e. “stack”) into more traditional credentials like diplomas and degrees.

What’s the government role in this?  Well, the problem is really one of co-ordination.  Individual campuses can experiment with short credentials or competency-based credentials all they like: if employers don’t understand the credentials, they will be worthless.  What is needed is collective action – someone has to corral institutions to work together to create new credential standards, and someone needs to corral business to talk about what feature they would find most useful in new, shorter credentials.

That may sound like a job for somebody like the Business-Higher Education Roundtable.  But frankly, some coercion is called for here.  My guess is if BHER floated this you’d probably get a few Polytechnics showing up to play (because it’s the kind of thing they do) and no one else.  But government has the muscle and dollars to make this happen a heck of a lot more quickly and efficiently.

Now, note I say “government” and not “the Government of Canada”.  It would be better all around if provincial governments, who constitutionally are the ones in charge in this area, took the lead.  But one could argue that the feds – provide they stay the hell away from directly funding institutions or getting too far into the curriculum weeds themselves – could at least nudge the key players towards the table.

Bottom line: if we want higher labour productivity we have to get much more serious about creating opportunities for workers to upgrade their skills.  Since the key pressure point for skills upgrading is time, we need to create new, shorter pathways to meaningful credentials.  That means shorter, stackable credentials.  These will need to be designed by employers and institutions together, but the quickest way to start this program runs through governments.  And there’s no time like the present to get started.

November 02

Shifts in Credentialling

As Colin Mathews, President of the technology company Merit, remarked in an excellent little article in Inside Higher Ed a few weeks ago, credentials are a language.  One with limited vocabulary, sure, but a language nonetheless.  Specifically, it is a form of communication from educational institutions to (primarily) the labour market to convey information about their possessors.  There has been a lot of talk in the last couple of years, however, to the effect that the current vocabulary of credentials is inadequate and that change is needed to promote better labour market outcomes for both firms and graduates.  What to make of this?

The complaints about credentials basically come in two categories.  The first has to do with what I call the “chunking” of credentials.  At the moment, we essentially have two sets of building blocks: “Credits” (or “courses”) and degrees.  One is very short, the other often quite long.  Why not something in-between, which indicates mastery over a body of material which might be of interest to employers but is less than a full degree?  These shorter, more focused credentials like Coursera’s “specializations”, or Udacity’s “nanodegrees” are meant to supply labour market skills more quickly than traditional degrees.

Then there is the issue of how to interpret the information conferred by a credential.  A bachelor’s degree mainly tells an employer that the bearer has to stick-to-it-iveness to complete a four-year project (commitment means a lot to employers).  If the employer has hired graduates from a particular university or program before, then they might have a sense of an individual’s technical capabilities, too.   But beyond that, it’s blank.  A transcript might tell an employer what courses a student has taken, but unless the employer is going to take an inordinate amount of time to scrutinize the curriculum, that doesn’t really help them understand what they’ve been exposed to.  Marks help in the sense that an employer can get a sense of what a student has achieved at school, but increasingly, employers are finding that this is irrelevant.  What matters in many fields are the “soft skills” or “fuzzy skills”: on these nearly all credentials are silent.

Enter the idea of “badges”, digital or otherwise.  A solution half-inspired by competency-based education principles and half by Girl Guides/Boy Scouts.  The idea here is to give learners certificates based on particular skills they have demonstrated just as the Guides and Scouts do.  The problems with this are manifold.  First of all, unless a particular skill can be demonstrated through standardized testing, certifying skills is actually a fairly time-consuming and therefore costly activity.  This is why many of the emerging badging systems actually measure achievements and activities rather than skills (one badging system recently profiled in Inside Higher Education, for instance, hands out badges for attending certain types of meetings.  Your typical employer could not care less).

But even if you buy the Guide/Scout analogy, badges quickly run into the same problem as transcripts.  Say you have a knot-tying badge.  Unless an employer is intimately familiar with the Guide/Scout curriculum, s/he will have no actual idea what the knot badge actually signifies in terms of practical skills.  Can they do Zeppelin Bends? Constrictor Knots?  Do they understand ambient isotopy?  Or can they just do a slip knot?  And badges for soft skills are still pretty sketchy, so that part of the equation is still a blank.

All these moves towards what might be termed “microcredentials” are well-meaning.  Degrees are a pretty blunt and opaque way to express achievement and ability; it would be better if we could find ways of making these more transparent.  But the problem here is that all these solutions are being tested largely without talking to employers.  Badges, or whatever new microcredential solution we are talking about here, are all new languages.  Employers understand the language of degrees.  They do not understand the language of badges and see little benefit in learning new languages which seem to bring little additional benefit. For most, the old language of degrees is good enough – for now.  And so the spread of badges and various types of skill-based certification is so far pretty limited.

But it’s early days yet.  My guess is we’re in the first stages of what will likely be a 20- to 30-year shift in credentialing.  As Sean Gallagher notes in his excellent new book The Future of University Credentials, the general trend is going to be towards demanding that individuals be able to demonstrate mastery of particular skills and competencies.  Partly, that can be done through changes to assessment and reporting within existing degree systems; but it may also come through the regularization of certain new credentials, some of which may be issued by existing institutions and others from new providers.  I don’t think the final form of these credentials are going to look anything like the current fashion of badges; they are frankly too clumsy to be up to much.  But the push in this direction is too strong to ignore.  Expect a lot of very interesting experimentation in this field over the next few years.

April 18

What Students Pay For (I)

Anyone who seriously believes in the whole “Great Disruption” meme has to be able to make the case that technologically-driven change of the kinds currently on offer can actually offer an improved value proposition to higher education consumers. No one, to date, has convincingly done so.

Let’s think about this for a minute: what is it students are actually buying when they enrol in a higher education institution? Though the specific combinations will differ from one student to another, all of them to some degree are buying each of the following four benefits:

1. A credential (e.g., a B.A, or an M.Sc.). At the end of the day, people want letters after their name because they perform an important signaling function in the labour market. The letters help young people get a foot in the door in a way that skills alone often don’t.

2. A brand (e.g., a Humber B.A. or a Dalhousie M.Sc.). Education is to some extent a positional good in the way that housing is. Location matters. Older institutions tend to be more prestigious and hence command a higher price.

3. An experience. For some reason, a lot of people tend to overlook this. Being in school, for most people, is a heck of a lot of fun. I know I’d pay good money to be a student again (and do it better this time). Post-secondary education thus isn’t just an investment, it’s a consumption good as well.

4. A set of skills and competencies which are obtained over the course of gaining a credential.

Most of the arguments for a Great Disruption implicitly assume that most of what people want out of post-secondary education are the skills and competencies.  Not coincidentally, of the four benefits listed above, that happens to be the one area where digital learning has the best case to provide better value for money.

Now, presumably there are some students who really only care about the skills. For them, experiments like Sebastian Thrun’s free Stanford class do indeed constitute some kind of Great Disruption. But for everyone else, any reduction in costs provided by these new “disruptive” providers is balanced by a reduction in one or more of the other three sets of benefits. They are simply less able to deliver on credentials, brand and experience than traditional education providers.

Overcoming deficiencies in those three areas is a pretty tall order, and it’s why most of the disruption stuff is vastly overblown. But there are a number of new providers who are in various ways trying to address these gaps. We’ll see how each of them fares tomorrow.

 

February 10

So, Competency-Based Education, Then

Competency-based education is not rocket science; demonstrate mastery over a particular set of material and you get a credential. This approach is common in informal education: badges for swimming and Guides, belts for martial arts, etc. Red Seal apprenticeships also operate this way.

Formal systems of education are more leery of this approach. In K-12, it is assumed that time served is more important than demonstrated skills in moving students from one level to another. Undergraduate education in North America similarly works on a time basis, with credits being defined in terms of contact hours.

The assumption, of course, is that time spent in the classroom ultimately implies skill acquisition, and hence that time-based education is just competency-based education by approximation. It’s a convenient argument for universities; essentially, it makes their function as certifiers of skills indistinguishable from their function as providers of knowledge/instruction. And by doing so, it gives them monopoly pricing power over instruction.

But what if someone could independently certify a set of knowledge and skills and say “yeah, that’s equivalent to a Bachelor’s degree”? Figuring out how to do that in a reliable way would be a genuine disruption to universities, because it would allow competing routes to credentials. Prior Learning Assessment Recognition (PLAR) – which is widespread in colleges if not universities – uses this same kind of techniques, though usually for advanced placement rather than delivering entire diplomas. So too, as a cautionary example, do many degree mills.

Western Governors University, a public, online university based in Salt Lake City, bases its degrees around “competency-units” rather than merely time-based credits, and it’s managed to convince a number of U.S. regional accreditors of the validity of such an approach. But despite the hype, WGU has its limitations. It has done very well to boost enrolment to 30,000 in under 15 years, but it’s still basically restricted to certain forms of professional certification and upgrading in business, IT, health and education – areas where external norms are easily available as reference points. Nobody has yet worked out how this would work in basic arts or sciences, where the attitude to the very notion of defined competencies verges on hatred.

That’s why things like Tuning, which documents degree-level outcomes, and AHELO, which is attempting to measure outcomes across different universities around the world, are so important. By getting people to focus on outcomes in areas where they haven’t before, they set the stage for a massive expansion of competency-based higher education.

If you’re looking for a “great disruption,” my money’s here. It’s not glamorous, and it won’t happen quickly, if at all. But unlike recently-hyped techno-solutions, it has the virtue of being both rigorous and realistic.