HESA

Higher Education Strategy Associates

Tag Archives: microcredentialling

May 10

Why Education in IT Fields is Different

A couple of years ago, an American academic by the name of James Bessen wrote a fascinating book called Learning by Doing: The Real Connection Between Innovation, Wages and Wealth.  (It’s brilliant.  Read it).  It’s an examination of what happened to wages and productivity over the course of the industrial revolution, particularly in the crucial cotton mill industry.  And the answer, it turns out, is that despite all the investment in capital which permitted vast jumps in labour productivity, in fact wages didn’t rise that much at all.  Like, for about fifty years.

Sound familiar?

What Bessen does in this book is to try to get to grips with what happens to skills during a technological revolution.  And the basic problem is that while the revolution is going on, while new machines are being installed, it is really difficult to invest in skills.  It’s not simply that technology changes quickly and so one has to continually retrain (thus lowering returns to any specific bit of training); it’s also that technology is implemented in very non-standard ways, so that (for instance) the looms at one mill are set up completely differently from the looms at another and workers have to learn new sets of skills every time they switch employers.  Human capital was highly firm-specific.

The upshot of all this: In fields where technologies are volatile and skills are highly non-standardized, the only way to reliably increase skills levels is through “learning by doing”.  There’s simply no way to learn the skills in advance.  That meant that workers had lower levels bargaining power because they couldn’t necessarily use the skills acquired at one job at another.  It also meant that, not to put too fine a point on it, that formal education becomes much less important compared to “learning by doing”.

The equivalent industry today is Information Technology.  Changes in the industry happen so quickly that it’s difficult for institutions to provide relevant training; it’s still to a large extent a “learning by doing” field.  Yet, oddly, the preoccupation among governments and universities is: “how do we make more tech graduates”?

The thing is, it’s not 100% clear the industry even wants more graduates.  It just wants more skills.  If you look at how community colleges and polytechnics interact with the IT industry, it’s often through the creation of single courses which are designed in response to very specific skill needs.  And what’s interesting is that – in the local labour market at least – employers treat these single courses as more or less equivalent to a certificate of competency in a particular field.  That means that these college IT courses these are true “microcredentials” in the sense that they are short, potentially stackable, and have recognized labour market value.  Or at least they do if the individual has some demonstrable work experience in the field as well (so-called coding “bootcamps” attempt to replicate this with varying degrees of success, though since they are usually starting with people from outside the industry, it’s not as clear that the credentials they offer are viewed the same way by industry).

Now, when ed-tech evangelists go around talking about how the world in future is going to be all about competency-based badges, you can kind of see where they are coming from because that’s kind of the way the world already works – if you’re in IT.  The problem is most people are not in IT.  Most employers do not recognize individual skills the same way, in part because work gets divided into tasks in a somewhat different way in IT than it does in most other industries.  You’re never going to get to a point in Nursing (to take a random example) where someone gets hired because they took a specific course on opioid dosages.  There is simply no labour-market value to disaggregating a nursing credential, so why bother?

And so the lesson here is this: IT work is a pretty specific type of work in which much store is put in learning-by-doing and formal credentials like degrees and diplomas are to some degree replaceable by micro-credentials.  But most of the world of work doesn’t work that way.  And as a result, it’s important not to over-generalize future trends in education based on what happens to work in IT.  It’s sui generis.

Let tech be tech.  And let everything else be everything else.  Applying tech “solutions” to non-tech “problems” isn’t likely to end well.

November 02

Shifts in Credentialling

As Colin Mathews, President of the technology company Merit, remarked in an excellent little article in Inside Higher Ed a few weeks ago, credentials are a language.  One with limited vocabulary, sure, but a language nonetheless.  Specifically, it is a form of communication from educational institutions to (primarily) the labour market to convey information about their possessors.  There has been a lot of talk in the last couple of years, however, to the effect that the current vocabulary of credentials is inadequate and that change is needed to promote better labour market outcomes for both firms and graduates.  What to make of this?

The complaints about credentials basically come in two categories.  The first has to do with what I call the “chunking” of credentials.  At the moment, we essentially have two sets of building blocks: “Credits” (or “courses”) and degrees.  One is very short, the other often quite long.  Why not something in-between, which indicates mastery over a body of material which might be of interest to employers but is less than a full degree?  These shorter, more focused credentials like Coursera’s “specializations”, or Udacity’s “nanodegrees” are meant to supply labour market skills more quickly than traditional degrees.

Then there is the issue of how to interpret the information conferred by a credential.  A bachelor’s degree mainly tells an employer that the bearer has to stick-to-it-iveness to complete a four-year project (commitment means a lot to employers).  If the employer has hired graduates from a particular university or program before, then they might have a sense of an individual’s technical capabilities, too.   But beyond that, it’s blank.  A transcript might tell an employer what courses a student has taken, but unless the employer is going to take an inordinate amount of time to scrutinize the curriculum, that doesn’t really help them understand what they’ve been exposed to.  Marks help in the sense that an employer can get a sense of what a student has achieved at school, but increasingly, employers are finding that this is irrelevant.  What matters in many fields are the “soft skills” or “fuzzy skills”: on these nearly all credentials are silent.

Enter the idea of “badges”, digital or otherwise.  A solution half-inspired by competency-based education principles and half by Girl Guides/Boy Scouts.  The idea here is to give learners certificates based on particular skills they have demonstrated just as the Guides and Scouts do.  The problems with this are manifold.  First of all, unless a particular skill can be demonstrated through standardized testing, certifying skills is actually a fairly time-consuming and therefore costly activity.  This is why many of the emerging badging systems actually measure achievements and activities rather than skills (one badging system recently profiled in Inside Higher Education, for instance, hands out badges for attending certain types of meetings.  Your typical employer could not care less).

But even if you buy the Guide/Scout analogy, badges quickly run into the same problem as transcripts.  Say you have a knot-tying badge.  Unless an employer is intimately familiar with the Guide/Scout curriculum, s/he will have no actual idea what the knot badge actually signifies in terms of practical skills.  Can they do Zeppelin Bends? Constrictor Knots?  Do they understand ambient isotopy?  Or can they just do a slip knot?  And badges for soft skills are still pretty sketchy, so that part of the equation is still a blank.

All these moves towards what might be termed “microcredentials” are well-meaning.  Degrees are a pretty blunt and opaque way to express achievement and ability; it would be better if we could find ways of making these more transparent.  But the problem here is that all these solutions are being tested largely without talking to employers.  Badges, or whatever new microcredential solution we are talking about here, are all new languages.  Employers understand the language of degrees.  They do not understand the language of badges and see little benefit in learning new languages which seem to bring little additional benefit. For most, the old language of degrees is good enough – for now.  And so the spread of badges and various types of skill-based certification is so far pretty limited.

But it’s early days yet.  My guess is we’re in the first stages of what will likely be a 20- to 30-year shift in credentialing.  As Sean Gallagher notes in his excellent new book The Future of University Credentials, the general trend is going to be towards demanding that individuals be able to demonstrate mastery of particular skills and competencies.  Partly, that can be done through changes to assessment and reporting within existing degree systems; but it may also come through the regularization of certain new credentials, some of which may be issued by existing institutions and others from new providers.  I don’t think the final form of these credentials are going to look anything like the current fashion of badges; they are frankly too clumsy to be up to much.  But the push in this direction is too strong to ignore.  Expect a lot of very interesting experimentation in this field over the next few years.