A couple of years ago, an American academic by the name of James Bessen wrote a fascinating book called Learning by Doing: The Real Connection Between Innovation, Wages and Wealth. (It’s brilliant. Read it). It’s an examination of what happened to wages and productivity over the course of the industrial revolution, particularly in the crucial cotton mill industry. And the answer, it turns out, is that despite all the investment in capital which permitted vast jumps in labour productivity, in fact wages didn’t rise that much at all. Like, for about fifty years.
What Bessen does in this book is to try to get to grips with what happens to skills during a technological revolution. And the basic problem is that while the revolution is going on, while new machines are being installed, it is really difficult to invest in skills. It’s not simply that technology changes quickly and so one has to continually retrain (thus lowering returns to any specific bit of training); it’s also that technology is implemented in very non-standard ways, so that (for instance) the looms at one mill are set up completely differently from the looms at another and workers have to learn new sets of skills every time they switch employers. Human capital was highly firm-specific.
The upshot of all this: In fields where technologies are volatile and skills are highly non-standardized, the only way to reliably increase skills levels is through “learning by doing”. There’s simply no way to learn the skills in advance. That meant that workers had lower levels bargaining power because they couldn’t necessarily use the skills acquired at one job at another. It also meant that, not to put too fine a point on it, that formal education becomes much less important compared to “learning by doing”.
The equivalent industry today is Information Technology. Changes in the industry happen so quickly that it’s difficult for institutions to provide relevant training; it’s still to a large extent a “learning by doing” field. Yet, oddly, the preoccupation among governments and universities is: “how do we make more tech graduates”?
The thing is, it’s not 100% clear the industry even wants more graduates. It just wants more skills. If you look at how community colleges and polytechnics interact with the IT industry, it’s often through the creation of single courses which are designed in response to very specific skill needs. And what’s interesting is that – in the local labour market at least – employers treat these single courses as more or less equivalent to a certificate of competency in a particular field. That means that these college IT courses these are true “microcredentials” in the sense that they are short, potentially stackable, and have recognized labour market value. Or at least they do if the individual has some demonstrable work experience in the field as well (so-called coding “bootcamps” attempt to replicate this with varying degrees of success, though since they are usually starting with people from outside the industry, it’s not as clear that the credentials they offer are viewed the same way by industry).
Now, when ed-tech evangelists go around talking about how the world in future is going to be all about competency-based badges, you can kind of see where they are coming from because that’s kind of the way the world already works – if you’re in IT. The problem is most people are not in IT. Most employers do not recognize individual skills the same way, in part because work gets divided into tasks in a somewhat different way in IT than it does in most other industries. You’re never going to get to a point in Nursing (to take a random example) where someone gets hired because they took a specific course on opioid dosages. There is simply no labour-market value to disaggregating a nursing credential, so why bother?
And so the lesson here is this: IT work is a pretty specific type of work in which much store is put in learning-by-doing and formal credentials like degrees and diplomas are to some degree replaceable by micro-credentials. But most of the world of work doesn’t work that way. And as a result, it’s important not to over-generalize future trends in education based on what happens to work in IT. It’s sui generis.
Let tech be tech. And let everything else be everything else. Applying tech “solutions” to non-tech “problems” isn’t likely to end well.