Why Education in IT Fields is Different

A couple of years ago, an American academic by the name of James Bessen wrote a fascinating book called Learning by Doing: The Real Connection Between Innovation, Wages and Wealth.  (It’s brilliant.  Read it).  It’s an examination of what happened to wages and productivity over the course of the industrial revolution, particularly in the crucial cotton mill industry.  And the answer, it turns out, is that despite all the investment in capital which permitted vast jumps in labour productivity, in fact wages didn’t rise that much at all.  Like, for about fifty years.

Sound familiar?

What Bessen does in this book is to try to get to grips with what happens to skills during a technological revolution.  And the basic problem is that while the revolution is going on, while new machines are being installed, it is really difficult to invest in skills.  It’s not simply that technology changes quickly and so one has to continually retrain (thus lowering returns to any specific bit of training); it’s also that technology is implemented in very non-standard ways, so that (for instance) the looms at one mill are set up completely differently from the looms at another and workers have to learn new sets of skills every time they switch employers.  Human capital was highly firm-specific.

The upshot of all this: In fields where technologies are volatile and skills are highly non-standardized, the only way to reliably increase skills levels is through “learning by doing”.  There’s simply no way to learn the skills in advance.  That meant that workers had lower levels bargaining power because they couldn’t necessarily use the skills acquired at one job at another.  It also meant that, not to put too fine a point on it, that formal education becomes much less important compared to “learning by doing”.

The equivalent industry today is Information Technology.  Changes in the industry happen so quickly that it’s difficult for institutions to provide relevant training; it’s still to a large extent a “learning by doing” field.  Yet, oddly, the preoccupation among governments and universities is: “how do we make more tech graduates”?

The thing is, it’s not 100% clear the industry even wants more graduates.  It just wants more skills.  If you look at how community colleges and polytechnics interact with the IT industry, it’s often through the creation of single courses which are designed in response to very specific skill needs.  And what’s interesting is that – in the local labour market at least – employers treat these single courses as more or less equivalent to a certificate of competency in a particular field.  That means that these college IT courses these are true “microcredentials” in the sense that they are short, potentially stackable, and have recognized labour market value.  Or at least they do if the individual has some demonstrable work experience in the field as well (so-called coding “bootcamps” attempt to replicate this with varying degrees of success, though since they are usually starting with people from outside the industry, it’s not as clear that the credentials they offer are viewed the same way by industry).

Now, when ed-tech evangelists go around talking about how the world in future is going to be all about competency-based badges, you can kind of see where they are coming from because that’s kind of the way the world already works – if you’re in IT.  The problem is most people are not in IT.  Most employers do not recognize individual skills the same way, in part because work gets divided into tasks in a somewhat different way in IT than it does in most other industries.  You’re never going to get to a point in Nursing (to take a random example) where someone gets hired because they took a specific course on opioid dosages.  There is simply no labour-market value to disaggregating a nursing credential, so why bother?

And so the lesson here is this: IT work is a pretty specific type of work in which much store is put in learning-by-doing and formal credentials like degrees and diplomas are to some degree replaceable by micro-credentials.  But most of the world of work doesn’t work that way.  And as a result, it’s important not to over-generalize future trends in education based on what happens to work in IT.  It’s sui generis.

Let tech be tech.  And let everything else be everything else.  Applying tech “solutions” to non-tech “problems” isn’t likely to end well.

Posted in

4 responses to “Why Education in IT Fields is Different

  1. Programs and courses that do provide general employability skills include management/business. Employers may hire with a particular discipline in mind, but typically what university graduates are asked to “do” is to lead a team, manage a project, negotiate with a customer/client, make a sale, manage a budget, analyze financial data, help commercialize an innovation. All of these skills are critical to healthy economies, yet when governments talk about economic growth they jump to STEM. Why is business not included?

  2. Excellent point, things in Tech industry work differently than others, however its usually the tech industry that leads in providing solution to problems, and soon the non-tech will follow. There is a need to recognize these micro credentials in addition to standardized basic training. The labour market wage pattern will stagnate in tech industry similar to the example from industrial revolution.
    Basic credentials + micro credentials/on job training in all field should be considered towards setting of wage patterns.
    .

    1. Yeah, I guess my point is that I’m not convinced non-tech will follow. Or at least not quickly.

  3. These are thoughtful reflections and it’s a research question I am thinking a lot about the Brookfield Institute for Innovation + Entrepreneurship. You raise a great point between distinguishing “tech graduates” and skills. Anecdotally, even graduates we speak to in “ICT type jobs” find that they still need to do micro-credential type offerings, whether through online learning or coding bootcamps to supplement their skill sets.

    Agreed with the point on not overgeneralizing trends based on what happens in work in the ICT sector but I wonder if it will truly be suis generis if for the sake of argument, we take the prediction that more jobs will require one to perform tasks that will increasingly rely on tech skills (regardless if you are in the ICT industry or not). Or does this only apply to people at the very high end of skill proficiency of technology? Be curious to hear your thoughts.

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.