A few weeks ago, Statistics Canada released a paper profiling graduates of community colleges who already held bachelor’s degrees. A significant number of these were graduates of foreign universities – immigrants who came to the country with a degree and then found they needed a Canadian credential. But there were also a substantial number – fully 8% of all college graduates – who already had a degree from a Canadian university.
In the 1990s, when colleges first started pointing out this university-to-college phenomenon, the popular explanation was “overeducated arts grads going to get some real vocational skills”, an early variation on the “we need more welders and fewer baristas” refrain of the early 2010s. It’s possible that this was happening at the time: now, however, it is something different: colleges – particularly in Ontario and British Columbia – have courted university graduates by developing short post-bachelor’s programs designed to address niche labour market needs. It has nothing to do with traditional college programs in the trades: rather, colleges and polytechnics have re-defined “vocational education” for the digital age, designed shorter programs with bachelor’s level courses, and sold it aggressively to a white-collar market.
What’s interesting is that these are programs that universities could have designed, delivered and sold. But they chose not to. Instead, they chose to allow colleges to walk into this market. The question is why, and what it means for the future.
The why, I think, is pretty simple. While universities are sometimes criticized for chasing the almighty dollar, in fact what they really chase is prestige. Often, they chase dollars through fundraising as a means of buying prestige – more buildings, more star professors, whatever. There is prestige in teaching graduate courses, or in developing new undergraduate programs. But there has generally been limited prestige associated with recruiting part-time undergraduates.
Institutionally, part-time and short-course programs at the undergraduate level often lacked a home in universities. They were usually controlled by Faculties of Continuing Education; less frequently they belonged to individual faculties. In the 1980s, when there was a surge of part-time students (mainly due to provincial governments mandating educational upgrades for teachers and nurses), Cont Ed and part-time studies were briefly in vogue. But when that wave of demand for those upgrades passed, Cont Ed’s prestige faded. At many institutions, Cont Ed was downsized or even eliminated altogether, and their programs scattered to the faculties, largely did not know what to do with them. Result: universities ended up abandoning their infrastructure for offering short courses and part-time bachelor’s-level education while colleges ramped up their offerings.
Now, it has increasingly dawned on universities – only a decade or so into a demographic bust of traditional-aged students – that older students might need courting, whether through traditional short courses/certificates or newfangled “micro-credentials”. Those universities that retained their Continuing Education units more or less intact (e.g. Ryerson or York) are going to do pretty well in this world. Those that dismantled them over the past two decades – y’all know who you are – are going to find it hard to compete. Playing the short course/certificate game means flexible, fast delivery, continual rapid re-design of programming and specialized expertise in marketing. Chances are if a university has left all this in the hands of individual faculties, it will have a really hard time working out how to compete in this landscape.
And colleges, as a result, will likely continue to eat into this market that universities could have dominated, but failed to, mainly because they had difficulty distinguishing between bachelor’s level teaching and actual bachelor’s degrees. This was a failure of imagination, basically. And one that at least some institutions will be rueing for the next few years.
Hi Alex. Very interesting points. But don’t blame “universities” (which tends to be read as university administration). Blame the faculty members. Outside of their research, faculty are typically conservative, and many have a great aversion to applied education. (Typical comment at our Senate meetings: “universities have been around since Medieval times, so why should we change now”). Case in point. When I was VP Academic at a research university I had a series of very interesting discussions with the president of a college-type institution known for hands-on programs and a high employment rate of its alumni. To keep it anonymous I’ll just call it a “college”. The college proposed that we admit selected students with two years college experience to third and fourth year programs in a university department that was bleeding majors, so that the college students received a degree and broadened their understanding of method and theory. In return our majors could substitute for their electives a year’s worth of experiential learning at the college, with state-of-the-art equipment and strong industry connections, and still graduate in four years. An added bonus for our department was the opportunity to recruit ethnically diverse students from the college into a program with relatively little diversity. Looked like a win-win all round. The department’s enrollments and majors would increase and the student body would diversify. Our students would get hands-on experience, and industry networking. The college students would receive a degree and broaden their education. The college would be able to recruit more students with the possibility of transferring into a university. When this was proposed to the department we got a flat “no”. Not even interested in coming to a meeting to discuss it. The reason – “we don’t think our students want jobs in industry, so we don’t see the value in substituting practical work for courses in theory”.