Are We Lacking AI-mbition?

If you’ve been following the latest developments on artificial intelligence (AI) in recent months, you’ve probably seen that higher education institutions all around the world are more and more aggressively starting to incorporate GenAI into their ways of doing. We’ve all seen Arizona State University announce its partnership with OpenAI at the beginning of 2024. The Hong Kong University of Science and Technology, on its end, has started to invite artificially generated ‘academics’ to teach to students in between lectures from real-life instructors, and more and more institutions, like many in the States (Harvard University, the University of Michigan, Washington University, the University of California, Irvine and UC San Diego) are starting to develop their own ChatGPT-like tools for faculty, staff and students to use.

Notice anything here? These innovative approaches all seem to be happening outside Canada. This is not to say that Canadian institutions aren’t keeping busy with their response to AI, but it’s notable that most of their response seem to stick within the realm of “reacting” to the new technology, rather than trying to proactively lead the way.

Are Canadian institutions lacking ambition? Or maybe are they lacking incentives and resources? In its recently released 2024 budget, the Government of Canada announced a $2.4 billion investment on AI to “strengthen Canada’s AI advantage.” Great news, it seems. However, from what we can tell so far, none of it is directly targeted towards the higher education sector – or at least, any implications are yet to be elaborated upon. How can we expect already stretched out institutions to become innovative and creative with AI, when we don’t give them the means to? 

Now, to be clear, it’s not that the Government has not invested in AI.  It invested quite a bit seven years ago through the pan-Canadian AI Strategy, which is administered by the Canadian institute for Advanced Research (CIFAR), which mainly showers money on three institutions (Université de Montréal, University of Toronto, and the University of Alberta) for advanced research in AI. It also dumped $125 million or so on the Université de Montréal for AI research through the CFREF process a few months ago. It’s rather that these investments have focussed fairly narrowly on the very top-end of the AI spectrum – that is the folks who are conducting cutting edge research. 

It’s a strategy that made a certain amount of sense six years ago when we seemed to have a comparative advantage in this area (though if you ask some of the innovation ultras, our lack of an IP strategy that would properly capture the benefits of such research always made this road treacherous); now that hundreds of billions of US venture capital is being brought to bear on the subject, it’s an open question whether continuing that strategy still makes sense. What remains unaddressed, either by the feds (or by their almost entirely silent provincial counterparts) is any sense of how to use higher education institutions in order to promote the diffusion of the AI skills required to drive faster and more effective adoption of such technologies.  

And it’s not like there aren’t examples out there about how to do this. Malaysia has pushed its universities both to introduce a lot of new degree programs in AI-related fields as well as to make substantial curricular change. South Korea is specifically ploughing extra funding into new AI-related undergraduate programs; Sweden has been doing so at the Master’s level. Sweden, France and Germany both devoted significant chunks of their artificial intelligence strategies to skills; so, too, did Singapore, which also developed a scheme for apprenticeships in AI. In other words, there are a lot of countries which seem a lot more seized of the need to use post-secondary institutions to diffuse skills, not simply concentrate money on a few elite research institutes and hope for a unicorn or two.

Now, that’s not to say that the lack of action at the national level should let institutions off the hook. Even if your institution isn’t making bold thematic moves like ASU or HKUST, it can still do a lot for skill diffusion. The first step, simply is keeping track of the latest estimates of how employers are deploying AI skills. That means keeping closer touch than ever with employers to see how their views are evolving. It also means keeping an eye on more general trends and national or global surveys of data uptake, preferably by industry. This latter part need not necessarily be done by every individual institution: groups of institutions joining together to share the cost of the data gathering.

But just gathering data/intelligence is not enough. Institutions need to find ways to get this material into the curriculum itself. To some extent this means designing entire new programs, though this is probably the least difficult aspect of keeping up with AI skills. No, rather, the challenge, as with all new technologies, is how to integrate the use of AI across all programs. Any institution wanting to instantly become the country’s top Canadian AI institution need only be the first to tell all of its departments that as they go through regular departmental reviews, they will be required to show how AI-relevant skills are being added to the curriculum.

It’s no big deal, really: curricula change all the time. But being the first institution to systematically collect data on changing AI skill use in the economy AND use it in a systematic way to feed the curricula review process? That would be a big deal. Someone should try it.

Posted in

One response to “Are We Lacking AI-mbition?

  1. “No, rather, the challenge, as with all new technologies, is how to integrate the use of AI across all programs. Any institution wanting to instantly become the country’s top Canadian AI institution need only be the first to tell all of its departments that as they go through regular departmental reviews, they will be required to show how AI-relevant skills are being added to the curriculum.”

    Nuts to that. It means telling people what is appropriate to their disciplines. There isn’t any obvious reason why dentists, say, should all be teaching AI, or experts on suicide in Russian literature.

    More importantly, if the overheads can tell us that we all have to add AI to the curriculum, what’s to keep them from making any other mad, impertinent demand, such as that everyone has to work on critical race theory, or against on critical race theory, or study Ayn Rand?

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.