In 1991, Maclean’s began publishing university rankings. In doing so, it relied heavily on university co-operation: in particular, it required institutions to fill in a survey for various pieces of data on admissions, class sizes, etc. Not all the questions were particularly well-defined and so there was a lot of data gaming. Eventually, in 2006, the universities decided they were not going to play the game any more: they were going to get out of the rankings business and instead set up their own transparency mechanisms.
Initially it all looked promising. There was quite a large meeting in (IIRC) 2007 of Canadian university Provosts, with Vivek Goel, then at U of T, as the host. Everyone swore up and down that the country’s students needed better information about institutions and that now that institutions were no longer sharing information with Maclean’s, they needed to be proactive about providing that data. But then it became clear that there were national differences concerning the speed of transitioning to such a system. So, a regional approach was taken.
Now, as soon as you hear the words “regional approach” from Canadian universities, you need to understand a few things. The first is that it means Ontario is going to take the lead. The Council of Ontario Universities (COU) is by far the most organized/best funded provincial association. Quebec’s is OK (it used to be better before it blew up in 2013 over the tuition fee debate), but “regional” in Quebec just means they can ignore what the rest of the country is doing. Out east, “regional” means the Atlantic Association of Universities is in charge, but it lacks the institutional weight of the COU, while out west “regional” means zipped-doo-dah because there is no permanent institutional infrastructure to co-ordinate efforts of western universities (though their Presidents do have an informal group with the most delightful acronym in Canadian academia, the Council of Western Canadian University Presidents, or COWCUP).
And so it went: the Atlantic universities tried to create something, but it petered out around 2013 or 2014. I can find no evidence on the web of this, but I recall Quebec attempting something before ditching the project largely out of lack of interest, and the west never really put anything together at all. Only the Common University Data Ontario project really moved forward, with mostly complete coverage by 2008 or so.
Of late, though, Ontario universities appear to have given up on this effort as well. If you go to the Council of Ontario’s Webpage, you arrive at a landing page which lists all the universities individually and invites you to go to a site which allows you to compare different institutions. It also contains links to all its member university sites. The first clue that this latter page hasn’t been updated in awhile is that Ontario Tech is still listed as UOIT. The second clue is that for several universities (McMaster, Nipissing, Ryerson, and York) the link sends you to a “404 We can’t find that page” message. Another half-dozen or so institutional links send you to a page from around 2015-16, even though they have more recent data. If you’re someone like me, who uses university websites all the time, it’s easy enough to find more recent data: I suspect most casual visitors would simply give up at this point.
Most of the universities’ CUDO pages leave much to be desired. Brock, Queen’s, Toronto, Windsor and York all have 2019 CUDO pages, which covers data from 2017 and 2018. Ottawa and OCADU don’t go past 2015, and the bulk of the rest are providing data from 2016. Trent doesn’t even bother putting its data up on its own site – it just directs people to the comparison site. And if you try using the comparison tool, what you get is a hodge-podge of data. You can compare data on some items (e.g. class sizes) up to 2018 but on others (e.g. distribution of entering class averages) it only goes up to 2016.
I want to stress that institutions are capable of providing this data: for many of the items CUDO is supposed to cover (eg. enrolments, finances, granting council data), universities have already published data for 2020 in other, less accessible forums. They are, rather, simply choosing not to make the data available in an accessible or comparable form.
It goes without saying that this is not a good look for Ontario universities. But let me remind everyone that while Ontario universities’ approach to CUDO may seem deeply half-assed, it’s still one cheek better than any other region of the country has managed. So far as Canadian institutions are concerned, it seems that the less anyone can know about them, the less they can be compared, the better. I mean we couldn’t portray one institution as inferior to another, could we? Heavens, no.
A fair objection is that it’s questionable how much any prospective student cares about any of the data that gets presented in CUDO. For one thing, it provdes institutional rather than program-level data, which might be of limited value given that students apply to programs rather than universities. That approach might make sense in the US, where each school has only a single admission standard to all programs. But in Ontario, where every mid-sized school is effectively running 40-odd separate application processes for 40-odd different programs, it makes zero sense. In any case, the argument loses its force because it’s not like any universities are arguing for replacing institutional data with program-level data.
Arguably, we probably shouldn’t care so much about losing this kind of data source because the indicators always seemed a bit picayune for your average 17 year-old who was the theoretical audience for this stuff. Number of volumes in the university collection? Research Awards by Granting Council? These items are of interest to the academic community, but less so for high school students but they aren’t much interest to students themselves. What would be useful? Engagement survey results from students in the program, perhaps. Graduation rates and graduate perceptions, for sure. Other countries can do this reasonably well. Check out the CompareEd tool from Australia or the DiscoverUni system run by the UK Office of Students. There is no earthly reason why Canadian universities could not do the same other than that they choose not to. Because…well, who knows, really. Data is bad and must be kept secret.
(One could argue that even this isn’t what students want to know. When choosing an institution, 17-18 year-olds – still the dominant intake at almost university in the country – are effectively choosing the place where they will make friends for the rest of their life: it’s in many ways a social decision rather than an academic one. What they really want to know is whether people at school X like K-Pop as much as they do, whether the dominant dress style at school Y is business attire or pyjamas, etc. One could imagine someone developing a survey to do something like that, but it wouldn’t be the universities themselves).
Still: it’s bad that Ontario universities can’t get their act together for CUDO and it’s worse that universities in the rest of the country couldn’t get their act together on anything at all. It’s time for a return to a debate about institutional transparency in this country.
The information may not be of interest to your typical 17 year old, but as a parent I used both the CUDO and ACUDS (when it existed) data extensively in researching options for my kids. I also used the data published in Macleans as it was more up to date than that available from either site. It always frustrated me that the data wasn’t more recent or available at the more granular program level, but data concerning admissions averages, retention rates, graduation rates, and classroom sizes helped me to guide my kids’ searches.