Remember about twenty months ago when everyone was gaga over the idea that the feds were going to pay for an expanded version of the faculty survey? And there would be data on part-timers! And on equity criteria! And maybe community colleges too!
Of course it was never clear that this would achieve anything like what its supporters claimed (mainly because it’s not clear how many profs are prepared to have certain personal data on things like race and disability recorded by institutions the way Statscan wants). But more to the point, even if it did, to what end? Why do we want to know so many details about academic staff when there are so many other mysterious yet consequential aspects of the system?
In any rational scheme of educational data collection, there are many things which should be higher priorities. Here are six really obvious ones:
A Survey of Post-Secondary Students. Despite the glaring need, apparently no one in government thinks it’s a priority. We last had a Statscan survey of students in 2008. We have zero idea about the socio-economic background or ethnicity of the student body. We have very little idea about how they spend their time, where they get their money, or what their goals are or what they think of their experience. Of course, this only matters if you think measuring access matters. Wait – access does matter! So why aren’t we doing this?
A Longitudinal Survey of Youth. Actually, we did once care enough about youth and access to measure it properly. It was something called the Youth in Transition Survey – a massive, decade-long effort to track two cohorts of kids from secondary school to post-secondary education and into the labour force. If anything, it was too ambitious (response rates by the end were getting pretty low), but it provided mostly excellent data about youth, education and work. It also provided the only data on young people not going to PSE we’ve ever had. But it’s been nearly 20 years since that first cohort was in high school and that data is now getting dated, policy-wise and it’s time for another. So why is there little agitation for a new one?
Better Analysis of Long-term Graduate Outcomes. We’ve had data on recent graduates for years and the data doesn’t seem to change very much (except when Statscan changes the sample timing for no good reason whatsoever). What we need is longer term data. It’s not hard to do: just link student data to tax data using SINs (or some other identifier). The folks at EPRI have been doing this at an institutional level for awhile: so why can’t we do it more generally?
Better Data on Student Debt. Rightly, we care a lot about student debt. But twenty years after debt became a headline issue, we still have genuinely very little idea how much of it there is. The federal government knows how much debt it issues and to whom, and provinces each know how much combined federal and provincial debt they each issue and to whom, but no one puts this data together across the board. We solve this problem by getting each province to link its student aid data with student-level records from the post-secondary student information system (several provinces are also quite capable of doing this on their own). The only hitch? Statscan genuinely does not care about student assistance. In fact, the folks there knowingly sends incomplete data to the OECD every year on the subject (the data you see in the annual Education at a Glance is federal data only, thereby missing over $3 billion in provincial aid to students), because they can’t be bothered to pick up the phone to call ten provinces and ask them for data?
A Survey of College Tuition. It’s a total mystery: we spend a lot of money each year figuring out average tuition, room and board figures at Canadian universities, yet we ignore colleges completely. It’s not as though colleges are any less tuition-dependent than universities (Quebec apart): so why do we have such copious data in one sector and not another? (Stastcan may find colleges less than enthusiastic participants in this, as I did when I tried it a coupe of years ago, but frankly that’s probably all the more reason for Statscan to do it).
Basic Data on non-academic staff. If you absolutely must spend money gathering data on employees, why not on counting non-academic staff? The growth of “administration” always seems to be top of everyone’s list of reasons why post-secondary education is going to hell in a handbasket (not always with a lot of evidence, as far as I can tell). So why not track it?
Better data on faculty composition is wonderful and all, but the amount of money being spent on this is way out of proportion to the actual importance of the issue. We should stop congratulating the feds for a one-off investment, and start asking them for a real data strategy. Higher education is about more than profs.
On the student debt issue – one thing that drives me crazy is that the StatCan wealth surveys, e.g. Survey of Financial Security, Canadian Financial Capability Survey, collect wealth data, including debt data, at the household level. So even if one can see a household with a big pile of student debt, it’s impossible to know which household member it belongs to.
That would be an easy one to fix.
Another fix that would be helpful: record *all* PSE, not just highest PSE. Right now it’s impossible to identify people who have been to both college and university in Labour Force Survey and the Census.