Earlier this year, I raged a bit at a project that the Ontario government had launched: namely, an attempt to survey every single student in Ontario about sexual assault in a way that – it seemed to me – likely to be (mis)used for constructing a league table on which institutions had the highest rates of sexual assault. While getting more information about sexual assault seemed like a good idea, the possibility of a league table – based as it would be on a voluntary survey with pretty tiny likely response rates – was a terrible idea which I suggested needed to be re-thought.
Well, surprise! Turns out Australian universities actually did this on their own initiative last year. They asked the Australian Human Rights Commission (AHRC) to conduct a survey almost exactly along the lines I said was a terrible idea. And the results are…interesting.
To be precise: the AHRC took a fairly large sample (a shade over 300,000) of university students – not a complete census the way Ontario is considering – and sent them a well-thought-out survey (the report is here). The response rate was 9.7%, and the report authors quite diligently and prominently noted the issues with data of this kind, which is the same as bedevils nearly all student survey research, including things like the National Survey of Student Engagement, the annual Canadian Undergraduate Research Consortium studies etc etc.
The report went on to outline a large number of extremely interesting and valuable findings. Even if you take the view that these kinds of surveys are likely to overstate the prevalence of sexual assault and harassment because of response bias, the data about things like the perpetrators of assault/harassment, the settings in which it occurs, report of such events and the support sought afterwards are still likely to be accurate, and the report makes an incredible contribution by reporting these in detail (see synopses of the reports from CNN, and Nature). And, correctly, the report does not reveal data by institution.
So everything’s good? Well, not quite. Though the AHRC did not publish the data, the fact that it possessed data which could be analysed by institution set up a dynamic where if the data wasn’t released, there would be accusations of cover-up, suppression, etc. So, the universities themselves – separate from the AHRC report – decided to voluntarily release their own data on sexual assaults.
Now I don’t think I’ve ever heard of institutions voluntarily releasing data on themselves which a) allowed direct comparisons between institutions b) on such a sensitive subject and c) where the data quality was so suspect. But they did it. And sure enough, news agencies such as ABC (the Australian one) and News Corp immediately turned this crap data into a ranking, which means that for years to come, the University of New England (it’s in small-town New South Wales) will be known as the sexual assault capital of Australian higher education. Is that label justified? Who knows? The data quality makes it impossible to tell. But UNE will have to live with it until the next time universities do a survey.
To be fair, on the whole the media reaction to the survey was not overly sensationalist. For the most part, it focussed on the major cross-campus findings and not on institutional comparisons. Which is good, and suggests that some of my concerns from last year may have been overblown (though I’m not entirely convinced our media will be as responsible as Australia’s). That said, for data accuracy, use of a much smaller sample with incentives to produce a much higher response rate would still produce a much result with much better data quality than what the ARHC did, let alone the nonsensical census idea Ontario is considering. The subject is too important to let bad data quality cloud the issue.
Erratum: There was a data transcription error in yesterday’s piece on tuition. Average tuition in Alberta is $5749 not $5479, meaning it is slightly more expensive than neighbouring British Columbia, not slightly less.