A Flawed Report on Sexual Violence

You may remember a couple of years ago I expressed some concern about the structure of a survey on sexual harassment and sexual violence being designed by the Ministry of Training Colleges and University in Ontario.  In particular, I was concerned that by trying to go for a census rather than a sample, they would in fact get lower overall data quality (why do a census of a superficial quant survey when you could cut costs by sampling and use the balance to ask deeper and more meaningful questions to smaller subsets of students?) and the report would end up being reported more as a league table ranking which institutions had higher levels of sexual assault rather than as a tool to help institutions understand and deal with these phenomena.  I changed my mind slightly when a similar survey came out in Australia with sensible design and sensible (not league-table) reporting with obvious policy relevance, and hoped that the Ontario study might turn out for the better.

But no.  How silly of me.  This is Ontario, where the sensible path is never taken. 

Let’s start with the survey instrument they chose.  It’s not bad as these things go – you can see a demo of it here.  To my mind it’s a little too heavy on the attitudinal stuff and is missing some key information on actual incidents, which is what the survey is supposed to be about.  There are three key batteries of questions, one about sexual harassment, one about stalking, and one about sexual violence.  In each of those batteries, students are asked whether a variety of different events had happened to them in the current academic year – for example, under the stalking sections, respondents are asked whether they had ever been watched or followed a distance, been cyberstalked, had unwelcome gifts or items left for you, etc. – and also how often they had occurred. 

What’s missing, however, are any questions which deal with where these incidents occurred.  This matters: if the events are happening on campus, or in transit to campus, there are things that the institution can actually do about it.  Based on the very limited data published this week, this would seem to be a serious point; institutions with large residence populations seem to have much higher rates of sexual assault than campuses which have a mainly commuter population.  (if things are happening off-campus, it’s important for institutions to know because they have a duty to support all their students, but the assaults themselves are very definitely not their responsibility).  In the absence of this question, what we get are numbers for each institution, but almost no usable insight, which is good if you just want to create a league table and bad if you actually want to improve anything.

Anyways, whatever: if the full report ever gets published there will be *some* useful data in there.  But unfortunately, on Tuesday, the Government of Ontario published a largely useless “summary report” which to my mind is an active act of disinformation for two reasons.

First, IT DOES NOT DISAGGREGATE DATA BY GENDER.  Yep, you read that right.  Apparently sexual assault is in no way gendered and breakdowns by gender are meaningless.  The Government of Ontario claims this is because of privacy issues and they are waiting on a ruling from the Privacy Commission to release the full data but this argument is:

  1. Nonsense, with sample sizes this large (over 150,000 students). This is absolutely not a thing and even if it were any statistician who draws a regular paycheck can tell you ways to get around this problem, such as suppressing small cells or rounding to nearest five;
  2. If it were a thing, why in the name of everything holy would you rush out a summary report with crap, useless, ungendered data?  It means you can’t even get good summary sample demographic data to help you interpret the results. Just wait, for God’s sake.

Second, it only presents aggregate summary data on harassment, stalking and violence.  This matters because each of those three batteries has multiple questions, not all of which would typically be included in the definition.  For instance, one of the thirteen situations/behaviours under harassment is “did someone ever treat you differently because of your gender identity or sexual orientation”, which might be considered discrimination but probably not harassment.  But my guess it contributed mightily to the fact that 63.2% of all university students (presumably including some male respondents ALTHOUGH WHO KNOWS BECAUSE THERE IS NO GENDER DATA) and 49.6% of colleges are reported to have experienced “harassment” according to this definition.  It would be nice to have the breakdown, and there is no earthly privacy reason not to publish it.  But they didn’t.

It’s the same thing with the percentage of students who say they have been “assaulted”.  We have some baseline data here: in 2016 a group of Canadian schools ran the American National College Health Assessment Survey, which asked whether students reported any of a number of factors as having had an impact on their academic performance: drug use, stress, gambling etc.  The figure for sexual assault was 1.2%.  But according to this Ontario survey, the figure is a staggering 23% in universities and 17% in colleges?  Why, we don’t know, but my guess would be that something similar is at work because one of the possible answers includes simply having someone unwantedly rubbing up against you, which is toxic behaviour but not “assault” in the conventional sense of the term.  But you absolutely need disaggregated data to know whether you’re right to say there has been a nineteen-fold increase in sexual assaults.  Wording matters and there’s clearly a lot of grey zone here.

(To be clear: I’m not arguing against a more expansive definition of assault.  However, the way the question was phrased allowed the analysis to show different totals using different definitions of assault.  Publishing the disaggregated data would have allowed people to see the severity and frequency of various types of activities with no loss of privacy.  They chose not to.  That inhibits understanding.) 

Frankly, it’s hard to read this whole thing as anything other than a hit piece on institutions rather than an honest attempt to deal with sexual violence.  At every turn, the instinct seems to be to print data with sensationally high numbers, denuded of context.  At a key point in the survey design, there was a decision to avoid asking very simple questions about location, which would have helped institutions design more effective interventions.  The only thing that doesn’t read that way is the data on student satisfaction with complaint procedures.  The percentage of incidents which led to reports is unknown (impossible to calculate if you don’t ask about location, because off-campus incidents wouldn’t be reported anyway, the person in charge of survey design really has some ‘splaining to do), but what we do know is that the ratio of students who are happy with their institution’s handling of the report out number those who are unhappy by more than 2.5 to 1.  Which, to put it mildly, isn’t the usual discourse you hear about universities and sexual assault complaints. Why anyone would release a document this awful is a total mystery. The lack of specifics does nothing to help create effective measures or policies for dealing with this very serious issue. I hope the final report comes soon but I half-suspect the government will choose to bury it.  Any ministry which could release research this bad can’t have policy improvement on its mind.

Posted in

2 responses to “A Flawed Report on Sexual Violence

  1. Another reason to disaggregate the data: Get reliable information on the experiences of LGBTQ2S+ people experiencing sexual harassment and violence. Our experiences are routinely discounted and ignored. #MeToo has been great at shining a light on the abuse of cis straight men toward (cis straight) women, but if that’s all you can capture and report you’re doing it wrong.

    In postsecondary we exist in the explicitly LGBTQS2+ realm or not at all. Rarely in surveys like this. (Except on vapid “treated differently because of your gender/orientation questions”… because there is always a little bird on your shoulder telling you “IF YOU WERE CIS AND STRAIGHT THIS WOULD DEFINITELY NOT BE A THING”).

    There are best practices on how to ask these questions (and let people opt out if they choose). Of course, you have to read up on them and them implement them thoughtfully, in consultation with the groups affected. A tall order I suppose.

    1. Very true. Actually, the question does have an orientation question which is quite inclusive (9-10 options, IIRC), so it would be possible to tease that stuff out from the data. If it is in fact ever published.

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.