Visa Caps “Lite”

Last week, it was revealed that Immigration, Refugees and Citizenship Canada (IRCC) is trending towards using “Trusted Institution Status” instead of caps on visas.*  The idea is not to decrease the number of visas overall, but to allow “trusted” institutions to access expedited visa processing.  Why is this important?  One, visa processing isn’t really a 12-month thing. Processing clusters during certain points of the year and IRCC doesn’t want to hire seasonal staff to compensate for these points. Two, several universities with big government relations budgets complained vociferously when their applicant’s visas were delayed past Labour Day, costing them millions in revenue.  So IRCC developed a data-intensive system to determine who is trusted and who is not and are currently “piloting” the scheme with a limited number of institutions.  They have more-than-faintly hilarious idea that this whole thing is going to be a live, working scheme by January (no one I know who deals with IRCC thinks the timeline is tenable, and 2025 is a much likelier date). 

Now, what to make of this idea? 

First, I have some doubts that it will survive a legal challenge.  This scheme is effectively trying to create two classes of Designated Learning Institutions: “good” and “bad”.  It’s genuinely unclear to me that the Government of Canada can do this: DLI status is a gift of the provinces, and I am not sure under what authority the feds think they have the power to discriminate between them without provincial agreement.  So, everything I said about this in last week’s post on caps stands.

But let’s leave the legal complications aside for a second.  It’s worth understanding that the way IRCC is proposing to distinguish between good and bad is effectively by creating a ranking.  The proposal is that they are going to assemble a list of thirteen indicators – six from their own files, and seven taken from institutions via survey – score institutions on each of them, and weigh the indicators somehow (the document does not indicate how – and I don’t want to hear anyone saying they’re not going to be weighted, because a non-weighted aggregation just means the indicators are weighted equally).  And at some unknown point in the ranking, they are going to draw a line and say that everything above the line is “trusted” and the others “not trusted”. 

Thus, the way to analyze this scheme’s indicators is the same way one analyzes a ranking: do the indicators make sense?  What is the quality of the data?  And in what direction do the chosen indicators display bias towards certain types of institutions?

Here are the IRCC indicators. 

IndicatorRelevanceBias
Study-permit approval ratesIt’s broadly considered the least biased measure for “good” students.  But approval rates by country of origin suggests a high correlation between GDP of the country of origin and visa approval rates. On current approval patters, this will affect institutions which disproportionately recruit in Africa (notably francophone ones.)
Rate of “Adverse Outcomes” (I’m told this is a euphemism for “being arrested”)Er…ok.  I would be keen to know what IRCC thinks individual institutions can do to screen for this.I have no idea.  This one has me stumped.
Diversity of international students (country of origin?)There is some value to portfolio diversity.Depends on how they turn diversity data into a score.  Really unclear how that will happen.
PGWP to PR transition ratesAgain: how is this in the control of the institution, exactly?  And why would you even want it to be?Institutions might start to screen students on the basis of whether they intend to stay in Canada or not.
Language levels of graduates who apply for express entryAgain, not entirely in institutional control since the test is taken years after the student leaves the institution.Can’t really be gamed, but it’s going to be fun when Quebec says the relevant language test in QC is French regardless of the institution attended.
International student graduates earning above Canadian medianAgain, not entirely within the institution’s control.  In Canada, income is linked tightly to geography.This will play out well for institutions whose graduates end up in big cities and less well for those who end up in Atlantic Canada and smaller towns.

A few of these are just bizarre.  Institutions have no control over who wants or gets a post-graduate work visa, and they certainly don’t have control over who gets Permanent Residency.  I mean, they could get into that line of work by asking students up-front if that is their intention, but I am quite sure that would confuse the hell out of students who are simultaneously being assessed for visas on the basis of whether they are might not return to their home country after their studies (seriously, this is the most common reason for permit rejection under a system which is allegedly designed to encourage immigration…let it never be said the Government of Canada doesn’t have a sense of humor).  And the idea that the system is going to incentivize sending more graduates to big cities instead of keeping them in smaller towns where lower-than-median incomes prevail is not likely to find favor.

Let’s turn now to the seven indicators to be populated via institutionally-supplied data.  Every ranker knows you should try to avoid over-reliance on this kind of data because institutions will try hard to twist and torque the data.  Frankly, what trust can an institution have that other institutions aren’t doing the same?

IndicatorRelevanceBias/gaming
One-year retention rates (multi-year programs only)There are two ways to get high retention rates: teach well, or mark generously.  Which do you want to encourage?Possible for international students to be graded more loosely than at present.
On-time program completionSee above, but also: why punish institutions where students switch programs because of interest/ aptitude?International students might be pressured not to switch programs.
Percentage of a DLI’s total revenue derived from international fundingThe implication here, I think, is that “too much” is bad.   But that means the institutions with the most foreign students will ipso facto be the least trusted on this measure, and conversely, those with the fewest international students will be the most trusted.  This does not make sense.Bizarrely, there is a bias to institutions that have a tiny number of international students.
$ value and percentage of total scholarships provided to international students who come from one of 46 countries on the UN Least Developed Countries listThis is basically encouraging institutions to give as much money as possible to students from the precise list of countries with the highest student visa refusal rates (indicator 1).  Also, is this one indicator or two?  Also, also: how does this make an institution trustworthy or not?Only about 5% of international students currently hail from these countries, of which Bangladeshis make up a third.  With such a low base, it will be easy for institutions to game this figure by switching more funding over to this five percent.
Funding for “targeted” international student supportsMost student services are not provided separately for international and domestic students.  There is not any obvious reason why they should be. So, what’s the point?Institutions will be incentivized to provide two sets of identical services, one for domestic and one set for international students.  Just the kind of efficiency institutions need.
Total number and percentage of international students living in housing administered by their institutionOn-campus housing is rarely the cheapest option for students.  So, one may well simply measure the % of wealthy international students at an institution.Bias towards high-prestige institutions with a lot of residence beds (Maple League institutions will do well here).
Average teacher-student ratio for the 10 courses with the highest numeric enrollment of international studentsThe 10 courses with the highest numeric enrollment of international students will be a lot bigger at big schools than at small ones.  This will look very much like a proxy for size, with the implication that size = bad.Smaller universities and colleges should benefit but since IRCC did not define the word “teacher”, I could easily imagine large universities gaming this one by claiming all TAs are “teachers”  

Let me be blunt: this is not a good rankings exercise.  Two-thirds of the IRCC-held indicators measure things which are not really in the control of the institution and seem to be included for the sake of having more indicators.  The survey-based indicators either directly contradict the IRCC-held indications, are open to gaming, or rely on contestable definitions of what makes an institution “good” and additionally incentivize bad behavior.  There are some indicators which have a clear bias to certain types of institution (the direction of the bias can switch, indicator-by-indicator, suggesting the ranking’s authors don’t have a coherent sense of which institutions are good and which are bad) while others have a bias towards certain geographic areas. 

I understand why IRCC felt the need to do this.  If you’re going to do something legally dubious like discriminate between DLIs, it helps to develop a system that looks accurate and impartial.  This system probably impresses some people as just that because it requires so much data (Canadians are easily impressed by data because we have so little of it).  But simply piling data in a mound without a clear conception of what “quality” means doesn’t magically create accuracy and impartiality.  Rather, it creates a conceptual nightmare.  Which, to be fair, is to be expected if you ask any federal agency, let alone the immigration department, to determine institutional quality in education.

My view?  Stop with the data fishing and bring it down to a single, easily understood indicator.  Of the existing 13 indicators, the one for visa approval rates is probably the most valid one, but unfortunately adopting as-is would explicitly be saying that good institutions are the ones that don’t recruit in poor countries.  To correct for this, I suggest measuring institutional visa approval rates, normalized for the countries from which the students apply.  Compared to what is on offer, it is simpler, faster to deliver, more intellectually coherent, and substantially less prone to incentivize institutions to do silly things.

You’re welcome.

*Note: I am using “visa” as shorthand to refer to study permits that apply to students who may have a temporary resident visa or an eTA. Thank you to a previous commentor for this note.

Posted in

2 responses to “Visa Caps “Lite”

  1. From around 2012 Australia had streamlined visa processing for applicants to study at ‘low risk’ institutions:

    ‘In order to be invited to participate in streamlined visa processing arrangements, education providers must be assessed and found to be of low immigration risk.

    ‘An education provider’s immigration risk level is calculated by assessing data held in departmental systems relating to prospective and actual international students associated with that provider. This includes data about student visa refusals, including fraud, and compliance with visa conditions relating to student visa applicants intending to study at the provider’s institution, as well as actual students who have visas linked to that institution.’

    This was contentious and I think several providers appealed their categorisation.

    From around 2015 the Government replaced this with its simplified student visa framework:

    ‘We use the combined evidence level outcomes of the student’s education provider and country of citizenship to guide whether the student needs to provide evidence of financial and English language capacity with their student visa application.’

    The department compiles a report on each institution which is available only to people authorised by the institution.

    https://immi.homeaffairs.gov.au/what-we-do/education-program/what-we-do/simplified-student-visa-framework

  2. IRCC needs to look beyond mere approval rates, although I agree that should be a heavily weighted indicator. That is because Canada’s ISP (at least at policy level), is not built solely on a “bums in seats” ($) mentality. Rather, Canada’s ISP is guided too by other factors such as filling labour market gaps and immigrant retention, benefits of cross cultural learning. These make some of the criteria you seem dismissive of REALLY important. I won’t go on, because I find most of your work terrific, but I am suggesting this piece would have been more productive from a more neutral lens.

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.