If there is one thing that drives me spare about Canadian universities’ and colleges’ government relations operations, it’s their obsession with economic impact studies, and their habit of wasting tens of thousands of dollars every couple of years to do new ones. You all need to stop. It’s not just because no one believes any of the data (or rather, the people at whom these are aimed fully understand that since no opportunity cost analysis ever accompanies these studies, they can say nothing about how investments in PSE compare to investments in any other field); it’s that they can be done cheaply by just about anyone. Today, I will show you exactly how to produce one.
Basically, impact studies look at four things: i) Institutions’ economic footprints, ii) the impact of research, iii) the impact of student spending, iv) the graduate “premium”. Let’s look at each of these in turn.
Institutions’ economic footprint. This is just total institutional spending minus some kind of Keynesian multiplier. The formal mathematical model here is just: A*B, where A is the sum of Institutional Expenditures (or “direct spending”) and B is a multiplier, specific to the province and the educational sector, which can be used to calculate indirect and induced impact on GDP, and also estimates for jobs created by direct, indirect and induced spending.
“A” is obtainable either by consulting institutions’ annual financial statements or – for universities at least – by obtaining data on total institutional expenditures from the Stastcan/CAUBO national FIUC survey here. With respect to “B”, different consulting companies will say they have fancy methodologies to obtain a “correct” set of multipliers, which will almost invariably end up being somewhere between 1.5 and 2.0. These companies will never reveal the mathematics behind how they arrive at these numbers leaving you, the client, with no way to compare the validity of different approaches. And yet this multiplier number is the only thing that creates variation between estimates.
There is, however, an easily available open-source way of obtaining this data. Statistics Canada regularly updates input-out multipliers for all industries, by province, on its website. With a few clicks, you can see what the multipliers are for GDP (direct, indirect and induced) as well as jobs created. Just apply those multipliers to total institutional expenditures, and you are good to go. The resulting estimates are no more or less accurate than what consulting firms will give you.
The Impact of Research. The standard way of measuring the economic impact of research in Canada is based on a methodology developed in 1998 by Fernand Martin and Marc Trudeau. The algorithm is just C*D*E*F*G, where: “C” is the total GDP Growth over a certain number of years, “D” is the portion of GDP Growth attributable to Total Factor Productivity (TFP) Growth, “E” The proportion of Total Factor Productivity Growth attributable to domestic factors, “F” is the proportion of domestic (i.e. provincial) R&D, which is attributable to post-secondary institutions in general and “G” is the proportion of research in the post-secondary sector carried out by a given institution.
How to calculate C is a matter of some debate in the small circles where this stuff gets discussed. Based on the original Martin/Trudeau methodology, it tends to be calculated back to 1971. Of course, this creates problems, because if you are trying to develop a cost-benefit ratio, costs tend to be annualized, and comparing one year’s worth of costs with several decades worth of benefits can look ridiculous. This is a matter of choice, however. Pick whatever number of years you want, just try not to look ridiculous.
The portion of GDP growth attributable to TFP growth – that is, D – is not easy to determine because it involves complex calculations and measurements. But consultants hate working as much as the next person, so they cheat by re-using the 20% figure used by Martin & Trudeau, who in turn plucked it from an OECD paper from the mid-90s. It’s really unlikely this number is still true (here’s a recent study suggesting that the number for Canada has declined to 13% in recent years). Either number is probably fine. Just pick one and stick with it.
E is even more hard to calculate, so again researchers tend to recycle the number Martin used 22 years ago, which is 69%. You should do the same. F, the proportion of provincial R&D performed in universities, is available from Statistics Canada. G, the proportion of research spending done at a given institution, can be worked out from the same source as A (see above).
The value of out-of-province students. Most companies make a pretty simple calculation: H*I, where H is the number of out-of-province students (including international students), which is available from your Institutional Research office, and I is the estimate what they spend while at a given institution, including tuition. Technically, consultants should probably commission a survey of students’ personal expenditures and look at the data for out-of-province/international students to come up with this, but sometimes also they cheat and just multiply out-of-province enrolment by some spending estimate given the imprimatur of authority by having been published by someone else (e.g. this one done by Climacs Economics for the Government of Canada). Or just use the spending estimates your student financial aid uses to determine need for institutional aid.
In some variations of this calculation, consultants might also add the value of visitors to institutions as part of the calculation. The calculation is simply H*J*K, where J is the number of out-of-province/international students, K is the number of person-days of visits to the campus associated with each student, and L is the average daily spend of tourists to the area. H is out of province students, K is usually derived from surveys conducted for and published by the local tourist board, and J is a totally arbitrary number. In most Canadian institutional studies, the number is 2, but some institutions use numbers as high as 8 (why 8? ¯\_(ツ)_/¯…it’s arbitrary, remember?)
The Value of Added Graduate Income. Estimating graduate “premiums” is trivial. No one can actually know what the value of a university education is, because it’s nearly impossible to generate a counter-factual. No one seriously believes that someone with a degree would be making the average wage of high school graduates if they had not gone to university: presumably, the person’s ability to enter and finish university indicates a set of characteristics which probably would have driven them to higher incomes in any case. And yet, this is precisely what every economic impact study claims because no one knows how else to calculate the premium.
And so the way the premium is usually calculated is L*(M-N), where L is the number of graduates still living in-province (I’m assuming here that the target of the report is the provincial government, but you can substitute a different geographic boundary if you choose), M is the average income of individuals with a university/college (pick as appropriate) education in your province, and L is the average income of individuals with a different baseline of education (for colleges this is always high school graduates, for universities it can be either high school or college graduates). M and N are both available from Statistics Canada. L is hard to know for sure, but your institutional alumni offices will have a pretty good idea, so ask them.
Putting it all Together
An institution’s “Big Number” for economic impact is (A*B)+(C*D*E*F*G)+(H*I)+(H*J*K)+(L*(M-N). And I’ve just shown you how to find/generate all 14 of those numbers in the comfort of your own home. Seriously, it takes an afternoon to so, plus whatever time it takes for the institutional research, alumni and financial aid offices to get back to you. The main thing big consulting firms contribute is an allegedly more accurate “B” (i.e. the “multiplier”) and fine infographics. But you know, the Statscan numbers are pretty good.
So please, folks, I’m begging you. Stop burning money on this kind of impact work. Save yourself some money by following this DIY calculation and use the difference to tell more interesting stories instead, ones that aren’t just exercises in competitive multiplication, but actually tell the story of how institutions make communities flourish. You won’t regret it.