The Trouble with Economic Impact Statements

Every once in awhile, universities in Ontario all decide to get on a bandwagon and do the same thing at the same time.  This winter, the bandwagon seems to be economic impact statements.  I imagine somewhere, Presidents or Provosts through the province came to the conclusion that under a Conservative government, it might be necessary to “prove their worth” in economic terms, and so, lo and behold, a bunch of institutions (as of Friday) have issued identical calls for proposals asking consultants to help them draft economic impact statements, which both quantify their impact on the local and provincial economy in some very precise ways and which put numbers on “intangible benefits” like lower crime rates, or enriching the local community.

There are only two problems with this.  The first is that all university economic impact statements are based on some highly questionable assumptions and all deliver the same results because they are based on the same algorithms.  The second is that this Conservative government cares nothing for this kind of bean counting.

Let’s start with the questionable assumptions part.  To look at institutional economic impact, all these exercises work the same way.  You take total institutional expenditures.  Depending on the type of expenditure (salaries, capital, etc), you plug in a Keynesian multiplier, which Statistics Canada helpfully produces for every province and industry with their Input-Output tables. (Some use the Sudmant method, which uses a consistent multiplier across categories).   You multiply.  Voila!  Impact!  Two obvious problems: first, all institutions will necessarily have exactly the same impact relative to one another, because they all use expenditures multiplied the same Statscan-derived number (there might be a little variation with respect to research commercialization, but in the big picture that’s minor).  And second, the whole exercise rests on the assumption that costs are benefits, which is invalid.  The real question one should want answer – is public expenditure here, at this institution or in this sector better a better investment than others a government might care to make? – simply cannot be answered this way.

Another metric institutions tend to put into these things is “look how much money our graduates contribute to the economy!”.  This is calculated simply: a) take average income of graduates (if you have a sample of alumni, survey them and take that number; otherwise, use Statscan census data and assume your university is like everyone else’s). b) take average income of high-school graduates, again from Statscan.  Subtract b from a, multiply by the number of graduates in the region, and voila!  Economic impact!  Multiply by a suitable marginal tax rate and you can talk about “public returns via increased tax base”, too.  The idea that people who go to university might have been systematically more likely to get higher salaries anyway must be rigorously ignored here, both because it’s impossible to quantify and because it would shrink a figure which institutions would prefer to be as large as possible.

If you want to get fancy (as the Ontario presidents apparently do), you do things like “measuring intangibles”.  Do an alumni/student/staff survey to and see how much they volunteer.  Extrapolate the results to all alumni/student/staff, maybe even attach a dollar figure to the number of hours you get. Attribute this outcome to “the university”.  Or, here’s another (from the current RFPs): attribute a dollar figure to the value of lower incarceration rates of graduates.  This can really only be done by a) looking at the costs of incarceration, b) looking at incarceration rates by level of education (data which I am fairly sure does not exist in Canada) and then c) attributing all the difference in incarceration rates to the university and applying it to the entirety of the institution’s alumni population.

Now, all of this may strike you as fairly useless.  It certainly strikes me that way and I’ve done a few of these for institutions (a guy’s got to make a living).  It may even strike institution as useless, I don’t know: it may simply be that they think that, as useless as they are, they are convincing to non-specialists and hence can be a valuable public of government relations tool.  They may think this is silly but nevertheless “one has to do it”. 

If that were true, I’d probably say “guys, vaya con dios.”  The problem is I think it’s not.  The idea that a flawed bean-counting approach to proving institutional value is going to be useful in dealing with a Conservative government skeptical of such claims doesn’t ring true to me.  Partially because these people aren’t dumb: there’s definitely a risk as soon as they get to the part which says costs are actually benefits, they’re going to start looking at institutions skeptically and wonder whether they are being taken for fools.  But also partially because today’s Tories are not yesterday’s Tories.  Their focesed bean-counting approach disappeared with Ernie Eves.  Our current premier earned his chops by arguing for what was by far the least fiscally responsible subway plan for Toronto.  Formal cost-benefit ratios be damned: if they like you, you can spend.  If they don’t, you’re toast.

As a result, I would argue very strongly that while institutions need to talk about their impact in ways that Conservatives can appreciate, this “express-everything-in-spuriously-exact-dollar-figures” approach is almost certainly the wrong way to go.  There are better ways to achieve this, which I will explain tomorrow.

Posted in

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.