Institutional Economic Impact Statements Part 2

Yesterday, we looked at how Economic Impact Statements are put together.  Today, we want to look at the uses and misuses of these statements.

Let’s start by acknowledging that these statements are not primarily designed to be objective, academic analyses of impact.  Rather, they are political documents, meant to put an institution in a good light.  There’s nothing wrong with that, but it means that they need to be read with a certain eye.  Given the built-in incentive to exaggerate the figures, it’s important to know which exaggerations are honest and which ones are chicanerous.

The first honest exaggeration involves the economic footprint of the institution: specifically, the idea that “benefits” are somehow equivalent to costs times some multiplier (we’ll get back to the multipliers in a second) which can then also be converted into a number of jobs created.  It’s not that this number is wrong, exactly: it’s a pretty standard Keynesian way of describing the impact of an institution on a local economy.  The issue is that pretty much every university has more or less the same effect.  Provided you are using the same multiplier (which, really, you should be), saying that University X has a $1 billion impact on the economy and University Y has a $500 million dollar impact does not mean one is twice as beneficial as the other, it just means one has twice the costs of the other.   And, of course, it says nothing about opportunity costs (i.e., what benefits similar investments in other areas might bring).  But since everyone does exactly the same thing (not just in education, but in other fields as well), this is what I call one of the honest exaggerations.

It’s the same thing, broadly speaking, with trying to measure graduate earning premiums.  There is certainly room for both chicanery and error in terms of how you come up with a number which represents average graduate earnings.  But calculating the premium – that is, the proportion of those earnings that you can then claim as your institution’s “value added” tends to be handled in a similar manner everywhere: just subtract the average amount earned by high school graduates.  This is preposterous, because high school graduates aren’t exactly an equivalent control group (selection effects matter).  But again, no one really knows how to calculate a better counterfactual and everyone uses the same practice, so it doesn’t give anyone an advantage, per se.

The problem, rather, is when institutions engage in what one might call “competitive counting” or “competitive multiplication”.  Here are a few examples.

  1. Using bigger multipliers. As noted yesterday, there is some legitimate debate about what the “correct” multiplier for institutional spending is. But at most Canadians institutions, it is a number adjacent to 1.5 and indeed a number of institutions simply use the figure 1.5 to simplify matters. But occasionally, someone substitutes a different number. Trent University, for instance, uses 2.0 as the multiplier. No rationale given. It certainly makes the institution look better, but it’s not actually a reflection of better performance. This is maybe the purest example of “competitive multiplication”.
  2. Exaggerating student expenditures (1). Dollars spent by International and out-of-province are certainly attributable to the institutions, because without the institution, those dollars would likely not be spent in-province. But what about local or in-province students, who for the most part would presumably be spending dollars at some other institution in the province if they didn’t attend the institution in question? We’d argue these shouldn’t be counted in the impact figures, but in Canada many institutions choose to do so anyway (e.g., Calgary, UBC). Kudos to those which do not (e.g., McMaster, Lakehead, Nipissing). Definitely an example of competitive counting.
  3. Exaggerating student expenditures (2). There is also the question of how much spending one attributes to those students. These documents are usually pretty cagey about data sources, but just within Ontario, we’ve seen monthly off-campus expenditure estimates of anything from $1,131.42 at Brock, which is probably low, to $1,949.27 at St. Lawrence college, which is probably too high (and yes, it is spurious precision to estimate these numbers to the penny). Probably, these reflect some kind of survey data collection effort. But the variation in Ontario is nothing like what you see in Australia, where the University of Wollongong pegs international student spending at about A$2,630, while the Group of 8 (i.e., the big research universities) assume monthly non-tuition spending of A$7,250, and then set the multiplier on that spending at a preposterous 3.14, because, who knows, maybe they like π or something.
  4. Exaggerating visitor numbers. Most economic impact plans like to talk about the economic impact of visitors to campus. They are almost without exception based on fictional numbers, because nobody counts visitors. After all, practically speaking, how could you? If you’ve got a decent museum or gallery on campus (e.g., UBC), you can potentially do something around ticket sales. But the number of people who come to visit your students? It’s all entirely based on assumptions. Some universities assume 2 visitors per student, others assume 8. Brandon University, for some reason, uses a total number of visitors that works out to a figure of 23.6. The fancier ones make some differentiation between visits per international student vs. domestic ones, but again, it’s all competitive multiplication. Calgary seems to imply that everyone who comes to watch a Stampeders game is technically a U of C visitor (which is sort of true because U of C operates McMahon Stadium, but come on).
  5. Exaggerating visitor revenues. Now we’re into real flight-of-fancy territory. If you don’t know how many visitors you actually have, you have no way of sampling them to find out how much a visitor would spend on an average visit. So normally what people do is just head over to the local tourist board to find out how much an average visitor spends. And voila! Multiply this number by the (largely fictional) number of visitors and there you go: tourism impact galore.

We could go on, but you get the idea.  Not all universities have the same relative impact: genuinely, some are better than others at generating it.  But given the standard methodologies used by the people who do economic impact assessments, the only obvious way to show an outsized impact, controlling for income, is through competitive counting and competitive multiplication.  And that probably gives the whole genre a bad name.

There are, however, better ways to talk about impact.  Ways that focus less on dollars and cents and more about the ways that institutions are embedded in communities and the paths by which benefits flow from institutions to society.  We’ve developed some innovative methods for this; give HESA Towers a shout if you’re interested in learning more.

Posted in

One response to “Institutional Economic Impact Statements Part 2

  1. The reports on economic impact studies and, particularly, multipliers reminds me of discussions in which I took part in Ontario’s higher education a couple of years ago as part of the then government’s planning to overhaul formula funding programs and introduce performance funding for colleges and universities. One set of discussions, recognizing that “northern, rural, and small” colleges and universities were very expensive in terms of per student funding, and that, nonetheless, de-population and job loss persisted, led to consideration of alternate institutional models in other jurisdictions, mainly Scotland and Sweden. The other was a competition of CARDF research grants which was re-allocated because too few “northern, rural, and small” colleges would have received grants. In both cases the policy assumption was that the “northern, rural, and small” colleges and universities were “key drivers in economic development and job creation.” In other words, there was an assumption that economic impact studies and Statistics Canada’s multiplier reports (and maybe Walter Sudmant’s) proved the point: spending by and on colleges and universities always pays the best dividend in terms of GDP growth and job formation. Case closed.
    Institutional economic impacts studies are, as HESA points out, prepared mainly to advance public relations and institutional images. They are often analytically idiosyncratic and casual about what multipliers are, how they should be used, and how Statistics Canada multiplier data are constructed and displayed. Hence the simple Sudmant Method — “just multiply by 1.5” — is perhaps good enough. But does that hold true if the purpose of the impact studies goes beyond burnishing institutional reputations and extends to public policy, for example performance funding the discussions cited above?
    It might surprise some readers that Statistics Canada calculates separate multipliers for economic growth and job growth, for schools, colleges, and universities, and for several other “industry” sectors, for example, health services and defence. One point five is a summary composite for universities across Canada. The multipliers differ from province to provinceThe summary provided of Statistics Canada’s economic multiplier data was not as complete as it could have been facile. StatsCan publishes provincial and national multipliers for 14 economic variables for 240 “industries” – including colleges and universities — within the Canadian economy. The data are derived from the supply use tables, which detail the quantities in which each of the 240 industries supplies and uses 500 products and services. The dataset is not just a few values for a few sectors that vary a bit here and there. These It is a rich dataset which contains matters of fact that have implications for public fiscal policy.
    Within the last decade theNationally, the most recent total job output growth multiplier for colleges was 15 13 per cent higher than the multiplier forthat for universities. The total output multipliers for the defence and hospitals and recreation was higher still, 15 per cent above that of universities.
    Comparing total jobs multipliers, the differences are even more significant. Colleges and hospitals have total jobs multipliers 36 and 34 per cent higher than universities. The jobs multiplier for arts and recreation is higher than that for hospitals and in turn even higher comparable to the rates for colleges and higher universities.
    If a provincial government were to expand public spending in, for example, northern Alberta or rural or northern Ontario, they would be severely misguided to assumepoorly informed if they did assuming that there were only a slight variance in multipliers between potential receiving industries. Out of context, 1.5 and 1.7 might seem to be in the same ballpark, but if a government were considering, say, $1B of new spending, that would represent a difference of $130M in output and 1,500 in jobs of impact. From the figures above, real differences in multipliers are even greater.
    Money is money. Jobs are jobs. The new performance funding model for Ontario (and in time, maybe, Alberta’s) sets economic impact metrics for colleges and universities that rely implicitly on job growth and expanded spending. Several college and university SMAs make promises in that direction. Yet, there are rigorous data readily at hand that demonstrate better superior alternatives – for example, expanding health care — than performance funding incentives for higher education in regions with declining population density, Pollyannaish enrolment and research targets, and unrealistic prospects for higher institutional spending on operations and wages, which is what multipliers multiply.
    So is 1.5 good enough? Maybe for a sound bite, but not for public fiscal policy.If a provincial government were to expand public spending in, for example, northern Alberta or northern Ontario, the first thought should not be to establish a new college or university or prop-up an existing one on the promise of a 1.5 multiplier. First, other public services might promise higher multipliers. Money is money. Jobs are jobs. Second, multipliers alone do not tell the whole story. Total economic impact depends on the volume of college and university spending, including wages. A university in a given region with expenditures of $50M has the same multiplier as a university that spends $500M. Adding or expanding a college or university in a region with declining population density and missed enrolment targets will not improve economic impact. Job growth might, but other sectors with higher multipliers would be a better policy choice.
    So is 1.5 good enough? Maybe for local public relations, but not for public fiscal policy.

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.