Better Economic Impact Statements

Yesterday I talked about how disappointing/not fit for purpose university and college economic impact statements tend to be.  Today I want to talk about how to make them better.  

Let me start off by rejecting two obvious alternatives.  The first is what one might call the “Look!  Shiny Things!” school of impact statements, which consists mainly of feeding people exciting anecdotes.  For example: “hey look what our one great scientist did!”  “Look at our small sample of high-performing alumni did!”  Etc.  I can understand why people take this route – it’s basically how Universities Canada and the U15 sell research/innovation in Ottawa and it seems to work despite its essential vacuity – but on the whole it’s not a smart idea.  Provincial politicians, unlike federal ones, actually know about the institutions they are funding and are less likely to buy this kind of superficial approach.

The second alternative is what one might call the “let’s-not-measure-anything-let’s-just-ignore-calls-for-quanitifcation” argument, which for the most part gets pushed by the “proving-relevance-means-the-barbarians-have-won” crowd.  In a word, no.  Institutions do need to show governments – that is, the people on whose funding they rely – that they are contributing to the overall welfare of society in tangible ways.  The question is how, not if.

Let’s start with style.  As I said yesterday, it’s not just that the quant approach to describing impact is methodologically suspect; I don’t think it’s convincing, either.  I don’t think that governments (this Tory government in particular but I think its true of most governments) care for the quant approach.  What they want is a narrative approach.  What matters is not whether an institution can torque some basically ludicrous figure about the institution’s impact on GDP, what matters is if you can tell a story – a truthful story, an evidence based-story, but a story nonetheless – about how it supports the society in which it is embedded.

It’s the difference between saying “90% of our graduates got jobs within six months” and saying “the local hospital can’t work without us: 80% of its nurses came from our nursing school”.  It’s the difference between saying “higher education makes for a more innovative population” and saying: “seven of the ten fastest-growing businesses in our region came through our institution and here is what they have to say about how their time here changed them”.  It’s the difference between saying “we bring in $X million in research dollars and got 10 patents last year” and “let me introduce you to the businesses across the province which use technology developed at our institution.”

Now I admit, this is tougher to do than the usual technique of multiplying total budget by some Keynesian multiplier and saying “hey presto!  benefits!”.  It requires institutions to stay in touch with and track graduates more actively than they currently do.  It requires institutions to have contact with the HR departments of major local employers (e.g. hospitals, school boards, the city) so that they always know how many of their graduates are being hired there (and, ideally, keep an eye on how well they are meeting employer expectations).  It means tracking research partnerships with communities and businesses not just for the purpose of recording income from partnerships, but also for what happens afterward and how the knowledge generated in a project gets used.  None of this is easy.  But the results are things that actually describe the impact of an institution in a clear, comprehensive, narrative way.

Of course, this isn’t an either/or.  If you really think you need those methodologically-iffy bean-counting exercises, if nothing else than for your back pocket if anyone asks, then go ahead and do them.  Can’t hurt, I guess.  But my advice is: stop thinking like technocrats, start thinking like politicians.  There are better ways to truthfully explain the impact of universities than (dubiously) talking the language of GDP impacts in net-present value terms.  You can be evidence-driven, but still talk about impacts in ways that are not only more attractive, but actually capture the real value of higher education much better than accounting language ever could.

(On the off chance anyone wants to know more about how to do this, y’all know how to reach me.)

Posted in

3 responses to “Better Economic Impact Statements

  1. 100% agree with the need to have better evidence of the impact universities are making be it from teaching/learning, student experience, alumni, or research portfolios. Research Impact Canada (www.researchimpact.ca) is a network of 16 universities developing and sharing methods to maximize the impacts of research. We have developed and are piloting a tool to more consistently capture (using semi structured interview guide when interviewing research stakeholders) and communicate (structured case study) the impacts of research in the real world. We need to move beyond citations/bibliometrics as proxies of impact to collect the evidence of impact where it is felt in community, gov’t, industry etc. And we need to move beyond anecdotes to substantiated evidence. We described this work in a post (http://researchimpact.ca/watching-impact-in-the-ref-and-how-it-informs-the-canadian-context-le-ref-en-observation-comment-limpact-sy-manifeste-et-son-influence-sur-la-situation-canadienne/). We have done three pilots at York U and other members of Research Impact Canada are testing it.

  2. Part of me agrees with you, but I tend to think that the more specific the economic impacts which can be described, the more governments will be tempted to shut down anything not obviously contributing to the economy.

    The local hospital relies on nursing graduates? Great! Shut down all but the nursing school! The business school spun out ten new businesses? Great! Keep it too, but turn the rest of the campus into condos.

    None of this is a good reason to (say) study Tolstoy or Byzantine art history or participle physics or undersea steam vents. None of this, in other words, is a defence of the sort of open-ended research that defines universities as such.

  3. Hi Alex…thanks for your two posts about the convenient fictions in higher ed impact statements.

    I did some work in 2017 and 2018 for the BC Association of Institutes and Universities, exploring how the models of Applied Research and Innovation in European Universities of Applied Sciences could be adapted to their context as teaching-intensive, region-serving institutions. The European institutions are about 10 years further along in their evolution than our university colleges and polytechnics, i.e., given their broader university mission in ~1998 versus 2008 in BC.

    I was intrigued by some of the recent advances in measuring impact from applied research and innovation, including developing capability for these activities in the partner organizations (corporate, SME, public sector, NFP social sector). Another development of interest was the ethnographic methods being created to track the process of interaction and knowledge co-creation amongst the academic and workplace partners.

    Happy to chat about this further, or to receive suggestions for updates to this work since 2018.

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.