Progress Studies and Data Collection in Higher Education

One of the things that absolutely cheeses me off about the field of higher education as a field is how little attention is paid to what might be called “progress studies”.  What are “progress studies” you ask?  Well, let me turn things over to Patrick Collison and Tyler Cowan, who coined the phrase in an Atlantic article a few years ago.

Progress itself is understudied. By “progress,” we mean the combination of economic, technological, scientific, cultural, and organizational advancement that has transformed our lives and raised standards of living over the past couple of centuries. For a number of reasons, there is no broad-based intellectual movement focused on understanding the dynamics of progress, or targeting the deeper goal of speeding it up. We believe that it deserves a dedicated field of study. We suggest inaugurating the discipline of “Progress Studies”.

If there is one field that needs something similar, it is higher education.  And not just at an institutional level, but at a system level. 

There is of course a significant scholarly literature on higher education – Lord knows, if there’s anything academia likes talking about, it’s itself.  But in North America, a vanishingly small proportion of it is about the actual management of institutions.  You can find endless articles about students, student progress, student mental health, student belongingness, and more recently a similar literature on faculty (experiences of minority faculty, feeling of alienation in the “neo-liberal” university, etc etc).  But about institutions?  Crickets.

There’s a reason for this of course.  In North America, the study of higher education is for the most part an extension of sociology, which on this continent at least very much tends to focus on individuals (and in particular disadvantaged individuals) rather than institutions.  As a result, we simply lack a literature which really focuses on institutions and how they function at a concrete level.

It’s a little bit different in Europe.  Higher education studies on that side of the Atlantic tend to be rooted in management science rather than sociology (this is especially true of the Centre for Higher Education Policy Studies in the Netherlands and the Centre for Higher Education in Germany, but it largely applies to all the institutional members of the Higher EDucation Development Association (or HEDDA)).  There is therefore a much more intelligent type of institutional analysis in European higher education studies (for some examples, see here and here) than in North America.  And yet, these do not really amount to “progress studies” for the simple reason that the European tradition does not make extensive use of historical analysis to look at outcomes: that is, it’s pretty good at explaining differences between different institutional approaches to various problems, but it’s not always great at looking at how institutions came to their different approaches.

What this means is that there isn’t really anyone working on what Francis Fukuyama calls the “getting to Denmark” problem.  That is to say, while some institutions/countries are widely seen as leaders, and there are any number of consultants or other sector leaders encouraging universities or countries to emulate these institutions/systems, we actually have very little systematic evidence telling us how these leading institutions/systems became leading institutions/systems.  And, frankly, without that it’s not clear how useful any of these exhortations actually are.

Just to give you an example at the system level: there are an awful lot of countries around the world interested in investing in universities as a means of advancing national technological sophistication and “climbing the value chain” in global production markets so as to become more prosperous.  The obvious national example here is China, which is probably the one country that has most unambiguously succeeded in this path.  Twenty years ago, there were maybe only two or three research institutions that would have registered as being world-class: now there are dozens of them.  But how did China do it?  Was it simply a function of the government dumping a boatload of money on top research institutions, or was there something more to it, something in the way universities were managed?  If we can’t answer this question, then it’s quite hard for these developing countries – who may not have Chinese-style boats of money to hand over to institutions – to know where to start. 

But it’s true at the institutional level, too.  In the mid-1980s, amidst underfunding and labour strife, it was in no way foreseeable that UBC would be the country’s second-best university forty years later (sorry McGill, but this is the truth).  At the time, one might have pointed to a number of other institutions – McGill most likely, but McMaster, Queen’s and a couple of other others – that might have been in a better position to take this position.  So how did UBC do this?  What can other institutions learn from this?  We have absolutely no idea, because no one does the research. 

Just for giggles, let me just enumerate a few obvious areas of research which would help shed light on how different Canadian institutions actually work and which initiatives actually work:

  • Budgetary models: how has Responsibility-Centred-Management worked out in practice?
  • Encouraging interdisciplinarity: everyone wants to do it, but no one really knows how to break down disciplinary silos – or do they?
  • Rewarding merit: there are a variety of ways of identifying and rewarding high-performing staff across the country.  Do any of them work as intended?  Are some more burdensome than others?
  • Research overhead and funding: institutions have a variety of ways of supporting big research efforts and managing things like overhead.  Are some of these methods more effective/less intrusive than others?
  • Fundraising: more and more fundraising and stewardship is being pushed down to faculty levels.  Is this a good idea or a terrible one?  What supports are necessary to make this work?

In a sensible world, an association of universities themselves would look at these issues: certainly, that has been one of the significant achievements of the European Universities Alliance.  But in Canada, institutions have not historically been all that interested in learning from one another, and so no such data gathering/analysis ever takes place.  As a result, there is no collective learning, no general understanding of how institutions can achieve various aims in an efficient way.  In other words, no progress studies. 

I don’t want to go so far as to say “no progress studies” = “no progress”.  Some rules of thumb travel through the higher education ether, and some headway is made even in the absence of this kind of work.  But my God, life would be easier for everyone in the sector if we actually measured and analyzed what we were doing.

(On the off-chance you’re a president/vice-president/provost and interested in these topics, drop me a line: maybe we can change all this).

Posted in

2 responses to “Progress Studies and Data Collection in Higher Education

  1. “Progress studies” sounds rather like a branch of history. I suspect that actual historians, however, would reject it as hopelessly Whiggist. In other words, it takes “progress” more or less for granted. A Tory history, on the other hand, would also see moments of deterioration, or at least places where “progress” was less than obvious.

    For those who think that it is the role of universities to question society and government, the increasing autocracy of China would seem to indicate, if not an actual deterioration of its universities, then certainly a reason to question if any real progress has been made, on anything other than rankings.

    Moreover, the incipient discipline you describe seems to suffer from a problem true of all historiography: determining causation. Did Chinese universities rise in the rankings because of government policy, or in spite of it? The University of California kept growing after the loyalty oath of 1950, but nobody (sane, that is) would say intellectual censorship caused its growth.

  2. You discussed the ideas around interdisciplinary programs quite a few times this year. I’m curious about how that aligns with so many provincial governments wanting more specific direct to employment “well worn path” programs.

    I think interdisciplinary programs are the shake up universities need to get back to educating for citizenship rather than discipline mastery, but governments seem to be talking about university as a path to employment.

    BC’s new StrongerBC: Future Ready Action Plan mentions knowledge five times and skills 119 times.

    Now, I’m a huge believer that educating for citizenship develops the skills that are important to also gain quality employment, but I think it will be difficult to push what you’re talking about (a renewed version of liberal education) when even the NDP sees education as job training. That’s before trying to convince discipline specific faculty that maybe students don’t need 80 credits of discipline specific courses.

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.