Ignoring Naylor

Cast your mind back to 2017 – 2018, when, in theory, everybody agreed that Canada’s Fundamental Science Review – aka the Naylor Review – was a Good Thing That Must Be Implemented.  And so we got the 2018 Budget, which dispensed billions of dollars, mostly back-ended, over six years and which was touted as the Greatest Research Budget Ever (via some competitive counting of the sort I described last week) even though in total it amounted to about a 14% real increase over 6 years.  Ottawa declared Fundamental Science “dealt with”, and went back to its favoured hobby of coming up with boutique research programs that made for good “announceables”

Lost in all of this was one of the main points made by the review: namely, that the balance between investigator-driven research (i.e. the stuff handed out through the granting councils) and priority-driven research (i.e. all the other boutique ideas that periodically emerge from PMO) was out of whack.  Among the largest culprits for this were initiatives like the Canada First Research Excellence Fund (CFREF), a competitive program which delivers $200 million/year in large, discrete chunks to individual universities so they can gain scale in a certain area of specialty (e.g. $94 million to Dalhousie over 6 years for “sustainable development of the ocean frontier”, $64 million to Queen’s for Astrophysics Particle Research, $76 million to Waterloo for transformative Quantum technologies).  Basically, universities pitch big ideas in areas in which they demonstrate (via bibliometrics, CVs and creative writing) some pre-existing strength, the tri-councils decide which ideas sound coolest, and they hand out money.  The Fundamental Science Review did not make any specific recommendations about the future of CFREF, but it did throw some carefully-worded shade in its direction as partially obstructing Fundamental Science. It recommended a mid-term review before any more funding was given out.

Anyhow, said review came out in July with zero fanfare. You can read it here. Despite the summer 2021 release, it gives the impression of having been written sometime in 2019 (why two years from writing to release date?  Welcome to Ottawa). It appears to have been specifically designed not to answer any of the critical questions about the value of CFREF raised by the Fundamental Science Review.  Instead, it answers the following five questions.

1.       To what extent does CFREF continue to address a unique need and align with government priorities?

2.       How, and to what extent, have institutions implemented structures and processes for prioritizing funding toward research in CFREF priority research areas?

3.       To what extent has high-caliber, diverse and interdisciplinary research talent been attracted, retained and trained?

4.       To what extent have funded institutions created or strengthened partnerships, collaborations and infrastructure to enhance research capacity?

5.       To what extent is the design and delivery of CFREF effective and efficient?

Let’s go through how the report analyzes these questions in reverse order.

To what extent is the design and delivery of CFREF effective and efficient? Super-duper efficient, apparently. At least if, like this report, you only count the part where government writes cheques to institutions and receives their annual reports. Left unexamined are the design and delivery of the entire infrastructures that must be set up within each institutional recipient to award competitive grants.

To what extent have funded institutions created or strengthened partnerships, collaborations and infrastructure to enhance research capacity?  This is the one element seems unambiguously positive – institutions seem to have collectively entered hundreds of arrangements with partners both domestic and foreign, public and private, with a value close to a billion dollars.  And this despite matching funding not being in any way a criteria for the program.  If there is an argument for putting big pots of money together to work on specific themes, this is probably it.

To what extent has high-caliber, diverse and interdisciplinary research talent been attracted, retained and trained?  This is a weird conflation of issues. “Training” refers primarily to graduate students (who make up well over half of people working on CFERF projects), most of whom were not “attracted” or “retained” by the program at all. Nor is there any measure of quality of training going on: the mere fact that graduate students are involved is good enough. As for “attracting” and “retaining” actual academic staff…well, among the five first CFREF sites (the ones from the 2015 competition), a grand total of 57 new staff were recruited, but nearly 40% were internal recruits from within the same university. So, for the quarter-billion or so dispensed to these five institutions, the number of new academic staff recruited was 35.  We are not told whether this was above or below expectations.

How, and to what extent, have institutions implemented structures and processes for prioritizing funding toward research in CFREF priority research areas? This section is more descriptive of the mini-bureaucracies each institution had created to manage the funds than it is evaluative.  But apparently there have been no major disasters and that’s good enough.

To what extent does CFREF continue to address a unique need and align with government priorities? I had to look carefully at this question to make sure I understood it. And it is one of the laziest evaluation questions I have ever seen, seemingly made expressly for the purpose of avoiding any hard questions about the program and whether it is achieving anything useful.

Let’s start with “uniqueness” which is not IN ANY WAY a measure of quality. If it were, my plan to create nuclear fusion by banging TimBiebs together at ultra-high speeds would surely receive funding. Indeed, if uniqueness were a defining criteria of quality, then by definition every damn boutique program imaginable is ipso facto good. 

Then there is “alignment with government priorities”. This is essentially defined whether the program spends money in areas which the federal government has declared to be Very Important.  Of course the answer is yes because the government priorities were baked into the award criteria back in 2015/6; these priorities can and will be switched for future funding competitions, easy peasey lemon squeazy.

None of this of course comes to grips with the real question posed by the Fundamental Science Review, which is whether CFREF is a smarter way of supporting Canadian science than putting more money into the granting councils.  Here is how the review dealt with this question.

Some stakeholders, echoing an issue raised by the Fundamental Science Review, questioned whether the federal government’s decision to concentrate resources on a small number of grants represents the best funding model, compared to allocating smaller amounts of money to a larger pool of researchers and/or institutions, and expressed concern that this approach may lead to an overconcentration of resources in a subset of institutions. This view was expressed particularly by those applicants who had not been successful in securing CFREF funding. Other interviewees, however, (both funded and unfunded) lauded this approach, saying that it shows strong commitment by the federal government to invest in priority areas and raise the international profile of Canadian postsecondary institutions.”

That’s it, that’s all. Just a little bit of value-free both-sides-ism, and the feds felt free to authorize another $200 million CFREF competition.

Our current federal government is a joke when it comes to evaluating value for money. And not a very good joke, either. Don’t let anyone tell you otherwise.

Posted in

One response to “Ignoring Naylor

  1. Having worked on the development of one of the successful CFREFs, and having seen it in action, my perspective is that there would have been much more bang for the buck had the funds gone to the Tri-Agencies and CFI, or even to the federal government’s own underfunded research labs.

    *Creative writing is an understatement. These were political documents far more than scholarly documents, and everyone was aware of that when they were written.
    *University researchers are good at funding others who think like they do.
    *Don’t ever audit the partner funds to sort out how much is real or value added.
    *It’s easy to find students to train, the trick is finding jobs in their fields after graduation. The thought that there was an existing need for a larger number of new graduates with Masters and PhD degrees in concentrated fields of study doesn’t necessarily jive with reality in many disciplines.

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.