HESA

Higher Education Strategy Associates

Category Archives: Policy

January 31

U.K. Tuition Fees: Early Results Are In

Unless you’ve been in a cave for the last 18 months, you’ve probably heard that the U.K. government is overhauling policies on student fees and government support in England and Wales (Scotland has its own arrangements). Public support for arts and social science students was eliminated, institutional grants were cut by 41% and, most strikingly, the limit on tuition fees was raised from £3,350/year to £9,000/year.

Since announcing the broad outlines of the policy fifteen months ago, the Cameron government has gone through some fairly bizarre twists and turns on certain details of the funding and tuition policies. At the end of the day, though, the final average tuition for the coming year came to £8,354, or about a 150% increase from last year.

For those of us who watch tuition fees for a living, this was a big experiment. Data from the competitive environment of the United States tends to show that even small changes in tuition fees can have significant effects on institutional enrolment, though usually through shifts from one institution to another rather than on aggregate enrolment. Large across-the-board tuition increases which affect the minimum price for all forms of higher education, on the other hand, presumably have more severe effects. But these are rarer and harder to observe, so this one-time “big-bang” in the UK seems like an ideal opportunity to examine the pure effects of a tuition hike.

Well, final application data was released yesterday.  Among 18 year-olds, this £5,000 (roughly $8,000) tuition increase has lowered applicant levels by just 3.6%, or about 8,000 students. Among older students – who tend to have less time to earn back their investments in higher education – the effect was significantly larger (on the order of 11%). Now, even a single student turned away for financial reasons is too many. But if an $8,000 net tuition increase only generates a net impact of less than 3% on traditional-aged students, and only 7% overall, that strongly suggests that tuition fees are not, on their own, a major deterrent to study.

So the next time someone suggest that the $250 tuition fee increase your institution is planning might have major effects on enrolment, there’s a simple reply. Such an increase would be 1/32nd of what was introduced in England and Wales. Assuming linear effects, a $250 increase might therefore be expected to reduce overall enrolments by between 0.2% and 0.25%. On an incoming class of, say, 3,000 undergraduates, that’s between six and eight students. We’d guess it’s not beyond anyone’s wit to design some student aid to offset that kind of effect.

U.K. tuition policy isn’t something we’d endorse, but clearly it’s not as harmful as some would lead us to believe, either. Real life’s just not that simple.

January 18

Student Surveys We’d Like to See

Surveys of current students tend to focus on just a few areas. Apart from questions about demographics and time use, they ask a lot of specific questions about satisfaction with student services along with a few general questions about overall satisfaction.

This is odd, because at the end of the day students don’t actually think student services are central to the overall quality of their PSE experience. What they care about first and foremost is the quality of the teaching they experience. Yet institutional surveys seem determined to avoid asking all but the most banal questions about teaching.

Sure, we have course evaluations. But these are used exclusively (if sparingly – but that’s another story) by departments. The data are never used as a tool to learn about what kinds of teaching methods work better than others, never linked to other demographic data to see if there are patterns in the data that link satisfaction or reported learning to student background, the amount of paid work a student engages in, etc. They are, in short, a massive lost opportunity.

What about the National Survey of Student Engagement (NSSE)? Well, despite allegedly being outcome-related, NSSE insists on treating a student’s entire class schedule as a single observation. It does ask about how often students “work in teams” or “make presentations in class,” but most students have a mix of classes, some of which have these elements and some which don’t. If you’re trying to understand how different teaching styles affect students, this stuff needs to be broken out class by class. NSSE, for all its cost, is essentially useless for this purpose.

Getting this data isn’t rocket science. Here at HESA, we regularly do surveys which ask questions about details of each of a student’s classes. That’s how we look at class size, it’s how we found out about the characteristics of students’ favourite and least-favourite classes and it’s how we learned about the effectiveness of e-learning resources in Canadian universities. If we can do it, so can universities and colleges.

From a public policy perspective, the reluctance to look harder at what makes for good teaching looks deeply suspicious. Teaching is what actually matters. It’s why institutions receive tax dollars. Finding out what kinds of pedagogical strategies are most effective should be a moral imperative for anyone who benefits from that amount of public support. But even if that weren’t true, having better knowledge about what works and what doesn’t in teaching is potentially an enormous source of strategic advantage for institutions.

But you can’t act on that advantage until you start collecting the data. So, enough with the student services questions; let’s start asking students real questions about teaching and learning.

December 01

A Prayer for Noah Morris

Noah Morris runs the Ontario Student Assistance Program (OSAP). He is the unfortunate soul who has the unenviable task of implementing Dalton McGuinty’s promise to give students 30% tuition rebates if they came from families with less than $160,000 in family income. It may have been popular electorally, but in policy terms it’s got “ugly” written all over it.

The government could have implemented this through the OSAP system by just cutting cheques to student aid recipients. But no: somebody in the Premier’s office had half-read some research about low-income students not always applying for student aid, and so decided they should avoid OSAP and make it universal. So who should cut the cheques in this brilliant scheme? The universities and colleges, of course! When students hand over their tuition money, institutions will hand back a rebate cheque. What could be simpler? Or more visible?

Great idea – until you realize institutions have absolutely zero idea about students’ incomes unless they apply for student aid. So we’re back to square one, except that now the government has shifted the administrative burden to institutions. Under the circumstances, COU’s “we’re sure keen to work with you” throne speech response was a model of turning the other cheek.

Did I mention the nightmarish interaction effects with federal programs? If the award is treated as a tuition waiver, two things happen. The first is that students’ recorded costs will fall, which means the students’ eligibility for student aid is reduced. The province can tweak OSAP rules to ignore the grant in the resource calculation, but it can’t force the federal program to do the same; since provincial remission programs are based on joint borrowing, the waiver may in fact be offset by a reduction in students’ eligibility for the province’s OSOG grants.

A fee-waiver approach also means that students will receive fewer tax credits. Assuming the program costs roughly $400 million, Ottawa will save $60 million a year in its tax credit programs at students’ expense.

Treating the awards as grants would eliminate the tax credit problem, but would require changing a lot of OSAP rules in order to avoid a situation where the new grant isn’t pushing out other aid dollar-for-dollar. It’s do-able – but again, there’s no reason to think Ontario will be able to shield students from clawbacks from the federal program.

So – an administrative nightmare plus nasty program interactions; just your run-of-the-mill trainwreck that results from designing public policy on the back of a cocktail napkin. And it’s OSAP’s Noah Morris that has to make it all work.

He deserves everyone’s sympathy.

October 12

Enough About National Goals, Already

In any discussion of Canadian post-secondary education, you know you’re about to approach an impasse when someone starts blaming some real or imagined ill on the lack of “national goals” or the absence of a federal ministry of higher education.

Honestly, who cares? The lack of a Department of Education until the Carter administration didn’t stop the U.S. from creating one of the world’s great PSE systems. Our lack of one hasn’t prevented us from having world-class research funding or one of the most accessible systems of higher education globally.

But that hasn’t stopped the Canadian Council on Learning from devoting its valedictory report to the subject of – yes, you guessed it – how terrible it is that Canada has no national goals in education.

Once it became clear that CCL, a federally-funded entity, was not going to have the support of co-operation of the provinces (due in no small measure to CEO Paul Cappon’s own behavior – he secretly worked with the feds to set up the Council while still working for the Council of Ministers of Education, Canada) it was always going to struggle to find a role or a niche. The council decided early on that banging the drum for “national goals” (which is partially but not entirely code for “more federal involvement”) was going to be it.

From then on, virtually any problem one could name in higher education was henceforth a problem of national goals. CCL had a hammer, and every problem was a nail. Just read the CCL report and see. Immigrant skills not meeting labour market demand? That’s a result of not having national goals in PSE. StatsCan unable to make its data comparable with the OECD’s? National goals, again. The rather more sensible propositions that poor immigration policy or lack of imagination at StatsCan could be the issue doesn’t even enter the picture.

The report baselessly asserts that more national action could reduce the male-female attainment gap, or improve apprenticeship completion rates. It is even more baselessly asserted that Canada has no quality assurance agencies in PSE when in fact seven of ten provinces do have one with an eighth (Saskatchewan) about to bring one in.

It’s the same kind of magical thinking used by Quebec separatists. The nature of the actual problem barely matters: once we have achieved separatism/adopted national policy goals in PSE, those problems will disappear. From an organization that once aspired to thought leadership in the field, it’s a disappointingly simplistic and ahistorical approach.

Page 12 of 12« First...89101112