HESA

Higher Education Strategy Associates

Author Archives: Alex Usher

August 28

The Australian Experiment (Part 1)

I spent a good part of this month in Australia, talking to people about the radical program introduced in the May budget.  The basics of the system are as follows:

  • A recently-introduced plan of uncapped places, with the government funding as many students as institutions wish to admit, was maintained; however, the average amount of the per-student subsidy will drop by 20%;
  • Tuition fees will be fully de-regulated.  Institutions will be able to charge what they like, subject to the stipulation that fees for domestic students cannot rise above fees for international ones;
  • Institutions will be required to sequester 20% of all tuition revenue over and above what is required to make good on the cut in government subsidies for students’ scholarships (in conception, this is identical to the 30% tuition set-aside in Ontario);
  • The repayment threshold for student loans will be lowered slightly, and students will for the first time be required to pay real interest on the loan (currently, they pay the loan back in constant inflation-adjusted dollars).

The package is seen as favouring the more prestigious and research-intensive universities (known as the Group of 8 – their version of our U-15), since – it is believed – they will be able to charge the higher fees.  And they have made their intentions plain as to where they will go under the new regime: significantly higher fees, reduced undergraduate intake, and ploughing all the extra money back into research, research, and more research.

(Australian universities conceive of research, rankings, and international students as a kind of iron triangle – they need international students for money, good rankings to attract international students, and research to drive their rankings results.  It’s patently not true – the most international-student-intensive university in Australia is Federation University in Ballarat, which is emphatically not research-intensive – but somehow it drives the discourse to the exclusion of all else.)

As might be expected, the sector isn’t unanimously in support.  Students don’t want the higher debt and the academics union are backing them (though, if universities do get more money, they’ll surely be arguing for much of it to be directed their way through higher salaries).  The most on-the-fence group are the lower-prestige universities, who aren’t convinced the market will let them charge higher fees, and hence see the whole exercise as one of increasing system stratification.

The budget is at risk of not passing; control of the Senate rests with a number of small parties – some of whom are definitely in wing-nut territory (google the terms: “Motoring Enthusiast Party”, “preference whisperers”, and “Jacquie Lambie radio interview” to get a sense of the nuttiness), and the budget contains a number of other highly-controversial reforms, such as introducing fees for doctors visit.  Some think that there’s a deal to be had – acceptance of the fees in return for ditching the student loan interest rate proposals – but this deal is far from certain.

Complicating matters further is that the G-8 seem determined to go ahead and announce their 2016 fee structure next month, even before the package’s fate in the Senate is assured.  This, they claim, is necessary for guidance counsellors and high school juniors to make decisions.  But it’s a hell of a risk – one which raises the stakes for the government.

And what will happen if it passes?  Tune in tomorrow.

August 27

The Problem at the Back End

Yesterday, we talked about how the Canadian aid system was both generous and clumsily organized, what with most of it being delivered through tax credits and loan remissions – neither of which shows up directly to reduce tuition at the time of registration.  This is something that needs to change; if we’re giving students so much money, we should at last give it to them in a form that is both useful and comprehensible.  So why can’t we do it?

Our tax credit system goes back to the Diefenbaker era (backstory here), but it really took off in the late 1990s when the education amount increased nearly eightfold in less than a decade.  Why such an increase?  Well, partly it’s because tax credits are politically convenient at election time – you can count them either as new spending or as tax cuts, depending on what audience you’re talking to.  But it’s also because provinces can’t hijack federal tax credits by diverting the money to other causes (e.g. the Harris or Charest tax cuts), or simply allow federal dollars to push out provincial ones (e.g. the Canada Study Grants for Students with Dependants).  Tax credits go straight from Ottawa to taxpayers.  So even though tax credits are a sub-optimal way of delivering student aid, they are superlative as a displacement-free method of delivering federal money.  If you don’t understand why that matters, you haven’t spent enough time in Ottawa.

Where provinces have built on tax credit nuttiness is by giving out large scale tuition rebates through the tax system.  Most provinces haven’t succumbed to it, but Saskatchewan, Manitoba, and New Brunswick have – and frankly they’ve gone bananas with it.  In Manitoba and Saskatchewan, students who finish a first undergraduate degree or college program qualify for one of those province’s graduate rebates. For graduates that remain in the province, these programs are so generous that literally everyone ends up receiving more in subsidy than they pay in tuition.  In Saskatchewan, students from families earning over $120,000 can end up receiving $34,000 in various forms of grants tax credits, while paying only $26,000 in tuition if they graduate in four years and stay in the province for seven years after graduation.

This, frankly, is nuts.  What’s the point of paying graduates $20K each to stay in Saskatchewan, regardless of pre- or post-study income?  Even if that argument had a smidgeon of validity fifteen years ago when it was introduced (amidst significant out-migration), now that Saskatchewan is a “have” province it looks more than faintly ridiculous.  Doesn’t Saskatchewan have more pressing needs than to pay middle-class kids to do what they were going to do anyway?

If we’re spending all this money making higher education cheap, we should spend it in a way that is more useful and comprehensible than tax credits.  The problem is, these kinds of subsidies are difficult to re-allocate because they benefit so many people.  But finding a way to re-allocate tax credit money is a must.  Stakeholder representative groups could do everyone a service by banding together and working out ways to give governments the necessary political cover to do the right thing.

August 26

What Students Really Pay

In a couple of weeks, Statistics Canada will publish its annual Tuition and Living Accommodation Cost (TLAC) survey, which is an annual excuse to allow the usual suspects to complain about tuition fees.  But sticker price is only part of the equation: while governments and institutions ask students to pay for part of the educational costs, they also find ways to lessen the burden through subsidies like grants, loan remission, and tax expenditures.  And Statscan never bothers to count that stuff.

Today, we at HESA are releasing a publication called The Many Prices of Knowledge: How Tuition and Subsidies Interact in Canadian Higher Education.  Unlike any previous publication, it looks not just at a single sticker price, but rather at the many different possible prices that students face depending on their situation.  We take ten student cases (e.g. first-year dependent student in college, family income = $80,000; married university student, spousal income = $40,000; etc.), and we examine how much each student would be able to receive in grants, tax credits, and loan remission in each of the ten provinces.  It thus allows us to compare up-front net tuition (i.e. tuition minus grants) and all-inclusive net tuition (i.e. tuition minus all subsidies) not just across provinces, but also across different students within a single province.

Some nuggets:

  • On average, a first-year, first-time student attending university direct from high-school, with a family income of $40,000 or less receives $63 more in subsidies than they pay in tuition, after all subsidies – including graduate rebates – are accounted for (i.e. they pay net zero tuition on an all-inclusive basis).  If they attend college, they receive roughly $1,880 more in subsidies than they pay in tuition (i.e. -$1800 tuition);
  • A first-year, first-time student attending university from a family with $40K in Quebec, after all government subsidies, pays -$393 in all-inclusive net tuition.  In Ontario, the same student pays -$200.  But if we were to include institutional aid, the student in Ontario would likely be the one better off, since students in Ontario with entering averages over 80% regularly get $1,000 entrance awards, while students in Quebec tend not to.  For some students at least, Ontario is cheaper than Quebec;
  • On average, college students who are also single parents receive something on the order of $11,000 in non-repayable aid – that is, about $8,500 over and above the cost of tuition.   In effect, it seems to be the policy of nearly all Canadian governments to provide single parents with tuition plus the cost of raising kids in non-repayable aid, leaving the student to borrow only for his/her own living costs.

The upshot of the study is that Canada’s student aid system is indeed generous: in none of our case studies did we find a student who ended up paying more than 62% of the sticker price of tuition when all was said and done, and most paid far less.  But if that’s the case, why are complaints about tuition so rife?

Two reasons, basically.  First, Canada’s aid system may be generous, but it is also opaque.  We don’t communicate net prices effectively to students because institutions, the provinces, and Ottawa each want to get credit for their own contributions.  If you stacked all the student aid up in a comprehensible single pile, no one would get credit.  And we can’t have that.

The second reason is that Canada only provides about a third of its total grant aid at the point where students pay tuition fees.  Nearly all the rest, stupidly, arrives at the end of a year of studies.  More on that tomorrow.

August 25

Back to School 2014

Morning, all.  Ready for the new school year run-down?

One thing already clear is that pretty much the whole sector has finally come to grips with the reality that annual 4% increases in funding aren’t coming back any time soon. That’s causing institutions to think more strategically than they’ve had to in a long time, which is a Good Thing – the downside is that there appears to be some places where this hard thinking is leading to some fairly ugly clashes between management and labour (hel-lo Windsor!).  Contract negotiations this year could be very interesting.

Federally, the big story will likely be how to make the new Canada First Excellence Research Fund work.  Yeah, remember that?  Will it actually make big investments in big universities, the way its U-15 authors hoped?  Or will the usual Canadian political dynamic intervene and turn it into something that distributes excellence funding more widely?  My money’s on option 2, which means in a few years the U-15 will have to convince people to establish an even newer research fund-distributing body, with even tighter funding award criteria.  Fun times.

The spring budget – the last before the next federal election – will almost certainly be about tax cuts, but it will be interesting to see if the feds think there’s something to be gained politically in more higher education spending.  My guess is no, because CFERF was the Tories’ “big swing” for this mandate.  Provincial budgets, I suspect, will all be status quo, but you never know.  The fact is money’s tight, growth is slow, and almost no one (except possibly Alberta) is looking at good times ahead any time soon.  Frankly, we’re only one good financial crisis away from some more swingeing cuts in public spending.

And speaking of swingeing cuts, some of those are on the table Down Under, along with a very radical plan to de-regulate tuition fees.  I spent part of my summer in Australia and New Zealand learning about some of the many interesting policy developments going on there.  Both countries – so similar to our own, yet with intriguing differences – have important lessons for Canadians on the higher education and skills files, and I’ll be talking a lot about them over the next few months, starting Thursday, with a two-part primer on the Australian de-regulation imbroglio.

One of our big summer projects at HESA towers has been following up on our Net Zero Tuition work with a much more detailed look at how students with specific profiles fare in all ten provinces.  It’s possibly the most detailed look ever taken at how student financial assistance plays out on the ground with real students, and we’ll be releasing it tomorrow, and covering it on the blog both Tuesday and Wednesday.  I look forward to the feedback.

But the real event to look forward to this fall is… the Worst Back-to-School Story of the Year Award!  Now in its third year, this award highlights the most ludicrous, under-researched, over-egged story on higher education.  Previous winners include Carol Goar and Gary Mason.  So far this year the back-to-school journalism scene has been pretty quiet, so the field is wide open.  Do send in any nominations to info@higheredstrategy.com.

Back to work!

August 18

Credit-Transfer, Korean Style

There are not many genuinely unique ideas in higher education.  Today, we at HESA are releasing a paper about one of them: Korea’s Academic Credit Bank System (ACBS), available here.

Korea has long had a problem with credit transfer.  Its higher education institutions are fairly rigid in terms of admissions, and few like accepting transfer students.  Another big problem is that Korean males have to do two years (roughly – it depends on the service) of universal military service, and they tend to do it before they’ve finished their higher education. People wanting to combine credits taken from a variety of institutions while on duty were usually out of luck if they tried to get those credits counted at their old institution.

So, in the 1990s, the Korean government started experimenting with new forms of educational credential.  Their first attempt was to create something called the Self-Study Bachelor’s Degree, in which Koreans could basically take a bunch of challenge exams on their own, and obtain a credential.  But the pick-up rate on this was fairly low (lessons here for people who thought MOOCs would create a tsunami of change, had they cared to check), and so the search for solutions was back on.

Eventually, they hit on the answer: if we can’t get existing institutions to agree on transfer credit, why not create a new institution that specializes in granting transfer credit?  In fact, why not create an institution that doesn’t offer any credit of its own, but just develops degree standards and tells people what courses to take in order to meet them?  No more schools telling you: “yeah, you’ve already taken that methods pre-req class, but you haven’t taken our methods pre-req class, so you’ll need to do that material again” – if the class gets accredited by the ACBS, you can use it as a building block to an ACBS degree.

Does it work?  You might say that.  The ACBS now awards roughly 8% of all degrees in South Korea, second only to the Korea National Open Univerisy (KNOU).  It has curricula for over 200 individual fields of study, and increasingly, the ACBS target audience is older and more female, with more and more degrees coming in areas like social work and early childhood education.

Would the ACBS work well outside Korea?  Hard to say: it’s not clear, for instance, that a Canadian ACBS would get a lot of clients.  Yes, the Canadian credit transfer system outside British Columbia and Alberta (which have quite developed transfer mechanisms) is a less a “system” than a barrel full of ad-hoc solutions.  But as I showed back here and here, this ad-hoc-ery  actually works-out okay in practice.  So while a Credit Bank might be a much cleaner solution than the piecemeal kind of approach Ontario has taken through ONCAT, it may also be overkill.

Where this approach might make even more sense is in the US, where, as Cliff Adelman and others have shown, there are millions of people with unfinished degrees who want to get a degree, but who are worried about the time and expense of having to re-start their studies.  The main barrier, really, is: how do you get something like that past accreditation agencies?  But given the potential benefits, it’s a problem worth thinking about.

August 11

Improving Career Services Offices

Over the last few years, what with the recession and all, there has been increased pressure on post-secondary institutions to ensure that their graduates get jobs.  Though that’s substantially the result of things like curriculum and one’s own personal characteristics, landing a job also depends on being able to get interviews and to do well in them.  That’s where Career Services Offices (CSOs) come in.

Today, HESA released a paper that looks at CSOs and their activities.  The study explores two questions.  The first question deals specifically with university CSOs and what qualities and practices are associated with offices that receive high satisfaction ratings from their students.  The second question deals with college career services – here we did not have any outcome measures like the Globe and Mail, so we focussed on a relatively simple question: how does their structure and offerings differ from what we see in the university sector?

Let’s deal with that second question first: college CSOs tend to be smaller and less sophisticated than those at universities of the same size.  At first glance, that seems paradoxical – these are career-focussed organizations, aren’t they?  But the reason for this is fairly straightforward: to a large extent, the responsibility for making connections between students and employers resides at the level of the individual program rather than with some central, non-academic service provider – a lot of what takes place in a CSO at universities takes place in the classroom at colleges.

Now, to universities, and the question: what is it that makes for a good career services department?  To answer this question we interviewed CSO staff at high- medium- and low-performing institutions (as measured by the Globe and Mail’s pre-2012 student satisfaction surveys) to try to work out what practices distinguished the high-performers.  So what is it that makes for a really good career services office?  Turns out that the budget, staff size, and location of Career Services Offices aren’t really the issue.  What really matters are the following:

  • Use of Data.  Everybody collects data on their operations, but not everyone puts it to good use.  What distinguishes the very best CSOs is that they have an effective, regular feedback loop to make sure insights in the data are being used to modify the way services are delivered.
  • Teaching Job-seeking Skills.  Many CSOs view their mission as making as many links as possible between students and employers.  The very best-performing CSOs find ways to teach job search and interview skills to students, so that they can more effectively capitalize on any connections.
  • Better Outreach Within the Institution.  It’s easy to focus on making partnerships outside the institution.  The really successful CSOs also make partnerships inside the institution. One of the key relationships to be nurtured is academic staff.  Students, for better or for worse, view profs as frontline staff and ask them lots of questions about things like jobs and careers.  At many institutions, profs simply aren’t prepared for questions like that, and don’t know how to respond.  The best CSOs take the time to reach out to staff and partner with them to ensure they have tools at their disposal to answer those questions, and to direct students to the right resources at the CSOs.

If you want better career services, there’s your recipe.  Bonne chance.

July 28

Coming Soon to Ontario: a British Columbia Solution

Unless you’ve been out of the country, or under a rock, for the last couple of years, you’ll be at least vaguely familiar with the concept that the province of Ontario is broke.  So broke, in fact, that it has departed radically from previous practice and, back in 2012, effectively froze physicians’ pay for two years.  Not individual physicians, of course – but on aggregate.  A zero overall increase.  And the government is now working to try to extend this freeze for another couple of years.

That makes good sense.  Health care costs have historically risen at several percentage points above inflation, and have squeezed other parts of the provincial budget (education has held out OK, all things considered, but other budget lines have been less well-protected), and physician costs are a major item in the health care budget.  And so saving money in this area is to be welcomed, even if it is at the expense of physicians – who are a politically tricky group to offend.

So what might the provincial government think of the post-secondary sector handing its employees raises of 3-4% per year, on aggregate?  Do you think aggrieved doctors won’t point to this anomaly during this pay round?  Can you imagine any possible rejoinder the government could offer that would make the least bit of sense?

Don’t you think maybe this situation is about to come to a rather sudden end?

For whatever reason, universities in Ontario have not been able to resist rising wage pressure from full-time faculty.  Despite money getting tighter, they have felt compelled to sign agreements that are plainly beyond their means (Hel-LO, University of Ottawa!).  Some university presidents, unable to deal with this problem on their own, are saying that they need government to step in to play “bad cop”.

What does the “bad cop” look like?  Well, take a look at how BC has handled wage negotiations over the last few years.  There, universities (and all other public sector entities) are given a biennial mandate with respect to employee negotiations.  In 2010 and 2011, the deal was: negotiate what you like, but the net cost of any new compensation deal cannot exceed zero.  In 2012 and 2013, that was changed to: there could be increases, but they had to be balanced dollar-for-dollar with efficiency savings in other areas.  The 2010-11 arrangements, in conception at least, are close to the approach Ontario has taken with its physicians.  And it worked pretty well, at least in the sense that it has kept costs down with a minimum of political fuss.

Not everyone in Ontario agrees that the province needs to take the bad cop approach.  Quite a number of university presidents (you can probably guess which ones) oppose asking the province to legislate on their behalf; not only do they feel it morally incumbent on universities to manage their own books, but also it is tactical suicide to ask governments to legislate – once you go down that road, there’s no telling where they’ll stop micromanaging universities’ affairs.

The problem is, those presidents haven’t exactly been too successful at reigning-in costs, either (hello again, University of Ottawa!).  So it seems almost inevitable that some sort of BC-like solution is on the cards.  The idea that profs could gain 15% in salary over four years, while physicians get zero, simply isn’t politically tenable.

July 21

University of Saskatchewan Detritus

We all remember this spring’s controversy at the University of Saskatchewan over the firing of Robert Buckingham, which resulted in the resignation of the University’s Provost, Brett Fairbairn, and the firing of the President, Ilene Busch-Vishniac.  Despite all the coverage, a number of key questions were never answered, like “how could anyone possibly think firing a tenured professor was a good idea?”  And, “who’s idea was it to fire him anyway – the Provost’s or the President’s?”

We now have more insight, as Fairbairn recently released a five-page letter providing his perspective on events.  Two key points from his account:

  • The decision to fire Buckingham as Dean was a group decision.  The Provost, “leaders responsible for Faculty Relations, HR internal legal expertise and Communications”, and the President (by phone) were all present.  But the key question of whether to dismiss him from the university altogether was referred to HR for further study.  At this point Busch-Vishniac told Fairbarin: “I will stand behind any actions you deem necessary and will not second-guess”.
  • The decision to fire him from both jobs was the HR department’s recommendation.

How HR came to this conclusion isn’t clear; Fairbairn notes that it had happened before at U of S in a case where there had been an irreparable breakdown in relations between employer and employee. Without knowing the case to which he’s referring, it’s hard to know what to make of this.  Certainly, the employer-employee relationship with Buckingham as a dean was irreparably damaged (which is why they were correct to fire him); it’s not at all clear that he couldn’t have remained as a faculty member since he wouldn’t have had any real contact with any of the superiors whose trust he had abused as Dean.  For whatever reason, Fairbairn decided to take the “expert advice” from HR, and did so without looping back to the communications people to get their input (which might have been valuable) or checking with Bush-Vishniac.

Far from backing up Fairbairn as promised, Busch-Vishniac threw him under the bus and asked for his resignation three days later.  That was emphatically the wrong call.  From the moment she gave the go-ahead for Buckingham’s dismissal, it was clear that either both of them would stay, or neither would.  Fairbairn decided to go, astutely noting that “the only thing worse than blame and recrimination among senior leaders is mutual recrimination among senior leaders”.

Fairbairn’s letter is a valuable peek into how crises get managed at universities.  I think it shows him as a manager with mostly the right instincts, but who erred in accepting some terrible advice from professionals who should have known better.  Others – mostly people who genuinely have no insight into how major organizations function – will probably see this distinction as irrelevant since the real crime was firing Buckingham as a Dean in the first place.  Former CAUT director James Turk, in particular, has made the “managers should have a right to criticise each other publicly” case – to which the correct response is: “and how much freedom did Turk allow his staff and executive to criticise his management as CAUT director?”.

If I were at the University of Saskatchewan, though, my main question after reading Fairbairn’s letter would be: “how is it that the HR department got off comparatively lightly?”  Food for thought.

July 14

Paul Cappon, Again

You may have noticed that Paul Cappon – former President of the Canadian Council of Learning – had a paper out last week about how what the country really needs is more federal leadership in education.  It is desperately thin.

Cappon starts by dubiously claiming that Canada is in some sort of education crisis.  Canada’s position as a world-leader in PSE attainment is waved away thusly: “this assertion holds little practical value when the competencies of those participants are at the low end compared with other countries”.  In fact, PIAAC data shows that graduates of Canadian universities, on average, have literacy skills above the OECD average.  Cappon backs up his claim with a footnote referencing this Statscan piece, but said document does not reference post-secondary graduates.  Hmm.

And on it goes.  He cites a Conference Board paper putting a $24 billion price tag on skills shortages as evidence of “decline”, even though there’s no evidence that this figure is any worse than it has ever been (Bank of Canada data suggests that skills shortages are always with us, and were worse than at present for most of the aughts), or that a similar methodology would not lead to even bigger figures in other countries.  He cites a low proportion of STEM graduates in engineering as cause for concern, despite a total lack of evidence that this percentage has anything at all to do with economic growth.

Ridiculously, he proclaims that the Canadian education system (not higher education – all education from kindergarden onwards) is less effective than the Swiss because they beat us on some per capita measures of research output (which are of course largely funded through pockets entirely separate from the education budget).  Yes, really.  For someone so in favour of evidence-based policy, Cappon’s grasp on what actually constitutes evidence is shockingly tenuous.

Having cobbled together a pseudo-crisis, Cappon concludes that “the principal cause of our relative regression is that other countries are coordinating national efforts and establishing national goals… Canada, with no national office or ministry for education, is mired in inertia, each province and territory doing its best in relative isolation.” Now there is no evidence at all that the existence of national systems or national goals makes a damn bit of difference to PIAAC outcomes.  Germany comes in for all sorts of praise because of its co-operation between national and state governments, but their PIAAC and PISA outcomes are actually worse than Canada’s.  Yet Cappon believes – without a single shred of evidence – that if only there were some entity based in Ottawa that could exercise leadership in education that everything would be better.

Two points: first, our nation rests on a very simple compromise.  Back in 1864, the *only* way in which Catholic, Francophone Lower Canada could be tempted into agreeing to a federal government with representation by population was if that new level of government never, ever, ever got its hands on education.  That was, and is, the deal.  It’s not going to change.  Period.

Second, everyone needs to remember that there was a time not long ago that the Director-General of CMEC suggested exactly such a body to some deputy ministers in Ottawa, and Ottawa created a body not dissimilar to what Cappon is describing.  But said CMEC DG didn’t tell the provinces he was negotiating this deal with the feds.  When he was offered the leadership of the new organization, he jumped, taking several CMEC senior staff with him.  The result was that provinces, feeling betrayed by the DG’s chicanery, refused to work with the new organization.  As a result, it achieved little before being shut down in 2011.

That DG, of course, was Paul Cappon.

This report is essentially a self-justification for a dishonest act performed nearly a decade ago, rather than a genuine contribution to public policy.  If Cappon actually cared about repairing federal-provincial relations around learning, a mea culpa would have been more appropriate.

July 07

How to Measure Teaching Quality

One of the main struggles with measuring performance in higher education – whether of departments, faculties, or institutions – is how to measure the quality of teaching.

Teaching does not go entirely unmeasured in higher education.  Individual courses are rated by students through course evaluation surveys, which occur at the end of each semester.  The results of these evaluations do have some bearing on hiring, pay, and promotion (though how much bearing varies significantly from place to place), but these data are never aggregated to allow comparisons of quality of instruction across departments or institutions.  That’s partly because faculty unions are wary about using individual professors’ performance data as an input for anything other than pay and promotion decisions, but it also suits the interests of the research-intensive universities who do not wish to see the creation of a metric that would put them at a disadvantage vis-a-vis their less-research-intensive brethren (which is also why course evaluations differ from one institution to the next).

Some people try to get around the comparability issue by asking students about teaching generally at their institution.  In European rankings (and Canada’s old Globe and Mail rankings), many of which have a survey component, students are simply asked questions about the quality of courses they are in.  This gets around the issue of using course evaluation data, but it doesn’t address a more fundamental problem, which is that a large proportion of academic staff essentially believes the whole process is inherently flawed because students are incapable of knowing quality teaching when they see it.  There is a bit of truth here: it has been established, for instance, that teachers who grade more leniently tend to get better course satisfaction scores.  But this is hardly a lethal argument.  Just control for average class grade before reporting the score.

It’s not as though there isn’t a broad consensus on what makes for good teaching.  Is the teacher clear about goals and expectations?  Does she/he communicate ideas effectively?  Is he or she available to students when needed?  Are students challenged to learn new material and apply this knowledge effectively?  Ask students those kinds of questions and you can get valid, comparable responses.  The results are more complicated to report than a simple satisfaction score, sure – but it’s not impossible to do so.  And because of that, it’s worth doing.

And even the simple questions like “was this a good course” might be more indicative than we think.  The typical push-back is “but you can’t really judge effectiveness until years later”.  Well, OK – let’s test a proposition.  Why not just ask students about a course they took a few years ago, and compare it with the answers they gave in a course evaluation at the time?  If they’re completely different, we can indeed start ignoring satisfaction types of questions.  But we might find that a good result today is in fact a pretty good proxy for results in a few years, and therefore we would be perfectly justified in using it as a measure of teaching quality.

Students may be inexperienced, but they’re not dumb.  We should keep that in mind when dismissing the results of teaching quality surveys.

Page 5 of 64« First...34567...102030...Last »