HESA

Higher Education Strategy Associates

Category Archives: Data

November 16

Helicopter Parents: Grounded?

We’ve all seen stories about “helicopter parents,” parents who hover over their children even after they enrol in university. But most of these stories are American in origin and tend to be anecdotal in nature. What’s the reality in Canada?

A few months ago, we asked our regular CanEd Student Research Panel what kind of on-going involvement their parents had in their lives. Did their parents help them with their homework or help them select courses or extracurricular activities? Had they helped them find a job, or (helicopter alert!) helped them contest a grade? The figure shows the results.

By some distance, the area in which parents gave the most assistance was finding a job, with runners up in assistance with school work, discussing a problem with a professor or administrator, and suggesting extra-curricular activities. Only 3% of students said their parents had behaved in that most helicopter-ish of ways by contesting a grade for them.

Female students were more likely to report having parental involvement in all of the categories compared to male students, and parental education was positively correlated with all categories as well. On academic matters, such as getting help with schoolwork and course selection, parental involvement increased with parental level of education. Anglophone parents were more likely to assist with schoolwork compared to other parents; allophone parents (many of whom are immigrants) were more likely to assist with course selection. Regarding choosing a career path or finding a job, allophone parents were more likely to be involved in choosing a career path, but substantially less likely to be involved with finding a job compared to Anglophone and Francophone parents.

Clearly, helicopter parents are not the norm among Canadian university students. So why do we hear so much about them? For one, they make a great news story. As well, it is possible that even a small percentage of meddling parents can affect institutional work patterns: at a campus of 30,000 students, if 3% of students’ parents call about their children’s grades, that’s 900 parental calls, or at least two calls a day, into the offices of Deans and Student Affairs. If that’s up from 2% a few years ago, that’s an extra 300 calls. That’s certainly enough to cause stories of helicopter parents to circulate, even if they aren’t in fact all that common.

Miriam Kramer and Alex Usher

October 04

Cool-hunting at the NBER

If you’re trying to keep abreast of the latest behavioural economics research on education, it’s worth popping in every so often at the Social Science Research Network (SSRN) to check out the latest from the National Bureau of Economic Research (NBER). It’s mostly about K-12, but when it does tackle higher education, it’s unfailingly interesting.

Maybe the most interesting piece published recently is called The Effects of Student Coaching in College, by Rachel Baker and Eric Bettinger (who is, IMHO, a genius). Over a period of two years, students at a number of U.S. institutions were assigned by lottery to a program run by a company called Inside Track in which they received various forms of personal and academic coaching. (The authors have posted a fulltext version of the paper for free here.)

The results were striking: one year of coaching created an immediate increase of 12 percent in year-on-year persistence, which did not shrink in subsequent years. Coaching is a pretty intensive (and expensive) enterprise, but 12 percent is an enormous return, and compares very favourably to the results achieved by increasing student aid.

A second great paper is A Community College Instructor Like Me: Race and Ethnicity Interactions in the Classroom by Robert Fairlie, Florian Hoffman and Philip Oreopoulos. Using data from a community college where low-achieving students are quasi-randomly assigned to instructors, the authors try to work out whether minority students taught by members of their own ethnic group do better than those taught by members of other ethnic groups.

As it turns out, they do – or, at least the younger ones do (there was no role-model effect among older students). When taught by a member of their own ethnic group, non-white students closed roughly half the educational gap with white students, and the effect was even greater among black students.

It’s great research, but unlike the Bettinger piece, the policy implications are less clear-cut as the political acceptability of greater classroom segregation seems limited, even backed by results like these. And hiring more instructors of one ethnicity may lead to more classroom sorting, which could have other knock-on effects.

Both papers are great, but if you can only read one, read Baker and Bettinger – it’s a result that has the potential to seriously change the way we look at retention.

September 14

Data Point of the Week: StatsCan Gets it Wrong in the EAG

So, as noted yesterday, the OECD’s Education at a Glance (EAG) statfest – all 495 pages of it – was just released. Now it’s our turn to dissect some of what’s in there.

Of most immediate interest was chart B5.3, which shows the relative size of public subsidies for higher education as a percentage of public expenditures on education. It’s an odd measure, because having a high percentage could mean either that a country has very high subsidies (e.g., Norway, Sweden) or very low public expenditures (e.g., Chile), but no matter. I’ve reproduced some of the key data from that chart below.

 

(No, I’m not entirely clear what “transfers to other entities” means, either. I’m assuming it’s Canada Education Savings Grants, but I’m not positive.)

Anyways, this makes Canada looks chintzy, right? But hang on: there are some serious problems with the data.

In 2008, Canada spent around $22 billion on transfers to institutions. For the chart above to be right would imply that Canadian spending on “subsidies” (i.e., student aid) was in the $3.5 – 4 billion range. But that’s not actually true – if you take all the various forms of aid into account, the actual figure for 2008 is actually closer to $8 billion.

What could cause such a discrepancy? Here’s what I’m pretty sure happened:

1) StatsCan didn’t include tax credits in the numbers. Presumably this is because they don’t fit the definition of a loan or a grant, though in reality these measures are a $2 billion subsidy to households. In fairness, the U.S. – the only other country that uses education tax credits to any significant degree – didn’t include it either, but it’s a much bigger deal here in Canada.

2) StatsCan didn’t include any provincial loans, grants or remission either. They have form on this, having done the same thing in the 2009 EAG. Basically, because StatsCan doesn’t have any instrument for collecting data on provincial aid programs, it essentially assumes that such things must not exist. (Pssst! Guys! Next time, ask CMEC for its HESA-produced database of provincial aid statistics going back to 1992!) So, what happens when you add all that in (note: U.S. data also adjusted)?

 

Not so chintzy after all.

September 13

Education at a Glance

By the time you read this, the first headlines should be coming through from Paris on the 2011 version of OECD’s annual publication, Education at a Glance (EAG). We’ll be taking a deeper look at some of the statistics tomorrow and over the coming weeks, but today I wanted to offer some thoughts on the product itself.

Over the 16 years since EAG was first published, it has had a considerable effect on policy-making around the world. By drawing direct comparisons between national systems, OECD has kick-started an entire policy sub-culture around benchmarking national outcomes. Canada, however, has had difficulty taking advantage of this explosion of comparative data, because of the difficulty adapting our education data – which is designed for our own policy purposes – to the somewhat abstract categories that OECD uses to make data from such varied countries comparable.

There’s been a lot of hysteria over this last point over the years. Back when the Canadian Council on Learning was still around (ok, they technically still exist, but have you seen what they’ve been putting out since their funding got nuked?) the annual EAG release would reliably be accompanied with anguished wails from CCL, going on about how Statistics Canada’s inability to produce comparable data was depriving the country of much of this benchmarking goodness and turning us into some third world backwater.

Slowly, however, Statistics Canada has been getting better at this, so tomorrow’s EAG may have more Canada in it that have previous editions. But just remember as you read the press coverage that there are an awful lot of caveats and simplifications that go into EAG in order to make vastly different education systems comparable. For instance, population educational attainment – a measure on which Canada traditionally does very well – is calculated based on labour force survey questionnaires which use different questions in different countries. So is Canada really the best educated country, or do we just have slack labour force survey questions?

Caveat lector.

August 31

Why is there an “S” in STEM?

Governments love to talk about STEM (science, technology, engineering and mathematics) programs. They were given prominent space in the last Canadian federal budget, and the acronym permeates U.S. educational policy discourse. It’s conventional wisdom that increasing the number of STEM graduates is essential to economic growth. You might think that the chief purpose of the modern post-secondary institution is to churn out graduates in STEM fields – and that as a corollary, arts students are some sort of vestigial leftover from a bygone era, kept around only to avoid the pain of their excision.

The full-court press to jack up STEM graduate production rates overlooks one important detail – the STEM fields are hardly a monolith, and there are some very important differences among them. Indeed, sometimes it’s unclear why these fields are grouped together at all. The issue, in large part, lies with the “S” – an undergraduate science degree is much less likely to get you a job.

Take a look at labour force status of the class of 2005 two years after graduation, courtesy of Statistics Canada’s 2007 National Graduates Survey. For comparison, we’ve left in data for the humanities – a field that is seldom lauded as the ticket to immediate success in the job market.

It becomes quickly apparent that one of the STEM fields is not like the others. Graduates in the physical and life sciences have extremely low employment. Barely half of them have a full-time job, only two-thirds are employed at all, and almost a quarter are not in the labour force – two years after graduating. Moreover, they have the highest rate of unemployment (11%). Students in engineering or math and computer science, by contrast, have full-time employment rates of around 80% and employment rates around 85%, with unemployment under 8%. Based on short-term employment outcomes, the sciences have little in common with the other three. It makes you wonder: if “TEM” sounded half as good as “STEM,” would we be so quick to lump in the sciences with the rest?

Of course, the sciences still offer great value to their students and society – even if that value doesn’t pay off as employment in the short term. And should science’s showing on these graphs make it feel lonely, there’s another field that might be its friend. As the data shows, a science student’s employment prospects are rather similar to those of a humanities graduate. And that’s something we shouldn’t hide behind an acronym.

August 30

Anticipating Demographic Shifts

I was in Regina last week speaking to the university’s senior management team about challenges in Canadian post-secondary education, when someone asked a really intriguing question.

“Given the changing demographics of Canada, with fewer traditional-aged students, are there any examples of good practice of universities altering their programming serving non-traditional students instead”?

I have to admit, I was stumped.

You’d think, for instance, that maritime universities, who have been facing demographic decline for quite some time, would have some experience of this, but they don’t, really. Think about it: when Memorial started hurting for students because of Newfoundland’s awful demographics, the main response was to lower tuition fees and begin raiding other nearby provinces for traditional-aged students. In the rest of the maritimes, they’ve been sucking traditional-aged students out of Ontario for a couple of decades now, and the primary solution to any shortfall now is to go looking for traditional-aged students in other parts of the world.

From Statistics Canada

There have, admittedly, been some advances recently in attracting non-traditional-aged students in Northern Ontario and the Prairies – specifically, Aboriginal students, who tend to arrive at university in their mid- to late-20s (often after having had children). But even here, what they are doing for the most part is trying to put in as many supports as possible so that they can be taught as if they were traditional, full-time students. One might conclude that universities are going to great lengths to avoid re-engineering themselves to serve older populations.

Taking demographics seriously means that some universities are going to have to move towards much more modular delivery of courses, more e-learning alternatives, and more evening courses. There are pockets of this, of course, but it hardly constitutes a major trend. Generally speaking, community colleges and polytechnics have been doing much better on this front than universities.

As the demographic shift continues, what happens if governments conclude that they should put more resources on lifelong learning and less on traditional-aged students? That possibility may open up some big opportunities for those institutions (mostly colleges) who have already invested heavily in this kind of delivery, and leave those institutions (mostly universities) who have not politically quite vulnerable.

August 24

Data Point of the Week: Comparing Academic Salaries

If there’s one subject we write about that gets people riled up, it’s academic salaries in Canada and the U.S. It’s a complicated issue – so let’s look at concrete examples at three of the better-paying Canadian institutions (Trent, Calgary and McMaster) and three prestigious American universities (Dartmouth, Washington and Berkeley).

If you just look at baseline salaries for two sets of institutions, you see some pretty big differences as shown in Figure 1, below. The gap is bigger for associate professors than for full professors, but either way, Canadian professors appear to be making out a lot better than American ones.

Figure 1: Unadjusted Average Base Salaries at Selected Institutions, in Thousands of Dollars.

But this isn’t quite the whole story. Our professors get a 12-month salary, and receive the same pay no matter what they do in the summer months. In the U.S., however, pay is on a nine-month basis – in the summer, many profs are working on research and drawing a separate income from their research grant. Different funders have slightly different rules about how much salary professors can take, but the basic rule of thumb – based on National Science Foundation (NSF) rules that came into effect in 2009 – is that they can take another two months’ worth of salary (i.e., 2/9 of their regular annual pay).

How does that affect average compensation in the U.S.? NSF data seems to indicate that about two-thirds of all professors at research universities hold grants. So, multiplying that out, average compensation across all faculty would look like this:

Figure 2: Adjusted Average Base Salaries at Selected Institutions, in Thousands of Dollars.

At the associate professor level, there is still an advantage to being in Canada – Trent, for instance, still has better salaries than Berkeley. But because promotion carries greater rewards in the U.S., the advantage reverses for full professors. There, all three Canadian universities have higher salaries than the University of Washington, but lower than those at Dartmouth and Berkeley.

If we only looked at research-active faculty, the numbers would look even better for the U.S. than they do in Figure 2. On the other hand, if we look at research-inactive faculty, the accurate comparison is Figure 1.

Another way of putting all this is that for older, research-active faculty, Canadian institutions may still face a bit of a compensation gap. For younger (i.e., associate professor rank) research-active faculty, Canada is the better bet; even Trent  outspends Berkeley. But where Canada really kicks tail is in research-inactive faculty, where faculty at our three selected universities have a collective compensation advantage of almost 25% for associate professors and 10% for full professors.

Which raises an interesting question: given the choice, is that the category in which we really want to have an advantage?

August 17

Data Point of the Week: These are Fantastic Graphs

…from U.S. researchers Stuart Rojstaczer and Christopher Healy on the subject of grade inflation. So fantastic, in fact, that I think I’ll mostly let them speak for themselves.

 

And this, of course, is at a time when institutions are becoming less selective, not more. Interestingly, though, it’s U.S. private universities – generally speaking more selective than publics – that are leading the grade inflation charge.

 

Since there’s no data to suggest that students are working harder than they used to, this is pretty much a straight-up change in grading practices. But what’s causing the change?

Canadian commentators James Coté and Anton Allahar would probably have you believe that grade inflation (or grade compression, as they more accurately dub it) is all due to the way that larger classrooms, disengaged students and manipulable teacher- evaluation schemes have given professors incentives to reduce standards – “they pretend to learn, we pretend to evaluate,” so to speak.

But this data – which shows that grade inflation/compression was more severe at the smaller and more selective institutions – suggests something different may be going on. In fact, Rojstaczer and Healy posit that the reverse is true – that diminished faculty disengagement expectations might be leading to disengagement.

Food for thought.

Page 16 of 16« First...1213141516