HESA

Higher Education Strategy Associates

Category Archives: data point

May 14

Revisiting the Looming Labour Shortage Theory

Various bits of labour market paranoia have been driving PSE policy lately.  The “skills shortage” is one – even if the case for its actual existence is pretty weak.  Another, though, is the broader idea that we’re about to hit a major labour shortage as boomer retirements… well, boom.  Time to explore that idea a bit.

At the heart of the labour market shortage meme – popularized mainly by Rick Miner in papers such as, People Without Jobs, Jobs Without People, and Jobs of the Future – is the uncontroversial point that the core-working age population (25-54 year-olds) is shrinking as a percentage of the overall population.  As a result, even as population increases, there will be fewer potential workers, hence causing labour shortages, hence leading wages to rise and (though nobody says this part) sending productivity growth into the trash can.  Scary stuff.

But it’s worth examining in more detail how Miner developed his scenario.  To derive labour market demand, he used 2006 HRSDC data on employment growth to arrive at both a 2011 baseline and a 2015 projection, and then assumed that subsequent growth would continue at a pace roughly equal to HRSDC’s 2011-15 projection (0.8% per year).  To derive labor market supply, he applied “current” (the base year is unclear) rates of participation by age group, and applied them forward to 2031.  These estimates produced a potential demand of 21.1 million jobs, and a supply (using his “medium estimates”) of about 18.4 million – a deficit of 2.7 million jobs.

But are constant labour market participation rates realistic?  If labour markets tighten, won’t supply adjust?  Miner seems to make this assumption – not because he thinks it’s true, but because he wants to highlight the scale of the coming transition.  As I showed back here (and as Miner himself notes) we have already seen a shift in the employment pattern of older workers: since 2000, employment rates have been rising 1 percentage point per year among 55-64 year olds, and 0.5 points per year among the over 65s.  And there’s no obvious reason those can’t continue; even if the current rate of growth lasted twenty years, our post-55 employment rates would still be behind the current rates of New Zealand and Iceland.

So, what happens if you replace the no-change projection with one based on employment rates continuing to increase at their post-2000 rate?  Check this out:

Future Employment Rates, Based on Different Assumptions About Employment Rates Among Workers Over 55

 

 

 

 

 

 

 

 

 

 

 

 

That’s a 2.2 million job gap between the two projections – enough to entirely wipe out Miner’s projected shortage.

To be clear, this projection is no better than Miner’s – we’re both just straight-lining different aspects of current reality.  But it does show that the whole “looming labour shortage” meme depends heavily on initial assumptions.

Tomorrow, we’ll look at the policy implications of this.

May 11

Hooked on School

What do Canadian students do when they’ve finished their university studies? And how do they differ from students in other parts of the world? We recently had the opportunity to examine country-level graduate surveys around the world.

Now, there are important caveats – no two countries conduct the same survey among the same exact population of graduates at the exact same time (and international data agencies like the OECD restrict most of their graduate analysis to fairly basic indicators, such as employment rates and earnings). Fortunately, centrally-coordinated surveys in Canada, Australia, France, Germany, New Zealand, Sweden, the United States and the United Kingdom permit meaningful international comparisons of life after a Bachelor’s degree.

So how do Canadian Bachelor’s degree-holders compare? For one thing, those who decide to work (excluding those who work and study part-time) report the second-highest earnings among the countries examined here, along with the U.S. – around $45,000 annually.* German students with a “traditional degree” report earnings of CAD $55,000.

More interesting, though, is the proportion of Canadian graduates who pursue further education. According to Statistics Canada’s National Graduates Survey, 42% of Bachelor’s degree-holders from the class of 2005 had pursued a new course of study by 2007; only 62% of Canadians were working (some did both). No one, other than German graduates of “new” degree programs, studies as much after completing an undergraduate degree than Canadians – even those who were surveyed in other countries much longer after graduation were less likely to return to school.

Of the 42% of Canadian graduates who went on to pursue an additional program (the number of students who pursued any kind of schooling, either within the context of a program or one or more individual courses, is closer to 75%), only 16% had completed their studies when surveyed two years after graduation.

Of course, not all students are as gung-ho about pursuing post-graduate education. Looking at graduates at all levels, while 62% of life-science graduates and 56% of humanities graduates went on to pursue another program of study, the same was true for less than one-third of architecture and engineering students. Education graduates were most likely (64%) to pursue individual courses after graduation; however fewer than 20% undertook a structured program.

Whether a Bachelor’s degree doesn’t confer the benefits Canadians students anticipate or whether it cultivates a thirst for learning that can only be quenched in the classroom isn’t clear. What is certain, though, is that most Canadian undergraduates aren’t ready to stop studying.

* Australian graduates actually reported higher earnings – $61,000 vs. $45,000 in Canada and the U.S., but they were surveyed five years after graduation, compared to one to two years in Canada and the States.

May 04

Student Stereotypes in Four Graphs

We all know about stereotypes when it comes to students: computer science students resemble characters from The Big Bang Theory, arts students are inordinately fond of hackie-sack, etc. But is there any truth to this?

Well, there is some, as it turns out. About a year ago we asked our CanEd Student Research Panel a series of questions about their attitudes toward academic challenges. The answers we got were interesting because of the way they broke down by field of study. Below are the answers to four questions about academic challenges for the six fields of study for which we had more than 100 observations.

Figure 1 Strongly Agree that “Being Challenged in School is Important to Me”

Slightly fewer business students say they think being challenged in school is very important to them, but not a huge difference across fields of study

Figure 2 Strongly Agree that Classes that Require Application of New Concepts and Techniques are Enjoyable

Engineers like trying new things, business and humanities students less so.

Figure 3 Agree and Strongly Agree that Courses that Require Long Hours of Work are Enjoyable

Are you getting the picture yet? This last one’s my favourite.

Figure 4 Strongly Agree that “I Prefer to Take Courses where the Instructor is an Easy Marker”

In sum, there are moderate but significant differences in academic outlook among students in different disciplines. It’s possible that these traits were acquired while in school, but I would tend towards the view that different academic disciplines simply attract different kinds of students (recall this little beauty from February?). Which in turn makes you wonder if some of the cross-disciplinary differences in learning outcomes that Arum and Roksa found in Academically Adrift  were less of a reflection on the kind of education students receive in different disciplines than and more a reflection of systemic differences in learners’ personalities.

November 18

Can You Build Your Way to Happiness?

With a half dozen universities currently planning upgrades to their athletics facilities, it’s worth asking the question: what’s the impact of these things on student satisfaction?

(Yes…we know…satisfaction isn’t everything. But it’s not nothing, either. And it has the singular value of being measurable, so…onwards!)

We have two recent case studies here. In 2009, Queen’s completed a new $230 million athletics complex, while in 2010, Trent completed an $18 million renovation to its own athletics building. What kind of effects did these renos have on satisfaction?

On our nine-point satisfaction scale, Queen’s saw a 3.4-point jump in satisfaction with Athletics facilities after completion of the new building; Trent saw a 2.3-point bump after its renovations were done. Clearly, it’s not dollars alone that push satisfaction – Trent got 0.126 points of satisfaction per million dollars spent, while Queen’s only got 0.015, which is an order of magnitude of difference.

But that’s just satisfaction with facilities. What about overall satisfaction with recreational and athletic programs themselves? It turns out these see a bump, too, but it’s not as large: the bump is about 1.7 (out of 9) at Queen’s and 1.3 at Trent.

Let’s take this still further. Satisfaction with athletic buildings and facilities is one of a number of buildings and facilities questions we ask. How much satisfaction “flows through” to overall satisfaction with buildings and facilities?

Answer: Not much. While both universities see an increase in overall satisfaction with buildings and facilities, Queens’ increase is small (about 0.22) and not out of line with the increase that Queen’s saw the previous year. Trent does have an anomalous bump of 0.30, which is more than one would expect from statistical noise.

Finally, let’s ask the big question – do these investments have a clear impact on overall satisfaction with the educational experience at these schools?

Answer: No – or, at least, not enough to stand out amidst all of the other factors that affect students’ satisfaction from year to year. Both schools actually saw small decreases in overall satisfaction in the years that the projects are completed.

In sum, it doesn’t seem like you can build your way to student satisfaction: students can’t be bought quite that easily. It would be interesting to have a counter-factual to Queen’s in order to find out what happens if you stick with an old, run-down athletics building and spend $230 million on decreasing class sizes or improving pedagogy instead. Our guess is the effect would be much more dramatic.

Maybe one day we’ll get a chance to try that out.

Alex Usher and Jason Rogers

November 16

Helicopter Parents: Grounded?

We’ve all seen stories about “helicopter parents,” parents who hover over their children even after they enrol in university. But most of these stories are American in origin and tend to be anecdotal in nature. What’s the reality in Canada?

A few months ago, we asked our regular CanEd Student Research Panel what kind of on-going involvement their parents had in their lives. Did their parents help them with their homework or help them select courses or extracurricular activities? Had they helped them find a job, or (helicopter alert!) helped them contest a grade? The figure shows the results.

By some distance, the area in which parents gave the most assistance was finding a job, with runners up in assistance with school work, discussing a problem with a professor or administrator, and suggesting extra-curricular activities. Only 3% of students said their parents had behaved in that most helicopter-ish of ways by contesting a grade for them.

Female students were more likely to report having parental involvement in all of the categories compared to male students, and parental education was positively correlated with all categories as well. On academic matters, such as getting help with schoolwork and course selection, parental involvement increased with parental level of education. Anglophone parents were more likely to assist with schoolwork compared to other parents; allophone parents (many of whom are immigrants) were more likely to assist with course selection. Regarding choosing a career path or finding a job, allophone parents were more likely to be involved in choosing a career path, but substantially less likely to be involved with finding a job compared to Anglophone and Francophone parents.

Clearly, helicopter parents are not the norm among Canadian university students. So why do we hear so much about them? For one, they make a great news story. As well, it is possible that even a small percentage of meddling parents can affect institutional work patterns: at a campus of 30,000 students, if 3% of students’ parents call about their children’s grades, that’s 900 parental calls, or at least two calls a day, into the offices of Deans and Student Affairs. If that’s up from 2% a few years ago, that’s an extra 300 calls. That’s certainly enough to cause stories of helicopter parents to circulate, even if they aren’t in fact all that common.

Miriam Kramer and Alex Usher

October 04

Cool-hunting at the NBER

If you’re trying to keep abreast of the latest behavioural economics research on education, it’s worth popping in every so often at the Social Science Research Network (SSRN) to check out the latest from the National Bureau of Economic Research (NBER). It’s mostly about K-12, but when it does tackle higher education, it’s unfailingly interesting.

Maybe the most interesting piece published recently is called The Effects of Student Coaching in College, by Rachel Baker and Eric Bettinger (who is, IMHO, a genius). Over a period of two years, students at a number of U.S. institutions were assigned by lottery to a program run by a company called Inside Track in which they received various forms of personal and academic coaching. (The authors have posted a fulltext version of the paper for free here.)

The results were striking: one year of coaching created an immediate increase of 12 percent in year-on-year persistence, which did not shrink in subsequent years. Coaching is a pretty intensive (and expensive) enterprise, but 12 percent is an enormous return, and compares very favourably to the results achieved by increasing student aid.

A second great paper is A Community College Instructor Like Me: Race and Ethnicity Interactions in the Classroom by Robert Fairlie, Florian Hoffman and Philip Oreopoulos. Using data from a community college where low-achieving students are quasi-randomly assigned to instructors, the authors try to work out whether minority students taught by members of their own ethnic group do better than those taught by members of other ethnic groups.

As it turns out, they do – or, at least the younger ones do (there was no role-model effect among older students). When taught by a member of their own ethnic group, non-white students closed roughly half the educational gap with white students, and the effect was even greater among black students.

It’s great research, but unlike the Bettinger piece, the policy implications are less clear-cut as the political acceptability of greater classroom segregation seems limited, even backed by results like these. And hiring more instructors of one ethnicity may lead to more classroom sorting, which could have other knock-on effects.

Both papers are great, but if you can only read one, read Baker and Bettinger – it’s a result that has the potential to seriously change the way we look at retention.

September 19

International Student Recruitment: Not as Good as We Think We Are

One of the most startling things about Canada’s recent success in attracting international students is how easy it has all been. Australia and the U.K. took decades to build up their position in international higher education, and in the former case it took decades of government-backed investment in developing overseas networks. Our recent extraordinary spurt of growth in international higher education – particularly in the Indian market – came in the space of about five years in a comparatively uncoordinated way.

So are Canadians just brilliant at this stuff or are there other factors at work?

I’d argue for the latter. Consider that in recent years the Americans have been imposing ludicrous visa regimes, the U.K. has been making menacing noises about rejecting international students and Australia’s image has been tarnished by events that have highlighted problems of racism and student security. We’ve therefore reaped the benefits without making any serious investments ourselves. We didn’t hit a triple; we were born on third base.

But this situation isn’t going to last forever. Universities around the developed world are heading for big trouble financially, and they are all going to be spending more time trying to tap the foreign student market. And in the developing world, institutions are improving all the time and improving their value position vis-à-vis our own. Competition is going to increase, and it’s not clear how well placed we are to win.

At HESA, we’ve developed the Global Student Survey to examine the views of students in various exporting countries about education in general and international education in particular. Our India survey, available for purchase as of today, shows some of the obvious vulnerabilities that Canadian institutions have, and the value proposition and the rising competition from Indian institutions are clearly there.

More importantly, our national brand in education is a problem. We rank well behind the U.S. and U.K. as a destination in Indian students’ minds, and even Singapore and the U.A.E. peg above us in some categories. And whereas Indian students describe American, British and Singaporean higher education in terms that are generic synonyms for excellence, Canada gets described like this:

phrases Indian students associate with Canada

 

Forget the temporarily rosy enrolment statistics: we have a problem here. We ignore it at our peril.

September 14

Data Point of the Week: StatsCan Gets it Wrong in the EAG

So, as noted yesterday, the OECD’s Education at a Glance (EAG) statfest - all 495 pages of it - was just released. Now it’s our turn to dissect some of what’s in there.

Of most immediate interest was chart B5.3, which shows the relative size of public subsidies for higher education as a percentage of public expenditures on education. It’s an odd measure, because having a high percentage could mean either that a country has very high subsidies (e.g., Norway, Sweden) or very low public expenditures (e.g., Chile), but no matter. I’ve reproduced some of the key data from that chart below.

 

(No, I’m not entirely clear what “transfers to other entities” means, either. I’m assuming it’s Canada Education Savings Grants, but I’m not positive.)

Anyways, this makes Canada looks chintzy, right? But hang on: there are some serious problems with the data.

In 2008, Canada spent around $22 billion on transfers to institutions. For the chart above to be right would imply that Canadian spending on ”subsidies” (i.e., student aid) was in the $3.5 – 4 billion range. But that’s not actually true – if you take all the various forms of aid into account, the actual figure for 2008 is actually closer to $8 billion.

What could cause such a discrepancy? Here’s what I’m pretty sure happened:

1) StatsCan didn’t include tax credits in the numbers. Presumably this is because they don’t fit the definition of a loan or a grant, though in reality these measures are a $2 billion subsidy to households. In fairness, the U.S. – the only other country that uses education tax credits to any significant degree – didn’t include it either, but it’s a much bigger deal here in Canada.

2) StatsCan didn’t include any provincial loans, grants or remission either. They have form on this, having done the same thing in the 2009 EAG. Basically, because StatsCan doesn’t have any instrument for collecting data on provincial aid programs, it essentially assumes that such things must not exist. (Pssst! Guys! Next time, ask CMEC for its HESA-produced database of provincial aid statistics going back to 1992!) So, what happens when you add all that in (note: U.S. data also adjusted)?

 

Not so chintzy after all.

August 31

Why is there an “S” in STEM?

Governments love to talk about STEM (science, technology, engineering and mathematics) programs. They were given prominent space in the last Canadian federal budget, and the acronym permeates U.S. educational policy discourse. It’s conventional wisdom that increasing the number of STEM graduates is essential to economic growth. You might think that the chief purpose of the modern post-secondary institution is to churn out graduates in STEM fields – and that as a corollary, arts students are some sort of vestigial leftover from a bygone era, kept around only to avoid the pain of their excision.

The full-court press to jack up STEM graduate production rates overlooks one important detail – the STEM fields are hardly a monolith, and there are some very important differences among them. Indeed, sometimes it’s unclear why these fields are grouped together at all. The issue, in large part, lies with the “S” – an undergraduate science degree is much less likely to get you a job.

Take a look at labour force status of the class of 2005 two years after graduation, courtesy of Statistics Canada’s 2007 National Graduates Survey. For comparison, we’ve left in data for the humanities – a field that is seldom lauded as the ticket to immediate success in the job market.

It becomes quickly apparent that one of the STEM fields is not like the others. Graduates in the physical and life sciences have extremely low employment. Barely half of them have a full-time job, only two-thirds are employed at all, and almost a quarter are not in the labour force – two years after graduating. Moreover, they have the highest rate of unemployment (11%). Students in engineering or math and computer science, by contrast, have full-time employment rates of around 80% and employment rates around 85%, with unemployment under 8%. Based on short-term employment outcomes, the sciences have little in common with the other three. It makes you wonder: if “TEM” sounded half as good as “STEM,” would we be so quick to lump in the sciences with the rest?

Of course, the sciences still offer great value to their students and society – even if that value doesn’t pay off as employment in the short term. And should science’s showing on these graphs make it feel lonely, there’s another field that might be its friend. As the data shows, a science student’s employment prospects are rather similar to those of a humanities graduate. And that’s something we shouldn’t hide behind an acronym.

August 24

Data Point of the Week: Comparing Academic Salaries

If there’s one subject we write about that gets people riled up, it’s academic salaries in Canada and the U.S. It’s a complicated issue – so let’s look at concrete examples at three of the better-paying Canadian institutions (Trent, Calgary and McMaster) and three prestigious American universities (Dartmouth, Washington and Berkeley).

If you just look at baseline salaries for two sets of institutions, you see some pretty big differences as shown in Figure 1, below. The gap is bigger for associate professors than for full professors, but either way, Canadian professors appear to be making out a lot better than American ones.

Figure 1: Unadjusted Average Base Salaries at Selected Institutions, in Thousands of Dollars.

But this isn’t quite the whole story. Our professors get a 12-month salary, and receive the same pay no matter what they do in the summer months. In the U.S., however, pay is on a nine-month basis – in the summer, many profs are working on research and drawing a separate income from their research grant. Different funders have slightly different rules about how much salary professors can take, but the basic rule of thumb – based on National Science Foundation (NSF) rules that came into effect in 2009 – is that they can take another two months’ worth of salary (i.e., 2/9 of their regular annual pay).

How does that affect average compensation in the U.S.? NSF data seems to indicate that about two-thirds of all professors at research universities hold grants. So, multiplying that out, average compensation across all faculty would look like this:

Figure 2: Adjusted Average Base Salaries at Selected Institutions, in Thousands of Dollars.

At the associate professor level, there is still an advantage to being in Canada – Trent, for instance, still has better salaries than Berkeley. But because promotion carries greater rewards in the U.S., the advantage reverses for full professors. There, all three Canadian universities have higher salaries than the University of Washington, but lower than those at Dartmouth and Berkeley.

If we only looked at research-active faculty, the numbers would look even better for the U.S. than they do in Figure 2. On the other hand, if we look at research-inactive faculty, the accurate comparison is Figure 1.

Another way of putting all this is that for older, research-active faculty, Canadian institutions may still face a bit of a compensation gap. For younger (i.e., associate professor rank) research-active faculty, Canada is the better bet; even Trent  outspends Berkeley. But where Canada really kicks tail is in research-inactive faculty, where faculty at our three selected universities have a collective compensation advantage of almost 25% for associate professors and 10% for full professors.

Which raises an interesting question: given the choice, is that the category in which we really want to have an advantage?

Page 1 of 212