HESA

Higher Education Strategy Associates

Tag Archives: Student Services

November 03

The European Way of Student Services

One of the delights of working in international higher education is that while higher education is pretty much isomorphic the world over, it’s not entirely so. There’s not so much variation that expertise isn’t transferable, but not so little that you can’t be learn something new by appreciating another country’s system.  One are of particular interest is student accommodations and student services.

In North America we take it for granted that student services and residence are a responsibility of institutions – who else would do it?  But there are at least two answers to that the private sector could do it, or a public corporation not associated with a particular institution could do it.

If you hang out near universities for any length of time in Australia, for instance, you’ll see plenty of private-sector solutions for student housing. Companies build cheap, small-ish dorms (50-100 occupants) near universities, and rent them to students.  Which university?  Doesn’t matter.  So long as you’re a student, they’ll rent you a small single apartment.  It’s nothing special – IKEA to the max – but for someone looking for something cheap and full of fellow students, it makes a lot of sense.

(Some – but by no means all – of the companies building and operating student residences seem to be international in scope.  I met and chatted with representatives of one such company at NAFSA in Denver earlier this year and for the life of me I cannot figure out how this makes sense.  Every country has its own laws and building codes so where would economies of scale accrue? Still, apparently someone thinks there’s money to be made in this business.)

In Europe, however, there is a different approach: national student services companies.  In Germany, Deutsches Studentenwerk (DSW), a government-funded non-profit, is responsible for a network of student residences scattered across the country an addition to canteens and counseling services at a number of universities.  In France, a similarly-organized CNOUS is responsible for residences alongside things like student exchanges. DSW also provides certain other services for students like Legal Aid and assistance in obtaining student financial assistance (DSW in some ways seems to think of itself at least partially as a protector/champion of student rights somewhat in opposition to the universities and government).  The important thing here is that residence is not tied to enrollment in a particular university.  At a given residence in Munich or Berlin you might find students from one of a number of local institutions.

Now, you can see some real advantages to this.  First, an organization which specializes in student services might be more effective in performing it than one that sometimes views it as tangential to the “main” mission (say, at your average research university).  Second, it might be cheaper and more efficient. In Montreal, why duplicate housing services across UQAM, McGill and Concordia, all of which are within five stops of each other on the city’s Green Line?  Why not stick them all together?

But what’s possible in Europe isn’t always possible in North America.  Until fairly recently in Europe, universities were much more creatures of government than North American ones ever were – having a different state agency take over part of the system was no big deal (hell, Max Planck and CNRS took over most of the research mission so why not student services?) In North America, institutions (some of them, anyway) think of student services as part of their value proposition, an element on which they compete with other institutions.  Until very recently, that notion of inter-institutional “competition” was mostly absent from the European way of thinking.

In truth, the real reason most universities in North America wold be loath to take a European route is that residences matter for alumni donations. Research on this is pretty clear: the shine people take to their university is closely related to the strength of the attachments they form while there.  And though residences aren’t the only place students form attachments, they’re high up the list.  Take residences away from the institutional experiences, and your appeal to alumni is going to be a lot weaker.

But it’s an interesting model to ponder nonetheless.  Makes you think about what the true “boundaries” of a university really are.

March 10

A New Focus for Student Unions

It’s that time of year when student elections are on and occasionally I get asked a question like “what’s the future of student unions?” and “what could student unions be doing better”?  These are good questions. Here’s my answer.

For the most part, student union budgets go into providing “services”.  Often, an awful lot of this ends up simply paying for light, heat and maintenance of student union buildings.  Big chunks also go to managing and overseeing the vast number of clubs.  This is irritating, nitpicky stuff, but it’s what most students actually find useful about student unions, so it’s probably money well-spent.

As a result a fairly small proportion goes in to actually representing students, which is odd since in theory that’s the purpose of student associations.  And the majority of that money goes to the external/government relations portfolio to talk to government.  Only a tiny fraction of funds in student unions goes to representing students inside their institutions.

And yet representing students to their administration is the one thing students can’t count on anyone else to do for them.  Lower tuition?  Any number of outside groups can argue for that.  Actually changing things inside an institution to improve the standard of education?  Only students can do that.

The number one issue for most students today is the fear of not getting a good job after graduation.  There’s not a whole lot student unions can do about that directly, but what they can do is put a lot more pressure on institutions to make sure they are as well-prepared as possible.  They can push institutions to deliver experiential learning.  They can push administrations to look at how to display co-curricular records on transcripts.  They can push faculty and departmental units to engage more with the labour market and adjust teaching and assessment accordingly.  These are all things which student organizations could do (but usually don’t) in a co-ordinated, effective, and meaningful way.

One of the reasons student leaders don’t focus on this area is because victories – when they occur – are so slow in coming.  It’s a rare student politician who can push a change in academic process or planning and expect to see a positive decision within the one-year lifespan of his or her career as an executive.  Student unions, by nature, are after quick hits.  But this is where provincial and national organizations like the Canadian Alliance of Student Associations and the Canadian Federation of Students can play a role: student leaders there have slightly longer tenures and are thus able to focus on longer-term issues.  For instance, in the UK, the National Union of Students puts a lot of work in on helping student associations work on quality assurance within institutions.  Now that’s somewhat easier to do in the UK than here because quality assurance processes are a lot more transparent, public and standardized over there.  But it’s not impossible to imagine it happening here.

Imagine local student unions spending time engaging their members to find out what kinds of outcomes they want from their time in university.  Imagine them spending time translating that into real policy options within the institution.  Imagine national student organizations spending time training people at the local level, teaching them how to understand university administrative and political structures, how to talk “Senate-ese”, and how to be effective champions of curricular change.  Imagine local student organizations putting time and effort into making sure that every student on every periodic review knew how to advocate effectively for change during the review process.

(Actually, if they were smart, universities themselves would get on this effort: increasing the number of students who can make intelligent contributions to university governance activities can really only be to the good).

To sum up: Canadian students have talked for years about access.  Less frequently have they really faced up to the question: access to what?  It’s past time they engaged more on this question, and just as importantly, empowered their members to act effectively in this area.

September 03

One Lens for Viewing “Administrative Bloat”

The Globe’s Gary Mason wrote an interesting article yesterday about the Gupta resignation.  Actually, let me qualify: he wrote a very odd article, which ignored basically everything his Globe colleagues Simona Chiose and Frances Bula had reported the previous week, in order to peddle a tale in which the UBC Board fired Gupta for wanting to reduce administrative costs. This, frankly, sounds insane.  But Mason’s article did include some very eye-opening statistics on the increase of administrative staff at UBC over the past few years – such as the fact that, between 2009-10 and 2014-15, professional administrative staff numbers increased by 737, while academic staff numbers increased by only 28.  Eye-opening stuff.

And so, this seems as good a time as any to start sharing some of the institution-by-institution statistics on administrative & support (A&S) staff I’ve been putting together, which I think you will find kind of interesting.  But before I do that, I want to show you some national-level data that is of interest.  Not on actual staff numbers, mind you – that data doesn’t exist nationally.  However, through the annual CAUBO/Statscan Financial Information of Universities and Colleges (FIUC) survey, we can track how much we pay staff in various university functions.  And that gives us a way to look at where, within the university, administrative growth is occurring.

FIUC tracks both “academic” salaries and “other” (i.e. A&S) salaries across seven categories: “Instruction & Non-Sponsored Research” (i.e. at the faculty level); “Non-Credit Instruction” (i.e. cont. ed); “Library, Computing, and Communications”; “Physical Plant”; “Student Services”; “External Relations” (i.e. Government Relations plus Advancement); and, “Administration” (i.e. central administration).  Figure 1 shows the distribution of A&S salary expenditures across these different categories for 2013-14.  A little over 32% of total money is spent on faculty, while another 23% is spent in central administration.  Physical plant and student services account for about 11% apiece, while the remaining three areas account for 18% combined.

Figure 1: Distribution of A&S Salaries by Function, in 000s of Dollars, Canada, 2013-14

1

 

 

 

 

 

 

 

 

 

 

 

 

A zoom-in on the figures for central administration is warranted, as there has been some definitional change over time, which makes time-series analyses a bit tricky.  Back in 1998, the reporting rules were changed in a way that increased reported costs by about 30%.  Then, in 2003, about 15% of this category was hacked-off to create a new category: “external relations” – presumably because institutions wanted to draw a distinction between bits of central administration that increased revenues, and those that consumed them.  Figure 2 shows how that looks, over time.

Figure 2: Expenditure on Administrative & Support Salaries in Central Administration, in 000s of 2014 Real Dollars, Canada

2

 

 

 

 

 

 

 

 

 

 

 

 

Long story short: from the 80s through to the mid-90s, administrative & support salaries in central administration rose by a little over 3% per year in real terms.  Then, briefly, they fell for a couple of years, before resuming an upward trend.  Ignoring the one-time upward re-adjustment, aggregate A&S salaries in these two areas combined have been rising at 5.3%, after inflation, since 1999.  Which is, you know, a lot.

Now, let’s look at what’s been going on across the university as a whole.  Figure 3 shows changes in total A&S salary paid over time, relative to a 1979 base.  For this graph, I dropped the “non-credit” category (because it’s trivial); for central admin, I’ve both combined it with “external relations”, and corrected for the 1998 definitional change.  Also, for reference, I’ve included two dotted lines, which represent change in student numbers (in red), and change in total academic salary mass (in yellow).

Figure 3: Change in Real Total Academic & Support Salary Spending (1979-80 = 100) by Function, Canada

3

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Since 1979, student FTEs rose 120%, while academic salary mass doubled, after inflation.  A&S spending in libraries and physical plant rose by considerably less than this, by 27% and 57%, respectively.  A&S spending on “instruction” (that is, faculty & departmental offices) rose almost exactly in tandem with student numbers.  Spending on A&S salaries in central admin and in ICT rose about twice as fast as that, ending the 35-year period at three-and-a-half times their original rate.  But the really huge increases occurred in student services, where expenditures on A&S salaries are now six times as high as they were in 1979.

Over the next couple of weeks, I’ll be able to supplement this picture with institutional data, but the key take-aways for now are as follows: i) “central administration” salaries are growing substantially faster than enrolment and academic salary mass, but they represent less than a quarter of total A&S spending; ii) the largest component of A&S spending – that is, those reporting to academic Deans – is actually growing exactly on pace with enrolment; and, iii) the fastest-growing component of A&S spending is student services.  So, there has been a shift in A&S spending, but it’s not entirely to the bad, unless you’ve got a thing against student services.

More next week.

March 13

The “Standard Model” in Aboriginal Services

One of the things I’ve noticed about services provided to Aboriginal students in Canadian PSE is that somehow, Canadian institutions have all arrived at essentially the same model.  Here it is:

The recruitment function: If you’re going to recruit on-reserve, you need someone to visit reserves.  Repeatedly. First Nations students aren’t going to make a multi-year commitment to you unless you visit them, look them in the eye and tell them “you can succeed with us and we’ll do what we can to help you do so.”

The “communicating with band offices” function:  Someone has to fill in the progress reports to bands so that students can continue to receive PSSSP support.  Occasionally, someone also has to dun the bands so they’ll actually pay the PSSSP monies owing.

The counseling function:  Where Aboriginal students are mostly from urban areas, this is pretty basic: someone who can do a bit of academic and personal counseling, perhaps linked with some academic and career support as well.  But where you have large numbers of students from fly-in communities, this function becomes much more about healing and dealing with extreme trauma (in some institutions it’s relatively common to hear of students interrupting their term because of the death or suicide of a family member).  These students also have serious issues regarding adjustment to urban life.  Few have ever paid rent, many have never taken a bus – overall, the transition is overwhelming. Counseling support for these students is actually seriously underfunded.

The academic support function: There are a number of institutions that have created specialized academic support for Aboriginal students. In some cases, it’s to bring kids from communities with weak secondary schools – again, mostly fly-in communities – up to a grade 12 level (it would be better to have specialized bridge programs, but PSSSP unfortunately doesn’t fund those).  In others, it’s about providing extra support for students going into professional programs (e.g., the University of Manitoba’s ACCESS program).

The social function: This involves programming Aboriginal activities – bringing elders and other speakers to campus, arranging feasts and pow-wows. Indirectly, this is about persistence – since these events attract Aboriginal students who might not come forward to ask for services, they are a means to identify clients for future assistance.

And finally, there’s space – a separate place for Aboriginal students to congregate. Sometimes this is done brilliantly (FNUC, UVic) and sometimes it’s abysmal (Lakehead).  Put all this together, and you have the model suite of Aboriginal student services.

Does it work?  There’s not much good evaluative research, though some of it was validated through the LE,NONET Project. But it’s what knowledgeable front-line workers tend to recommend, which is a good recommendation in itself.

November 18

Can You Build Your Way to Happiness?

With a half dozen universities currently planning upgrades to their athletics facilities, it’s worth asking the question: what’s the impact of these things on student satisfaction?

(Yes…we know…satisfaction isn’t everything. But it’s not nothing, either. And it has the singular value of being measurable, so…onwards!)

We have two recent case studies here. In 2009, Queen’s completed a new $230 million athletics complex, while in 2010, Trent completed an $18 million renovation to its own athletics building. What kind of effects did these renos have on satisfaction?

On our nine-point satisfaction scale, Queen’s saw a 3.4-point jump in satisfaction with Athletics facilities after completion of the new building; Trent saw a 2.3-point bump after its renovations were done. Clearly, it’s not dollars alone that push satisfaction – Trent got 0.126 points of satisfaction per million dollars spent, while Queen’s only got 0.015, which is an order of magnitude of difference.

But that’s just satisfaction with facilities. What about overall satisfaction with recreational and athletic programs themselves? It turns out these see a bump, too, but it’s not as large: the bump is about 1.7 (out of 9) at Queen’s and 1.3 at Trent.

Let’s take this still further. Satisfaction with athletic buildings and facilities is one of a number of buildings and facilities questions we ask. How much satisfaction “flows through” to overall satisfaction with buildings and facilities?

Answer: Not much. While both universities see an increase in overall satisfaction with buildings and facilities, Queens’ increase is small (about 0.22) and not out of line with the increase that Queen’s saw the previous year. Trent does have an anomalous bump of 0.30, which is more than one would expect from statistical noise.

Finally, let’s ask the big question – do these investments have a clear impact on overall satisfaction with the educational experience at these schools?

Answer: No – or, at least, not enough to stand out amidst all of the other factors that affect students’ satisfaction from year to year. Both schools actually saw small decreases in overall satisfaction in the years that the projects are completed.

In sum, it doesn’t seem like you can build your way to student satisfaction: students can’t be bought quite that easily. It would be interesting to have a counter-factual to Queen’s in order to find out what happens if you stick with an old, run-down athletics building and spend $230 million on decreasing class sizes or improving pedagogy instead. Our guess is the effect would be much more dramatic.

Maybe one day we’ll get a chance to try that out.

Alex Usher and Jason Rogers