HESA

Higher Education Strategy Associates

August 18

Credit-Transfer, Korean Style

There are not many genuinely unique ideas in higher education.  Today, we at HESA are releasing a paper about one of them: Korea’s Academic Credit Bank System (ACBS), available here.

Korea has long had a problem with credit transfer.  Its higher education institutions are fairly rigid in terms of admissions, and few like accepting transfer students.  Another big problem is that Korean males have to do two years (roughly – it depends on the service) of universal military service, and they tend to do it before they’ve finished their higher education. People wanting to combine credits taken from a variety of institutions while on duty were usually out of luck if they tried to get those credits counted at their old institution.

So, in the 1990s, the Korean government started experimenting with new forms of educational credential.  Their first attempt was to create something called the Self-Study Bachelor’s Degree, in which Koreans could basically take a bunch of challenge exams on their own, and obtain a credential.  But the pick-up rate on this was fairly low (lessons here for people who thought MOOCs would create a tsunami of change, had they cared to check), and so the search for solutions was back on.

Eventually, they hit on the answer: if we can’t get existing institutions to agree on transfer credit, why not create a new institution that specializes in granting transfer credit?  In fact, why not create an institution that doesn’t offer any credit of its own, but just develops degree standards and tells people what courses to take in order to meet them?  No more schools telling you: “yeah, you’ve already taken that methods pre-req class, but you haven’t taken our methods pre-req class, so you’ll need to do that material again” – if the class gets accredited by the ACBS, you can use it as a building block to an ACBS degree.

Does it work?  You might say that.  The ACBS now awards roughly 8% of all degrees in South Korea, second only to the Korea National Open Univerisy (KNOU).  It has curricula for over 200 individual fields of study, and increasingly, the ACBS target audience is older and more female, with more and more degrees coming in areas like social work and early childhood education.

Would the ACBS work well outside Korea?  Hard to say: it’s not clear, for instance, that a Canadian ACBS would get a lot of clients.  Yes, the Canadian credit transfer system outside British Columbia and Alberta (which have quite developed transfer mechanisms) is a less a “system” than a barrel full of ad-hoc solutions.  But as I showed back here and here, this ad-hoc-ery  actually works-out okay in practice.  So while a Credit Bank might be a much cleaner solution than the piecemeal kind of approach Ontario has taken through ONCAT, it may also be overkill.

Where this approach might make even more sense is in the US, where, as Cliff Adelman and others have shown, there are millions of people with unfinished degrees who want to get a degree, but who are worried about the time and expense of having to re-start their studies.  The main barrier, really, is: how do you get something like that past accreditation agencies?  But given the potential benefits, it’s a problem worth thinking about.

August 11

Improving Career Services Offices

Over the last few years, what with the recession and all, there has been increased pressure on post-secondary institutions to ensure that their graduates get jobs.  Though that’s substantially the result of things like curriculum and one’s own personal characteristics, landing a job also depends on being able to get interviews and to do well in them.  That’s where Career Services Offices (CSOs) come in.

Today, HESA released a paper that looks at CSOs and their activities.  The study explores two questions.  The first question deals specifically with university CSOs and what qualities and practices are associated with offices that receive high satisfaction ratings from their students.  The second question deals with college career services – here we did not have any outcome measures like the Globe and Mail, so we focussed on a relatively simple question: how does their structure and offerings differ from what we see in the university sector?

Let’s deal with that second question first: college CSOs tend to be smaller and less sophisticated than those at universities of the same size.  At first glance, that seems paradoxical – these are career-focussed organizations, aren’t they?  But the reason for this is fairly straightforward: to a large extent, the responsibility for making connections between students and employers resides at the level of the individual program rather than with some central, non-academic service provider – a lot of what takes place in a CSO at universities takes place in the classroom at colleges.

Now, to universities, and the question: what is it that makes for a good career services department?  To answer this question we interviewed CSO staff at high- medium- and low-performing institutions (as measured by the Globe and Mail’s pre-2012 student satisfaction surveys) to try to work out what practices distinguished the high-performers.  So what is it that makes for a really good career services office?  Turns out that the budget, staff size, and location of Career Services Offices aren’t really the issue.  What really matters are the following:

  • Use of Data.  Everybody collects data on their operations, but not everyone puts it to good use.  What distinguishes the very best CSOs is that they have an effective, regular feedback loop to make sure insights in the data are being used to modify the way services are delivered.
  • Teaching Job-seeking Skills.  Many CSOs view their mission as making as many links as possible between students and employers.  The very best-performing CSOs find ways to teach job search and interview skills to students, so that they can more effectively capitalize on any connections.
  • Better Outreach Within the Institution.  It’s easy to focus on making partnerships outside the institution.  The really successful CSOs also make partnerships inside the institution. One of the key relationships to be nurtured is academic staff.  Students, for better or for worse, view profs as frontline staff and ask them lots of questions about things like jobs and careers.  At many institutions, profs simply aren’t prepared for questions like that, and don’t know how to respond.  The best CSOs take the time to reach out to staff and partner with them to ensure they have tools at their disposal to answer those questions, and to direct students to the right resources at the CSOs.

If you want better career services, there’s your recipe.  Bonne chance.

July 28

Coming Soon to Ontario: a British Columbia Solution

Unless you’ve been out of the country, or under a rock, for the last couple of years, you’ll be at least vaguely familiar with the concept that the province of Ontario is broke.  So broke, in fact, that it has departed radically from previous practice and, back in 2012, effectively froze physicians’ pay for two years.  Not individual physicians, of course – but on aggregate.  A zero overall increase.  And the government is now working to try to extend this freeze for another couple of years.

That makes good sense.  Health care costs have historically risen at several percentage points above inflation, and have squeezed other parts of the provincial budget (education has held out OK, all things considered, but other budget lines have been less well-protected), and physician costs are a major item in the health care budget.  And so saving money in this area is to be welcomed, even if it is at the expense of physicians – who are a politically tricky group to offend.

So what might the provincial government think of the post-secondary sector handing its employees raises of 3-4% per year, on aggregate?  Do you think aggrieved doctors won’t point to this anomaly during this pay round?  Can you imagine any possible rejoinder the government could offer that would make the least bit of sense?

Don’t you think maybe this situation is about to come to a rather sudden end?

For whatever reason, universities in Ontario have not been able to resist rising wage pressure from full-time faculty.  Despite money getting tighter, they have felt compelled to sign agreements that are plainly beyond their means (Hel-LO, University of Ottawa!).  Some university presidents, unable to deal with this problem on their own, are saying that they need government to step in to play “bad cop”.

What does the “bad cop” look like?  Well, take a look at how BC has handled wage negotiations over the last few years.  There, universities (and all other public sector entities) are given a biennial mandate with respect to employee negotiations.  In 2010 and 2011, the deal was: negotiate what you like, but the net cost of any new compensation deal cannot exceed zero.  In 2012 and 2013, that was changed to: there could be increases, but they had to be balanced dollar-for-dollar with efficiency savings in other areas.  The 2010-11 arrangements, in conception at least, are close to the approach Ontario has taken with its physicians.  And it worked pretty well, at least in the sense that it has kept costs down with a minimum of political fuss.

Not everyone in Ontario agrees that the province needs to take the bad cop approach.  Quite a number of university presidents (you can probably guess which ones) oppose asking the province to legislate on their behalf; not only do they feel it morally incumbent on universities to manage their own books, but also it is tactical suicide to ask governments to legislate – once you go down that road, there’s no telling where they’ll stop micromanaging universities’ affairs.

The problem is, those presidents haven’t exactly been too successful at reigning-in costs, either (hello again, University of Ottawa!).  So it seems almost inevitable that some sort of BC-like solution is on the cards.  The idea that profs could gain 15% in salary over four years, while physicians get zero, simply isn’t politically tenable.

July 21

University of Saskatchewan Detritus

We all remember this spring’s controversy at the University of Saskatchewan over the firing of Robert Buckingham, which resulted in the resignation of the University’s Provost, Brett Fairbairn, and the firing of the President, Ilene Busch-Vishniac.  Despite all the coverage, a number of key questions were never answered, like “how could anyone possibly think firing a tenured professor was a good idea?”  And, “who’s idea was it to fire him anyway – the Provost’s or the President’s?”

We now have more insight, as Fairbairn recently released a five-page letter providing his perspective on events.  Two key points from his account:

  • The decision to fire Buckingham as Dean was a group decision.  The Provost, “leaders responsible for Faculty Relations, HR internal legal expertise and Communications”, and the President (by phone) were all present.  But the key question of whether to dismiss him from the university altogether was referred to HR for further study.  At this point Busch-Vishniac told Fairbarin: “I will stand behind any actions you deem necessary and will not second-guess”.
  • The decision to fire him from both jobs was the HR department’s recommendation.

How HR came to this conclusion isn’t clear; Fairbairn notes that it had happened before at U of S in a case where there had been an irreparable breakdown in relations between employer and employee. Without knowing the case to which he’s referring, it’s hard to know what to make of this.  Certainly, the employer-employee relationship with Buckingham as a dean was irreparably damaged (which is why they were correct to fire him); it’s not at all clear that he couldn’t have remained as a faculty member since he wouldn’t have had any real contact with any of the superiors whose trust he had abused as Dean.  For whatever reason, Fairbairn decided to take the “expert advice” from HR, and did so without looping back to the communications people to get their input (which might have been valuable) or checking with Bush-Vishniac.

Far from backing up Fairbairn as promised, Busch-Vishniac threw him under the bus and asked for his resignation three days later.  That was emphatically the wrong call.  From the moment she gave the go-ahead for Buckingham’s dismissal, it was clear that either both of them would stay, or neither would.  Fairbairn decided to go, astutely noting that “the only thing worse than blame and recrimination among senior leaders is mutual recrimination among senior leaders”.

Fairbairn’s letter is a valuable peek into how crises get managed at universities.  I think it shows him as a manager with mostly the right instincts, but who erred in accepting some terrible advice from professionals who should have known better.  Others – mostly people who genuinely have no insight into how major organizations function – will probably see this distinction as irrelevant since the real crime was firing Buckingham as a Dean in the first place.  Former CAUT director James Turk, in particular, has made the “managers should have a right to criticise each other publicly” case – to which the correct response is: “and how much freedom did Turk allow his staff and executive to criticise his management as CAUT director?”.

If I were at the University of Saskatchewan, though, my main question after reading Fairbairn’s letter would be: “how is it that the HR department got off comparatively lightly?”  Food for thought.

July 14

Paul Cappon, Again

You may have noticed that Paul Cappon – former President of the Canadian Council of Learning – had a paper out last week about how what the country really needs is more federal leadership in education.  It is desperately thin.

Cappon starts by dubiously claiming that Canada is in some sort of education crisis.  Canada’s position as a world-leader in PSE attainment is waved away thusly: “this assertion holds little practical value when the competencies of those participants are at the low end compared with other countries”.  In fact, PIAAC data shows that graduates of Canadian universities, on average, have literacy skills above the OECD average.  Cappon backs up his claim with a footnote referencing this Statscan piece, but said document does not reference post-secondary graduates.  Hmm.

And on it goes.  He cites a Conference Board paper putting a $24 billion price tag on skills shortages as evidence of “decline”, even though there’s no evidence that this figure is any worse than it has ever been (Bank of Canada data suggests that skills shortages are always with us, and were worse than at present for most of the aughts), or that a similar methodology would not lead to even bigger figures in other countries.  He cites a low proportion of STEM graduates in engineering as cause for concern, despite a total lack of evidence that this percentage has anything at all to do with economic growth.

Ridiculously, he proclaims that the Canadian education system (not higher education – all education from kindergarden onwards) is less effective than the Swiss because they beat us on some per capita measures of research output (which are of course largely funded through pockets entirely separate from the education budget).  Yes, really.  For someone so in favour of evidence-based policy, Cappon’s grasp on what actually constitutes evidence is shockingly tenuous.

Having cobbled together a pseudo-crisis, Cappon concludes that “the principal cause of our relative regression is that other countries are coordinating national efforts and establishing national goals… Canada, with no national office or ministry for education, is mired in inertia, each province and territory doing its best in relative isolation.” Now there is no evidence at all that the existence of national systems or national goals makes a damn bit of difference to PIAAC outcomes.  Germany comes in for all sorts of praise because of its co-operation between national and state governments, but their PIAAC and PISA outcomes are actually worse than Canada’s.  Yet Cappon believes – without a single shred of evidence – that if only there were some entity based in Ottawa that could exercise leadership in education that everything would be better.

Two points: first, our nation rests on a very simple compromise.  Back in 1864, the *only* way in which Catholic, Francophone Lower Canada could be tempted into agreeing to a federal government with representation by population was if that new level of government never, ever, ever got its hands on education.  That was, and is, the deal.  It’s not going to change.  Period.

Second, everyone needs to remember that there was a time not long ago that the Director-General of CMEC suggested exactly such a body to some deputy ministers in Ottawa, and Ottawa created a body not dissimilar to what Cappon is describing.  But said CMEC DG didn’t tell the provinces he was negotiating this deal with the feds.  When he was offered the leadership of the new organization, he jumped, taking several CMEC senior staff with him.  The result was that provinces, feeling betrayed by the DG’s chicanery, refused to work with the new organization.  As a result, it achieved little before being shut down in 2011.

That DG, of course, was Paul Cappon.

This report is essentially a self-justification for a dishonest act performed nearly a decade ago, rather than a genuine contribution to public policy.  If Cappon actually cared about repairing federal-provincial relations around learning, a mea culpa would have been more appropriate.

July 07

How to Measure Teaching Quality

One of the main struggles with measuring performance in higher education – whether of departments, faculties, or institutions – is how to measure the quality of teaching.

Teaching does not go entirely unmeasured in higher education.  Individual courses are rated by students through course evaluation surveys, which occur at the end of each semester.  The results of these evaluations do have some bearing on hiring, pay, and promotion (though how much bearing varies significantly from place to place), but these data are never aggregated to allow comparisons of quality of instruction across departments or institutions.  That’s partly because faculty unions are wary about using individual professors’ performance data as an input for anything other than pay and promotion decisions, but it also suits the interests of the research-intensive universities who do not wish to see the creation of a metric that would put them at a disadvantage vis-a-vis their less-research-intensive brethren (which is also why course evaluations differ from one institution to the next).

Some people try to get around the comparability issue by asking students about teaching generally at their institution.  In European rankings (and Canada’s old Globe and Mail rankings), many of which have a survey component, students are simply asked questions about the quality of courses they are in.  This gets around the issue of using course evaluation data, but it doesn’t address a more fundamental problem, which is that a large proportion of academic staff essentially believes the whole process is inherently flawed because students are incapable of knowing quality teaching when they see it.  There is a bit of truth here: it has been established, for instance, that teachers who grade more leniently tend to get better course satisfaction scores.  But this is hardly a lethal argument.  Just control for average class grade before reporting the score.

It’s not as though there isn’t a broad consensus on what makes for good teaching.  Is the teacher clear about goals and expectations?  Does she/he communicate ideas effectively?  Is he or she available to students when needed?  Are students challenged to learn new material and apply this knowledge effectively?  Ask students those kinds of questions and you can get valid, comparable responses.  The results are more complicated to report than a simple satisfaction score, sure – but it’s not impossible to do so.  And because of that, it’s worth doing.

And even the simple questions like “was this a good course” might be more indicative than we think.  The typical push-back is “but you can’t really judge effectiveness until years later”.  Well, OK – let’s test a proposition.  Why not just ask students about a course they took a few years ago, and compare it with the answers they gave in a course evaluation at the time?  If they’re completely different, we can indeed start ignoring satisfaction types of questions.  But we might find that a good result today is in fact a pretty good proxy for results in a few years, and therefore we would be perfectly justified in using it as a measure of teaching quality.

Students may be inexperienced, but they’re not dumb.  We should keep that in mind when dismissing the results of teaching quality surveys.

June 30

The Effects of Tuition Fees (Part 2)

As I mentioned last week, a major paper I’ve been working on for over a year with colleagues from DZHW on the subject of the effects of fees was published last Monday by the EC (available here).  In my last post, I talked about how fees affected institutions – today, I want to talk about how they affect students.

In our report, we looked at case studies over 15 years (1995-2010) from nine countries – Austria, Canada, England, Finland, Germany, Hungary, Poland, Portugal, and South Korea.  These countries represent very different experiences.  Some have private higher education, others don’t.  Some have public sectors that change fees, while others don’t (Hungary and Poland are in the middle, where public universities provide education free for some while charging for others).  And looking across all the different cases, we found = the following:

1)      Most tuition increases have no perceptible effect on enrolment.  The only cases where clear-cut effects could be discerned was in England in 2012, where the increase was about $10,000 in a single year, and in the de-regulation of professional fees in Ontario in the mid-90s (where, bizarrely, low-income students were not affected, but middle-income students were).

2)      That said, there are also some clear-cut cases where tuition has been a driver of increased access.  In both Poland and South Korea, major increases in enrolments were driven by the existence of fee-supported places (mainly but not exclusively in private institutions).

3)      Though this is partly a matter of many countries having education data-sets that make Canada’s look enviable, there is very little evidence that changes in fees have done much to change the composition of the student body.  In every country where there is data, underrepresented groups have done better over time, regardless of the fee regime.  Even in the extreme case of England 2012, under-represented groups (the poorest income quintile, black and Asian students) tended to be less affected by the tuition increase than richer, whiter students. (The one exception here is older students, who were disproportionately affected by the changes.)  To the extent that late-entrants to higher education come from poorer backgrounds, this should be seen as a kind of hidden socio-economic effect of fees.

4)      Changes in tuition fees seem to have had no discernible effects on students’ choice of major, and few discernible effects on students’ decisions about where to study (in Canada, for instance, rates of out-of-province study are actually up over the last decade).

5)      It is not so much that fees themselves have no effect; rather, it is that in nearly all cases, fees are introduced with accompanying increases in student aid.  Sometimes it is paltry compared to the size of the fees required (e.g. South Korea in the 90s), sometimes it is implemented in a fairly clunky way (Germany, mid-2000s).  But it is always there to offset the worst effects of fees.  And in the case of England 2012, it was there to ensure that students weren’t required to pay a single extra penny of costs up-front, which seems to have had a major factor in limiting the impact of the world’s largest-ever tuition increase (in the short-term at least).

The lesson here?  Unless you’re planning on going England-style crazy, international evidence fee increases are unlikely to affect access in a measurable way.

June 23

The Effects of Tuition Fees (Part 1)

For the last eighteen months or so, I’ve been working on a project with colleagues Dominic Orr and Johannes Wespel of the Deutsche Zentrum für Hochschul- und Wissenschaftsforschung (DZHW) for the European Commission, looking at the effects of changes in tuition fees and fee policies on institutions and students.  The Commission published the results on Friday, and I want to tell you a little bit about them – this week I’ll be telling you about the effects on institutions, and next week I’ll summarize the results with respect to students.

The first question we answered had to do with whether or not a rise in tuition ultimately benefits higher education institutions.  Critics of fees sometimes suggest that extra fees do not in fact result in institutions receiving more money, because governments simply pull a fast one on the public and withdraw public money from the system, thus leaving institutions no better off.  Our examination of nine case studies revealed there were certainly some occasions where this was the case – Canada in the mid-90s, Austria in 2001, and the UK in 2012 – but that in the majority of cases fee increases were accompanied by stable or increased government funding.  Moreover, in all the cases where there was an accompanying decrease in public funding, it was signalled well in advance by governments, and indeed the increase in fees was deliberately designed to be a replacement for public funds. We did not find a case where a government “pulled a fast one”.

The second question we asked was how universities reacted to the introduction of fees: did they suddenly start chasing money and becoming much more sensitive to the demands of students and donors?  The answer, by and large, was no, for three reasons.  First, tuition isn’t the only financial incentive on offer to institutions; particularly if they are already funded on a per-student basis, the introduction or increase of fees isn’t likely to change behaviour.  Second, institutions won’t go after fees in ways that they think will negatively affect their prestige.  In Germany for instance, many universities have considerable latitude to raise income via teaching through continuing-education-like programs, but effectively they don’t do this, because they believe that engaging in that sort of activity isn’t prestige-enhancing.  And third, institutions often delay altering their behaviour too much because they don’t believe government policy will “stick”.  In Germany, specifically, the feeling was that the introduction of fees was unlikely to last and so there was no point in getting too invested in attracting new students to take advantage of it.

In fact, although fees in public institutions are often touted as a way to make universities more flexible and more responsive to business, the labour force, etc., this never actually works in reality, because universities are saddled with enormous legacy costs (you can close a program, but you still have to pay the profs), and have a particular self-image that means  they closely-tied to traditional ways.  What does seem to work – at least to some degree – is to allow the emergence of new types of higher education institutions altogether.  In Poland, it was only the emergence of private universities that allowed the system to take on the explosion of demand in the 1990s.  In Finland, an entirely new type of higher education institution (ammattikorkeakoulu or “Polytechnics”) was developed to take care of applied education, and accounted for 80% of all enrolment growth since 1995.

Next week: the effects on students.  See you then.

June 16

Summer Reading

Hi all.  Enjoying summer yet?

Three recent works that I think are worth a peak at over the summer:

1.       George Fallis’ Rethinking Higher Education: Participation, Research and Differentiation.  The thing you need to know about George Fallis is that the size of the books he writes are all out of proportion to the point he is trying to make.  They’re good books, substantial books, useful books, but the actual point he makes could probably be made in an article of 15 pages or so.  And so it is here: this is a pretty good all-around 250 page look at the Ontario university system.  And at the end of the day he makes two original points: 1) demographic change means that there probably isn’t too much growth left in the Ontario system, so we should stop framing policy in terms of access; and, 2) what Ontario really needs is a policy on university research and graduate studies, rather than allowing them to continue to grow haphazardly as afterthoughts to undergraduate enrolments. I’m not entirely sold on the first proposition (there’s scope from growth in people switching from colleges and universities, and there’s scope to grow from continued international enrolments), but he’s absolutely bang-on for the second one.  As a recommendation for policy it’s so far from current government practice that it’s basically in another time-zone, but it’s a point that needed to be made.

2.       Just released last week by The Council of Ministers of Education, Canada was a fascinating report entitled The Role of Education Agents in Canada’s Education Systems.  Authored by Robert Coffey and Leanne Perry of Michigan State University, the report provides everything from a survey of education providers about their use of education agents (less enlightening than you’d think), to an international comparison of rules and codes of conduct regarding the use of agents (Canada is apparently more of a wild west than, say, Australia or the UK), to a really top-notch discussion of why and how institutions use agents in the first place, and how misconduct occurs and is dealt with.  There’s little that’s earth-shattering in here, but as a primer on an increasingly important topic, it’s well worth a read.

3.       An edited volume of works on Australian higher education called The Dawkins Revolution 25 Years On (not available in Canada, but can be ordered directly through Melbourne University Press).  Back in 1988, the country’s Education Minister, John Dawkins, brought in a series of changes (converting colleges into universities, introducing a limited amount of competition into funding, and introducing student charges via the Higher Education Contribution System) that fundamentally restructured higher education in a way rarely seen anywhere in the world, let alone Australia.  This excellent book of essays from people like Simon Marginson, Bruce Chapman, Gavin Moodie, Julie Wells ,and Andrew Norton both traces the development of the Dawkins agenda and cogently explains its impacts over the subsequent 25 years.  It’s the kind of book you read and wonder “why can’t we write this kind of stuff in Canada”?  Part of the answer to that, of course, is that our system is much more fractured nationally and less amenable to single narratives than is Australia’s; part of it too is that we’ve never had anything as interesting as Dawkins’s reforms to write about.  But even so, our attempts at similar essay-collections (for example, Higher Education in Canada and A Challenge for Higher Education in Ontario fall short of the standard set here.  This book deserves a wide readership.

June 12

Arigato, Sayonara

With election night, the World Cup getting under way, and tomorrow being Friday the 13th, it seems like as good a time as any to shut down the blog for the summer – I apologize to those of you who were hoping for one last rant on election results (though you can probably catch my thoughts over on twitter, where my handle is @AlexUsherHESA).  Starting Monday, this blog will be on a once-a-week schedule until August 25th,  when normal daily service will resume.

We’ve got an interesting program of research going on at HESA Towers over the summer, and we’ll have some very interesting material for you come the Fall, especially around internationalization.  I’ll be spending a  bit of time Down Under to get the skinny on some of the big changes currently happening there in higher education, and will report the findings back to you.  And we’ll have some new work on affordability that I think you’ll rather enjoy (or, at least those of you who don’t hate-follow me will enjoy it).

But before I sign off, two very quick requests:

1)      My summer reading project is to read as many institutional histories as possible (thrill a minute at HESA Towers, never a dull moment).  If you know the name and author of your university or college’s history, could you send them my way so I can find a used copy online?

2)      You guys read my stuff every day (well, most days, anyway).  Are there subjects you want to hear more about?  Less?  If you can take a couple of minutes to let me know, I’d really appreciate any feedback you can take the time to provide.

Have a great summer, everyone.

Page 1 of 6212345...102030...Last »