On Saturday morning, the US Department of Education released the College Scorecard. What the heck is the College Scorecard, you ask? And why did they release it on a Saturday morning? Well, I have no earthly idea about the latter, but as for the former: it’s a bit of a long story.
You might remember that a little over a year ago, President Obama came up with the idea for the US Government to “rate” colleges on things like affordability, graduation rates, graduate earnings and the like. The thinking was that this kind of transparency would punish institutions that provided genuinely bad value for money by exposing said poor value to the market, while at the same encouraging all institutions to become more attentive to costs and outcomes.
The problem with the original idea was three-fold. First, no one was certain that the quality of available data was good enough. Second, the idea of using the same set ratings for both quality improvement and to enforce minimum standards was always a bit dicey. And third, the politics of the whole thing were atrocious – the idea that a government might declare that institution X is better than institution Y was a recipe for angry alumni pretty much everywhere.
So back in July, the Administration gave up on the idea of rating institutions (though it had been quietly backing away from it for months); however, it didn’t give up on the idea of collecting and disseminating the data. Thus, on Saturday, what it released instead was a “scorecard”; a way to look up data on every institution without actually rating those institutions. But also – and this is what had nerds in datagasm over the weekend – it released all of the data (click “download all data” here). Several hundred different fields worth. For 20 years. It’s totally unbelievable.
Some of the data, being contextual, is pretty picayune: want to know which institution has the most students who die within four years of starting school? It’s there (three separate branches of a private mechanics school called Universal Technical Institute). But other bits of the data are pretty revealing. School with the highest average family income? (Trinity College, Connecticut.) With the lowest percentage of former students earning over $25,000 eight years after graduation? (Emma’s Beauty Academy in Mayaguez, PR.) With the highest default rates? (Seven different institutions – six private, one public – have 100% default rates.)
Now, two big caveats about this data. The first is that institutional-level data isn’t, in most cases, all that helpful (graduate incomes are more a function of field of study than institution, for instance). The second caveat is that information around former students and earnings relate only to student aid recipients (it’ a political/legal thing – basically, the government could look up the post-graduation earnings for students who received aid, but not for students who funded themselves). The government plans to rectify that first caveat ahead of next year’s release; but you better believe that institutions will fight to their dying breath over that second caveat, because nothing scares them more than transparency. As a result, while lots of the data is fun to look at, it’s not exactly the kind of stuff with which students should necessarily make decisions (a point made with great aplomb by the University of Wisconsin’s Sara Goldrick-Rab.
Caveats aside, this data release is an enormous deal. It completely raises the bar for institutional transparency, not just in the United States but everywhere in the world. Canadian governments should take a good long look at what America just did, and ask themselves why they can’t do the same thing.
No… scratch that. We ALL need to ask governments why they can’t do this. And we shouldn’t accept any answers about technical difficulties. The simple fact is that it’s a lack of political will, an unwillingness to confront the obscurantist self-interest of institutions.
But as of Saturday, that’s not good enough anymore. We all deserve better.