Sierra Magazine has released its third annual “Cool Schools” list, which it calls a “comprehensive guide to the most eco-enlightened U.S. universities.” In principle, generating this kind of data is a worthwhile goal, since it is helpful for current and prospective students, staff and faculty to have good information. But after looking at what they have presented online, I’m left feeling distinctly unenlightened.
Let’s begin with the fact that the rankings are based on surveys sent to the schools. So, the first flaw is self-reporting. There may be no cost-effective way around this, since even the Association for the Advancement of Sustainability in Higher Education’s Sustainability Tracking, Assessment & Rating System (STARS) uses self-reporting. Nonetheless, self-reporting as the sole source of information means that the results should be read with a grain of salt.
Second, aside from mentioning that they “e-mailed a lengthy questionnaire to sustainability experts at hundreds of schools,” the report says nothing about the specific contents of the survey or the methods used to analyze the results. Without at least some transparency, how can readers evaluate Sierra Magazine’s (and for that matter, the schools’) claims? It’s not as if the magazine doesn’t have room online for these details. I guess we’re going to need some rock salt.
Next up is the fact that the “guide” appears to be nothing more than a table of numbers representing scores in each of eight areas of campus sustainability. It can hardly be called “comprehensive” when there is no qualitative data about the specifics of each school’s sustainability initiatives. To be fair, the web site does include brief reports from students at six of the schools in which they “cheer their college’s unique conservation efforts.”
OK, so what of the numbers? Yikes, someone needs to go back and take math over again… Third grade math that is. You know, addition. Have a look at this table of the magazine’s Honor Roll of top 20 schools.
Notice anything not quite right? Like the fact that the rows don’t add up to the “scores” in the last column? Let’s look more closely. The rankings have 8 categories scored out of 10 points, and up to 5 bonus points, which I calculate (8×10=80+5=85) to mean a maximum possible score of 85. If we add up the top-ranked University of Colorado at Boulder’s numbers, we get a total score of 69, not 100.
How can this make sense? Maybe it’s a percentage. OK, 69÷85=81%. Nope, I guess it’s not that (actually, I knew it wasn’t). What’s the explanation? All it says on the magazine’s web page is that “the top 20 were calibrated to create an out-of-100 rating system for publication in the magazine.” Oh, of course, but what the heck does “calibrated” mean? Did they put the scores on a curve? Doesn’t look like it. Um, pass the salt, please…
If we look at the complete listings page, where 135 schools are ranked, we get a somewhat more sensible picture. There, the rows do add up properly. And there, the University of Colorado at Boulder gets a real score of 69. Comparing that table to the honor roll table, we can discern that “calibrated” seems to mean the editors added about 30 points to each score. But why?
Why give a school that actually scored 81%, a grade of 100% and an A+? Why take these top 20 and artificially inflate their scores to make it look as if they are exemplars of sustainability? I have nothing against the University of Colorado at Boulder, but I highly doubt that it has no room for improvement on the path to sustainability. But that’s exactly what a score of 100% or A+ suggests.
In my classes, students have to really earn their grades. What Sierra Magazine is doing is a kind of inexplicable grade inflation that distorts readers’ understanding and exagerates progress towards sustainability in higher education. And that’s not particularly helpful.
If they submitted this to me, I would give them a do-over and insist they provide details of their methods and a clear explanation of their interpretation of the data and its significance. What, they didn’t think any academics would read this? Let’s hope next year’s report is an improvement. Even better, it’s not too late to supplement this year’s report.