Re-posted from Campus Morning Mail with permission.
Last week I pledged to work for any Australian university for free if they abandoned league tables. So far no university has been bold enough to take up my offer.
I am one of many academics who have criticised league tables using science, demonstrating their many statistical flaws and negative consequences. But all this scientific criticism has had no impact. Instead, the number of league tables is growing and so is their influence. Hence my attempt to pierce the ballooning league tables with humour.
I wish I could raise the stakes by gradually adding willing colleagues to the deal. How many professors would it take for a university to bite? A whole statistics department? This people auction would tell us how much universities value actual brains over the nebulous value of a good ranking.
Any valuation universities put on league tables would be wrong, unless it was negative. League tables cost universities money in the time they spend trying to climb the rankings, and cause harm in the wonky decision making they inspire. Universities would be better taking their league tables budget and spending it on some actual tables where actual students could sit.
The kool-aid wisdom is that higher rankings in league tables bring in international students and hence vital earnings. I think this greatly underestimates the intelligence of potential students.
Ironically, by the end of their degree I would hope that most students who completed a degree in the mathematical sciences would finish with the skills to see the numerical incontinence of university league tables. Students graduating in marketing might appreciate the wheeze. But both should understand that they were duped.
One of many massive ironies of the league table racket is that when students have a bad experience at a “high ranking” university they might complete their satisfaction surveys as if everything was awesome. Why? Because their surveys are used in the rankings and they want to be able to show potential employers that they graduated from a high ranking university.
If there was a league table of Goodhart’s law, which states “All metrics of scientific evaluation are bound to be abused”, then university league tables would be ranked number 1.
The cuckoo idea of league tables has made its way into every Australian nest. And our universities now squawk loudly how they are the best and biggest cuckoo.
Worst still, the cuckoos are breeding. There’s now a league table for the United Nations’ Sustainable Development Goals. What a disaster for making real progress in that vitally important area, as I expect this field will soon see an increase in gaming and fraud to increase rankings.
Can we overturn these bolted down tables?
There are new league tables where universities won’t want to be top. Such as Ben Goldacre’s TrialsTracker that monitors funded clinical trials that have failed to report their results. It would be relatively easy to rank Australian universities on this indicator of research waste and expand it to all NHMRC and ARC funded research.
There’s been a recent boom in algorithms that can “read” published papers and accurately assess things like transparency and methodological rigor. It is possible to examine all the papers produced by every Australian university and give an overall assessment and ranking of their transparency and good research practice.
But these automated tables are vulnerable to universities making cosmetic changes without making real improvements. A more radical idea is to audit universities to truly assess their quality. For example, for every Australian university a random sample of 100 papers published 10 years ago is made and these papers examined in detail to see whether the work has been translated into practice or had some other positive impact. That would give a true measure of research impact and something worth ranking.
It is impossible to game the scattergun of a random sample and universities would have to think about the overall quality of the research they produce.
League tables claim to rank quality, but their data and methods are deeply flawed. If we really want to know the quality of Australian universities then we need to spend some money to gather worthwhile independent data on what universities are really producing.