In The Australian this morning, Simon Marginson suggests that there is something wrong with the methodology in Graduate Winners.
He starts by accusing Grattan of starting with conclusions and backing them with ‘selective studies’ and cherry-picking data from other sources. My colleagues who did the empirical work for Graduate Winners are very unimpressed with this impugning of their professional integrity. We started with the public benefits claimed in the base funding review, and looked for whatever primary Australian data we could find. There was no cherry picking, no selectivity – and nobody has come forward with anything Australian that we missed.
Particularly on the non-financial benefits, I can confidently say that nobody in Australia has ever analysed this issue as carefully and comprehensively as we did. Jim Savage’s work on this did not get the attention it deserved due to the political controversy over tuition subsidies, but his technical paper has to be the starting point for any future work in this area.
Marginson’s alternative methodology is to look at OECD comparisons, stating that ‘it is significant that the Grattan report carefully avoids both the method and content of the OECD. It would have us believe Australian higher education has nothing to learn from global comparisons.’ Given how often this point has come up, in hindsight perhaps I should have included a section on this subject. But I don’t think the OECD funding data in itself tells us much other than that countries have very different mixes of state and private funding, and that these are reflected in their higher education financing systems. A high fee university system would not mesh well with Scandinavian tax rates. But it does fit with the lower tax rates in Australia, the US or Japan.
Similarly, I find Marginson’s claim that other countries report stronger relationships between education and social engagement uncompelling (we report this fact, he did not need to go to the OECD). American colleges and universities expressly inculcate civic values, Australian universities very rarely do so. The differences between the countries reflect the different histories of their higher education systems, and not public funding levels.
If I had 15 minutes to prepare a debating case on higher education, I probably would turn to the OECD for some handy facts and figures. The key OECD document is called Education at a Glance for a reason. But if as was the case I had months and the help of colleagues to explore the Australian data and think through the conceptual issues, that is surely preferable. In my mind the two big issues in higher education public funding are whether it causes significant additional public benefits (on top of those that would be derived from a market system), and whether there are access implications from fees. We focused on these big issues.
In any case, if we had used OECD data it would have tended to support our conclusion that the level of public funding is not the key variable in higher education systems. As I showed in an earlier post, there is no evidence that lower fees result in higher attainment. Indeed, the data suggests the reverse. One of my colleagues updated our analysis today with the latest Education at a Glance data, and (unsurprisingly) it shows again that high fees and high attainment tend to go together.
Even if OECD comparisons were a better methodology, they don’t always get Marginson where he wants to go.