Category Archives: Students and teaching

Does attending a prestige research university reduce earnings?

The latest HILDA statistical report has a finding that the AFR says:

threatens to undermine the prevailing view that it’s worthwhile for students sweat out a high ATAR score in year 12 so they make it into one of the elite Group of Eight universities.

What the report says is that Group of Eight graduates in their sample earn less than graduates who went to universities in the Australian Technology Network or Innovative Research Universities groups.

In my view the finding is not so much wrong as misleading. We did similar research at Grattan last year, using the same HILDA data source. We found a salary premium for the Group of Eight and Australian Technology Network universities of about 6 per cent, after controlling for various personal attributes and, most importantly, discipline.

I believe that the main reason the HILDA report is getting its result is that Group of Eight universities have enrolment skews towards relatively low paid disciplines such as arts and to a lesser extent science. As can be seen in the chart below, these tend to have lifetime earnings in the lower half of the income distribution.

earningsJPG
Source: Grattan analysis of ABS Census 2011 data. It shows the median lifetime private financial benefit of holding a bachelor degree in the stated discipline, compared to the median person of the same gender who finished their education at Year 12.

Just taking 2013 completions, 31 per cent of Group of Eight completions were in the ‘society and culture’ field that includes humanities compared to 23 per cent of IRU graduates and 14.5% of ATN graduates. For science, this field includes 16 per cent of Group of Eight graduates but only 8 per cent of IRU and 6 per cent of ATN graduates.

All this said, past research that does control for discipline has consistently found that the financial advantage of attending a Group of Eight university is less than we might expect, given their relatively high prestige. Despite some anecdotal evidence to the contrary, employers of new graduates don’t seem to pay a Group of Eight salary premium. And Grattan’s work using HILDA data found only a fairly small advantage in the long run. As we did not control for prior ability, even that small advantage might be over-stated due to partly reflecting the higher prior academic ability, as measured by school results, of the students who attend Group of Eight universities.

If a prospective student wants to maximise their income, the key advice is that what they study matters more than where they study it.

But for Group of Eight universities to come out worse in the long run on a discipline basis they would have to be doing significantly worse in adding human capital during their courses, bad enough to cancel out the positive effects of higher prior ability and whatever proxy value their brands have in the labour market. That seems pretty unlikely. I think if the analysis was done again including discipline we’d see a finding more consistent with theory and past research: a small Group of Eight advantage.

The science bubble finally shows signs of deflating

Since 2009, demand for science courses has been growing strongly. This is leading to a serious labour market over-supply, and so I believe there needs to be a correction. Unfortunately the information flows to prospective students for science contain a lot of misleading signals from STEM boosters, who persist in high-profile, but in my view incorrect, claims that this area of education needs encouragement.

The February 2015 applications statistics show some tentative signs that the market is adjusting to science realities. Through the tertiary admissions centres, there has been a 3 per cent decline in science applications. In reality, it is probably a larger drop than this, as I understand that a course reclassification that led to a big drop in environmental studies applications should have boosted science. Against this trend, however, there has been a 5 per cent increase in direct applications for science courses (for non-Year 12 applicants, there is a trend towards applying directly rather than through tertiary admission centres).

Feb demand

The biggest drop in demand if we don’t consider the environmental studies reclassification has been for education courses, down 9 per cent through tertiary admissions centres. This is probably because with a clear occupational outcome in teaching negative labour market information is transmitted much more effectively. Although this is a sensible adjustment, Kim Carr will still be on the warpath about nearly 900 education course offers to applicants with ATARs below 50.

Total applicants (as opposed to applications) are up 1 per on 2014, indicating a market that is essentially stable.

IT, engineering and economics students least satisfied with teaching quality

Yesterday I reported data from the 2014 University Experience Survey suggested that students at non-university higher education providers were, on most indicators, more satisfied with their education than students at universities.

There are also significant differences between disciplines on satisfaction with teaching quality, as seen in the chart below. I have taken out disciplines with fewer than 1,000 respondents in the UES, as well as most ‘other’ disciplinary categories as too vague. This took out both the discipline with the highest satisfaction (language and literature, 89%) and with the lowest (mechanical engineering, 72%).

satisfaction by discipline

Most of the relatively low-satisfaction disciplines are popular with males and international students, who report lower overall satisfaction than females and domestic students. But I can’t tell on the available data which way the causation might be running – whether students in engineering, IT and commerce faculties are less satisfied at least partly because they tend to be male and/or from overseas, or because males and international students are less satisfied because they are enrolled in engineering, IT and commerce faculties.

Non-uni higher education provider students more satisfied than uni students

The 2014 University Experience Survey results have been released in the last few days, and this time they included students from non-university higher education providers (NUHEPs). A total of 1,444 students from 15 NUHEPs completed a survey. Given that there are about 130 NUHEPs the results aren’t conclusive, but they are interesting.

As can be seen in the chart below, NUHEP students were generally more satisfied with their educational experience than university students. Each of the categories includes multiple related questions that are combined to produce an overall satisfaction rate. For example, the teaching quality scale contains questions on whether teachers explained things clearly, gave helpful comments on work, whether assessment tasks challenged students to learn, and other similar topics.

NUHEP satisfy

The area where university students are more satisfied than NUHEP students is ‘learning resources’ which includes questions about the quality of teaching spaces, library facilities, online learning materials, the quality of student spaces and common areas, and related topics. Possibly the big university campuses with their economies of scale are better on these things.

The positive responses on ‘learner engagement’ are noticeably lower for both groups. For example, only 53 per cent of students said they had a sense of belonging to their university. The recent first year experience survey picked up a negative trend in this area.

Although there is room for improvement in some areas, for most questions responses were more positive in 2014 than 2013. That supports the conclusion of the end-of-degree course experience questionnaire (trend data at p.76 of Mapping Australian higher education) that teaching quality in Australian universities is slowly but steadily improving.

What’s going on in the new graduate labour market?

Late last year the mainstream media picked up on the graduate un/under-employment story. At Grattan we have been doing a bit more work to see what is going on.

One of the things we wanted to look at whether the poor employment outcomes were driven by more graduates, as the 2009 and onwards enrolment boom students finish their courses, or a declining labour market, or both.

We have published completions data, but there is no published time series of the number of recent graduates with jobs. What we’ve done is taken the proportion of recent graduates with full-time jobs in the Graduate Destination Survey as a share of the completions number. To the extent that the GDS is an imperfect sample our numbers are likely to be a little wrong, but I doubt this will affect the trend.

As can be seen in the slide below, both supply and demand factors are affecting outcomes. The graduate labour market peaked in 2007, when nearly 61,000 new bachelor graduates found (or already had) full-time jobs. In 2013 and 2014, just over 52,000 new bachelor graduates had full time jobs about four months after completing their degrees.

recent grad employ and complete

There seem to be two shocks to the employment market. The first was the onset of the global financial crisis, with was felt most strongly for the 2008 completing students, with a decline of 7 per cent in the number of graduate jobs on the previous year. Perhaps surprisingly, there was a slightly bigger shock in 2013, with a 7.6 per cent decline on the number of jobs in 2012. One reason it was worse in 2013 is that big health fields which had been little affected by the 2009 downturn declined significantly. This is consistent with fewer health occupations appearing on the skills shortage list (p. 68).

While graduate employment opportunities have trended down, the number of domestic bachelor degree completions has trended up, by 17 per cent between 2008 and 2014. Given there are still some big student cohorts enrolled in our universities, the number of completions will only increase in the next few years. Unfortunately, we cannot have the same confidence about full-time jobs for recent graduates.

Increases in low SES uni participation, 1991-2011

Using the trend data from the chart below, it is often said that we are making little progress in increasing higher education participation for people from low SES backgrounds.

low SES trend

The chart shows domestic low SES students as a percentage of all domestic students. But the denominator is important: it means that low SES enrolment has to increase more quickly than enrolment generally for the percentage to go up.

A more meaningful indicator is low SES enrolment as a percentage of the relevant low SES population. This tells us whether people from low SES backgrounds are becoming more likely to attend university over time.

An interesting paper out from the Group of Eight today (disclosure: drawing on some of my work from a few years back) shows how, for the late teenage children of low SES workers, university attendance has become more likely over time.

For example, in 1991 16 per cent of the children of tradespeople were at university. Twenty years later that number was 26 per cent. The gaps between SES groups remain very wide, but with participation growth in the leading SES group, professionals, slowing down the gaps are not as large as they were in the past.

Census trends occupational partic

—-
Note: The data is drawn from the census, using 18 and 19 year olds living at home. At home is needed to determine parental occupation. According to the two latest censuses, about 80% of 18 year old university students and 70% of 19 year olds are living with their parents.

Language background should be dropped as a higher ed equity category

At The Conversation, Tim Pitman has anlaysed enrolment changes under the demand driven system of the official equity groups.

He mentions in passing one equity group that survives on the list despite it not predicting educational disadvantage: coming from a non-English speaking background and arriving in Australia in the last decade.

Census data suggests that it is people from English speaking backgrounds who lag in university attendance. Limiting the analysis to 18 to 20 year olds who are citizens (to avoid international students skewing the analysis), only people who speak Australian Indigenous languages at home have lower rates of university attendance.

NESB attend

Narrowing the analysis to people arriving in Australia between 2001 and 2011 does not change the broad picture, with people speaking an African language at home having about the same rate of university attendance as people who speak English at home, with the other groups having higher, and often significantly higher, rates of attendance.

NESB recent arrival

Speaking English at home is not, of course, in itself a disadvantage when it comes to going to university. Class, cultural and locational factors explain these differences. These factors are already covered by other equity categories, making language background redundant.

Update: Tim Pitman in comments below is questioning whether restricting the analysis to 18-20 year olds is enough to sustain the argument. I give reasons below why I think it is. However, to test this I have analysed 30-34 year olds. I don’t think these numbers are as good as the 18-20 year olds, as they are more affected by adult migration by people who already have degrees. Also there will be some double counting of people who have a degree and are studying. But they are a guide. Here we do get one language group, Southwest and Central Asian (without double-checking the numbers, I am guessing mainly Arabs, Afghans and Turks) which has lower rates of educational attainment and participation. However, the differences aren’t large and overall it is still very difficult to argue that speaking a language other than English at home is in itself associated with educational disadvantage.

30-34 year olds

The case for including for-profit higher education providers in the demand driven system

Reaction to the report of the demand driven review, which I co-authored with David Kemp, has been pretty positive overall. But our proposal to extend Commonwealth supported places to non-university higher education providers, especially those operated on a for-profit basis, is attracting some negative comment.

Professor Greg Craven, vice-chancellor of Australian Catholic University, said:

There is a basic psychological difference between a statutory body (university) ploughing money back into the enterprise and a private college whose modus operandi is to make a profit.”

Whether or not that is true, a higher education system needs to be robust to the weaknesses and variability of human motivations. Indeed, the public universities themselves are a case study in the limitations of a ‘just trust us’ model in higher education.

As the report discusses (pages 9-10 especially) the universities were for a long time, and still are to a lesser extent, able to get away with poor practices in teaching. This showed in the abysmal results of the first national student surveys conducted in the mid-1990s. Things have improved since through a combination of public information, government programs and incentives, market competition, and more recently regulation.

The report recommends that all these measures apply to the non-university providers as well. Indeed, they have another layer of scrutiny that the universities lack, which is that their courses need to be individually approved by the Tertiary Education Quality and Standards Agency. It also recommends extending the University Experience Survey to the non-university providers, and publishing the results on a replacement for the MyUniversity website to make it easier for potential students to compare courses. Read more »

New data on the close link between SES and university attendance

I’ve criticised the government’s exclusive focus on attracting more university students from the lowest 25% of geographic areas, as measured by an index of education and occupation. I had found several data sources suggesting that educational achievement in the second-lowest quartile wasn’t much better than in the lowest quartile.

Today the ABS released an update to its online 2011 census package that lets us classify students according to their socioeconomic status ($$$ if you want access). I calculated university attendance rates for 20-24 year olds by SES deciles, with one the lowest and ten the highest.

I think my general point stands: there are low rates of university attendance well above the lowest 25%. Someone in the 4th decile is well above the lowest 25%, but still only has a third of the likelihood of attending university as someone in the top 10%. Even removing early school leavers from the analysis, their chances of attending university are still less than half those of someone in the top 10%.* We need a re-investigation of the role poor school results versus other factors play in this outcome.

uni attend 20-24 take 2

However, the data is less lumpy than I expected. There is the upper middle class at deciles nine and ten with high rates of education and professional employment which is quite different from the rest of the population. But below that attendance rates do slowly but steadily increase as people move up the SES spectrum, without the large and weakly-differentiated lowest 50% I expected from other sources.

* The decile differences are somewhat exaggerated due to students who move from low SES areas, especially in regional areas, to live near universities which are in high SES areas.

The complicated university teaching-research relationship

In The Age this morning, Don Aitken argues that university teaching has come off second best. ‘Today research, and only research, is really important,’ he says.

I certainly think that university teaching needs improving. But the story is not one of the decline of teaching and the rise of research, with one improving at the clear expense of the other.

Up until the Dawkins reforms of the late 1980s and early 1990s more than half of higher education students attended colleges of advanced education or institutes of technology. Their mission was teaching rather than research, although some of their academics were doing research. The universities were teaching-research institutions, but with weaker research pressures than today. Most research funding was delivered as a block grant that was (unlike today) not linked to indicators of research performance.

If the teaching-focused colleges of advanced education and institutes of technology were good at teaching, we would expect their positive legacy to show when the first national student survey (the course experience questionnaire) was conducted in the mid-1990s. In reality, the CEQ showed generally dismal results. Across the country, the average positive response to six teaching-related questions was around one-third.

As the government started emphasising research performance in its funding policies, the apparent incentive was to focus on it over teaching. But this is not showing in the trend data (the figure below). The time series was was upset in 2010 in ways that exaggerate satisfaction compared to the past, but the steady upward trend in satisfaction cannot be disputed. (Some theories as to why are here.)

GTS

A consistently calculated time series on research productivity only goes back to 1997. It shows steadily increasing productivity up to 2005, where it stablises at an average 2.1-2.2 publications per full-time researcher per year (counting teaching-research staff as 0.4 full-time equivalent in research, in line with common time use expectations).

Publications per academic

Rather than research rising at the expense of teaching, on these indicators they both rose together until the middle of last decade. In research, the focus has shifted to research quality – it’s still too early to put numbers on it, but simultaneous with on-going increases in satisfaction with teaching universities are culling weaker researchers and focusing their investment in areas of relative research strength.

As well as it being difficult to find evidence for research at the expense of teaching over time, our recent Grattan research project failed to find much evidence that low-research departments are better at teaching than high-research departments, as measured by recent student surveys.

My view is that at the dawn of the Dawkins era universities were under-performing institutions, across both teaching and research. Research was further down the path of professionalisation and favoured in academic culture. But both teaching and research needed to improve a lot, and that is what we have seen.

Just removing research and making some universities ‘teaching only’ would not on its own make things better. Improved teaching needs concerted effort, whether or not it occurs in an institution that also produces research.