Category Archives: Students and teaching

IT, engineering and economics students least satisfied with teaching quality

Yesterday I reported data from the 2014 University Experience Survey suggested that students at non-university higher education providers were, on most indicators, more satisfied with their education than students at universities.

There are also significant differences between disciplines on satisfaction with teaching quality, as seen in the chart below. I have taken out disciplines with fewer than 1,000 respondents in the UES, as well as most ‘other’ disciplinary categories as too vague. This took out both the discipline with the highest satisfaction (language and literature, 89%) and with the lowest (mechanical engineering, 72%).

satisfaction by discipline

Most of the relatively low-satisfaction disciplines are popular with males and international students, who report lower overall satisfaction than females and domestic students. But I can’t tell on the available data which way the causation might be running – whether students in engineering, IT and commerce faculties are less satisfied at least partly because they tend to be male and/or from overseas, or because males and international students are less satisfied because they are enrolled in engineering, IT and commerce faculties.

Non-uni higher education provider students more satisfied than uni students

The 2014 University Experience Survey results have been released in the last few days, and this time they included students from non-university higher education providers (NUHEPs). A total of 1,444 students from 15 NUHEPs completed a survey. Given that there are about 130 NUHEPs the results aren’t conclusive, but they are interesting.

As can be seen in the chart below, NUHEP students were generally more satisfied with their educational experience than university students. Each of the categories includes multiple related questions that are combined to produce an overall satisfaction rate. For example, the teaching quality scale contains questions on whether teachers explained things clearly, gave helpful comments on work, whether assessment tasks challenged students to learn, and other similar topics.

NUHEP satisfy

The area where university students are more satisfied than NUHEP students is ‘learning resources’ which includes questions about the quality of teaching spaces, library facilities, online learning materials, the quality of student spaces and common areas, and related topics. Possibly the big university campuses with their economies of scale are better on these things.

The positive responses on ‘learner engagement’ are noticeably lower for both groups. For example, only 53 per cent of students said they had a sense of belonging to their university. The recent first year experience survey picked up a negative trend in this area.

Although there is room for improvement in some areas, for most questions responses were more positive in 2014 than 2013. That supports the conclusion of the end-of-degree course experience questionnaire (trend data at p.76 of Mapping Australian higher education) that teaching quality in Australian universities is slowly but steadily improving.

What’s going on in the new graduate labour market?

Late last year the mainstream media picked up on the graduate un/under-employment story. At Grattan we have been doing a bit more work to see what is going on.

One of the things we wanted to look at whether the poor employment outcomes were driven by more graduates, as the 2009 and onwards enrolment boom students finish their courses, or a declining labour market, or both.

We have published completions data, but there is no published time series of the number of recent graduates with jobs. What we’ve done is taken the proportion of recent graduates with full-time jobs in the Graduate Destination Survey as a share of the completions number. To the extent that the GDS is an imperfect sample our numbers are likely to be a little wrong, but I doubt this will affect the trend.

As can be seen in the slide below, both supply and demand factors are affecting outcomes. The graduate labour market peaked in 2007, when nearly 61,000 new bachelor graduates found (or already had) full-time jobs. In 2013 and 2014, just over 52,000 new bachelor graduates had full time jobs about four months after completing their degrees.

recent grad employ and complete

There seem to be two shocks to the employment market. The first was the onset of the global financial crisis, with was felt most strongly for the 2008 completing students, with a decline of 7 per cent in the number of graduate jobs on the previous year. Perhaps surprisingly, there was a slightly bigger shock in 2013, with a 7.6 per cent decline on the number of jobs in 2012. One reason it was worse in 2013 is that big health fields which had been little affected by the 2009 downturn declined significantly. This is consistent with fewer health occupations appearing on the skills shortage list (p. 68).

While graduate employment opportunities have trended down, the number of domestic bachelor degree completions has trended up, by 17 per cent between 2008 and 2014. Given there are still some big student cohorts enrolled in our universities, the number of completions will only increase in the next few years. Unfortunately, we cannot have the same confidence about full-time jobs for recent graduates.

Increases in low SES uni participation, 1991-2011

Using the trend data from the chart below, it is often said that we are making little progress in increasing higher education participation for people from low SES backgrounds.

low SES trend

The chart shows domestic low SES students as a percentage of all domestic students. But the denominator is important: it means that low SES enrolment has to increase more quickly than enrolment generally for the percentage to go up.

A more meaningful indicator is low SES enrolment as a percentage of the relevant low SES population. This tells us whether people from low SES backgrounds are becoming more likely to attend university over time.

An interesting paper out from the Group of Eight today (disclosure: drawing on some of my work from a few years back) shows how, for the late teenage children of low SES workers, university attendance has become more likely over time.

For example, in 1991 16 per cent of the children of tradespeople were at university. Twenty years later that number was 26 per cent. The gaps between SES groups remain very wide, but with participation growth in the leading SES group, professionals, slowing down the gaps are not as large as they were in the past.

Census trends occupational partic

—-
Note: The data is drawn from the census, using 18 and 19 year olds living at home. At home is needed to determine parental occupation. According to the two latest censuses, about 80% of 18 year old university students and 70% of 19 year olds are living with their parents.

Language background should be dropped as a higher ed equity category

At The Conversation, Tim Pitman has anlaysed enrolment changes under the demand driven system of the official equity groups.

He mentions in passing one equity group that survives on the list despite it not predicting educational disadvantage: coming from a non-English speaking background and arriving in Australia in the last decade.

Census data suggests that it is people from English speaking backgrounds who lag in university attendance. Limiting the analysis to 18 to 20 year olds who are citizens (to avoid international students skewing the analysis), only people who speak Australian Indigenous languages at home have lower rates of university attendance.

NESB attend

Narrowing the analysis to people arriving in Australia between 2001 and 2011 does not change the broad picture, with people speaking an African language at home having about the same rate of university attendance as people who speak English at home, with the other groups having higher, and often significantly higher, rates of attendance.

NESB recent arrival

Speaking English at home is not, of course, in itself a disadvantage when it comes to going to university. Class, cultural and locational factors explain these differences. These factors are already covered by other equity categories, making language background redundant.

Update: Tim Pitman in comments below is questioning whether restricting the analysis to 18-20 year olds is enough to sustain the argument. I give reasons below why I think it is. However, to test this I have analysed 30-34 year olds. I don’t think these numbers are as good as the 18-20 year olds, as they are more affected by adult migration by people who already have degrees. Also there will be some double counting of people who have a degree and are studying. But they are a guide. Here we do get one language group, Southwest and Central Asian (without double-checking the numbers, I am guessing mainly Arabs, Afghans and Turks) which has lower rates of educational attainment and participation. However, the differences aren’t large and overall it is still very difficult to argue that speaking a language other than English at home is in itself associated with educational disadvantage.

30-34 year olds

The case for including for-profit higher education providers in the demand driven system

Reaction to the report of the demand driven review, which I co-authored with David Kemp, has been pretty positive overall. But our proposal to extend Commonwealth supported places to non-university higher education providers, especially those operated on a for-profit basis, is attracting some negative comment.

Professor Greg Craven, vice-chancellor of Australian Catholic University, said:

There is a basic psychological difference between a statutory body (university) ploughing money back into the enterprise and a private college whose modus operandi is to make a profit.”

Whether or not that is true, a higher education system needs to be robust to the weaknesses and variability of human motivations. Indeed, the public universities themselves are a case study in the limitations of a ‘just trust us’ model in higher education.

As the report discusses (pages 9-10 especially) the universities were for a long time, and still are to a lesser extent, able to get away with poor practices in teaching. This showed in the abysmal results of the first national student surveys conducted in the mid-1990s. Things have improved since through a combination of public information, government programs and incentives, market competition, and more recently regulation.

The report recommends that all these measures apply to the non-university providers as well. Indeed, they have another layer of scrutiny that the universities lack, which is that their courses need to be individually approved by the Tertiary Education Quality and Standards Agency. It also recommends extending the University Experience Survey to the non-university providers, and publishing the results on a replacement for the MyUniversity website to make it easier for potential students to compare courses. Read more »

New data on the close link between SES and university attendance

I’ve criticised the government’s exclusive focus on attracting more university students from the lowest 25% of geographic areas, as measured by an index of education and occupation. I had found several data sources suggesting that educational achievement in the second-lowest quartile wasn’t much better than in the lowest quartile.

Today the ABS released an update to its online 2011 census package that lets us classify students according to their socioeconomic status ($$$ if you want access). I calculated university attendance rates for 20-24 year olds by SES deciles, with one the lowest and ten the highest.

I think my general point stands: there are low rates of university attendance well above the lowest 25%. Someone in the 4th decile is well above the lowest 25%, but still only has a third of the likelihood of attending university as someone in the top 10%. Even removing early school leavers from the analysis, their chances of attending university are still less than half those of someone in the top 10%.* We need a re-investigation of the role poor school results versus other factors play in this outcome.

uni attend 20-24 take 2

However, the data is less lumpy than I expected. There is the upper middle class at deciles nine and ten with high rates of education and professional employment which is quite different from the rest of the population. But below that attendance rates do slowly but steadily increase as people move up the SES spectrum, without the large and weakly-differentiated lowest 50% I expected from other sources.

* The decile differences are somewhat exaggerated due to students who move from low SES areas, especially in regional areas, to live near universities which are in high SES areas.

The complicated university teaching-research relationship

In The Age this morning, Don Aitken argues that university teaching has come off second best. ‘Today research, and only research, is really important,’ he says.

I certainly think that university teaching needs improving. But the story is not one of the decline of teaching and the rise of research, with one improving at the clear expense of the other.

Up until the Dawkins reforms of the late 1980s and early 1990s more than half of higher education students attended colleges of advanced education or institutes of technology. Their mission was teaching rather than research, although some of their academics were doing research. The universities were teaching-research institutions, but with weaker research pressures than today. Most research funding was delivered as a block grant that was (unlike today) not linked to indicators of research performance.

If the teaching-focused colleges of advanced education and institutes of technology were good at teaching, we would expect their positive legacy to show when the first national student survey (the course experience questionnaire) was conducted in the mid-1990s. In reality, the CEQ showed generally dismal results. Across the country, the average positive response to six teaching-related questions was around one-third.

As the government started emphasising research performance in its funding policies, the apparent incentive was to focus on it over teaching. But this is not showing in the trend data (the figure below). The time series was was upset in 2010 in ways that exaggerate satisfaction compared to the past, but the steady upward trend in satisfaction cannot be disputed. (Some theories as to why are here.)

GTS

A consistently calculated time series on research productivity only goes back to 1997. It shows steadily increasing productivity up to 2005, where it stablises at an average 2.1-2.2 publications per full-time researcher per year (counting teaching-research staff as 0.4 full-time equivalent in research, in line with common time use expectations).

Publications per academic

Rather than research rising at the expense of teaching, on these indicators they both rose together until the middle of last decade. In research, the focus has shifted to research quality – it’s still too early to put numbers on it, but simultaneous with on-going increases in satisfaction with teaching universities are culling weaker researchers and focusing their investment in areas of relative research strength.

As well as it being difficult to find evidence for research at the expense of teaching over time, our recent Grattan research project failed to find much evidence that low-research departments are better at teaching than high-research departments, as measured by recent student surveys.

My view is that at the dawn of the Dawkins era universities were under-performing institutions, across both teaching and research. Research was further down the path of professionalisation and favoured in academic culture. But both teaching and research needed to improve a lot, and that is what we have seen.

Just removing research and making some universities ‘teaching only’ would not on its own make things better. Improved teaching needs concerted effort, whether or not it occurs in an institution that also produces research.

Why is student satisfaction with teaching increasing?

My new Grattan report argues that teaching in Australian universities could be improved. But despite remaining shortcomings, I think significant progress has been made since the 1990s.

We often hear that with higher student-staff ratios Australian academics have less time to spend on students. But in the long-running course experience questionnaire survey it is the time-use questions that have shown the greatest improvement over time.

The figure below shows that the proportion of completing students agreeing that staff ‘put a lot of time into commenting on my work’ and ‘normally gave me helpful feedback on how I was going’ has roughly doubled since 1997.*

feeback questions ceq

I think a major explanation is likely to be technology. The increase last decade matches with the spread of home internet connections. Academic staff became much more accessible via email and learning management systems than they had ever been before, and were also able to efficiently give the same or similar feedback to multiple students.

There were also good improvements (20 percentage points plus) in agreement with propositions such as lecturers were good at explaining things, teachers motivated me to do my best work, and staff worked hard to make their subjects interesting. These are not so obviously technology driven, suggesting that other forces for good teaching were at work.

These might include the spread of subject-level student surveys and their link to promotion and greater (though still typically very short course) training in university teaching.

Whatever the exact causes, these results highlight how increasing funding is not necessarily the key to improved education. Through most of these years, real per student funding for Commonwealth-supported students was declining. How universities organise themselves is the most important factor.

* All five points on the response scale were labelled for the first time in 2010, with points labelled strongly disagree, disagree, neither agree nor disagree, agree and strongly agree. In previous
years, only the anchor points of strongly disagree and strongly agree were labelled. This seems to have increased positive responses.

The persistence of health and education students

I recently received some new data on completion and attrition rates by ATAR, a surprisingly under-examined topic in Australian higher education. My Mapping Australian higher education publication summarises research suggesting a weak relationship between ATAR and average marks. However, data on 2005 commencing students shows a quite strong relationship beween ATAR and completion – the higher the ATAR, the higher the chance of completion. The whole cohort data is in this article.

We also have the data by field of education. Most disciplines have the same general pattern. But two, health and education, have higher persistence at lower ATARs, as can be seen below.

health ed atar completion
Source: DIICCSRTE

The same two broad fields of study also have graduates with high rates of retention in jobs related to their field of study, as seen in the chart below.

Degree job relevance

I’m inclined to think that the main reason is that people who choose these degrees have a relatively high degree of commitment to the end occupation from day one. A colleague notes that this may in part be because students in these fields don’t necessarily have many attractive alternatives. For people with lowish ATARs who don’t want to do voc ed, teaching and nursing have been paths to relatively secure and reasonably paid careers.