71% of Australian uni staff are using AI. What are they using it for? What about those who aren’t?

71% of Australian uni staff are using AI. What are they using it for? What about those who aren’t?

Yanz Island/Shutterstock

Since ChatGPT was released at the end of 2022, there has been a lot of speculation about the actual and potential impact of generative AI on universities.

Some studies have focused on students’ use of AI. There has also been research on what it means for teaching and assessment.

But there has been no large-scale research on how university staff in Australia are using AI in their work.

Our new study surveyed more than 3,000 academic and professional staff at Australian universities about how they are using generative AI.

Our study

Our survey was made up of 3,421 university staff, mostly from 17 universities around Australia.

It included academics, sessional academics (who are employed on a session-by-session basis) and professional staff. It also included adjunct staff (honorary academic positions) and senior staff in executive roles.

Academic staff represented a wide range of disciplines including health, education, natural and physical sciences, and society and culture. Professional staff worked in roles such as research support, student services and marketing.

The average age of respondents was 44.8 years and more than half the sample was female (60.5%).

The survey was open online for around eight weeks in 2024.

One hand types on a computer, while another writes in a notebook.

We surveyed academic and professional staff at universities around Australia.
Panitan/Shutterstock

Most university staff are using AI

Overall, 71% of respondents said they had used generative AI for their university work.

Academic staff were more likely to use AI (75%) than professional staff (69%) or sessional staff (62%). Senior staff were the most likely to use AI (81%).

See also  UK risks losing out on hi-tech growth if it falters on AI regulation

Among academic staff, those from information technology, engineering, and management and commerce were most likely to use AI. Those from agriculture and environmental studies, and natural and physical sciences, were least likely to use it.

Professional staff in business development, and learning and teaching support, were the most likely to report using AI. Those working in finance and procurement, and legal and compliance areas, were least likely to use AI.

Given how much publicity and debate there has been about AI in the past two years, the fact that nearly 30% of university staff had not used AI suggests adoption is still at an early stage.

What tools are staff using?

Survey respondents were asked which AI tools they had used in the previous year. They reported using 216 different AI tools, which was many more than we anticipated.

Around one-third of those using AI had only used one tool, and a further quarter had used two. A small number of staff (around 4%) had used ten tools or more.

General AI tools were by far the most frequently reported. For example, ChatGPT was used by 88% of AI users and Microsoft Copilot by 37%.

University staff are also commonly using AI tools with specific purposes such as image creation, coding and software development, and literature searching.

We also asked respondents how frequently they used AI for a range of university tasks. Literature searching, writing and summarising information were the most common, followed by course development, teaching methods and assessment.

A man stands in front of a group of young people who are in tiered seating in a lecture hall.

ChatGPT was the most common generative AI tool used by our respondents.
Monkey Business Images/ Shutterstock

See also  Do you trust AI to write the news? It already is – and not without issues

Why aren’t some staff using AI?

We asked staff who had not yet used AI for work to explain their thinking. The most common reason they gave was AI was not useful or relevant to their work. For example, one professional staff member stated:

While I have explored a couple of chat tools (Chat GPT and CoPilot) with work-related questions, I’ve not needed to really apply these tools to my work yet […].

Others said they weren’t familiar with the technology, were uncertain about its use or didn’t have time to engage. As one academic told us plainly, “I don’t feel confident enough yet”.

Ethical objections to AI

Others raised ethical objections or viewed the technology as untrustworthy and unreliable. As one academic told us:

I consider generative AI to be a tool of plagiarism. The uses to date, especially in the creative industries […] have involved machine learning that uses the creative works of others without permission.

They also also raised about AI undermining human activities such as writing, critical thinking and creativity – which they saw as central to their professional identities. As one sessional academic said:

I want to think things through myself rather than trying to have a computer think for me […].

Another academic echoed:

I believe that writing and thinking is fundamental to the work we do. If we’re not doing that, then […] why do we need to exist as academics?

How should universities respond?

Universities are at a crucial juncture with generative AI. They face an uneven uptake of the technology by staff in different roles and divided opinions on how universities should respond.

These different views suggest universities need to have a balanced response to AI that addresses both the benefits and concerns around this technology.

See also  Colleges are using AI to prepare hospitality workers of the future

Despite differing opinions in our survey, there was still agreement among respondents that universities need to develop clear, consistent policies and guidelines to help staff use AI. Staff also said it was crucial for universities to prioritise staff training and invest in secure AI tools.

The Conversation

Alicia Feldman receives an Australian Government Research Training Program Scholarship and Fee Offset.

Paula McDonald receives funding from the Australian Research Council.

Abby Cathcart and Stephen Hay do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.