Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - AU
The Guardian - AU
National
Caitlin Cassidy

Voice from the past: how one university is countering AI with ancient examination techniques

The University of South Australia
Conversation-based assessments the University of South Australia are helping lecturers better determine their students’ understanding of subjects. Photograph: Tracey Nearmy/The Guardian

Over millennia, debating the issues of the day has evolved as a rite of passage to graduate from university in Europe and parts of the Muslim world.

A town square in the ancient world might not be the most obvious place to tackle the challenges of artificial intelligence facing universities. But the oral tradition of viva voce, which from Latin translates as “word of mouth”, is being adopted as a more accurate way to test knowledge in the age of chatbots, cheating and commercialisation.

Dr Chris Della Vedova, a senior lecturer of biochemistry and biomedical sciences at the University of South Australia, first noticed problems with essay and multiple choice exams when students were forced online at the start of the pandemic.

“With digital exams, we didn’t really know if anybody knew anything, which made it hard to assess,” he said. “It was rare to fail students unless they didn’t complete them [exams].”

So, his team pivoted – trying out a new format of 20-minute conversations where assessors would draw from a random pool of questions based on material covered in lectures.

Students would answer, and the assessor would then ask a series of follow-up questions requesting them to expand on their answers, or to put their ideas in the context of the course as a whole.

UniSA began using the viva voce system in 2022, replacing the final written exam for a range of its science degrees with oral assessments instead. In an age where generative AI can generate complex responses in seconds, there have been zero academic integrity breaches since the format was implemented.

Vedova says conversation-based assessments are helping lecturers better determine their students’ understanding of subject content because of the “fluid, personalised nature of a conversation”.

“Often you’ll ask the first question and get a good memorised answer, but we want to make sure they understand what they’re saying, so follow-ups give the opportunity to gauge how solid their understanding is,” he said.

“When students are nervous or unclear, we can give them prompts – have them step back, whereas in an exam they might look at a question and leave it blank.”

It’s not without its challenges.

A professor of artificial intelligence at the University of New South Wales, Toby Walsh, said the format may unfairly disadvantage students who use English as a second language or introverted people who dislike public speaking.

Vedova conceded that international students – who made up between 15% and 30% of science courses – did worry about the oral format. But he said practice activities built confidence and spoken communication skills were not being assessed.

“Most of these students are going into health professions where interactions will be verbal,” he said.

Vedova said he was “more confident” he could work out what they understood by sitting and talking to them.

“There’s no reason it can’t work at any university in the age of generative AI, where there’s a lot we can’t really trust any more.

“With a 10-page essay, you have no idea where it’s come from.”

But there are practical barriers.

Viva voce is what we reserve for the highest examinations we do – PhDs – to see what they’ve done, and I’m very supportive of it, it’s fantastic,” Walsh said, adding: “But it doesn’t scale very well. With a sit-down exam, you have hundreds of students completing it at once. Orally, you can only examine one.

“My colleagues using more oral examinations are always running into the problem that a first year undergraduate course has thousands of students, [and] they don’t have the staff.

“It’s difficult to compare students. With an exam, there’s one comparison, but oral exams may go in a different direction.”

An artificial intelligence expert, Dr Stefan Popenici, says that in smaller cohorts oral examinations are the “perfect form of assessment”. But he says the method is in direct conflict with the model of “commercialised higher education”.

“If we want to have graduates able to read and write, it’s important to consider alternative forms of assessment,” he said.

“You can’t cheat in front of a panel – you have to say what you know. Students clicking a button? That’s scary.”

Is there a third way?

For Walsh – it is the open-book exam.

“Why do we get people to remember stuff if they can look it up in the real world?” he said.

“We should set things that are challenging, accepting the fact everyone has access to the knowledge. Make it so it requires expertise on top of tools.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.