ChatGPT will formally be rolled out in all Australian schools for the first time this year after the education ministers’ backing of a framework guiding AI use.
From embracing the technology as a learning tool to blanket bans and returning to pen-and-paper exams, the education sector has grappled with how to respond to the chatbot since it was released in late 2022.
Now that it’s firmly here to stay, here’s what you need to know about schools in the age of AI.
What’s in the framework?
The framework, released in December, outlines principles for the use of emergent technologies including privacy and security standards, equity and accessibility.
It was developed by the national AI schools taskforce, in consultation with school sectors, education unions, experts and First Nations Australians.
As part of the requirements, schools must engage students on how generative AI tools work – including their “potential limitations and biases”, with teachers deferred to as “subject-matter experts” in the classroom.
Student work, including assessments, needs to outline how generative AI tools should or should not be used, including by ensuring appropriate attribution.
The framework also takes note of the ability of AI to support students with disabilities, from diverse backgrounds and in rural and remote communities – provided it is accessible and equitable.
How has Australia responded to ChatGPT?
Every state and territory, excluding South Australia, moved to temporarily restrict ChatGPT in public schools last year as concerns mounted about privacy and plagiarism, while some private schools adopted it in their teaching and services.
Since then, the education minister, Jason Clare, has launched an inquiry into the use of generative artificial intelligence, looking into the opportunities and risks it poses to students and teachers.
Clare told Guardian Australia his main focus was ensuring schools shouldn’t use generative AI products that sold student data.
“If we get this right, generative AI can help personalise education and make learning more compelling and effective,” he said. “We will continue to review the framework to keep pace with developments.”
A Department of Education spokesperson, Julie Birmingham, told the inquiry while the technology was developing quickly, Australia had been “leading the way” in its response.
Early research showed AI could provide intelligent tutoring systems, better personalisation, more targeted learning materials and help educate at-risk students, she said.
“The question will be how do we operationalise [the taskforce] and support teachers and schools to deal with the challenges,” she said.
Who is leading the way?
South Australia has been a notable national outlier, opting against banning the technology when ChatGPT was released.
Its minister for education, training and skills, Blair Boyer, told Guardian Australia schools would be doing young people an “incredible disservice” if it didn’t educate students about the appropriate use of AI. “AI will be a part of our work and lives in the future,” he said.
Since ChatGPT’s release, South Australia’s Department of Education has developed a generative AI chatbot app which uses the same language model as the tool with built-in safeguards to protect students’ privacy and avoid inappropriate content.
EdChat, being tested at 16 public schools, doesn’t save students’ input or use it to learn from, unlike ChatGPT.
The trial is being used to inform how AI will be implemented into the state’s curriculum and was shared in the creation of Australia’s framework.
Are other states and territories on board?
Queensland has also completed a small trial in state high schools that piloted an AI teaching and learning tool called Cerego among 500 students.
The adaptive learning platform, which uses AI to create quiz-based learning that adjusts to the needs of individual students, will be released to all state schools later in 2024.
Victoria was one of the first states and territories to lift its restriction to access ChatGPT, scrapping it in term two last year.
A spokesperson for the state’s Department of Education said any use of AI in the curriculum would be determined by schools within the “overarching principles” of safe and ethical use.
Tasmania is similarly preparing its own policy, procedures and materials for the 2024 school year, noting the latest version of the Australian curriculum includes means to integrate the teaching of AI.
Western Australia has been considering AI trials held in other states to streamline lesson planning, marking and assessment developments and reduce teacher workloads.
It has pointed out the ability for AI to automate jobs like excursion planning, meeting preparation and writing general correspondence.
The Australian Capital Territory is taking a “cautious approach”, noting the continued risks of algorithmic bias and unauthorised use of student data by tech companies.
“Our current priority is to establish a robust educational framework, to guide teachers in how students use AI tools responsibly,” an ACT education directorate spokesperson said.
The secretary of the New South Wales education department, Murat Dizdar, said the state was committed to national collaboration and actively involved in discussions to prepare learners for a future where “generative AI is part of everyday life”.
What risks remain?
Leslie Loble, a UTS academic and former deputy secretary in the NSW education department, said while the “shock” of ChatGPT’s emergence had worn off, the issues surrounding generative AI were “by no means settled and resolved”.
“Going from seven states banning it to now is a sign of how far schools and systems have come in understanding the potential benefits and risks,” she said.
“But we really have to get started on clearer standards and expectations for what AI should deliver and how it should be defined and governed.”
She said the framework was an “excellent foundation” but key risks remained – particularly the persistent problem of equity and the digital divide.
“A lot of students still aren’t getting the basics of computers and wifi,” she said, pointing to funding gaps between government and independent schools in terms of resourcing.
“There’s a clear gap in take-up of these sophisticated generative AI tools between households and schools that have resources and those that don’t.
“To the extent these technologies can have a positive impact on learning – that divide is deeply troubling and will lead to a learning gap getting even worse.”
While Australia’s framework succeeded on a global scale in connecting AI to teaching and learning, Loble said the worst case would be to “set and forget” without consistently anchoring AI in teacher-led programs.
“That means investment in professional learning and support,” she said.
“Teacher workload is enormous and growing but if we don’t start to provide much better information about these tools, AI will end up increasing teacher workload because it will fall to them to decide [how to use it].
“It’s a potential workload shift we can’t afford right now.”