Anxieties about chatbot software and academic cheating really disclose worries about the process of learning, argues Prof Carol Mutch.
Opinion: ChatGPT (Chat Generative Pre-trained Transformer) is the subject of heated debate in the education sector. It is a text chatbot created by Open AI that was released late last year and which Microsoft has included as part of its Bing search engine. ChatGPT uses artificial intelligence to quickly scan millions of text sources to create a new piece of text to answer a question or meet a set of criteria.
What puts ChatGPT a step ahead of other chatbots is that it appears to create unique pieces of text that avoid being picked up by plagiarism software. It produces a credible and seemingly authentic result. Educators have experimented by asking ChatGPT to answer an exam or essay question and then given the answer to peers to mark. In these experiments, it has been awarded a passing grade. In the US, it passed college exams.
The concern that students will now use ChatGPT for homework, essays and their online exams has led to a range of knee-jerk reactions. It has been banned in New York schools. Universities in Australia are returning to pen and paper exams. Schools in New South Wales and Queensland will install firewalls to block its use.
READ MORE: * AI chatbots will revolutionise education * AI's new frontier: Works of art and human-like chatbots
Other educators feel we should embrace the new technology and use its strengths to improve teaching and learning.
When I was first introduced to ChatGPT by an enthusiast, my response was also knee-jerk. Not another piece of technology I have to master! But I found out that not only can ChatGPT answer an essay-type question but also you can continue to have a conversation with the chatbot and ask it to change the style or format or include extra information, until you are satisfied.
In its current form, ChatGPT does have limitations. It couldn’t provide very recent information, and sometimes the way it used words wasn’t quite right. I asked students in my January summer school class if they had heard of it or used it. Only one had, and he felt the essay it produced wouldn’t quite meet the grade for passing.
It prompted me to think about the issue of academic integrity. Would I be able to recognise whether something I was marking was that student’s own work, to distinguish their work from that written by ChatGPT? In my academic career, I have dealt with problems of plagiarism, inadvertent or deliberate copying and blatant cheating. While my university uses plagiarism software and has processes in place, it is never a pleasant task to raise your concerns with a student whom you suspect of academic dishonesty.
There are myriad questions to consider about what might make a student cheat, which in turn raises further questions about their learning experience, and what we as educators could do to improve it.
Do students cheat because of financial, personal, family or social pressure? Most in my summer school class were working in paid employment while studying and were often stressed and tired.
Is cheating the result of universities becoming too focused on the bottom line? Undergraduate students often learn in large, sterile, impersonal lecture theatres where they can feel anonymous and invisible. How can lecturers get to know their students when they can barely see their faces, let alone learn their names?
My summer school class gave me an opportunity to think about ChatGPT and academic integrity differently. Rather than focus on the outcome, I would focus on the process of learning.
Over the years, I have learned about what works for me as a teacher and what students tell me works for them. I talked about what it meant to be a student in this class, about our student code of conduct and how we could make everyone feel comfortable and included.
Students say they like to get to know their lecturer as a person. I told them a little about my background to introduce an activity where they thought about how their backgrounds had shaped their ideas and beliefs. I provided multiple opportunities for them to discuss ideas and process content with each other. I could see them getting to know each other and begin to feel that we were creating a community of learners.
Students tell me they want to know why they are learning something. How will they use it in later life? I used a teach-process-apply approach so that even the heavy philosophical content we were covering could be applied to everyday ethical conundrums.
Students do not like exams. They tell me their mental health plummets at exam time. For this summer school delivery, I was able to negotiate not to have an exam for this course. I chose two assessments that were more tailored to their interests but would still build on the skills of writing fluently, outlining an argument, supporting it with evidence and using referencing conventions.
Students could choose the issue they wanted to explore, the theory they would use to explain the issue, and the sources that would help them meet the assessment requirements. There were regular discussions with me, and each other, to clarify their topics, plan their strategies and justify their choices. Before they hand in their assessments, they will have scheduled one-on-one graded discussions with me about their assessments, how they relate to the content of the course and what they have learned from the course.
By the time I mark these assignments, I will know my students and their topics well. I will hear their individual voices in my head as I read their work. And will find out whether I have achieved my goal of authentic learning and assessment. In the end – that is what academic integrity is. Perhaps concern about ChatGPT is misplaced anxiety about the tail wagging the dog, but the answer is closer at hand.