Since the launch of ChatGPT in late 2022, academics have expressed concern over the impact the artificial intelligence service could have on student work.
But educational institutions trying to safeguard academic integrity could be looking in the wrong direction. Yes, ChatGPT raises questions about how to assess students’ learning. However, it should be less of a concern than the persistent and pervasive use of ghostwriting services.
Essentially, academic ghostwriting is when a student submits a piece of work as their own which is, in fact, written by someone else. Often dubbed “contract cheating,” the outsourcing of assessment to ghostwriters undermines student learning.
Universities and other institutions have employed plagiarism-detection tools, such as Turnitin, in an effort to combat the more obvious forms of academic misconduct.
But contract cheating is increasingly commonplace as time-poor students juggle jobs to meet the soaring costs of education. And the internet creates the perfect breeding ground for willing ghostwriting entrepreneurs.
In New Zealand, 70-80% of tertiary students engage in some form of cheating. While most of this academic misconduct was collusion with peers or plagiarism, the emergence of artificial intelligence has been described as a battle academia will inevitably lose.
It is time a new approach is taken by universities.
Allowing the use of ChatGPT by students could help reduce the use of contract cheating by doing the heavy lifting of academic work while still giving students the opportunity to learn.
The risky business of ghostwriting
Universities have been cracking down on ghost writing to ensure quality education, to protect their students from blackmail and to even prevent international espionage.
Contract cheating websites store personal data making students unwittingly vulnerable to extortion to avoid exposure and potential expulsion from their institution, or the loss of their qualification.
Leer más: ChatGPT is the push higher education needs to rethink assessment
Some researchers are warning there is an even greater risk – that private student data will fall into the hands of foreign state actors.
Preventing student engagement with contract cheating sites, or at least detecting students who use them, avoids the likelihood of graduates in critical job roles being targeted for nationally sensitive data.
ChatGPT as friend not foe
It is inevitable ChatGPT will increasingly become part of how students complete their assigned work. While changing the way assignments are completed – and assessed – there are a number of reasons why ChatGPT could also be harnessed as an educational tool.
ChatGPT still requires a certain level of engagement from students. They have to guide the AI through various stages of the research and writing process. By meticulously defining their research question, crafting precise prompts, critically assessing generated content and integrating it with their original thoughts, students retain control over their intellectual journey.
Leer más: ChatGPT: Student insights are necessary to help universities plan for the future
Given the underworld associated with ghostwriting, artificial intelligence has the potential to bust the contract cheating economy. This would keep students safer by providing them with free, instant and accessible resources.
Using natural language processing and machine learning algorithms, ChatGPT fosters originality by offering students instant, personalised feedback. By stimulating creativity, broadening vocabulary and enhancing structural coherence, ChatGPT could cultivate an environment where students can flourish and develop their distinct style.
Finally, those who argue that artificial intelligence technologies like ChatGPT may contribute to the erosion of academic integrity overlook the game-changing potential of this sort of technology to refine citation practices. ChatGPT can provide students with style guides and citation generators. These tools can enable students to appropriately credit sources and circumvent plagiarism.
By inputting the relevant context, ChatGPT can assess the author backgrounds, considering cultural, political and ethical biases that may influence their views. In turn, ChatGPT can recommend alternative readings that offer a well-rounded array of viewpoints.
Levelling the playing field
Arguably, the most significant advantage of artificial intelligence tools lie in their potential to level the playing field for students.
Students from diverse backgrounds face several challenges, including navigating uncharted academic terrain, adapting to unfamiliar environments, and grappling with the pressures of independence. These obstacles are amplified for marginalised students and those attending underprivileged schools.
Leer más: ChatGPT killed the student essay? Philosophers call bullshit
Academic integrity fundamentally hinges on promoting fairness in the educational process. However, ensuring equal access to resources and support for all students is a daunting task, particularly when confronted with large classes or students with varying academic preparedness.
ChatGPT can serve as a valuable tool in advancing academic integrity by granting all students access to the same resource for honing their writing skills and obtaining feedback on their work, irrespective of their backgrounds or academic prowess.
There needs to be more research on the learning opportunities offered by artificial intelligence programmes like ChatGPT. But it is here, and for a variety of reasons students are using it. Rather than banning ChatGPT and programmes like it, we should be using these tools to help students. In doing so, we would reduce the need for students to seek out other – potentially harmful – ways of completing their assessments.
Las personas firmantes no son asalariadas, ni consultoras, ni poseen acciones, ni reciben financiación de ninguna compañía u organización que pueda obtener beneficio de este artículo, y han declarado carecer de vínculos relevantes más allá del cargo académico citado anteriormente.
This article was originally published on The Conversation. Read the original article.