ChatGPT has thrown higher education into tumult. Universities were already using artificial intelligence technology for their own daily business: to remind students to pay off tuition balances, to answer questions about campus life, or even to check students’ work for plagiarism. But ChatGPT, an AI chatbot released to the general public last November, has turned the tables. Now a student can recruit it to generate a passable paper on just about any topic in seconds. Feminism in Virginia Woolf’s fiction? No problem. The heroic code in “Beowulf”? Done. The potential for cheating becomes immense.
Some universities, like Sciences Po in France, have banned ChatGPT for classwork, unless students have permission from instructors. Open Universities of Australia has offered students guidelines for using ChatGPT ethically. The University of Toronto advises instructors to specify which digital tools are allowed for assignments but warns the instructors against using unapproved AI tools to evaluate student work. Perhaps they read the tweets joking that teachers will soon use AI to come up with assignments, students will use AI to do them, and AI will grade the result—at which point everyone can leave education to the bots and go for coffee.
Not all educators are worried. “Students today need to be prepared for a future in which writing with AI is already becoming essential,” writes Lucinda McKnight for Times Higher Education. She also suggests various ways to integrate AI into the classroom. Students can use the technology to do basic research, she proposes, or they can analyze and compare the text produced on a given topic by different AI writers. They can even use programs such as Google’s Verse by Verse to turn out randomized poems—to what end remains a mystery.
For all the opportunities ChatGPT might bring, its greatest threat right now is to the teaching of writing. There are other ways to assess students’ knowledge: oral exams, multiple-choice tests, handwritten answers. So what is the university paper for? If students can use a chatbot to produce a draft in seconds, why have them write anything at all?
ChatGPT is a bot that engages in conversational dialogue—you feel like you’re literally chatting with it. What makes that trick possible is something called a large language model. “These models,” explains Luke Stark, assistant professor of information and media studies at Western University, “are created using a computational technique called deep learning, in which simulated electronic neurons infer patterns in a large amount of data.” In essence, they learn by example. Trained on text from the internet, they are like a supercharged autocorrect, determining what word or phrase is most likely to come next.
Large language models don’t think. What they do is make interpretations based on the material fed into them—at its worst, Stark says, it’s “garbage in, garbage out.” This is why they make so many errors at present, and it’s also why they can duplicate the gaps and biases of their sources. What ChatGPT can’t find, it often makes up: users have noticed it will churn out material with facts and bibliographic references that sound plausible but are wholly imaginary. Without appropriate filters, AI can also reproduce descriptions of illegal and violent activities.
Large language models are good at mimicking our rules, however. This is why I can ask ChatGPT to write a sonnet about doves, in French, and it can rhyme in all the right places. What ChatGPT produces “is a version of what we ask students to do,” says John Warner, author of Why They Can’t Write: Killing the Five-Paragraph Essay and Other Necessities (2018). Functionally, the bot ends up imitating the form of a standard essay. The problem, Warner explains, is that students are often taught to write according to formulas too. If students are judged based on how well they stick to a model, it’s understandable that they will look for the most efficient way to reach that goal. For them, writing is a product they deliver in return for a grade.
The suggestion that teachers can adapt to the new technology by having their students analyze and revise AI-produced work in class shares this basic belief. It assumes that, after they graduate, students will never need to write a first draft again. As long as they have the skills to refine and correct prose made by AI, they will be competitive in the workplace. But the goal of school writing isn’t to produce goods for a market. We do not ask students to write a ten-page essay on the Peace of Westphalia because there’s a worldwide shortage of such essays. Writing is an invaluable part of how students learn. And much of what they learn begins with the hard, messy work of getting the first words down.
Think at length about writing and you may come to the conclusion that it’s an impossible task. The world doesn’t come readily organized into a neat narrative. Ideas do not appear in a linear fashion, labelled by words that represent them perfectly. The writer faces masses of information and her growing network of still-inexpressible hunches about them and tries to translate them into prose that’s bound to seem inadequate. It’s like crafting a shell necklace while swimming in the ocean, the molluscs wriggling away as you reach for them.
At every step of the process, the writer has to make decisions about what to focus on and what to leave out. Each one of these choices opens up new opportunities and closes others off. This back-and-forth movement between the structure of form—even in the most basic sense of words on a page—and the chaos of human thought is generative. It can produce something new: a fresh way of expressing an idea, a thought the writer finds surprising and worth pursuing. Sometimes this struggle helps the writer discover notions that were already in her but that she wasn’t yet capable of articulating. We speak of the writer drafting, shaping, and revising a text, but at times, the text changes her.
“The value of writing a first draft is akin to the value of learning to fall off your bike when you’re beginning to ride it,” says Stark. A certain amount of discomfort is built in. Students need to learn the habits of mind and body they need for a lifetime of writing, to “develop muscle memory.” Warner, too, talks about writing as “embodied practice,” meaning you bring your whole self to it. It may seem odd to think of an intellectual process in physical terms, as though writers were lifting weights at the gym, but the metaphor isn’t too far off the mark. Writing a long piece of text—like, say, a university essay—takes stamina. It is hard to concentrate on one task for long periods of time, hard even to keep still long enough to finish a work. (As some medieval scribes wrote at the end of their works, three fingers write but the whole body labours.)
Through experience, writers learn not just how they can best write but also how they can best think. Much thinking does not, after all, happen at the surface of consciousness. The human brain is not a machine that takes in a query and spits out a result. While working on a demanding project, a writer discovers which tricks he can use to access his subconscious intelligence. He might write by hand or with a timer or cover his computer screen. He might try writing in different places or compose in the shower or while taking a walk. He might dream on it. To be fair, mulling over a mathematics problem or tricky lab results might look the same. The point is not that writing is the only way to think but that schools must train students in how to persist through intellectual challenges, even if it would be easier to throw them at AI. The cost of making things easier is self-knowledge.
To bypass this process in the name of efficiency means losing other benefits as well. Students miss out on the chance to gain expertise by grappling with material on their own. Most of my memories from my own schooling are related to projects I worked on, not to information I absorbed passively from books or teachers. This is in line with what cognitive psychologists call the “generation effect.” It’s been widely observed that people tend to remember information better if they are prompted to produce it themselves rather than read it. “One of my beliefs about writing is that you progress more quickly when you get to ground some of what you’re doing in expertise,” explains Warner. In other words, it’s a cycle: writing helps students absorb knowledge, and that knowledge makes it easier to write the next assignment.
Other things can be lost too. Writing offers students a chance to develop their expressive abilities, to find voices suitable for different writing occasions. Done well, it demands that they think through the needs of their readers, the conventions of the genre they’re working in, and the effects they can achieve through a particular choice of words. Perhaps worst of all, outsourcing the difficult part of writing denies students the opportunity to build an idea and its expression. There is a craft element to intellectual work that goes beyond sorting, criticizing, revising, and annotating. Human beings have a basic need to feel that they can make things in the world. This is why, despite the reported decline of the English major in the past decade, enrolments in creative writing programs are booming. Students, it turns out, actually want to write.
To accept writing as a process through which students come to understand themselves means thinking of education as something more expansive than training in on-the-job skills. Any time we outsource our thinking to something that cannot think, says Warner, “it’s a mistake. It’s merely a shortcut to an end product, and that product is going to be diminished.” He admits that a lot of the writing done in the professional world is, in fact, boilerplate content (such as social media posts, press releases, and product descriptions) that could easily be given to an AI program to complete. And maybe it should be. The onus is on teachers to have their students write the kinds of things a machine can’t—and for that work to be valued.
This gets at the core of a problem. A great deal of the writing students do in school doesn’t fit the best-case scenario I’ve described above. Many writing assignments are just steps to credentialing—or at least they seem so to students. Teaching writing well takes time, effort, and skill. As governments in places like Canada and the UK push to increase class sizes in public schools, and as universities shift more of their teaching load onto overworked adjunct instructors, the likelihood that students will receive robust writing instruction—even from the most dedicated teachers—falls. Given less guidance and under pressure to perform, it’s little wonder students gravitate to ChatGPT and other AI programs. “The neoliberal university is already set up for efficiency,” says Stark, and society is conditioning our students to care solely about outputs and credentials.
Limited resources are a problem, but they also present an opportunity to reflect on the goals of a good education and what tools best serve those ends. AI is here to stay, and students will likely use it throughout their lives. But it is still up to teachers to weigh whether they should spend valuable class time training students in how to massage the products of a large language model. It is also up to us to articulate why writing matters, perhaps most of all when it’s difficult. It matters because people have the capability to grow and learn throughout their lives, and the basis for that is set in school. It matters because human beings are never finished products. The developments in AI force us to ask: Are we teaching people ways that cultivate their humanity, or are we teaching them to do what a machine could do?