Students Are Using AI Text Generators to Write Papers—Are They Cheating?

Chris Stokel-Walker:

Educators who are becoming increasingly aware of their students’ use of text-generating tools admit to being perplexed by them, perceiving both dangers and potential benefits in their use. Cath Ellis, associate dean for education at the University of New South Wales in Sydney, Australia, and a leading researcher in academic integrity, believes that using AI to complete one’s homework or exams is a form of cheating. (She bases that definition on whether a student can look a lecturer in the eye and tell them what they’ve done.) But she’s not an alarmist about the tools. “I think we should stop freaking out about it and calling it an existential crisis,” she said. In fact, Ellis puts them in the same category as word processors’ built-in spelling checkers or slideshow software’s automated design suggestions. “These are tools coming into our professional and personal worlds, and we’re using them,” she said.

One of Ellis’ students admitted to the professor that she had used QuillBot, an AI-powered paraphrasing tool, to summarize sections of prior academic research she wanted to cite in her own work. The same student has also used DeepAI’s text generator to create elements of essays in the past. “I’m not particularly troubled about students using AI bots to write their essays, if I have a conversation with them about their work and they can explain it to me,” she said. It’s when they can’t explain what the bots have generated that she finds it particularly problematic. “It’s not so much that they haven’t done the work. It’s that they haven’t embodied the learning. They don’t have the knowledge on board,” Ellis said.