As word of students using AI to automatically complete essays continues to spread, some lecturers are beginning to rethink how they should teach their pupils to write.
Writing is a difficult task to do well. The best novelists and poets write furiously, dedicating their lives to mastering their craft. The creative process of stringing together words to communicate thoughts is often viewed as something complex, mysterious, and unmistakably human. No wonder people are fascinated by machines that can write too.
Although AI can generate text with perfect spelling, great grammar and syntax, the content often isn’t that good beyond a few paragraphs. The writing becomes less coherent over time with no logical train of thought to follow. Language models fail to get their facts right – meaning quotes, dates, and ideas are likely false. Students will have to inspect the writing closely and correct mistakes for their work to be convincing.
Prof: AI-assisted essays ‘not good’
Scott Graham, associate professor at the Department of Rhetoric & Writing at the University of Texas at Austin, tasked his pupils with writing a 2,200-word essay about a campus-wide issue using AI. Students were free to lightly edit and format their work with the only rule being that most of the essay had to be automatically generated by software.
In an opinion article on Inside Higher Ed, Graham said the AI-assisted essays were “not good,” noting that the best of the bunch would have earned a C or C-minus grade. To score higher, students would have had to rewrite more of the essay using their own words to improve it, or craft increasingly narrower and specific prompts to get back more useful content.
“You’re not going to be able to push a button or submit a short prompt and generate a ready-to-go essay,” he told The Register.
“I think if students can do well with AI writing, it’s not actually all that different from them doing well with their own writing. The main skills I teach and assess mostly happen after the initial drafting,” he said.
“I think that’s where people become really talented writers; it’s in the revision and the editing process. So I’m optimistic about [AI] because I think that it will provide a framework for us to be able to teach that revision and editing better.
“Some students have a lot of trouble sometimes generating that first draft. If all the effort goes into getting them to generate that first draft, and then they hit the deadline, that’s what they will submit. They don’t get a chance to revise, they don’t get a chance to edit. If we can use those systems to speed write the first draft, it might really be helpful,” he opined.
Listicles, informal blog posts, or news articles will be easier to imitate than niche academic papers or literary masterpieces. Teachers will need to be thoughtful about the essay questions they set and make sure students’ knowledge are really being tested, if they don’t want them to cut corners.
“The onus now is on writing teachers to figure out how to get to the same kinds of goals that we’ve always had about using writing to learn. That includes students engaging with ideas, teaching them how to formulate thoughts, how to communicate clearly or creatively. I think all of those things can be done with AI systems, but they’ll be done differently.”
The line between using AI as a collaborative tool or a way to cheat, however, is blurry. None of the academics teaching writing who spoke to The Register thought students should be banned from using AI software. “Writing is fundamentally shaped by technology,” Vee said.
“Students use spell check and grammar check. If I got a paper where a student didn’t use these, it stands out. But it used to be, 50 years ago, writing teachers would complain that students didn’t know how to spell so they would teach spelling. Now they don’t.”
Most teachers, however, told us they would support regulating the use of AI-writing software in education
Mills was particularly concerned about AI reducing the need for people to think for themselves, considering language models carry forward biases in their training data. “Companies have decided what to feed it and we don’t know. Now, they are being used to generate all sorts of things from novels to academic papers, and they could influence our thoughts or even modify them. That is an immense power, and it’s very dangerous.”
Lauren Goodlad, professor of English and Comparative Literature at Rutgers University, agreed. If they parrot what AI comes up with, students may end up more likely to associate Muslims with terrorism or mention conspiracy theories, for example.
“As teachers, we are experimenting, not panicking,” Monroe told The Register.
“We want to empower our students as writers and thinkers. AI will play a role… This is a time of exciting and frenzied development, but educators move more slowly and deliberately… AI will be able to assist writers at every stage, but students and teachers will need tools that are thoughtfully calibrated.”