Tuesday, 13 May 2025

 The AI Writing Debate: Missing the Forest for the Trees

By Richard Sebaggala


 

A recent article published by StudyFinds titled “How College Professors Can Easily Detect Students’ AI-Written Essays” revisits an ongoing debate about whether generative AI tools like ChatGPT can really match the nuance, flair, and authenticity of human writing. Drawing on a study led by Professor Ken Hyland, the article argues that AI-generated texts leave much to be desired in terms of persuasiveness and audience engagement. It lacks rhetorical questions, personal asides, and other so-called 'human' features that characterize good student writing. The conclusion seems simple: AI can write correctly, but it can't make connections.

But reading the article made me uneasy. Not because the observations are wrong, but because they are based on a narrow and, frankly, outdated understanding of what constitutes good academic writing. More importantly, they misrepresent the role of generative AI in the writing process. The arguments often portray Gen AI as if it were another human from a distant planet trying to mimic the way we express ourselves, rather than what it actually is, a tool designed to help us. And here’s the irony. I have experienced first-hand the limitations of human writing, even my own, and I see AI not as a threat to our creativity, but as a reflection of the weaknesses we have inherited and rarely challenged.

 

When I started my PhD at the University of Agder in Norway, many friends back home in Uganda already thought I was a good writer. I had been writing for years, publishing articles and teaching economics. But my confidence was shaken when my supervisor returned my first paper with over two thousand comments. Some of them were brutally honest. My writing was too verbose, my sentences too long and my arguments lacked clarity. What I had previously thought was polished human writing was actually a collection of habits I had picked up from outdated academic conventions. It was a difficult but necessary realisation: being human doesn’t automatically make your writing better.And yet many critics of AI-generated texts would have us believe that it's the very mistakes we’ve internalised, such as poor grammar, excessive verbosity, and vague engagement, that make writing human and valuable.

 

This is why the obsession with 'engagement markers' as the main test of authenticity is somewhat misleading. In good writing, especially in disciplines such as economics, business, law or public policy, clarity, structure, and logical flow are often more important than rhetorical flair. If an AI-generated draft avoids rhetorical questions or personal allusions, this is not necessarily a weakness. Rather, it often results in a more direct and focussed text. The assumption that emotionally engaging language is always better ignores the different expectations in different academic disciplines. What is considered persuasive in a literary essay may be completely inappropriate in a technical research report.

 

Another omission in the argument is the fact that the role of prompters is not considered. The AI does not decide on its own what tone to strike. It follows instructions. If it is asked to include rhetorical questions or to adopt a more conversational or analytical tone, it does so. The study’s criticism that ChatGPT failed to use personal speech and interactive elements says more about the design of the prompts than the capabilities of the tool. This is where instruction needs to change. Writing classes need to teach students how to create, revise, and collaborate using AI prompts. This does not mean that critical thinking is lost, but that it is enhanced. Students who know how to evaluate, refine, and build upon AI-generated texts are doing meaningful intellectual work. We're not lowering the bar, we're modernising the skills.

A recent study by the Higher Education Policy Institute (HEPI) in the UK revealed that 92% of university students are using AI in some form, with 49% starting papers and projects, 48% summarizing long texts, and 44% revising writing. Furthermore, students who actively engaged with AI by modifying its suggestions demonstrated improved essay quality, including greater lexical sophistication and syntactic complexity. This active engagement underscores that AI is not a shortcut but a tool that, when used thoughtfully, can deepen understanding and enhance writing skills

It's also worth asking why AI in writing causes more discomfort than AI in data analysis, mapping, or financial forecasting. No one questions the use of Excel in managing financial models or STATA in econometric analysis. These are tools that automate human work while preserving human judgment. Generative AI, if used wisely, works in the same way. It does not make human input superfluous. It merely speeds up the process of creating, organising, and refining. For many students, especially those from non-English speaking backgrounds or under-resourced educational systems, AI can level the playing field by providing a cleaner, more structured starting point.

The claim that human writing is always superior is romantic, but untrue. Many of us have written texts that are grammatically poor, disorganized , or simply difficult to understand. AI, on the other hand, often produces clearer drafts that more reliably follow an academic structure. Of course, AI lacks originality if it is not guided, but this is also true of much student writing. Careful revision and critical thinking are needed to improve both. This is not an argument in favour of submitting AI-generated texts. Rather, it is a call to rethink the use of AI as a partner in the writing process, not a shortcut around it.

Reflecting on this debate, I realise that much of the anxiety around AI stems from nostalgia. We confuse familiarity with excellence. But the writing habits many of us grew up with, cumbersome grammar, excessive length and jargon-heavy arguments, are not standards to be preserved.They are symptoms of a system that is overdue for reform. The true power of AI lies in its ability to challenge these habits and force us to communicate more consciously. Instead of fearing AI's so-called impersonality, we should teach students to build on their strengths while reintroducing their own voice and judgment.

We are not teaching students to surrender their minds to machines. We are preparing them to think critically in a world where the tools have evolved. That means they need to know when to use AI, how to challenge it, how to add nuance, and how to edit their results to provide deeper understanding. Working alongside AI requires more thinking, not less.

The writing habits we've inherited are not sacred. They are not the gold standard just because they are human. We need to stop missing the forest for the trees. AI is not here to replace the writer, it's there to make our writing stronger, clearer, and more focused if only we let it.

3 comments:

  1. I do agree with you. AI is unmatchable when it comes to improving writing and building ideas.

    ReplyDelete
  2. My favorite article on your blog so far! 🫶🏼

    ReplyDelete
  3. Perphaps this debate will continue for a longtime.

    ReplyDelete