By Shyam Sharma
Associate Professor and Graduate Program Director
Program in Writing and Rhetoric
shyam.sharma@stonybrook.edu
Last year, students in my WRT 102 class systematically explored how AI chatbots can help them as writers–that is, if they learn how to use the tools effectively and responsibly. By breaking down the process of writing a research-based paper into a step-by-step process, my students asked ChatGPT to assist them with a few dozen different tasks related to those steps. The result has so far been fascinatingly mixed.
Unfortunately, most faculty across campus do not have the luxury of time (i.e., small class), teaching experience/expertise (in some cases), or the curricular space that writing teachers do in order to engage their students in the research and writing processes with the assistance of AI or not. With these challenges in mind, the following are some strategies that can be helpful to colleagues across campus:
Time
The first challenge, lack of time, is most significant for faculty teaching larger classes. Natural language processing tools can be a convenient way to reduce time and labor that writing-intensive assignments require from instructors. Unfortunately, this convenience comes with a number of risks, including plagiarism and the bypassing of learning/skill development. Consequently, all instructors who assign writing might also have to allocate some time on writing instruction as well as time to help students use the new tools effectively and responsibly.
Adopting the “writing to learn” (WTL) approach (rather than just “learning to write”) can justify the investment of some time for teaching writing, as it helps students “discover” ideas and perspectives, “create” connections and structures, and “interpret” text or the real world around them. If writing needs to mediate learning, then it must be seen as more than merely a means for encoding pre-existing ideas in words; it calls for assisting students in the process, including for using AI tools productively and responsibly.
With some scaffolding/support and the right approach, even a small amount of time to help students use AI tools meaningfully can greatly help to harness the power of writing as a means for learning, fostering disciplinary identity, and preparing for professional careers. Below I share a range of strategies to optimize whatever little time faculty can invest.
Trust
Trust is the second challenge when individualized attention is not feasible, and AI tools can complicate this even further. I started my teaching experiments with AI using a simple rule–“cite what you use”–but even in a small class, that simple rule didn’t survive the complexity of how students use the tools. Students used chatbots in too many ways in the process and couldn’t just cite specific words or ideas!
My experience so far is that the only thing we can do is to develop trust in our students, as we help them develop their own “brain muscles” for research and writing skills, with or without using AI tools. AI tools are making academic integrity issues too complex to address through any technologies or policies. With that said, they can also enhance writing if used effectively and responsibly.
In place of doubt and distrust, we must teach our students where to draw the line for themselves. We can no longer just specify “requirements” like page or word count, topic or method specifications, number of citations or strategies of source engagement, etc. We should help students understand and achieve the goals of the assignment by using appropriate tools and resources. We should help students answer their own educational questions: Why am I in this course? What skills will I develop if I invest adequate time and effort–including with AI assistance?
Broadly put, educators are bound to shift focus from policing plagiarism to bolstering originality, from requirements to commitment toward learning, from fear to interest, from policy statements to support, from challenge to confidence, from moralizing to motivating. Students can best decide when and how to use AI tools if they possess sufficient skills and confidence and are inspired enough to take on the challenges of learning.
Teaching
Beyond allocating some time and shifting focus toward trust building, faculty across the disciplines need new teaching strategies to mitigate the challenges posed by AI tools. That requires first educating students what “writing” means in the context of learning and in relation to AI.
Some students ask: Could we soon be just asking AI to do all our writing? This question views writing as a product, ignoring that the use of a text generator in the process of learning is fundamentally different from a businessperson using it to cut costs, a father using it to make lasagna, or a freelance journalist using it to speed up writing. Unlike other users, students must use writing to develop their own brain muscles for researching and reading, summarizing and synthesizing, citing and engaging sources, developing and defending an intellectual position, organizing and creating flow in her ideas, and so on–with and without using AI tools as they become more and more a part of our world. Simply asking a chatbot to “do” these things for us is more like asking for the answer to all math problems and less like using a calculator to better handle the more complex ones.
Professors should also identify and address distinct challenges posed by AI use in writing processes in different disciplines and professions, from the ethical in medicine to the legal in engineering to the financial in business. Creating and using machine-generating language requires more layers of responsibilities for “languaging” than we have always known. This calls for some teaching of “critical AI literacy” skills–including technical skills, rhetorical savvy, and political and ethical considerations.
To summarize, a little time, a focus on trust, and a few teaching strategies could turn a menace into a meaningful resource. From their explorations so far, my students have created a list of tasks that ChatGPT can (potentially) assist them write better, faster, etc–that is, if they have the skills and invest the time to make that assistance meaningful: finds sources**, suggest new ideas or perspectives*, help to brainstorm or start writing*, jog memory on a topic, find/generate basic knowledge about a topic**, outline a paper*, write up thesis statements and topic sentences*, elaborate topic sentences or citation*, tighten and otherwise revise draft*, recognize rhetorical strategies in samples, change style of draft such as by reducing jargons, give feedback or critique on draft*, edit for clarity and correctness, etc. In the list above, to represent the cautions my students say are necessary, I’ve used two asterisks where they’ve flagged it for unreliability (such as making up sources and facts) and one asterisk for other kinds of problems.
I must also add that my class has found that ChatGPT isn’t very reliable even with papers based on library and internet research–not to mention papers that are lab-based or fieldwork-based, creative or contextual, culturally informed or sensitive. And yet, where there is instructional support and seriousness on the part of student writers, the tool becomes more and more useful. Hence the need for some time and trust. In contrast, instructors who simply assign essays and wait for the deadline are going to receive papers, paragraphs, or paraphrases based on chatbots from many students.
With the three major challenges above in mind, as well as the cautions, I would like to share a class handout that I created for AI-assisted writing instruction for a research-based paper. Please adapt any part of it as it best serves the needs of your courses and assignments in your disciplines and contexts. The handout can be found at the link below.
Independent Versus AI-Assisted Learning of Research and Writing Skills Handout