Member-only story
ChatGPT is not your friend
As a high school English teacher and an Independent Educational Consultant, my fall seasons are awash in essays. This season’s rains have been unusually acidic.
In my English classroom, I teach students to argue persuasively, avoid logical fallacies, craft insightful analyses. In my independent work, I help clients develop reflective college application essays. In both, I prioritize student voice, clarity, and concision.
ChatGPT is good at none of these things.
ChatGPT, my dear students, is not your friend.
There are practical reasons for avoiding the large language model. Colleges run essays through AI analysis, and if they care, will ask students for evidence of originality if they have any doubts. In school, most students have already discovered ChatGPT’s compulsive lying problem: Asked for references for a research paper, ChatGPT falsified author and title names that sounded reasonable but did not exit. Asked about important scenes in George Orwell’s 1984, ChatGPT shared details about the pivotal “prostitute march” — a scene which does not happen in the novel. Like a toddler who swears he “didn’t cross the street” when HE ALREADY IS ACROSS THE STREET, ChatGPT fabricates information to bridge gaps in reality — just to please you.
These lies, like the toddler’s, are easy to overcome. What worries me more is the use of ChatGPT to “smarten up” student writing. Students now ask AI to “improve” their writing or “correct the grammar.” What students fail to…