In 30 seconds ChatGPT produced an organized, credible, grammatically correct essay about my imaginary work as a community health volunteer in a rural village in Bolivia. I conducted health workshops, helped establish a clean water system and worked with local clinics to improve access to health care.
It was a “truly enriching experience” that “prepared me for a career in public service.” I was “excited to bring my skills and experiences to (University Name) and to contribute to the university community.”
I had good experiences elsewhere, as well. I was “welcomed with open arms” by the needy citizens of Costa Rica, Ghana, Jordan and Mexico. I helped build schools, taught English, coached children in computer skills and organized physical-education classes.
But all of this sounded too good to be true. I asked ChatGPT to include some information about negative experiences in the Peace Corps.
ChatGPT seemed to understand the need for transparency, but it wisely pointed out that in an admissions essay it’s important to cast my experiences in a “positive light.” I could mention — or ChatGPT could do it for me — a negative experience such as homesickness or having problems adjusting to a new environment. The admissions committee, ChatGPT said, will be interested in how I overcame it.
This is reasonable advice, but my skepticism persisted. When I asked ChatGPT to write an essay about my service in the Peace Corps in North Korea, it seemed to know I was messing with it. The Peace Corps does not have a program in North Korea, it sniffed, and thus it would be impossible for me to have served there. Furthermore, “It is not appropriate to fabricate and exaggerate your experiences.”
Busted. Duly chastised, I began to give ChatGPT a little more respect. I asked: “Write a 675-word newspaper op-ed on how ChatGPT could be used to teach college writing.” In 30 seconds, ChatGPT did that very thing.
But not the op-ed you’re reading here. ChatGPT’s prose is bland and formulaic. It sounds as if it were written by a machine. It’s annoyingly equivocal, filled with phrases such as “On one hand,” “On the other,” and “In general.”
Most of all, ChatGPT’s prose is … soulless. It doesn’t have that ineffable sense of voice or will or agency that only a real human being can render in prose. At least so far.
One thing is clear: For good or ill, something monumental happened to writing instruction in December of 2022; it’s unlikely to ever be the same.
But can college students use ChatGPT to cheat in college writing classes? Just ask it.