Artificial Intelligence (AI) is becoming more advanced by the minute, and evolving into a more integral part of society every day. Among the latest innovations is a software called ChatGPT, an AI chatbot that spits out humanlike conversational dialogue while learning exponentially from itself and its previous inputs. ChatGPT can research or write anything from original academic papers to company memos. This tool is also used in schools by both teachers and students.
As a recent college graduate who has seen those around me pick up ChatGPT, it’s hard not to notice that this incredibly impressive innovation can simplify day-to-day tasks like writing articles or emails (before you ask, yes, I wrote this myself). It can also make the workplace much more efficient. But should bosses credit work to an employee who gave a prompt to ChatGPT and received a document from the program? Should a teacher give a good grade to a student for a paper that amounted largely to asking AI to do their work?
Certain districts, according to WRAL, in North Carolina are embracing ChatGPT in their middle and high school curriculums, most notably Wake County and Chapel Hill-Carrboro City. Others, like Granville County, are open to its use but have not developed an official policy yet.
ChatGPT’s advocates argue that the chatbot can bolster students’ experience in the classroom. To counter the worry that it hinders the development of necessary skills, some say that because the outputs of this AI are not always entirely accurate, students can hone their critical thinking skills by deciding what is true and what is not after it spits out information. Millbrook High School history teacher Mark Grow told WRAL that ChatGPT can help students access information quickly in the classroom, creating an avenue for them to have “higher-level” conversations more quickly and become more “methodical curators of information.”
Other districts, like Wilson and Sampson counties, believe ChatGPT ought to be banned from school servers and excluded from the curriculum, arguing that it is nearly impossible to determine if students use it to cheat on assignments or tests and plagiarize academic papers.
It seems to me that those who are skeptical of the technology’s impacts are seeing the situation more clearly. ChatGPT largely does students’ work for them, deteriorating their work ethic in and out of the classroom. Perhaps just as worrisome, ChatGPT will replace the need for still-developing minds to learn research and writing skills.
Why spend hours working on a project or a paper when it can be done for you in seconds, with almost no risk of being caught? Also, why hone study skills when chatbots find answers to questions for you in seconds? All of a sudden, students will find that their capability of retention is noticeably weakened. Essentially, educators run the risk of trading proficiency in AI usage for vital skills like deep reading, research, writing, (true) critical thinking, memory and retention, and creativity.
The impediment of creativity may be equally detrimental. If students use AI to write their work or do their research for them, their creativity is squandered every time. For example, if the same assignment is presented to a class of students, and they all have ChatGPT do it for them, each one will come up with a unique product in the same way that they would if they did it themselves. But, importantly, they did not. Rather than let their unique perspectives shape their final product, they will watch as an AI chatbot instantaneously does it for them.
Here is a little story from Business Insider to put this into perspective: Earlier this year, the military created a highly sophisticated AI robot capable of detecting humans approaching from relatively large distances. However, when human creativity was employed in order to test the limitations of the software design, it was quickly discovered that eight out of eight times, a group of Marines outsmarted the wildly expensive Pentagon tool.
In order to avoid being detected, the Marines figured out that all they had to do was not walk like a human. Some of them did somersaults, while others giggled as they approached under a cardboard box or dressed up like trees. If human creativity is not cultivated the way it should be in middle and high school, and instead students become dependent on AI technology, they become no more impressive than this limited robot. Worst case scenario, over time they will not be capable of coming up with ways to outsmart the very technology that they are dependent on like the Marines did.
Student achievement must be prioritized in schools. I believe that granting free reign of not-yet regulated AI tools will undoubtedly hurt students more than it will help them.
Schools should not simply embrace this powerful technology and assume students will use it responsibly, or that they won’t be negatively affected by its easy answers. If schools want to teach students how to use it responsibly, I recommend offering extracurricular sources or optional electives for students in high school (middle school seems way too young — the earlier it is introduced, the less time students have for authentic development).
Since certain schools are already taking the leap of embracing AI, it is now more important than ever for parents who don’t want to see their children become dependent on it to exercise their choice options for their student. It would be one thing to opt into a ChatGPT elective, but it is another matter entirely to be forced to learn with it or to be surrounded by peers using it. Students who would rather hone the skills that ChatGPT will hinder deserve the ability to attend a school where they can.