You’ve heard the argument that every new technology poses its own existential threat. The calculator diluted our ability to do mental math; our cellphones made us less sociable; study aids like Chegg and SparkNotes made us, ironically, study less.
And just as everyone was worried that computers and the internet would be the downfall of mankind, some believe that AI poses the same kind of threat, to a much greater degree.
It’s worth asking: where do the cliches end and the truths begin? In many cases, those who are skeptical of AI are written off as paranoid traditionalists, much like the camp of people who were wary of the proliferation of search engines. But consider this…
At the turn of the century, E.D. Hirsch Jr. published an essay called “‘You Can Always Look It Up’… Or Can You?” In it, he argued that the now-integral method of teaching “learning skills” actually hinders students’ ability to absorb knowledge. This is because students are taught how to look up information without being taught how to distill it. They are inadvertently robbed of the process by which they come to understand the pieces of knowledge they “look up.”
Of course, the internet proved to be a net positive (in most cases) for productivity and communication. However, we cannot deny the truth that now, students read less, write more poorly, and put less effort into studying. The accessibility of surface-level information, and the encouragement to capitalize on it, are arguably to blame.
The same alarm bells that sounded for Hirsch 25 years ago are sounding for education leaders like Robert Pondiscio in this new age of AI. In his eloquent phrasing: “What Hirsch feared might happen to students who relied too heavily on search engines pales next to the complacency invited by AI, which offers the illusion of mastery without the work of learning. It allows both students and teachers to skip the hard part—the thinking.”
“The illusion of mastery without the work of learning.” The risk that AI poses today goes beyond limiting students’ competency; it is, at best, diminishing their work ethic and, at worst, making the necessity for raw skills obsolete.
When students use AI, the answers provided to questions (which once took diligent research) are now presented wrapped with a neat bow. The mind which engages with it, as Pondiscio points out, remains idle.
Of course, this concern is nothing new. Brilliant minds have been calling attention to the tradeoffs of rapid technological advancement since the beginning of time. But as technology exponentially challenges and minimizes the need for (inherently human) cognition, the problems I mentioned become more dire, and more urgent.
Students are using chatbots to do the studying for them, as schools and teachers increasingly look to AI to improve efficiency or even craft assignments.
Overdependency on AI will have the opposite effect from what the siren sound as we hear it today promises. The dilution of our inherent human ability to think and reason will be the price paid for “efficiency” and good grades.
At least, when search engines were proliferating, users still needed good judgment to search the right things and sift through the limitless links provided on such topics. AI, in many cases, does the researching, compiling, thinking, judging, and presenting for us.
Our minds will be left untrained in every aspect but the one which conditions us to trust rather than verify.
Again, Pondiscio captures it best: “This is not an argument for technophobia. It’s an argument for intellectual vigilance. The civic mission of education is not to make students efficient consumers of information but to form judicious minds—citizens who can weigh evidence, detect bias, and recognize coherence or nonsense when they see it. If AI makes those faculties atrophy, it will have made us not smarter but shallower.”
The new technology, while it does provide the temptation towards this more passive disposition, is not in itself an inherent evil. But if used improperly or without restraint, it can be a real problem. Schools have a duty and obligation to teach students how to use AI responsibly, including as an amplifier for understanding knowledge and retaining information. That means the curious mind must be cultivated first, before it is introduced to AI as a helpful tool.