
The hype around artificial intelligence technology has driven it to be implemented all around us — from businesses to search engines. Yet its controversial nature has nonetheless made it a topic of fierce debate, as more and more studies are released showing the serious harm AI infrastructure poses to our planet, health and job security. Most concerning, though, is its recent introduction into our education systems. The technology has been rapidly adopted by both teachers and students as a path towards seemingly faster learning. Even here at the University of Connecticut, the use of AI has seen a marked increase this year. That alone is not a surprise — its use as a quick and easy means to an end has earned it a place of honor among the stressed-out and sleep deprived student body of our college, who seek an easy solution to their workload. Even some of our professors have begun encouraging the use of artificial intelligence, building it in as a key part of their pedagogy, with classes as disparate as computer science, linguistics and journalism now including the technology. This pattern, however, represents a concerning danger to our educational environments, and threatens to damage not only our classroom experience, but our learning itself.
In the 2025 school year, 85% of teachers reported using AI in some form in their classroom. Among students, 86% admitted to using AI alongside their schoolwork. Proponents of AI welcome these large numbers. Many credit it with opening up education, allowing a certain ease-of-learning to streamline the teaching process. For some teachers, AI has been credited as being invaluable in their lesson planning and grading. In many minds, the extra free time created by these shortcuts allow for increased student-teacher interactions, creating a thriving school system.
…Or at least, that’s what many expect. In reality, AI use has already led to severe impediments in the learning process. Socially, it has been shown to harm student-teacher relationships — contrary to popular belief. This is because, even if more time is allotted for educators and their pupils to interact directly because of AI, its sheer presence in the classroom has been shown to nonetheless create a sense of disconnection. In one study, 50% of students described feeling less connected to teachers when an AI language model was involved in the classroom learning experience, as opposed to having a neutral or positive stance. Teachers have also described AI as impacting their relationships with students: 71% of educators say the presence of artificial intelligence, both in and out of the classroom, has created an additional burden on them, forcing them to question the legitimacy of their student’s work.
The true danger of artificial intelligence, however, comes from its impact on student learning. A 2026 study by the Brookings Institution, focused on school systems from over 50 countries, found that AI in education poses the risk of undermining the core foundational development of students. Administrators often like to believe that by allowing AI as a supplemental resource to teacher instruction, they are giving an extra tool to students and educators alike. In English environments, it is often introduced as a way to automatically edit and fix students’ grammar and writing. Yet students who use AI to this end often perform worse than their peers who don’t. This is because of AI use’s negative feedback cycle, known as cognitive offloading. Users of AI models have increasingly shifted their thinking onto the technology, leading to a kind of cognitive atrophy commonly associatedwith aging brains — a concerning development for an education system that is meant to capitalize on neuroplasticity and problem solving in young learners.

Notably, the phenomenon of cognitive off-loading is not a new issue in education. Calculators have decreased reliance on mental arithmetic, and keyboards (such as the one used to write this article) have led to a reduced need for handwriting, and the partial extinction of cursive. AI, however, represents a rapid acceleration of this phenomenon, making it much more dangerous. Calculators and keyboards have been built into our education system as an end-goal short cut — the skills to complete their tasks are taught before the tools themselves are, such that students aren’t reliant on their use. The same does not hold true for artificial intelligence. This has caused students who use AI to perform, on average, worse than their peers. The belief that AI is actually good for learning comes instead from the conflation that high grades must equal high learning — a concerning and false dissonance. Imagine this: Student A takes a test on arithmetic, which they didn’t study for, and gets an “A-” after getting lucky on their multiple-choice questions. Happy with their grade, they walk away. Student B fails the test initially but goes back to study and reviews their mistakes. After retaking the test, they get the same “A-” as their peer. Both results were the same, but who really learned more from the experience? Who is better set up to apply that knowledge in the future? The results of this thought experiment have already been shown to be true. A report by Oxford University found that six in ten students found AI to negatively impact the retention of the skills they learned in class. Only by learning through their own mistakes were they able to make real improvement. Yet AI too interferes with student’s ability to recognize and grow from these errors. Language models are inherently sycophantic, meaning that they are prone to agreeing with the user in order to promote positive response. This system not only encourages incorrect work but leaves students uncomfortable and unwilling to be met with constructive criticism — undermining the administrative argument that AI helps students improve from their mistakes.
It is clear that the presence of AI in our learning circles is not only counter-productive, but dangerous. As the debate rages on about the ethical implications of AI’s continued expansion, we must not forget the consequences of its introduction into the minds of the next generations. For the sake of our children and the futures they hold ahead of them, the message must be clear: keep the robots out of our classrooms.
