Nine in ten UK undergraduates surveyed by the Higher Education Policy Institute (Hepi) reported using generative AI (GenAI) in their assessments, highlighting a shift in academic practices. Students can now produce polished assignments with diminished engagement in the underlying concepts, raising concerns that automation may undermine the essence of learning itself. This trend poses a risk of transforming intellectual struggle into a mere performance of intelligence, prompting universities to reinforce the critical notion that learning cannot be automated or outsourced. Writing, researching, and reflecting remain vital components of true understanding.
To address the opportunities and challenges posed by GenAI, educational institutions must anchor their innovations in the fundamental principles of human learning. One way to achieve this is by utilizing AI to support, rather than supplant, cognitive development. Purposefully designed AI tools can alleviate cognitive overload by providing structured explanations, examples, and formative feedback, allowing students to focus on reasoning and interpretation. However, this potential is only realized when students are explicitly taught metacognition—the ability to think about their own thinking.
Metacognitive learners become more discerning users of technology, as they learn to recognize how AI influences their thought processes. This skill can be fostered through straightforward classroom practices. For instance, students might maintain brief AI reflection logs, documenting their use of AI tools, the outputs received, and any revisions or challenges they posed to those outputs. Another effective exercise involves “compare and critique,” where students undertake a task both with and without AI, reflecting on the differences in reasoning, depth, and confidence. Additionally, “think-aloud” activities can model the evaluation of AI responses in real-time by questioning their assumptions, evidence, and tone. Such routines transform AI from a mere shortcut into a supportive scaffold, helping students cultivate awareness of their own learning habits and protecting the reflective capacity that is central to intellectual growth.
Despite the digital evolution, the foundations of effective teaching remain steadfast. Clarity is essential; students must understand not only what they are learning but also why it matters. Transparency in learning goals and assessments can transform evaluations into tools for growth rather than final judgments. Feedback, whether generated by humans or AI, should facilitate improvement, turning assessment into a continued learning experience.
Moreover, learning continues to be a profoundly social endeavor. In a world dominated by personalized digital feeds and algorithmic recommendations, universities must maintain their role as venues for dialogue, disagreement, and collaborative discovery. Group discussions, collaborative projects, and peer reviews foster empathy, intellectual curiosity, and resilience—qualities that even the most advanced machine cannot replicate. These attributes remain the hallmarks of meaningful education.
AI as a cognitive co-pilot
The most effective application of AI in higher education may arise from what can be termed extended cognitive hygiene—a disciplined approach to selecting and utilizing digital tools. Each introduction of AI should have a clear pedagogical purpose, whether it is to prompt reflection, stimulate creativity, or alleviate cognitive load. Central to this process is AI literacy, which must be taught explicitly rather than assumed. Educators can incorporate “verification tasks” into lessons that require students to cross-check AI-generated facts against reliable sources and identify inaccuracies or gaps. Exploring bias through comparative analyses of responses across different AI tools can also enhance critical evaluation skills.
Engaging students in labeling claims as “evidence-based,” “speculative,” or “unsupported” cultivates habits of critical analysis. Over time, these structured exercises can transform skepticism into skill, empowering students to engage confidently with AI while maintaining academic integrity. Additionally, prompt design can serve as a new avenue for developing critical thinking. Activities such as prompt-redrafting, where students refine their questions based on feedback, and the “question ladder,” which encourages successive improvements to a vague prompt, can reinforce this skill set. Thinking aloud while crafting questions helps students grasp how intention influences outcomes, demonstrating that effective inquiries are products of disciplined thought. In this way, AI can function as a cognitive co-pilot, enhancing rather than replacing human capabilities and allowing universities to merge digital fluency with intellectual integrity.
GenAI does not herald the demise of learning; rather, it invites a re-examination of what it truly means to learn. The objective is to harmonize the precision of machines with the empathy, creativity, and ethical awareness that are uniquely human. As GenAI continues to evolve and improve, the future of higher education hinges on the cultivation of intelligent, reflective, and compassionate individuals. The challenge ahead lies not in outsmarting AI but in outgrowing it.
Patrice Seuwou is an associate professor of learning and teaching and director of the Centre for the Advancement of Racial Equality at the University of Northampton.
If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.
See also
Straia Emerges from Stealth with $1M a16z Investment to Transform Higher Ed Data Use
RAND Reports Reveal Pre-K Teacher Insights on Generative AI, Pay, and Training Gaps
Sylacauga City Schools Expand AI Integration, Enhance Blended Learning for Educators
Corporate Training in Italy Surges 80% with AI Integration, Reveals New CEC Study
Educators Embrace AI: Transforming Classroom Learning with New Strategies



















































