Harvard researchers are grappling with the implications of generative AI tools in education, particularly their potential impact on students’ cognitive development and critical thinking skills. In a recent episode of the podcast “Harvard Thinking,” experts including Tina Grotzer, a cognitive scientist at Harvard’s Graduate School of Education, and Michael Brenner, a professor of applied mathematics, discussed the challenges and opportunities presented by AI technologies in learning environments.
As generative AI becomes increasingly prevalent, a debate has emerged over its role in educational settings. Samantha Laine Perfas, the podcast’s host and a writer for The Harvard Gazette, highlighted concerns that such tools may undermine students’ ability to think independently. Reports indicate that assignments, which traditionally required hours of effort, can now be completed in a fraction of the time, raising questions about the long-term effects on skills development.
“Once you start to know what your mind can do that’s so much better than AI, it makes sense that some tasks are well-relegated to AI and others are not,” Grotzer stated, emphasizing the need for a balanced approach to AI integration in learning. The conversation explores the necessity of determining which educational tasks should be assisted by AI and which should remain the purview of traditional learning methods.
Grotzer noted that while educational systems have long prioritized factual knowledge, critical and creative thinking are equally vital. “Learning also involves thinking about your mind, how we use our minds well, understanding our minds,” she explained. This perspective is crucial as educators seek to harness AI’s capabilities without sacrificing students’ cognitive growth.
Ying Xu, an assistant professor of education at Harvard, raised additional concerns about the age at which students should engage with AI tools. “What kinds of tools are we thinking of introducing?” she asked, pointing out that specialized educational tools can enhance learning, while more general AI applications may require higher levels of cognitive maturity. Xu’s research indicates that self-regulation is necessary for students to interact effectively with AI, as they must learn to balance its use with independent thinking.
In a survey of 7,000 high school students, nearly half reported feeling overly dependent on AI for their learning, with many expressing difficulty in limiting its use. This highlights a broader challenge: how to incorporate AI in educational settings while preserving essential skills like critical thinking and self-regulation.
Brenner, who teaches graduate-level courses, shared his experience adapting his curriculum in light of AI advancements. He noted that students should be challenged to tackle more complex problems rather than relying on AI for straightforward answers. “I think that because we have AI, students should do more, they should solve harder problems,” he asserted. Brenner recounted redesigning his course to require students to create problems that existing AI tools could not solve, fostering deeper engagement and understanding.
Yet, Grotzer expressed concern over the quality of student submissions as AI-generated content became more prevalent. “I’m reading 60 pages of glop; it’s not terribly thoughtful,” she remarked, pointing to the need for educators to rethink how they assess student work in an era of AI. The shift toward more authentic assessments is critical, as is ensuring that students maintain a sense of agency in their learning journeys.
Xu highlighted the importance of human interaction in education, noting that while AI can provide information, it cannot replace the relational aspects of learning. “Learning is much more than just exchanging information and receiving feedback,” she stated. Her studies show that students often prefer feedback from human instructors over AI-generated responses, indicating the value of personal connection in the educational process.
As the discussion concluded, the panelists urged educators and parents alike to consider the implications of AI in children’s lives. “What I would add is that AI is very important, but there are also many other things going on in young people’s lives that are equally important,” Xu warned. The need for comprehensive education that balances technology with personal growth and emotional development is more pressing than ever.
In this rapidly evolving educational landscape, the challenge will be to integrate AI tools in a manner that enhances learning while fostering critical thinking, emotional intelligence, and agency among students. As Grotzer emphasized, the conversation is ongoing, requiring continuous reflection on the capabilities of both AI and the human mind.
See also
Andrew Ng Advocates for Coding Skills Amid AI Evolution in Tech
AI’s Growing Influence in Higher Education: Balancing Innovation and Critical Thinking
AI in English Language Education: 6 Principles for Ethical Use and Human-Centered Solutions
Ghana’s Ministry of Education Launches AI Curriculum, Training 68,000 Teachers by 2025
57% of Special Educators Use AI for IEPs, Raising Legal and Ethical Concerns





















































