A recent publication from Google DeepMind has ignited discussions around the potential of artificial intelligence, particularly its relation to consciousness. The paper, authored by DeepMind scientist Alexander Lerchner, directly contradicts CEO Demis Hassabis‘s assertions regarding the imminent development of artificial general intelligence (AGI). Lerchner’s findings suggest that consciousness in AI systems is unattainable, despite the industry’s prevailing belief that complex computations can lead to conscious experience.
In his March 2026 paper, Lerchner introduces the concept of the “abstraction fallacy,” emphasizing that AI systems only simulate consciousness without genuinely experiencing it. He compares this to a celebrity impersonator who, despite delivering an impressive performance, lacks the authenticity of the actual celebrity. This framework challenges the assumption that impressive language manipulation by AI indicates any internal consciousness.
Central to Lerchner’s argument is the idea of “mapmaker dependency.” He posits that AI systems rely on humans to categorize and label the complexities of reality into forms the machines can understand. For example, the human workers who label training images are essential in creating the meanings that large language models (LLMs) appear to generate independently.
Lerchner further asserts that true consciousness necessitates a physical presence and intrinsic drives—a characteristic absent in digital systems. Johannes Jäger, an evolutionary systems biologist, reinforces this point by stating, “You have to eat, breathe, and you have to constantly invest physical work just to stay alive, and no non-living system does that.” Without physical embodiment, LLMs remain mere “patterns on a hard drive,” activated only when prompted, devoid of any internal motivation or meaning beyond human-defined tasks.
This distinction—between simulation and genuine instantiation—suggests that AGI, in the absence of consciousness, may simply be a highly advanced tool rather than a sentient being. The implications of this revelation are profound, as it necessitates a reevaluation of the narratives that companies like Google have promoted regarding the capabilities of their AI technologies.
While Lerchner’s arguments are being received with interest, they do not represent a groundbreaking shift in the discourse surrounding AI consciousness. Philosophy professors and leading researchers in the field, including Mark Bishop from Goldsmiths University, acknowledge the validity of Lerchner’s arguments but also point out that similar positions have been articulated previously. Bishop states that he supports “99 percent” of Lerchner’s conclusions but finds it unsurprising that Google allowed the publication, given its existing narrative promoting AGI.
This situation creates a credibility paradox for those assessing claims made by AI companies. When internal researchers publish findings that contradict the corporate vision, it lays bare the divide between marketing strategies and scientific realities. The presence of such mixed signals raises questions about whether this discrepancy reflects a genuine uncertainty within the company or a calculated positioning in an ongoing debate about consciousness in machines.
As the discussion surrounding AI and consciousness continues to evolve, the need for transparency in the technology sector remains critical. The distinction between what AI can simulate and what it truly embodies will shape both public perception and future research directions. This ongoing dialogue may ultimately influence the trajectory of AI development and the ethical considerations that accompany it.
See also
Meta’s AI Acquisition Fails as China’s DeepSeek V4 Struggles to Compete
Germany”s National Team Prepares for World Cup Qualifiers with Disco Atmosphere
95% of AI Projects Fail in Companies According to MIT
AI in Food & Beverages Market to Surge from $11.08B to $263.80B by 2032
Satya Nadella Supports OpenAI’s $100B Revenue Goal, Highlights AI Funding Needs





















































