As 2025 concludes, the tech world reflects on a year marked by a remarkable evolution in artificial intelligence (AI) and its implications for society. This year has seen both rapid advancements and increasing scrutiny of the path towards achieving artificial general intelligence (AGI). While 2024 sparked a widespread curiosity about AI, the consensus among industry leaders is that 2025 has underscored AI’s profound impact on daily life.
Prominent voices in the field have offered contrasting perspectives on the future of AI. In mid-2025, Sam Altman, CEO of OpenAI, asserted in his blog post titled “The Gentle Singularity” that the blueprint for AGI is already established. “We already know how to build AGI. In 2026, we will see systems that can generate original insights,” he proclaimed. Altman argues that the Scaling Law has not yet reached its peak and predicts that the cost of intelligence will soon approach zero due to advances in automated electricity production.
In a similar vein, Jensen Huang, the CEO of NVIDIA, has shifted discussions from a focus solely on computing power to what he terms the “AI factory.” In a speech at the end of 2025, he remarked, “The bottleneck of AI is no longer imagination but electricity. In the future, the Scaling Law will not only involve stacking models but also a 100,000-fold leap in inference efficiency.”
Contrastingly, Yann LeCun, former chief scientist at Meta, has voiced skepticism regarding large language models (LLMs). Prior to his departure to start a new venture, he stated, “LLMs are a dead end on the path to AGI. They have no world model, like a castle in the air without a body.” This divergence in viewpoints illustrates the ongoing debate on the viability of current AI models and the future trajectory of AGI.
As 2026 approaches, questions linger about the sustainability of the Scaling Law. A recent article by Zhengdong Wang, a researcher at Google DeepMind, gained traction on social media, asserting, “The Scaling Law is not dead! Computing power is still king, and AGI is just getting started.” Wang’s analysis reviews the substantial growth in computing power over the past decade, emphasizing that historical trends suggest that with increasing computational capabilities, AI models often surpass expectations.
This year marks a significant turning point in the narrative surrounding AI, with a palpable shift toward viewing computing power as the cornerstone of progress. Reflecting on the past fifteen years, the AI community has witnessed an exponential increase in the computational resources used to train models, growing by a factor of four to five each year. This progression is supported by empirical research indicating a stable relationship between performance and computing power, with improvements in performance correlating to increases in computational capability.
Moreover, the Scaling Law has been observed to induce qualitative leaps in AI capabilities, giving rise to what are termed “emergent capabilities.” These include logical reasoning and the ability to follow complex instructions, suggesting that increased computing power not only enhances efficiency but also fosters intelligence. This has led to a paradigm shift, where discussions within DeepMind have evolved from “Can this problem be solved?” to “How much computing power is needed to solve this problem?”
Despite the optimism, the infrastructure challenges are significant. In 2025, the conversation has shifted towards “AI factories,” reflecting the growing recognition that AI has evolved into a heavy industry requiring substantial land and energy resources. Amin Vahdat, Google’s chief infrastructure officer, emphasized the necessity of doubling computing power capacity every six months to meet burgeoning demands. This urgent call for scaling aligns with Huang’s vision of the AI factory, a concept underscoring the integration of land, energy, and custom silicon chips in the AI landscape.
On the hardware front, the release of NVIDIA’s Blackwell platform in 2025 has provided a critical foundation for the future of AI development. This new system can interconnect multiple GPUs, enhancing inference speed and enabling even the largest models to operate without memory limitations. However, as the demand for power surges, the reliance on liquid cooling solutions has become imperative, marking a new era in AI infrastructure.
As the year draws to a close, the tech community remains engrossed in the evolving story of AI. With the Scaling Law appearing to thrive amid challenges, the dialogue surrounding AGI continues to intensify. The reflections of 2025 offer a glimpse into a future driven by unprecedented computational power and innovative thinking, with significant implications for how society interacts with technology.
See also
AGCO’s Innovation Hub Enhances AI for Farming with Real-Time Farmer Feedback
AI Professionals Must Upskill: Top Resources for Career Growth and Learning Opportunities
Canada’s National Security Agency Reviews AI Use, Identifies Potential Risks



















































