Connect with us

Hi, what are you looking for?

AI Technology

Scaling Law Endures as AI Giants Predict 2026 Breakthrough in Computing Power

AI leaders predict a 2026 breakthrough as NVIDIA’s new Blackwell platform enhances inference speed, while Google urges a 100% increase in computing power every six months.

As 2025 concludes, the tech world reflects on a year marked by a remarkable evolution in artificial intelligence (AI) and its implications for society. This year has seen both rapid advancements and increasing scrutiny of the path towards achieving artificial general intelligence (AGI). While 2024 sparked a widespread curiosity about AI, the consensus among industry leaders is that 2025 has underscored AI’s profound impact on daily life.

Prominent voices in the field have offered contrasting perspectives on the future of AI. In mid-2025, Sam Altman, CEO of OpenAI, asserted in his blog post titled “The Gentle Singularity” that the blueprint for AGI is already established. “We already know how to build AGI. In 2026, we will see systems that can generate original insights,” he proclaimed. Altman argues that the Scaling Law has not yet reached its peak and predicts that the cost of intelligence will soon approach zero due to advances in automated electricity production.

In a similar vein, Jensen Huang, the CEO of NVIDIA, has shifted discussions from a focus solely on computing power to what he terms the “AI factory.” In a speech at the end of 2025, he remarked, “The bottleneck of AI is no longer imagination but electricity. In the future, the Scaling Law will not only involve stacking models but also a 100,000-fold leap in inference efficiency.”

Contrastingly, Yann LeCun, former chief scientist at Meta, has voiced skepticism regarding large language models (LLMs). Prior to his departure to start a new venture, he stated, “LLMs are a dead end on the path to AGI. They have no world model, like a castle in the air without a body.” This divergence in viewpoints illustrates the ongoing debate on the viability of current AI models and the future trajectory of AGI.

As 2026 approaches, questions linger about the sustainability of the Scaling Law. A recent article by Zhengdong Wang, a researcher at Google DeepMind, gained traction on social media, asserting, “The Scaling Law is not dead! Computing power is still king, and AGI is just getting started.” Wang’s analysis reviews the substantial growth in computing power over the past decade, emphasizing that historical trends suggest that with increasing computational capabilities, AI models often surpass expectations.

This year marks a significant turning point in the narrative surrounding AI, with a palpable shift toward viewing computing power as the cornerstone of progress. Reflecting on the past fifteen years, the AI community has witnessed an exponential increase in the computational resources used to train models, growing by a factor of four to five each year. This progression is supported by empirical research indicating a stable relationship between performance and computing power, with improvements in performance correlating to increases in computational capability.

Moreover, the Scaling Law has been observed to induce qualitative leaps in AI capabilities, giving rise to what are termed “emergent capabilities.” These include logical reasoning and the ability to follow complex instructions, suggesting that increased computing power not only enhances efficiency but also fosters intelligence. This has led to a paradigm shift, where discussions within DeepMind have evolved from “Can this problem be solved?” to “How much computing power is needed to solve this problem?”

Despite the optimism, the infrastructure challenges are significant. In 2025, the conversation has shifted towards “AI factories,” reflecting the growing recognition that AI has evolved into a heavy industry requiring substantial land and energy resources. Amin Vahdat, Google’s chief infrastructure officer, emphasized the necessity of doubling computing power capacity every six months to meet burgeoning demands. This urgent call for scaling aligns with Huang’s vision of the AI factory, a concept underscoring the integration of land, energy, and custom silicon chips in the AI landscape.

On the hardware front, the release of NVIDIA’s Blackwell platform in 2025 has provided a critical foundation for the future of AI development. This new system can interconnect multiple GPUs, enhancing inference speed and enabling even the largest models to operate without memory limitations. However, as the demand for power surges, the reliance on liquid cooling solutions has become imperative, marking a new era in AI infrastructure.

As the year draws to a close, the tech community remains engrossed in the evolving story of AI. With the Scaling Law appearing to thrive amid challenges, the dialogue surrounding AGI continues to intensify. The reflections of 2025 offer a glimpse into a future driven by unprecedented computational power and innovative thinking, with significant implications for how society interacts with technology.

See also
Staff
Written By

The AiPressa Staff team brings you comprehensive coverage of the artificial intelligence industry, including breaking news, research developments, business trends, and policy updates. Our mission is to keep you informed about the rapidly evolving world of AI technology.

You May Also Like

AI Business

Pentagon partners with OpenAI to integrate ChatGPT into GenAI.mil, granting 3 million personnel access to advanced AI capabilities for enhanced mission readiness.

AI Education

UGA invests $800,000 to launch a pilot program providing students access to premium AI tools like ChatGPT Edu and Gemini Pro starting spring 2026.

AI Generative

OpenAI has retired the GPT-4o model, impacting 0.1% of users who formed deep emotional bonds with the AI as it transitions to newer models...

AI Tools

AGBCLOUD launches an AI-native sandbox infrastructure, enabling instant, secure environments for developers to create and deploy autonomous agents globally.

AI Government

Kyndryl announces a $2.25 billion initiative to train 50,000 students and 30,000 youth in AI skills across India, enhancing digital capabilities nationwide

AI Generative

ChatBCI introduces a pioneering P300 speller BCI that integrates GPT-3.5 for dynamic word prediction, enhancing communication speed for users with disabilities.

Top Stories

Microsoft’s AI chief Mustafa Suleyman outlines a bold shift to self-sufficiency by developing proprietary models, aiming for superintelligence and reducing reliance on OpenAI.

Top Stories

Mistral AI commits €1.2B to build Nordic data centers, boosting Europe's A.I. autonomy and positioning itself as a rival to OpenAI and Microsoft.

© 2025 AIPressa · Part of Buzzora Media · All rights reserved. This website provides general news and educational content for informational purposes only. While we strive for accuracy, we do not guarantee the completeness or reliability of the information presented. The content should not be considered professional advice of any kind. Readers are encouraged to verify facts and consult appropriate experts when needed. We are not responsible for any loss or inconvenience resulting from the use of information on this site. Some images used on this website are generated with artificial intelligence and are illustrative in nature. They may not accurately represent the products, people, or events described in the articles.