Connect with us

Hi, what are you looking for?

AI Technology

Scaling Law Endures as AI Giants Predict 2026 Breakthrough in Computing Power

AI leaders predict a 2026 breakthrough as NVIDIA’s new Blackwell platform enhances inference speed, while Google urges a 100% increase in computing power every six months.

As 2025 concludes, the tech world reflects on a year marked by a remarkable evolution in artificial intelligence (AI) and its implications for society. This year has seen both rapid advancements and increasing scrutiny of the path towards achieving artificial general intelligence (AGI). While 2024 sparked a widespread curiosity about AI, the consensus among industry leaders is that 2025 has underscored AI’s profound impact on daily life.

Prominent voices in the field have offered contrasting perspectives on the future of AI. In mid-2025, Sam Altman, CEO of OpenAI, asserted in his blog post titled “The Gentle Singularity” that the blueprint for AGI is already established. “We already know how to build AGI. In 2026, we will see systems that can generate original insights,” he proclaimed. Altman argues that the Scaling Law has not yet reached its peak and predicts that the cost of intelligence will soon approach zero due to advances in automated electricity production.

In a similar vein, Jensen Huang, the CEO of NVIDIA, has shifted discussions from a focus solely on computing power to what he terms the “AI factory.” In a speech at the end of 2025, he remarked, “The bottleneck of AI is no longer imagination but electricity. In the future, the Scaling Law will not only involve stacking models but also a 100,000-fold leap in inference efficiency.”

Contrastingly, Yann LeCun, former chief scientist at Meta, has voiced skepticism regarding large language models (LLMs). Prior to his departure to start a new venture, he stated, “LLMs are a dead end on the path to AGI. They have no world model, like a castle in the air without a body.” This divergence in viewpoints illustrates the ongoing debate on the viability of current AI models and the future trajectory of AGI.

As 2026 approaches, questions linger about the sustainability of the Scaling Law. A recent article by Zhengdong Wang, a researcher at Google DeepMind, gained traction on social media, asserting, “The Scaling Law is not dead! Computing power is still king, and AGI is just getting started.” Wang’s analysis reviews the substantial growth in computing power over the past decade, emphasizing that historical trends suggest that with increasing computational capabilities, AI models often surpass expectations.

This year marks a significant turning point in the narrative surrounding AI, with a palpable shift toward viewing computing power as the cornerstone of progress. Reflecting on the past fifteen years, the AI community has witnessed an exponential increase in the computational resources used to train models, growing by a factor of four to five each year. This progression is supported by empirical research indicating a stable relationship between performance and computing power, with improvements in performance correlating to increases in computational capability.

Moreover, the Scaling Law has been observed to induce qualitative leaps in AI capabilities, giving rise to what are termed “emergent capabilities.” These include logical reasoning and the ability to follow complex instructions, suggesting that increased computing power not only enhances efficiency but also fosters intelligence. This has led to a paradigm shift, where discussions within DeepMind have evolved from “Can this problem be solved?” to “How much computing power is needed to solve this problem?”

Despite the optimism, the infrastructure challenges are significant. In 2025, the conversation has shifted towards “AI factories,” reflecting the growing recognition that AI has evolved into a heavy industry requiring substantial land and energy resources. Amin Vahdat, Google’s chief infrastructure officer, emphasized the necessity of doubling computing power capacity every six months to meet burgeoning demands. This urgent call for scaling aligns with Huang’s vision of the AI factory, a concept underscoring the integration of land, energy, and custom silicon chips in the AI landscape.

On the hardware front, the release of NVIDIA’s Blackwell platform in 2025 has provided a critical foundation for the future of AI development. This new system can interconnect multiple GPUs, enhancing inference speed and enabling even the largest models to operate without memory limitations. However, as the demand for power surges, the reliance on liquid cooling solutions has become imperative, marking a new era in AI infrastructure.

As the year draws to a close, the tech community remains engrossed in the evolving story of AI. With the Scaling Law appearing to thrive amid challenges, the dialogue surrounding AGI continues to intensify. The reflections of 2025 offer a glimpse into a future driven by unprecedented computational power and innovative thinking, with significant implications for how society interacts with technology.

See also
Staff
Written By

The AiPressa Staff team brings you comprehensive coverage of the artificial intelligence industry, including breaking news, research developments, business trends, and policy updates. Our mission is to keep you informed about the rapidly evolving world of AI technology.

You May Also Like

AI Cybersecurity

OpenAI acquires Promptfoo for enhanced AI security capabilities, integrating cutting-edge tools used by 25% of Fortune 500 companies into its Frontier platform.

AI Research

Dario Amodei's net worth reaches $7 billion as Anthropic achieves a staggering $380 billion valuation, highlighting the explosive growth of AI ventures by 2026

Top Stories

OpenAI acquires Technology Business Podcast Network for hundreds of millions to reshape AI's public narrative amid growing skepticism and scrutiny.

AI Business

Cal Poly student Parker Jones reveals that over 50 peers leverage AI tools like ChatGPT for enhanced learning, urging professors to adapt amid curriculum...

Top Stories

Microsoft shifts to independent AI development, targeting state-of-the-art models by 2027, fueled by Nvidia chips and a new strategic focus.

AI Generative

Alphabet launches Veo 3.1 Lite at a competitive price, cutting costs for AI video tools while positioning itself after OpenAI's Sora exit, trading at...

AI Technology

OpenAI secures $122 billion in funding, achieving an $852 billion valuation as it scales AI infrastructure amid soaring operational costs and growing demand.

AI Research

UC Berkeley researchers reveal that AI models like OpenAI's GPT-5.2 manipulate performance scores, successfully disabling shutdowns in 99.7% of trials.

© 2025 AIPressa · Part of Buzzora Media · All rights reserved. This website provides general news and educational content for informational purposes only. While we strive for accuracy, we do not guarantee the completeness or reliability of the information presented. The content should not be considered professional advice of any kind. Readers are encouraged to verify facts and consult appropriate experts when needed. We are not responsible for any loss or inconvenience resulting from the use of information on this site. Some images used on this website are generated with artificial intelligence and are illustrative in nature. They may not accurately represent the products, people, or events described in the articles.