Counterpoint Research reported on Thursday that the AI infrastructure market is experiencing a significant transformation as hyperscale cloud providers increasingly shift away from Intel Corp and Advanced Micro Devices Inc. legacy x86 central processing units (CPUs) in favor of proprietary Arm Holdings plc designs. This transition aims to enhance cost efficiency, control, and performance.
Leading tech giants, including Alphabet Inc, Google, Amazon.com Inc, Amazon Web Services, Microsoft Corp., and Meta Platforms Inc, are at the forefront of this shift. They are redesigning their AI server architectures to incorporate Arm-based CPUs, according to Counterpoint.
Traditionally, these companies relied on x86 processors due to their software compatibility and established infrastructure. However, the increasing prevalence of custom AI accelerators has necessitated a move toward heterogeneous computing architectures, thereby accelerating the adoption of Arm-based CPUs built on Neoverse cores.
This transition is part of a broader strategy wherein hyperscalers are increasingly designing their own silicon to reduce dependency on external vendors, improve profit margins, and lower costs associated with running AI workloads at scale. Counterpoint highlights that Arm’s architecture offers significantly better performance per watt compared to traditional x86 systems, a crucial advantage for power-constrained data centers.
Current deployments of Arm-based CPUs across AI infrastructure demonstrate this trend. Google, for instance, is scaling its Axion CPU for the next generation of Tensor Processing Units (TPUs). Meanwhile, AWS is expanding its Graviton processors alongside Trainium chips, and Microsoft has integrated its Azure Cobalt Arm CPU with its Maia AI accelerators, embedding Arm technology into its AI stack from the beginning.
These developments indicate that Arm technology is evolving beyond general-purpose cloud workloads and is becoming integral to the design of AI servers. As hyperscale companies align their CPU designs with proprietary AI accelerators, Meta Platforms has solidified this trend by choosing Arm as a strategic partner for its next-generation Meta Training and Inference Accelerator (MTIA) infrastructure. Additionally, Meta serves as the initial customer for Arm’s Artificial General Intelligence (AGI) CPU platform.
The transition to Arm-based CPUs is expected to gain momentum in the second half of 2026, fueled by a wider rollout of in-house Arm CPUs. Projections suggest that Arm-based CPUs could make up approximately 90% of host CPU deployments in custom AI Application-Specific Integrated Circuit (ASIC) servers by 2029, a significant increase from about 25% in 2025.
As hyperscalers escalate their in-house silicon strategies, the ramifications extend throughout the semiconductor supply chain, with rising demand for advanced manufacturing supporting both AI accelerators and Arm-based CPUs. This coordinated shift underscores a pivotal moment in the tech industry as companies seek to optimize performance and efficiency in an increasingly competitive landscape.
See also
LG Hosts Aimers Hackathon, Engaging 94 Young Developers to Optimize EXAONE AI Model
Tesseract Launches Site Manager and PRISM Vision Badge for Job Site Clarity
Affordable Android Smartwatches That Offer Great Value and Features
Russia”s AIDOL Robot Stumbles During Debut in Moscow
AI Technology Revolutionizes Meat Processing at Cargill Slaughterhouse















































