Arm Unveils First Data Center CPU for Agentic AI, Meta Co-Develops
Arm Launches First Data Center CPU for Agentic AI

Arm Unveils First Data Center CPU for Agentic AI

Arm has made a groundbreaking announcement with the launch of its first-ever data center CPU, the Arm AGI CPU. This chip is specifically engineered for Agentic AI, a type of artificial intelligence that operates autonomously by reasoning, planning, and acting, rather than merely responding to individual queries. This move represents a significant shift in Arm's traditional business model, which has historically focused on licensing its chip architecture to other companies for them to design and manufacture their own processors.

Why This Announcement Is Crucial for Arm

Historically, Arm's business model has revolved around licensing its chip architecture to various companies, enabling them to create and produce their own processors based on this technology. This approach has successfully embedded Arm's architecture into hundreds of billions of devices, ranging from smartphones to servers. However, the latest announcement fundamentally alters this model, as Arm is now directly producing the chip itself.

Rene Haas, CEO of Arm, emphasized the importance of this development, stating, "AI has fundamentally redefined how computing is built and deployed. Agentic computing is accelerating that change. Today marks the next phase of the Arm compute platform and a defining moment for our company."

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

This strategic shift provides partners with a new option: instead of choosing between licensing Arm's intellectual property or adopting its Compute Subsystems, they can now deploy Arm-designed silicon directly, offering greater flexibility and efficiency in their operations.

Technical Specifications of the Arm AGI CPU

The technical specifications of the Arm AGI CPU are meticulously designed to support Agentic AI applications. The processor features up to 136 Arm Neoverse V3 cores per CPU, delivering leading performance per core with a memory bandwidth of 6GB/s per core and sub-100 nanosecond latency. It operates at a thermal design power (TDP) of 300 watts, with a dedicated core per program thread, ensuring consistent performance under heavy loads without throttling or wasting idle threads.

In terms of density, the chip supports high-density 1U server chassis for air-cooled deployments, accommodating up to 8,160 cores per rack. For liquid-cooled systems, it can deliver more than 45,000 cores per rack. Arm claims that compared to x86 CPUs, the AGI CPU offers more than double the performance per rack, potentially translating into capital expenditure savings of up to $10 billion per gigawatt of AI data center capacity.

Meta as Lead Partner and Co-Developer

The most significant partnership in this launch is with Meta, the parent company of Facebook, Instagram, and WhatsApp. Meta is not merely a customer but a co-developer of the chip, having worked alongside Arm to build the AGI CPU. Meta plans to deploy this chip to optimize the infrastructure supporting its family of apps, where it will work in conjunction with Meta's own custom silicon, the Meta Training and Inference Accelerator (MTIA), to enable more efficient orchestration in large-scale AI systems.

Santosh Janardhan, head of infrastructure at Meta, commented, "We worked alongside Arm to develop the Arm AGI CPU to deploy an efficient compute platform that significantly improves our data center performance density."

Full List of Partners and Ecosystem Support

Beyond Meta, a wide array of technology companies have confirmed their deployment of the Arm AGI CPU. This list includes Cerebras, Cloudflare, F5, OpenAI, Positron, Rebellions, SAP, and SK Telecom. On the manufacturing and systems side, Arm has partnered with lead OEMs and ODMs such as ASRock Rack, Lenovo, Quanta Computer, and Supermicro.

The broader ecosystem supporting the platform's expansion into silicon now spans more than 50 leading companies across various sectors, including hyperscale computing, cloud services, silicon design, memory, networking, software, and manufacturing. Key names in this ecosystem include Amazon Web Services (AWS), Google Cloud, Microsoft Azure, and TSMC, which is manufacturing the Arm AGI CPU using its advanced 3nm process technology.

Pickt after-article banner — collaborative shopping lists app with family illustration

Other major supporters include Broadcom, Marvell, Micron, Samsung, SK hynix, Hugging Face, Databricks, Oracle Cloud, Red Hat, Snowflake, Cisco, Arista, MediaTek, and GitHub. Nvidia CEO Jensen Huang highlighted the long-standing partnership, noting, "Arm's adaptability has made it possible for us to integrate Arm across all of our platforms and for all different phases of AI. Together we're creating one seamless platform, from cloud to edge to AI factories."