Anthropic Considers Developing Custom AI Chips to Boost Efficiency
Anthropic, the company behind the Claude AI model, is actively exploring the possibility of designing its own custom artificial intelligence chips. According to a Reuters report published on Thursday, which cited three anonymous sources, the firm is in the early stages of this initiative. The plans are still preliminary, meaning Anthropic has not finalized a specific chip design or assembled a dedicated team for the project. The company could ultimately decide to abandon this path and continue purchasing chips from current suppliers like Nvidia, Google, and Amazon. Anthropic has declined to comment on these developments.
Revenue Growth Drives Silicon Ambitions
The primary motivation behind this exploration is Anthropic's rapid revenue expansion. Earlier this week, the company announced that its annualized run rate has surpassed $30 billion, a significant increase from $9 billion at the end of 2025. This surge translates into a massive volume of AI inference tasks, making the prospect of owning proprietary silicon increasingly attractive from a cost and performance standpoint.
Recent Compute Deals and Current Infrastructure
This news comes just days after Anthropic secured its largest compute agreement to date. Prior to the Reuters report, the company announced a long-term partnership with Google and Broadcom, securing approximately 3.5 gigawatts of TPU capacity set to become available from 2027. This represents triple the computing power Anthropic was utilizing earlier this year. Additionally, AWS CEO confirmed to CNBC that Anthropic's latest models, including its most advanced one named Mythos, are being trained on Amazon's Trainium chips.
Anthropic is not currently facing a compute shortage. The company strategically runs Claude across a mix of Google TPUs, Amazon Trainium chips, and Nvidia GPUs, selecting the most suitable hardware for each workload. Custom chips would complement this existing ecosystem rather than replace it entirely.
Industry-Wide Shift Toward Custom Silicon
Anthropic's potential entry into custom chip design aligns with a broader industry trend. Meta has been developing its MTIA chip line for both training and inference purposes. OpenAI entered into a 10-gigawatt custom accelerator deal with Broadcom in October 2025. Google and Amazon have long utilized their own TPU and Trainium chips, respectively, to power internal AI operations. Broadcom has emerged as a key design partner for multiple companies, including a fifth unnamed customer.
Every major player in the AI sector is moving toward owning at least part of its silicon stack, a shift that poses a future challenge for Nvidia's market dominance.
Financial Considerations and Market Impact
Reuters estimates the cost of designing an advanced AI chip at around $500 million. While Anthropic remains unprofitable, its revenue tripling in just four months provides a stronger financial foundation to consider such a substantial investment. This exploration underscores the competitive pressures and strategic moves shaping the future of AI hardware infrastructure.



