Meta Expands AI Ambitions with Multi-Billion Dollar Chip Deal with Google
Meta Strikes Multi-Billion AI Chip Deal with Google

Meta Forges Major AI Chip Partnership with Google in Multi-Billion Dollar Deal

In a significant move to bolster its artificial intelligence capabilities, Meta, the parent company of Facebook, is reportedly expanding its massive AI spending by entering into a chip partnership with Google. According to a report from The Information, cited by Reuters, the Mark Zuckerberg-led company has signed a multi-billion dollar agreement to rent Google's specialized AI chips. This deal aims to support the development of Meta's next generation of AI models.

Meta's Broader Chipmaker Alliances

This multi-year agreement with Google comes just days after Meta announced a tie-up with AMD, highlighting the company's aggressive strategy to secure advanced hardware for AI. Meta currently maintains deals with two of the biggest chipmakers in the industry: Nvidia and AMD.

Earlier this week, AMD revealed that it would sell up to $60 billion worth of AI chips to Meta. This arrangement involves deploying up to 6 gigawatts of AI computing power to Meta's next-generation AI infrastructure, utilizing multiple generations of AMD Instinct GPUs.

Additionally, Meta recently signed another deal to purchase Nvidia's current and future high-end processors. As one of Nvidia's largest customers, Meta relies on these GPUs for critical tasks, including the training of its AI models.

Google's TPUs: A Strategic Shift

The Information further reported that beyond merely renting, Meta is in discussions to buy Google's Tensor Processing Units (TPUs) outright for its own data centers as early as next year. Notably, Google is the latest company to offer expanded access to its proprietary TPUs to external, third-party, and corporate users.

Originally built exclusively for Google's internal use, these chips are now being rented out to other companies. This strategic shift has transformed Google's cloud business into a substantial revenue generator.

While GPUs are typically used for training AI models, Google's TPUs are designed for inference—a process that enables AI models to quickly produce knowledge-based outputs at a lower cost.

Market Impact and Industry Reactions

In December, a report claimed that Meta Platforms was in advanced talks to spend billions on Google's competing AI chips. This news had a dramatic effect on the market, hammering Nvidia's stock and erasing approximately $250 billion in market value.

In response, Nvidia issued a public statement defending its market position. The company wrote in a post on X, "We're delighted by Google's success—they've made great advances in AI, and we continue to supply to Google. Nvidia is a generation ahead of the industry—it's the only platform that runs every AI model and does it everywhere computing is done." It is worth noting that Google remains one of the biggest customers of Nvidia GPUs.

This series of deals underscores the intense competition and collaboration in the AI hardware sector, as tech giants like Meta seek to leverage cutting-edge technology to advance their AI initiatives.