Anthropic earlierAnnounceAnthropic plans to significantly expand its partnership with Google Cloud, a multi-billion dollar agreement under which the company will utilize up to one million of Google's Tensor Processing Unit (TPU) accelerators.
This collaboration is expected to add more than 1GW (gigawatt) of computing power to Anthropic by 2026, thereby supporting the continued advancement of AI technology research and product development.
As the AI computing power arms race intensifies, Anthropic bets on Google TPU
Anthropic's move highlights the current extreme demand for computing resources in the development of large-scale AI models, just as its main competitor OpenAI has recently not onlyCollaboration with AMD, and jointly developed with BroadcomSelf-developed AI chip, highlighting that ensuring sufficient computing power has become a key factor for AI companies to maintain their competitiveness.
The scale of this agreement is staggering. Acquiring one million TPUs will elevate Anthropic's computing power to a whole new level, enabling it to train larger and more complex foundation models and accelerate the iteration, update, and expansion of products such as its Claude series of AI services.
On the other hand, the cooperation with Anthropic has once again consolidated Google Cloud's position as a key infrastructure supplier for large AI companies. At the same time, as Google continues to expand its TPU application services, it has also become more attractive to its partners.Bringing significant business growthAmong them, US chip design companies Broadcom and Taiwan's MediaTek are expected to be the main beneficiaries, making their operating prospects in 2026 highly optimistic.
