A Broadcom sign is pictured as the company prepares to launch new optical chip tech to fend off Nvidia in San Jose, California, U.S., September 5, 2025.
Brittany Hosea-small | Reuters
Broadcom revealed during a September earnings call that it had signed a customer that had placed a $10 billion order for custom chips.
At the time, Broadcom didn’t say who it was, but on Thursday, CEO Hock Tan revealed that the mystery customer was AI lab Anthropic, which placed an order for the latest Google tensor processing units.
“We received a $10 billion order to sell the latest TPU Ironwood racks to Anthropic,” said Tan, speaking on Broadcom’s fourth-quarter earnings call on Thursday. He also said Anthropic had placed an additional $11 billion order with Broadcom in the company’s latest quarter.
While Broadcom typically doesn’t disclose its large customers, Tan’s September remark drew significant investor attention amid the AI infrastructure boom. A Broadcom official told CNBC in October that the mystery customer wasn’t OpenAI, which has its own agreement to purchase chips from the chipmaker.
Broadcom makes custom chips called ASICs, which some experts believe are more efficient for certain artificial intelligence algorithms than the market-dominating chips from Nvidia. Broadcom helps make Google’s TPUs, and last month, the search company bragged that it trained its state-of-the-art Gemini 3 model entirely on its TPUs.
The chipmaker calls its custom AI chips XPUs, and on Thursday, Tan said his company was delivering entire server racks — not just chips — to Anthropic, which is Broadcom’s fourth XPU customer.
Broadcom on Thursday also said that it has secured a fifth customer for its custom chip business. That customer placed a $1 billion order during the fourth quarter, but once again, Broadcom did not reveal the customer.
“It’s a real customer, and it will grow,” Tan said.

Google-Anthropic deal
In late October, Anthropic and Google announced a sweeping cloud partnership that underscored just how quickly the AI infrastructure race is moving.
The deal is valued in the tens of billions of dollars. The agreement gives Anthropic access to as many 1 million Google TPUs, and is expected to bring well over a gigawatt of new AI compute capacity online in 2026.
Anthropic employs a multi-cloud, multi-chip strategy.
The startup spreads workloads across Google’s TPUs, Amazon’s custom Trainium chips and Nvidia graphics processing units. Anthropic’s various models are tuned to run on whichever platform makes the most sense for training, inference or research.
For Google, Anthropic’s bet on TPUs is exactly the kind of validation investors have been rewarding, as Wall Street increasingly ties Alphabet’s stock run-up to clear demand for its in-house AI chips.
Google Cloud CEO Thomas Kurian previously said Anthropic’s decision to dramatically expand its TPU usage reflects “the strong price-performance and efficiency” the startup has seen over several years.
After more than a decade of internal development, Google is making TPUs widely available to cloud customers as a service, rather than selling the hardware outright, and has been leaning on its chips as a differentiator as it tries to keep pace with surging demand for AI compute.
Analysts have described TPUs as the most credible alternative to Nvidia’s GPUs. Analysts have argued that Google’s tightly tailored ASICs and power-efficient designs could become a meaningful driver of the company’s cloud business growth as power constraints — not chip supply — emerge as the key bottleneck for AI.
