Intel Xeon 6 processors are shown to CNBC at Intel’s advanced packaging facility in Chanfler, Ariona, on November 17, 2025.
Tony Puyol
Google has committed to using multiple generations of Intel central processing units in its artificial intelligence data centers, an expansion of an existing partnership.
The internet giant has long relied on Intel processors, dating back to its earliest server rack ambitions nearly three decades ago. Intel’s newest Xeon 6 CPUs will now run AI training and inference workloads, potentially giving the chipmaker a stronger position in an AI market that’s so far been dominated by Nvidia.
“Their Xeon roadmap gives us confidence that we can continue to meet the growing performance and efficiency demands of our workloads,” Amin Vahdat, Google’s chief technologist for AI infrastructure, said in a statement Thursday.
No financial terms were disclosed, nor did the companies provide a timeline for the agreement.
The deal lands as the CPU takes center stage in the next phase of the AI race. Dion Harris, Nvidia‘s head of AI infrastructure, told CNBC in March that CPUs are “becoming the bottleneck” as agentic workloads move compute needs beyond the graphics processing units that have ruled AI thus far.
“Scaling AI requires more than accelerators — it requires balanced systems,” Intel CEO Lip-Bu Tan said in a statement about the Google deal on Thursday.
Intel, which has been struggling for years to keep pace with new trends in technology, sold a 10% stake to the U.S. government in August, with the Trump administration touting the chipmaker’s ability to make advanced chips on U.S. soil. The following month, Nvidia said it would purchase a $5 billion stake in Intel.
Shares of Intel have nearly tripled in the past year, fueled by those investments.
Intel makes the latest Xeon processor on its most advanced 18A technology at its Arizona chip fabrication plant that opened last year. Despite pouring billions into the foundry side of its business, Intel’s own processors remain the largest customer at the new fab.
But Tan posted on LinkedIn earlier this week that Elon Musk has tapped Intel to design, fabricate and package custom chips for SpaceX, xAI and Tesla at his ambitious Terafab project in Texas, though no financial details or timeline was announced.
As part of Thursday’s announcement, Google and Intel reiterated that they’re collaborating on another type of chip, the infrastructure processing unit, or IPU, which the two companies have worked on together since 2022. In a press release, Intel said this programmable accelerator is used to “offload networking, storage and security functions from host CPUs.”
Google told CNBC in an email that the IPU was a first-of-its-kind chip when the companies first collaborated on it four years ago. Google said it’s designed to help customers better utilize the main CPU in a traditional data center by taking over “overhead” tasks, such as routing network traffic, managing storage, encrypting data and running virtualization software.
For over a decade, Google has also developed its own custom AI accelerator called the tensor processor unit, or TPU. In 2024, Google also started making its own custom CPU, Axion, choosing an Arm-based design over Intel’s leading x86 architecture.
— CNBC’s Kristina Partsinevelos contributed to this report.
WATCH: How advanced packaging became the next bottleneck for making AI chips

