Google Bets on Intel Xeon CPUs to Boost AI Training and Inference

April 9, 2026
Google Bets on Intel Xeon CPUs to Boost AI Training and Inference

Google is expanding its partnership with Intel to use multiple generations of the chipmaker’s CPUs in its AI data centers, positioning Intel’s latest Xeon 6 processors for both training and inference workloads.

The agreement, announced Thursday, deepens a long-standing relationship between the two companies as Google looks to diversify the hardware powering its AI infrastructure. Intel said its processors will play a larger role in handling the growing demands of large-scale AI systems, while Google pointed to performance and efficiency gains as key factors in the decision.

“Their Xeon roadmap gives us confidence that we can continue to meet the growing performance and efficiency demands of our workloads,” Amin Vahdat, Google’s chief technologist for AI infrastructure, said in a statement.

The companies did not disclose financial terms or a deployment timeline. Still, the move underscores a shift in how AI infrastructure is being built, with CPUs taking on a more prominent role alongside the GPUs that have dominated recent AI development.

Intel CEO Lip-Bu Tan framed the partnership around system-level balance rather than reliance on a single type of chip. “Scaling AI requires more than accelerators — it requires balanced systems,” he said.

That framing aligns with broader industry signals. Nvidia’s head of AI infrastructure, Dion Harris, said earlier this year that CPUs are “becoming the bottleneck” as newer AI workloads place additional strain on systems beyond graphics processors.

Beyond Xeon, Google and Intel are also continuing work on a separate chip known as the infrastructure processing unit, or IPU. The programmable accelerator is designed to handle networking, storage, and security tasks that would otherwise consume CPU resources. Google said the chip helps offload “overhead” functions such as routing traffic, managing storage, and running virtualization software.

The expanded collaboration comes as Intel attempts to reestablish itself in the AI hardware market after years of lagging behind newer trends. The company has invested heavily in manufacturing, including producing its latest Xeon chips using its 18A process at a new Arizona fabrication facility.

At the same time, Google continues to pursue a multi-pronged chip strategy. The company has long developed its own AI accelerators, known as tensor processing units, and recently introduced an Arm-based CPU called Axion for its data centers.

Taken together, the partnership suggests Google is broadening its hardware base rather than relying on a single architecture or supplier. While GPUs remain central to AI workloads, the renewed focus on CPUs reflects evolving system requirements as models grow more complex and infrastructure demands increase.

This analysis is based on reporting from CNBC.

Image courtesy of Intel.

This article was generated with AI assistance and reviewed for accuracy and quality.

Last updated: April 9, 2026

About this article: This article was generated with AI assistance and reviewed by our editorial team to ensure it follows our editorial standards for accuracy and independence. We maintain strict fact-checking protocols and cite all sources.

Word count: 439Reading time: 0 minutes

AI Tools for this Article

📧 Stay Updated

Get the latest AI news delivered to your inbox every morning.

Browse All Articles
Share this article:
Next Article