Alongside the partnership, Anthropic said it is doubling Claude Code’s five-hour rate limits for Pro, Max, Team, and seat-based Enterprise plans. The company is also removing peak-hour usage reductions for Pro and Max users and raising API rate limits across Claude Opus models.
The deal marks Anthropic’s latest push to secure large-scale AI infrastructure as demand for Claude continues to grow. The company framed the SpaceX agreement as part of a broader compute expansion strategy that already includes partnerships with Amazon, Google, Broadcom, Microsoft, NVIDIA, and Fluidstack.
Anthropic said its Amazon agreement includes up to 5 gigawatts of capacity, with nearly 1 gigawatt expected online by the end of 2026. Additional infrastructure commitments include a 5 gigawatt arrangement with Google and Broadcom beginning in 2027, a strategic partnership with Microsoft and NVIDIA tied to $30 billion of Azure capacity, and a $50 billion investment in US AI infrastructure with Fluidstack.
The company said it continues to run Claude across multiple hardware platforms, including AWS Trainium, Google TPUs, and NVIDIA GPUs, while exploring additional ways to bring more compute online.
As part of the SpaceX partnership, Anthropic also said it has expressed interest in developing orbital AI compute infrastructure with the company in the future.
The expansion comes as Anthropic increases its focus on international infrastructure deployments for enterprise customers operating in regulated industries such as healthcare, financial services, and government. The company said some of its future capacity growth will be deployed in Asia and Europe to meet data residency and compliance requirements.
Anthropic added that it is prioritizing expansion in countries with “legal and regulatory frameworks” that support large-scale AI infrastructure investment and secure supply chains for hardware, networking, and facilities.
The announcement underscores how access to compute capacity has become a central competitive issue for major AI companies, as demand for model training and inference infrastructure continues to accelerate across the industry.
This analysis is based on reporting from anthropic.
Image courtesy of Anthropic.
This article was generated with AI assistance and reviewed for accuracy and quality.