About TrendForce News

TrendForce News operates independently from our research team, curating key semiconductor and tech updates to support timely, informed decisions.

[News] Google Reportedly Courts Smaller CSPs to Host TPUs, Taking Aim at NVIDIA


2025-09-04 Emerging Technologies editor

According to The Information, citing sources, Google has recently approached smaller cloud providers that primarily lease NVIDIA chips, urging them to also host its AI processors in their data centers. As the report notes, the goal may be to encourage adoption of Google’s TPUs, which might place Google in more direct competition with NVIDIA.

As the report points out, sources say Google has struck a deal with at least one cloud provider — London-based Fluidstack — to deploy its TPUs in a New York data center. Google’s efforts extend beyond Fluidstack. According to the report, it has reportedly also sought similar deals with other NVIDIA-focused providers, including Crusoe, which is building a data center full of NVIDIA chips for OpenAI, and CoreWeave, which leases NVIDIA hardware to Microsoft and has a contract to supply OpenAI.

The companies Google is targeting are largely emerging cloud providers that rely heavily on NVIDIA chips, the report highlights. To win over these cloud providers, Google has reportedly offered Fluidstack incentives to aid its TPU expansion. If Fluidstack can’t cover the lease for its new New York data center, Google would act as a “backstop” with up to $3.2 billion, as the report notes.

Google’s TPU Strategy: From Internal Demand to External Growth

Google’s push to advance its own AI chip has been in motion for some time. As the report highlights, sources say the company has considered ramping up the TPU business to boost revenue and reduce its dependence on NVIDIA chips.

Its TPU and AI business is on the rise. Morningstar, citing analysts, estimates that Google’s TPU operations and its DeepMind AI research arm together could be valued at around $900 billion. As the report indicates, the sixth-generation Trillium TPUs, released in December 2024, are in high demand. Demand is also expected to increase for the seventh-generation Ironwood TPU, the company’s first designed for large-scale inference, the report adds.

As The Information points out, Google primarily uses TPUs for its own AI projects, such as the Gemini models, with internal demand soaring in recent years. The company has also long rented TPUs to outside firms — including Apple and Midjourney via Google Cloud, according to the report.

Read more

(Photo credit: Google)

Please note that this article cites information from The Information and Morningstar.


Get in touch with us