Vivold Consulting
June 27, 2025

OpenAI turns to Google's AI chips to power its products, source says

OpenAI has begun renting artificial intelligence chips—specifically tensor processing units (TPUs)—from Google to support its AI products such as ChatGPT.
OpenAI has begun renting artificial intelligence chips—specifically tensor processing units (TPUs)—from Google to support its AI products such as ChatGPT, a move signaling a strategic shift from its traditional reliance on Nvidia GPUs and Microsoft-backed infrastructure. This partnership marks a notable collaboration between AI sector competitors, as OpenAI seeks to expand computing capacity through Google Cloud services. OpenAI previously depended heavily on Nvidia chips for both training and inference tasks, but rising costs have prompted interest in Google's TPUs as a potentially more affordable alternative. Though Google is not providing its most powerful TPUs to OpenAI, this marks the first major instance of OpenAI utilizing non-Nvidia chips. The deal also underscores Google's effort to commercialize its in-house AI technology and expand its cloud business, with clients ranging from tech giants like Apple to startups including Anthropic and Safe Superintelligence. Both companies declined to comment on the specifics of the arrangement.