
OpenAI is no longer "All In" as Nvidia shifts to Google TPU, breaking chip dominance!

OpenAI has begun using AI chips TPU manufactured by Google, marking its first departure from solely relying on NVIDIA chips. This shift will help OpenAI reduce inference costs and initiate a vendor diversification strategy. This collaboration may also promote TPU as an alternative to NVIDIA GPU. Morgan Stanley believes this agreement will enhance market confidence in Google and drive the development of its cloud services. Google's TPU has now attracted several tech companies, but it has not rented out its most powerful TPU models to OpenAI
According to The Information, Microsoft (MSFT.US) invested artificial intelligence startup OpenAI has begun using AI chips manufactured by Google (GOOGL.US) to build products including ChatGPT, marking its first significant shift away from solely relying on NVIDIA (NVDA.US) chips.
This collaboration with Google's Tensor Processing Units (TPU) signifies OpenAI's first substantial adoption of non-NVIDIA chips, initiating a supplier diversification strategy. Previously, the company had long relied on NVIDIA chips for both training AI models and executing inference computations (the operational process after model training), and was one of the largest purchasers of NVIDIA graphics processing units (GPU).
OpenAI expects that leasing TPUs through Google Cloud will help reduce costs associated with the inference phase. The Information states that this could position TPUs as a cheaper alternative to NVIDIA GPUs.
Earlier this month, OpenAI planned to integrate Google Cloud services to meet its growing computational needs, marking a surprising collaboration between two well-known competitors in the AI field. Morgan Stanley also released a research report supporting Google, believing that if the agreement is confirmed, it would indicate Google's confidence in its long-term search business position and accelerate the development of Google Cloud, with a valuation exceeding 18 times.
For Google, this collaboration comes as it expands the external availability of its self-developed Tensor Processing Units (TPU), which were previously used mainly for internal projects, and has now attracted tech giants like Apple (AAPL.US), as well as competitors to ChatGPT founded by former core members of OpenAI, such as Anthropic and Safe Super Intelligence.
However, it is reported that Google has not leased its most powerful TPU models to OpenAI, indicating its plan to reserve the most advanced versions for supporting internal projects, including its self-developed Gemini large language model