While "backstabbing" Microsoft and engaging in internal competition: OpenAI is reported to have reached a cooperation with Google Cloud, with o3 prices dropping by 80%

Wallstreetcn
2025.06.10 17:26
portai
I'm PortAI, I can summarize articles.

Media reports indicate that OpenAI and Alphabet - C have been in discussions for months and finalized a cooperation agreement last month, with Google Cloud providing computing power for OpenAI to train and run models. Previously, cooperation could not be reached due to OpenAI's exclusive agreement with Microsoft. Before January of this year, Microsoft was the sole cloud service provider for OpenAI

Recent news shows that OpenAI is reaching out to its major "financial backer" Microsoft's old rival Google while actively promoting industry competition and engaging in a price war.

On June 10th, Eastern Time, OpenAI's CEO Sam Altman announced on social media that the inference model o3 would be reduced in price by 80%, looking forward to people's reactions, and believing that people would also be satisfied with the performance and pricing of o3 Pro.

This significant price drop is undoubtedly another step by OpenAI to further promote the "involution" of large models after the rise of DeepSeek.

After DeepSeek released a groundbreaking open-source model with ultra-high cost performance in January this year, OpenAI officially launched its most cost-effective inference model o3-mini at the end of that month, marking the first time it opened the inference model to free users, allowing ChatGPT free users to try the o3-mini model by selecting "Reason" in the message editor or regenerating responses.

On the same Tuesday, shortly before Altman announced the price drop of o3, media reports revealed that OpenAI had reached a cloud service cooperation agreement with Google, which would utilize OpenAI's AI capabilities and Google's computing resources, a common rival of Microsoft, to support its business. This unprecedented cooperation not only marks a further loosening of OpenAI's dependence on Microsoft but also exposes the harsh reality of the computing power scarcity behind the AI arms race—survival needs are overwhelming everything, and yesterday's enemies can become today's allies.

The "Impossible Alliance" Driven by Computing Power Hunger

According to media reports, the seemingly incredible cooperation between OpenAI and Google was officially finalized in May this year, after months of discussions. For OpenAI, this is the latest move to reduce its excessive reliance on Microsoft; for Google, this is a significant victory for its cloud service business, although it also means providing ammunition to a strong competitor in its own AI business.

Media sources cited that Google Cloud will provide OpenAI with new computing power for training and running AI models. As is well known, the emergence of OpenAI's ChatGPT poses the most serious threat to Google's leading search business in years.

This cooperation highlights a harsh reality in the AI industry: the enormous computing demands are reshaping the competitive landscape. On Monday, an OpenAI spokesperson revealed that OpenAI's annual recurring revenue (ARR) has reached $10 billion, nearly doubling from $5.5 billion in the same period last year. However, behind this rapid growth is an insatiable thirst for computing power.

A report from September last year stated that OpenAI expects the computing costs for model training to rise significantly in the coming years, potentially reaching $9.5 billion a year by 2026, not including large language models (LLM) Research on the amortization of early training costs. A report from February this year mentioned that OpenAI is expected to "burn money" even more in the coming years as it invests all its revenue into the computational demands of running existing models and developing new ones. OpenAI estimates that the total computational costs from this year to 2030 will exceed $320 billion.

Microsoft is no longer "exclusive"

On January 21 of this year, Microsoft announced that it would no longer serve as the exclusive cloud service provider for OpenAI but retained "preferential purchasing rights." Reports at the time indicated that this adjustment stemmed from dissatisfaction among OpenAI's senior management regarding the slow progress of Microsoft's new data center construction.

Although it is no longer the exclusive cloud provider, Microsoft still retains the right to exclusively resell OpenAI models on the Azure cloud platform and can reuse OpenAI's intellectual property in its products. Microsoft stated that the current cooperation agreement between the two parties will last until 2030. Additionally, Microsoft enjoys 25% of OpenAI's revenue and has the right to share in the profits of its future products.

Media reports on Tuesday indicated that prior to Microsoft's official announcement in January, Microsoft's Azure cloud service had been the exclusive data center infrastructure provider for OpenAI, and Google was unable to reach a partnership with OpenAI due to the lock-in agreement with Microsoft. Currently, Microsoft and OpenAI are renegotiating the terms of Microsoft's multi-billion dollar investment agreement, including Microsoft's future equity stake in OpenAI.

OpenAI's diversification strategy also includes the $500 billion "Stargate" infrastructure project in collaboration with SoftBank and Oracle, as well as a multi-billion dollar computing power agreement with CoreWeave.

In February, Wall Street Journal mentioned that most of OpenAI's power and data center computing resources are provided by Microsoft, while OpenAI is shifting to rely on SoftBank as the main investor for the Stargate project. OpenAI expects that by 2030, the Stargate data center project will support three-quarters of the computing power needed to operate and develop AI models.

Additionally, there have been reports that OpenAI plans to finalize the design of its first self-developed chip this year to reduce reliance on external hardware providers