
Microsoft invites Musk's Grok AI to settle on Azure, causing new ripples in its relationship with OpenAI

According to reports, Microsoft is currently in talks with Elon Musk's xAI, planning to host the Grok AI model through Azure cloud services and make it available to Microsoft customers and its own product teams. Analysts believe that this move may spark controversy within Microsoft and further escalate tensions with its partner OpenAI
According to sources, Microsoft has instructed its engineers engaged in AI infrastructure to prepare for hosting Elon Musk's Grok AI model.
Reports indicate that Microsoft is currently in talks with Musk's xAI, planning to host the Grok AI model through Azure cloud services and make it available to Microsoft customers and its own product teams. Analysts believe this move may spark controversy within Microsoft and further escalate tensions with its partner OpenAI.
If the deal progresses, Grok will be integrated into Microsoft's AI development platform Azure AI Foundry, providing developers with AI services, tools, and pre-trained models for building AI applications and agents. This will allow developers to integrate Grok into their applications and may enable Microsoft to use the model in its own applications and services.
Azure AI Foundry Business Continues to Expand
Over the past year, Microsoft has been continuously expanding its Azure AI Foundry business and rapidly adopting models from various AI labs that compete with its partner OpenAI. The disruptive AI company DeepSeek, which made waves in the AI field earlier this year, prompted Microsoft to quickly adopt its ultra-low-cost R1 model. Microsoft CEO Satya Nadella pushed the team to complete testing and deployment of R1 within a few days, which was seen as an extremely rapid response internally.
Previously, Nadella had been advocating for Microsoft to host the Grok model, eager to make Microsoft the preferred hosting platform for all popular or emerging AI models. To build an AI platform and develop AI agents into a digital workforce, the Microsoft Azure AI team continues to introduce new models and procure hardware that enhances AI capabilities.
Asha Sharma, Vice President of Microsoft AI Platform Business, stated in an interview last month:
“All the systems we have built over the past 50 years need to serve AI agents. At Azure AI Foundry, we are thinking about how to evolve into the operating system behind every AI agent.”
Opening Grok to developers through Azure is part of Microsoft's strategy to build a core platform for AI models and agents, but this does not mean that AI labs are shifting their model training needs to Microsoft. Last year, Musk canceled a server deal with Oracle worth up to $10 billion and stated on the X platform that xAI will "internally train" models in the future, no longer relying on Oracle's servers.
It remains unclear whether Microsoft will secure an exclusive agreement to host the Grok model or if competitors like Amazon will also be able to participate in hosting. Reports suggest that Microsoft is currently only seeking to provide computational resources for hosting the Grok model, rather than servers needed for future model training.
Hosting Grok May Strain Microsoft's Relationship with OpenAI
Analysts believe that Microsoft's hosting of Musk's Grok model may create some tension within the company, especially considering Musk's relationship with the controversial "Department of Government Efficiency" (DOGE) project. Musk has stated that he will gradually step back from his work with DOGE this month, and the announcement of Grok's arrival on Azure may be made at the Microsoft Build developer conference on May 19 Apart from the DOGE issue, the custody of Grok may further intensify the cooperation between Microsoft and OpenAI. Earlier, OpenAI countersued Elon Musk, accusing him of attempting to slow down OpenAI's development through "malicious strategies."
Meanwhile, multiple reports indicate that there are frictions between Microsoft and OpenAI regarding computing power allocation and model access permissions. The Wall Street Journal reported this week that Satya Nadella's relationship with OpenAI CEO Sam Altman is "deteriorating." Nadella's hiring of Mustafa Suleyman last year to build an AI team was seen as an "insurance measure" to address the risks posed by OpenAI.
However, Suleyman and his Microsoft AI team have made little progress in developing models that can compete with OpenAI, leading Microsoft to still rely on OpenAI to support major AI functions like Office and Copilot. It is understood that Microsoft originally expected OpenAI to launch the GPT-5 model this month, but due to repeated delays in the new model's release and resource pressures following upgrades to image generation capabilities, the likelihood of GPT-5 being launched this month is low.
The launch of Grok on Azure clearly indicates Microsoft's willingness to embrace other AI model providers. Microsoft's GitHub Copilot already supports models from Anthropic and Google, in addition to OpenAI. Therefore, it is not unimaginable that the main Copilot program will support developers in choosing from multiple AI models in the future, especially as this aligns with Microsoft's ambition to become the preferred platform for AI developers and users