
NVIDIA invests $1.5 billion to rent GPU servers equipped with its own chips from Lambda

NVIDIA is both a supplier and investor of Lambda, and has now become its largest customer. Media reports indicate that the company has reached an agreement with Lambda worth a total of $1.5 billion to lease servers equipped with its own GPUs. Analysts believe that this deal continues NVIDIA's strategy of supporting "small cloud service providers," similar to the previous agreement with CoreWeave, aimed at enhancing competitiveness against traditional cloud giants like Amazon and Google
Media reports indicate that Lambda, a small cloud service provider preparing for an IPO, has recently received strong support from its most important supplier, NVIDIA.
Insiders revealed to the media that this summer, NVIDIA agreed to lease 10,000 GPU servers equipped with its own AI chips from Lambda for four years, with a total value of $1.3 billion.
Additionally, NVIDIA has reached another deal with the company worth $200 million to lease 8,000 servers equipped with NVIDIA chips, with the specific timing yet to be determined. These contracts make NVIDIA Lambda's largest customer to date and lay the groundwork for the company's upcoming initial public offering (IPO).
The media states that this is also the latest case of NVIDIA promoting its chips into the cloud market through a "circular" financial arrangement, assisting small cloud service providers in competing with traditional giants like Amazon and Google. This also demonstrates how the capital market in the AI field is "internally circulating": NVIDIA acts as a supplier, investor, and customer, supporting multiple "small cloud companies," also referred to as "neocloud."
Business Model Similar to CoreWeave
Lambda's business model involves leasing data center space, deploying servers equipped with NVIDIA GPUs, and signing contracts with customers to rent these servers. It remains unclear what the specific costs are for Lambda leasing GPU servers from NVIDIA, and how NVIDIA records this transaction financially, as it is both the buyer and seller, and also a shareholder of Lambda.
Another insider told the media that NVIDIA's own researchers will also use the GPU servers leased from Lambda.
In addition to NVIDIA, Lambda's other major customers include Amazon and Microsoft, which together contributed nearly $114 million of the company's cloud revenue in the second quarter. Notably, Amazon and Microsoft primarily use Lambda's GPU servers for internal purposes, rather than for customer services on the AWS or Azure platforms.
Lambda expects its cloud revenue to exceed $1 billion by 2026 and $20 billion by 2030, aiming to secure contracts with major AI developers including OpenAI, Google, Anthropic, and xAI.
Lambda also anticipates that by 2030, its computing power will reach nearly 3GW (gigawatts), equivalent to nearly half of the total computing power of some of the largest cloud service providers today, while it was only 47 megawatts in the second quarter of this year. How the company will achieve this growth remains unclear, but going public may help it expand operations through debt.
Lambda's business model and high customer concentration are similar to CoreWeave. The latter is a larger GPU cloud service provider that has recently gone public and has also received significant support from NVIDIA.
Previously, Lambda primarily signed small-scale, short-term GPU leasing contracts, while this deal with NVIDIA is the largest in its history and is likely to aid its market promotion ahead of its IPO in the first half of next year
Supporting Small Factories is NVIDIA's Consistent Strategy
NVIDIA has consistently supported companies willing to use its chips and more willing to purchase a variety of hardware products than traditional cloud giants. For example, NVIDIA had conflicts with Microsoft over the design of GPU server racks; meanwhile, executives at Lambda were internally discussing whether to adopt the new optical network technology that NVIDIA is developing.
NVIDIA also assisted CoreWeave in its rapid rise. CoreWeave, which transitioned from cryptocurrency mining, signed an agreement with NVIDIA early in its transformation that was almost identical to that of Lambda. This agreement helped CoreWeave secure debt financing and expand its cloud business, thereby capturing market share from traditional cloud providers.
NVIDIA's support for small cloud service providers aims to protect its core business in the long term. Although NVIDIA's largest customers remain Microsoft, Amazon, and Google, these tech giants are also developing their own AI chips to reduce reliance on NVIDIA.
Customer Concentration Risk
Although Amazon and Microsoft procure far more GPUs for their own data centers than they rent from third parties like Lambda, both companies have stated that their GPU servers operate at near full capacity almost around the clock, and the pace of data center expansion is not keeping up with demand.
The scale of Microsoft's contract with Lambda is much smaller than its leasing agreement with CoreWeave, but Lambda executives have indicated that the company is negotiating larger-scale collaborations with other potential customers.
However, it remains uncertain whether Lambda can secure such deals, and company executives have acknowledged that, like other cloud service providers, Lambda faces issues such as power supply and data center space shortages.
CoreWeave previously relied on external leased data centers but recently acquired a large power and data center company for $9 billion, planning to build its own sites to reduce costs.
NVIDIA's Collaborative Relationships
Media reports indicate that Lambda executives stated that NVIDIA signed a GPU leasing agreement worth $1.3 billion, codenamed "Project Comet," which will support its emerging cloud computing business DGX Cloud. Through this platform, NVIDIA leases GPUs to cloud service providers, which then sublease them to companies developing AI; NVIDIA's own researchers are also using this platform.
Analysts suggest that NVIDIA values Lambda for multiple reasons, one of which is that Lambda is attracting more customers to switch to NVIDIA GPUs. For instance, Lambda recently signed a one-year cooperation agreement with image generation startup Midjourney to help the company migrate code originally running on Google AI chips to NVIDIA's new generation Blackwell GPUs.
Lambda executives stated that converting Google AI chip users to NVIDIA GPU users has earned the company higher praise within NVIDIA.
Google's TPU (Tensor Processing Unit) chips have become more competitive in the AI field in recent years. Google has also engaged with GPU-focused cloud service providers like CoreWeave, hoping they would deploy Google chips, and one company has already agreed to cooperate