
NVIDIA accelerates its AI ecosystem layout in Europe: expanding the Lepton cloud platform to connect with local developers

NVIDIA is accelerating its AI ecosystem layout in Europe by launching an expanded version of Nvidia DGX Cloud Lepton, aimed at connecting local developers with the global computing ecosystem. The platform provides GPUs through multiple cloud service providers, supporting high-performance computing. Amazon AWS and Microsoft Azure will be among the first participants. In addition, NVIDIA is collaborating with European venture capital firms to provide computing resources for startups to promote regional development
According to Zhitong Finance APP, NVIDIA (NVDA.US) stated that it is working to connect developers in Europe with its global computing ecosystem. The company announced the launch of an expanded version of Nvidia DGX Cloud Lepton—an AI platform that integrates global computing market capabilities, allowing connection to AI applications built by developers for institutional and physical applications. The platform's GPUs are now available through a network of cloud service providers.
The AI chip giant indicated that Mistral AI, Nebius, Nscale, Firebird, Fluidstack, Hydra Host, Scaleway, and Together AI have now brought NVIDIA Blackwell and other NVIDIA architecture GPUs to market, thereby expanding the regional coverage of high-performance computing.
NVIDIA pointed out that Amazon (AMZN.US) AWS and Microsoft (MSFT.US) Azure will be among the first large cloud service providers to participate in the DGX Cloud Lepton project.
NVIDIA added that these companies, along with CoreWeave (CRWV.US), Crusoe, Firmus, Foxconn, GMI Cloud, Lambda, and Yotta Data Services, have joined the platform.
NVIDIA has also collaborated with several European venture capital firms (such as Accel, Elaia, Partech, and Sofinnova Partners) to provide credit lines for portfolio companies on the DGX Cloud Lepton platform, enabling startups to access accelerated computing resources and drive regional development.
Additionally, by integrating with NVIDIA's software suite (including NVIDIA NIM, NeMo microservices, and Cloud Functions), DGX Cloud Lepton can simplify and accelerate every stage of AI application development and deployment, regardless of scale.
NVIDIA noted that to enable accelerated computing to serve the global AI field more broadly, Hugging Face is launching a new service called "Training Clusters as a Service." This service integrates with the DGX Cloud Lepton platform, seamlessly connecting AI researchers and developers building foundational models with NVIDIA's computing ecosystem.
NVIDIA's founder and CEO Jensen Huang stated, "We are working with partners in the region to build a network of AI factories. Developers, researchers, and enterprises can leverage this network to transform local breakthroughs into global innovations."