
Amazon AWS pioneers by creating an "AI Model Supermarket" to build differentiated competitive barriers

Amazon Web Services (AWS) achieves differentiated competition by building an "AI Model Supermarket," allowing customers to freely choose their technology routes instead of relying on a single large language model (LLM). The Bedrock service has become a growth engine for AWS, contributing over 18% of total revenue. AWS adopts a "self-research + investment" strategy and plans to double its investment in Anthropic to $8 billion in 2024 to avoid reliance on a single model
According to Zhitong Finance APP, in the face of the artificial intelligence arms race, Amazon (AMZN.US) Web Services (AWS) is taking a differentiated approach—rather than betting on the success or failure of a single large language model (LLM), it is building a model marketplace that allows customers to freely choose their technology paths. This strategy is helping AWS establish a unique barrier in the cloud computing battlefield.
As the core of AWS's artificial intelligence strategy, the Bedrock service has become a growth engine just two years after its launch. In the first quarter of this year, the platform contributed over 18% of AWS's total revenue, with its core competitiveness lying in providing an "AI toolbox" composed of over 100 models.
Figure 1
Dave Brown, Vice President of AWS Compute and Networking, candidly stated: "No single model can cover all scenarios." This understanding was validated during the DeepSeek model incident—when this Chinese startup launched a disruptive model, AWS was able to launch its hosted version on Bedrock in just one week, demonstrating an impressive technical response speed.
The inception of Bedrock can be traced back to 2020. At that time, product director Artur Diao wrote an internal memo proposing the development of a code generation tool based on LLMs. Although then-CEO Andy Jassy initially considered it a "pipe dream," when Code Whisperer was officially launched in 2023, the team keenly realized that what customers truly needed was the choice across models. This shift in understanding prompted AWS to become the first cloud service provider to offer a multi-model selection platform.
To avoid over-reliance on a single model, AWS adopted a "self-developed + investment" combination strategy. In November 2024, AWS will double its investment in Anthropic to $8 billion, with the deal requiring Anthropic to use only AWS chips to train its large language model Claude. (In contrast, Microsoft has invested over $13 billion in OpenAI.) This $8 billion deal allows Amazon to demonstrate its AI training infrastructure and deepen its ties with an LLM provider while continuing to offer customers a variety of model choices on Bedrock.
Additionally, Amazon is set to launch its own foundational model series, named Nova, by the end of 2024, two years after the release of ChatGPT. This dual-track parallel model allows AWS to bind deeply with Anthropic while maintaining platform openness. Dan Rosenthal, Head of Market Partnerships at Anthropic, admitted: "We focus on winning in our areas of strength, and when customers need multi-model choices, Bedrock is the best vehicle." At the hardware level, AWS is challenging NVIDIA's monopoly through self-developed chips. The Bedrock platform allows customers to freely choose between AWS's self-developed chips or hardware from Intel (INTC.US), AMD (AMD.US), and NVIDIA (NVDA.US). This "cooperative competition" model directly addresses industry pain points: while NVIDIA chips are powerful, AWS reduces the cost of its own chips in specific scenarios through optimization technology. With capital expenditures expected to reach $100 billion by 2025 (a 20% increase from 2024), this battle over computing costs is entering a heated phase.
Although AWS's revenue in the first quarter grew by 16.9% year-on-year to $29.27 billion, it has fallen short of expectations for three consecutive quarters. The "triple-digit growth" data for AI revenue disclosed by Jassy in the shareholder letter reflects the urgency of strategic transformation. Currently, Amazon is directing the majority of its capital expenditures toward the AI field, and the cost-performance advantage of its self-developed chips may become a key variable in breaking NVIDIA's blockade.
Figure 2
In this battle for AI infrastructure, AWS is demonstrating the new competitive rules through action: while the industry focuses on the high ground of models, the real decisive factor may lie in who can provide developers with a more flexible technology combination and more cost-effective computing solutions. This wisdom of "not putting all eggs in one basket" is precisely the unique leverage that cloud computing giants have against new tech challengers