
Baidu releases two new models, priced at only 25% of DeepSeek, Li Yanhong: Without application, the chip model has no value

The Wenxin Large Model X1 Turbo has an input price of 1 yuan per million tokens and an output price of 4 yuan, which is only 25% of DeepSeek-R1; Li Yanhong stated that some models represented by DeepSeek still have "pain points" such as single modality, high hallucination, slow speed, and high prices. Baidu released these two new models to address these issues
Baidu launches an AI price war, with new model costs sharply reduced.
On April 25, Baidu founder, chairman, and CEO Robin Li announced the release of new AI products at the Create 2025 Baidu AI Developer Conference, including Wenxin Large Model 4.5 Turbo, Wenxin Large Model X1 Turbo, highly persuasive digital humans, and the general multi-agent collaboration app "Xinxiang," and announced that it will help developers fully embrace MCP.
According to Robin Li, the two Wenxin large models focus on multi-modal capabilities, strong reasoning, and low costs. Among them, the X1 Turbo has an input price of 1 yuan per million tokens and an output price of 4 yuan, only 25% of DeepSeek-R1.
Image source: Open Source Community Exchange
Compared to Wenxin 4.5, the price of 4.5 Turbo has also dropped significantly by 80%, with an input price of only 0.8 yuan per million tokens and an output price of 3.2 yuan, only 40% of DeepSeek-V3.
Not only is the pricing highly aggressive, but in multiple benchmark test sets, the Wenxin Large Model 4.5 Turbo's multi-modal capabilities outperform GPT 4o.
In addition to releasing low-cost, high-performance foundational models, Baidu has also launched a series of AI applications and tools aimed at building a complete developer ecosystem.
According to reports, the conference unveiled highly persuasive digital humans, the general multi-agent collaboration app "Xinxiang," and the content operating system "Cangzhou OS," among others. At the same time, Baidu announced that it will help developers fully embrace MCP (Model Context Protocol) and released the country's first e-commerce transaction MCP, search MCP, and other servers for developers to call.
"Price War" Escalates! Directly targeting competitors like DeepSeek
Baidu's combination of "better performance, lower prices" is clearly aimed at attracting developers and enterprise users in the AI model field through extreme cost-effectiveness, rapidly eating into competitors' market share.
Robin Li also candidly pointed out the problems currently existing in the AI model market. He believes that some models represented by DeepSeek still have "pain points" such as single modality, high hallucination rates, slow speeds, and high prices.
For example, DeepSeek mainly processes text and cannot understand multi-modal content such as sound, images, and videos, which is precisely what many customers need. At the same time, its high "hallucination rate" poses risks in commercial applications Li Yanhong emphasized that Baidu's release of these two new models is aimed at solving these problems.
Li Yanhong: Application is King, Models Must Address "Pain Points"
Li Yanhong has publicly stated multiple times that without applications, models and chips are worthless.
At today's meeting, Li Yanhong reiterated his "application first" viewpoint, which is also his latest statement on the AI technology industry:
"AI applications are what truly create value. Without AI applications, no model or chip has any value."
Reducing costs is the key to driving the explosion of applications. Li Yanhong pointed out:
"With such a super capable foundational model, we can create super useful and super interesting AI applications."
Previously, Li Yanhong publicly stated that closed-source models outperform open-source models in terms of efficiency, effectiveness, and cost, referring to open-source models as "intelligence tax." Following the global popularity of DeepSeek-V3/R1 this year, Li Yanhong's statement has been discussed again.
To support its AI ambitions, Baidu also "lit up" the country's first fully self-developed 30,000-card Kunlun chip cluster at the conference. Li Yanhong stated that this cluster can simultaneously carry out full training of multiple large models with hundreds of billions of parameters and support a large number of customers for model fine-tuning.
In the second half of AI, price and application will become the core of competition, indicating that AI models will accelerate from being "high and mighty" to "inclusive and usable," and the era of application explosion may be accelerating