
Storage is exploding again! AI inference and multimodal drive data explosion, hard disk and flash memory manufacturers will become the biggest beneficiaries

The market is experiencing a logical shift from "buying shovels (GPUs)" to "building warehouses (storage and hardware)." With the explosion of AI inference demand and the exponential growth of multimodal data, storage is no longer just a container for data, but an indispensable "working memory" in AI computing flows. Bank of America believes that this transformation will make storage, edge devices, and network connectivity vendors the new beneficiaries in the next wave after GPUs
As the wave of AI shifts from the training phase to large-scale inference applications, the storage sector, regarded as the "AI working memory," is undergoing an unprecedented revaluation.
On Tuesday, storage concept stocks in the U.S. stock market surged collectively, with SanDisk skyrocketing by 27.56%, marking its best single-day performance since February. Western Digital and Seagate Technology followed closely, rising by 16.77% and 14.00%, respectively.
The direct catalyst for this round of market surge came from NVIDIA CEO Jensen Huang's speech at CES. He stated: “In terms of storage, this is currently a completely untapped market. This is a market that has never existed before and is likely to become the largest storage market globally, essentially carrying the working memory of global AI.” Meanwhile, NVIDIA showcased a new storage platform optimized for agent AI inference at CES, promising to improve energy efficiency fivefold compared to traditional platforms.
This surge is not coincidental but rather a profound correction in the market's understanding of the AI development phase. Bank of America Merrill Lynch analyst Wamsi Mohan pointed out in a recent report that 2026 will be a turning point for enterprise and edge AI. With the proliferation of multimodal AI (including text, images, and video), the volume of data generated will grow exponentially, driving the continuation of hardware spending cycles.
Bank of America believes that the theme of AI investment is shifting from capital expenditure-driven model training to an AI inference phase centered on return on investment (ROI), and this transition will make storage, edge devices, and network connectivity vendors the next beneficiaries following GPUs.
AI Inference and Multimodal: The True Drivers of Data Explosion
The revaluation of the storage sector is fundamentally based on the nature shift of AI workloads.
According to IDC data, the global annual data generation volume is expected to soar from 173 ZB in 2024 to 527 ZB in 2029, more than doubling in five years, with a compound annual growth rate of about 25%. The explosion of AI model data also poses unprecedented demands on storage capacity and speed.
Analyst Wamsi Mohan emphasized in the report: “As we look towards 2026 and beyond, we expect AI inference to dominate.” Enterprises need to retain an increasing amount of data for training, analysis, and compliance purposes, leading to a "synchronous surge" in storage demand.
Especially with the rise of multimodal generative AI. Unlike early text-based AI, the proliferation of multimodal AI means that systems need to handle and generate unstructured data such as images, videos, and audio. This data is not only massive but also requires frequent reading and writing to support the closed loop of "monitoring/verification/reintegration," which makes storage no longer just a passive archiving tool but an indispensable active participant in the AI computing process. **
Hard Drive Revival: Opportunities for Seagate and Western Digital
In the field of massive data storage, hard disk drives (HDD) still hold an irreplaceable position due to their cost advantages and capacity density. Bank of America Merrill Lynch believes that the increase in multimodal AI and inference demand will directly drive the growth of hard drive shipments and prompt customers to shift towards higher-capacity drives, thereby enhancing the per TB value of Seagate (STX) and Western Digital (WDC).
Mass Storage Demand (HDD): Seagate (STX) and Western Digital (WDC) are the main beneficiaries. The content generated by multimodal AI (such as generative videos, surveillance logs, and synthetic images) requires low-cost, high-capacity storage media. Seagate's HAMR (Heat-Assisted Magnetic Recording) technology and Western Digital's UltraSMR technology are designed to meet the extreme pursuit of single-disk capacity and energy efficiency.
High-Performance Cache Demand (NAND): The inference process is not "read-only." Modern AI systems need to store prompts, feedback labels, security logs, and vector databases for RAG (Retrieval-Augmented Generation). This requires a large amount of random I/O and write operations, directly increasing the demand for high-performance enterprise SSDs.
Rise of Edge AI: SanDisk and the Battlefield of High-Performance Flash Memory
Jensen Huang mentioned at CES: "Storage is a market that is currently completely undeveloped... This could become the largest storage market in the world, essentially carrying the working memory of the world's AI."
This view aligns with Bank of America Merrill Lynch's analysis. In addition to data centers, AI is rapidly penetrating edge devices. Edge AI—which runs AI directly on terminal devices such as smartphones, PCs, cars, and drones—will be another huge growth area.
Opportunities for SanDisk: Edge AI requires extremely low latency and high reliability. Devices need to locally process high-definition video streams, sensor data, and real-time decisions, driving the shift of storage media from low-end to high-performance UFS and NVMe interfaces. As a leader in embedded and removable flash memory, SanDisk will benefit from the increase in single-device storage capacity and product structure upgrades (such as automotive-grade UFS 4.1).
Revival of Terminal Manufacturers: Apple (AAPL) is seen as the "ultimate edge AI player," with its large device base and self-developed chips (M series/A series) enabling it to provide privacy protection and rapid responses through local inference. With a vast installed base of devices, customized Apple Silicon chips, and a strong focus on privacy, Apple can offer a faster and safer experience through edge AI. The potential collaboration with third-party models like Gemini and Siri is expected to further enhance the stickiness of its ecosystem. At the same time, Dell (DELL) and HP (HPQ) will also benefit from the demand for "AI PCs" as enterprises upgrade their devices, with Gartner predicting that AI PCs will account for 43% of all PC shipments by 2025
Price Increase and Market Outlook
As demand surges, the tightness on the supply side also supports the rise in stock prices. According to Bloomberg Industry Research analyst Jake Silverman, with the growing demand for AI training and inference, storage supply is tightening, and prices are skyrocketing. Earlier this week, the Korea Economic Daily reported that Samsung Electronics and SK Hynix are seeking to raise server DRAM prices by 60% to 70% in the first quarter compared to the fourth quarter of last year.
Bank of America Merrill Lynch summarized that hardware spending has been increasing its share of revenue in the IT industry year by year since 2022, and the industry is entering a "hardware renaissance era." In this cycle, not only Nvidia, which provides computing power, but also storage manufacturers that provide "memory," as well as Amphenol and Corning, which provide connectivity, will become core beneficiaries in the wave of AI inference.
