
Benchmarking HBM! SoftBank collaborates with Intel to develop "ZAM": aiming to make AI memory cheaper and more energy-efficient

SoftBank partners with Intel to bet on next-generation AI memory technology ZAM. This technology originates from a U.S. government project, focusing on low power consumption and low cost, directly addressing the energy consumption bottleneck and supply chain tension issues in AI. The prototype is expected to be released in 2028. Following this news, both SoftBank and Intel's stock prices surged over 5%
SoftBank and Intel join forces to bet on next-generation AI memory technology, attempting to carve out new paths in a market dominated by high bandwidth memory (HBM). The "Z-Angle Memory" (ZAM) project developed by both parties focuses on reducing power consumption and costs, directly addressing the energy consumption bottleneck and supply chain challenges currently faced by AI.
SoftBank's subsidiary Saimemory announced on Tuesday that it has signed a cooperation agreement with Intel to jointly promote the commercialization of this next-generation memory technology aimed at artificial intelligence and high-performance computing. The technology aims to improve traditional dynamic random access memory (DRAM) architecture to meet the growing performance demands of AI applications.
According to a press release from SoftBank, the prototype product of the ZAM project is expected to be completed within the fiscal year ending March 31, 2028, with commercialization targeted for the fiscal year 2029.
Joshua Fryman, Chief Technology Officer of Intel's Government Technology Division and Intel Fellow, stated in a statement that standard memory architectures cannot meet AI demands, and the new architectures and assembly methods developed by Intel enhance DRAM performance while reducing power consumption and costs.
Following the announcement, SoftBank's stock price rose by 5.13% in Tokyo trading.

Intel's stock rose by 5%.

Technology Originates from U.S. Government Project
Saimemory was established in December 2024 and will leverage Intel's expertise in memory technology, particularly the technological achievements developed by Intel as a participant in the U.S. Department of Energy's Advanced Memory Technology Project.
This Department of Energy project focuses on developing core technologies for advanced memory, with Intel responsible for improving the performance and energy efficiency of the next generation of DRAM used in computers and servers.
According to a report by Nikkei Asia last year, the Japanese multinational IT equipment and services company Fujitsu is also involved in this project.
AI Demand Triggers Supply Chain Tension
The industrial background of this collaboration is the surge in demand for memory driven by AI-related applications, with demand growth far exceeding supply capacity, leading to shortages across the entire memory supply chain.
Currently, AI chips heavily utilize high-performance memory solutions such as HBM, but these products are difficult to produce, expensive, and have highly concentrated supply.
The ZAM project aims to provide an alternative path by improving traditional DRAM architecture, reducing manufacturing complexity and costs while ensuring performance, which may offer more options for the AI hardware supply chain.
Energy Efficiency Becomes a Core Consideration
The emphasis on energy efficiency in the ZAM project reflects the industry's growing concern over the enormous energy consumption of AI computing. Joshua Fryman stated that the new memory architecture and assembly methods developed by Intel could achieve broader applications in the next decade, positioning this technology to significantly reduce power consumption while enhancing performance As the scale of AI models continues to expand, the requirements for memory bandwidth and capacity during training and inference processes are steadily increasing, and the corresponding energy consumption has become one of the bottlenecks restricting AI development.
If more energy-efficient memory technology can be successfully commercialized, it will directly impact the operating costs of data centers and the economic feasibility of AI applications
