In 2018, OpenAI first introduced the GPT model, marking the beginning of a new chapter in the global technology revolution, with the wave of artificial intelligence rapidly sweeping the globe. During this process, the massive computational power demand driven by AI large models has shown explosive growth. Simultaneously, the strong demand for data centers has raised higher requirements for memory.
01
High-capacity SSDs are timely, and the new AI storage battle begins.
According to the vice president of sales for the Asia-Pacific region of Solidigm, a subsidiary of SK Hynix, since the development of GPT applications began, the training parameters of GPT models have continued to climb. GPT-3 has billions of parameters, while GPT-4 boasts tens of trillions of parameters.
Faced with the trillions of parameters, HBM (High Bandwidth Memory), a new type of CPU/GPU memory chip, has emerged. With advantages like high bandwidth, high capacity, low latency, and low power consumption, HBM has become a standard for AI servers.
In terms of NAND, QLC Enterprise SSDs (QLC Enterprise SSDs) have gradually become the preferred storage solution for data centers due to their high capacity, low power consumption, and fast read speeds. Especially with North American customers expanding their storage product orders, the demand for QLC enterprise SSDs has also risen. According to global market research firm TrendForce, in 2024, QLC enterprise SSD shipments are expected to reach 30EB (Exabyte), a fourfold increase compared to 2023.
Currently, AI inference servers primarily perform read operations, with a relatively lower frequency of data writes compared to AI training servers. QLC enterprise SSDs offer faster read speeds and capacities up to 64TB compared to HDDs.
In addition to higher capacity and faster read advantages, another crucial reason for the enhanced adoption of QLC enterprise SSDs in AI applications is the superior TCO (Total Cost of Ownership) advantage. Specifically, with higher storage density, optimized server and physical space utilization, and reduced energy consumption, QLC SSDs can help large-scale data centers lower their overall TCO while meeting high-performance storage needs.
In summary, QLC enterprise SSDs are gradually replacing HDDs as a new favorite in the AI storage field. As AI training becomes increasingly energy-intensive, energy efficiency becomes a key consideration for upstream and downstream manufacturers. Therefore, QLC high-capacity SSDs may be more suitable for read-intensive applications with high-capacity needs.
02
Upstream storage manufacturers are making strides, with Solidigm and Samsung leading the way.
Driven by the AI boom, the demand for ultra-large capacity SSDs in data centers is also growing, and QLC-based NAND flash products have become one of the best choices to meet the growing demand for ultra-high-performance SSDs in data centers.
However, the power cost of AI data centers poses limitations, requiring each storage server to be equipped with high-capacity memory. Recently, Hyun Jae-woong, vice president of product planning at Samsung, stated that compared to multi-level cell (MLC) and triple-level cell (TLC) devices, QLC NAND can store more data per cell, significantly enhancing storage performance.
From a technological standpoint, Samsung and Solidigm are leading in the QLC enterprise SSD field, with only these two manufacturers having verified QLC products on the market.
Solidigm is currently mass-shipping its fourth-generation 192L QLC NAND. The fourth-generation QLC NAND, launched in 2023, is based on Floating Gate technology, stacking 192 layers with a single-chip density of 1.3 Terabit. Compared to the first-generation 64L QLC NAND, its programming speed has increased by 2.5 times, random read performance by 5 times, and read latency reduced by 1.5 times.
Samsung will begin mass production of its ninth-generation V-NAND with four-layer cells (QLC) in the second half of this year. Besides SK Group and Samsung, Western Digital is also actively striving to ship high-capacity storage products, expected to mass-produce 162-layer QLC SSDs in the future.
In terms of revenue and market share, Samsung and SK Group are also far ahead. According to a previous report by TrendForce, in the first quarter of 2024, the combined market share of Samsung and SK Group (SK Hynix & Solidigm) in the global enterprise SSD field reached 77.8%, with Samsung leading at 47.4% and SK Group following closely at 30.4%.
In terms of revenue, due to supplier production cuts, the large-capacity order demand that surged since the fourth quarter of 2023 has not been fully met. Coupled with other end products expanding orders through a low-price inventory procurement strategy, and the significant growth in large-capacity storage demand driven by AI servers, enterprise SSD procurement bits increased by more than 20% quarter-on-quarter in the first quarter of 2024. With both volume and price rising, enterprise SSD revenue in the first quarter of 2024 reached $3.758 billion, a 62.9% quarterly increase. The top two, Samsung and SK Group, achieved revenues of $1.782 billion and $1.144 billion, respectively.
TrendForce predicts that the demand for high-capacity SSDs from AI servers will continue to rise in the second quarter, pushing up enterprise SSD contract prices by more than 20% and potentially increasing enterprise SSD revenue growth by another 20%.
03
Conclusion
With the explosive popularity of AI large models like ChatGPT and Sora, artificial intelligence technology has been widely applied across various industries, with AI phones, AI PCs, and AI servers gradually coming to the forefront. In the future, storage technology will be one of the key drivers for further increasing the penetration rate of artificial intelligence.
Disclaimer: This article is created by the original author. The content of the article represents their personal opinions. Our reposting is for sharing and discussion purposes only and does not imply our endorsement or agreement. If you have any objections, please get in touch with us through the provided channels.