In January 2025 at CES, major storage manufacturers showcased their latest advancements, with all of them demonstrating their storage capabilities for AI applications. Companies like SK hynix, Samsung, and Micron unveiled their newest technologies and products, leading the way in AI storage innovation.
01
SK hynix
At CES, SK hynix showcased its transformative memory products aimed at advancing the AI era. These included SK hynix’s HBM, server DRAM, eSSD, CXL, and PIM products.
As a leading HBM supplier, SK hynix revealed its 16-layer HBM3E sample, the world’s largest capacity HBM product. The 48GB 16-layer HBM3E is specifically optimized for AI learning and inference. The product is still in the development stage. Additionally, the company displayed DDR5 RDIMM and MCR DIMM, two high-speed server DRAM modules customized for data center environments, providing fast data processing and large memory capacity.
As data storage’s importance in AI applications becomes increasingly prominent, SK hynix particularly highlighted its innovative eSSD solutions, including PS1010, PEB110, and PE9010. These products are tailored for data centers and offer excellent reliability and high read/write speeds needed to handle the massive data generated by AI applications.
Moreover, SK hynix showcased its cutting-edge CXL technology, such as CMM-DDR5 and CMM-Ax, which are leading the development of memory interfaces to enhance the flexibility and scalability of AI systems. PIM solutions, including GDDR6-AiM and AiMX, were also presented. These are high-speed, low-power accelerator cards capable of performing complex computing tasks, revolutionizing data processing efficiency in AI environments.
At the same time, SK hynix demonstrated its edge AI solutions, highlighting the widespread applications of its products in AI services. These innovative products included LPCAMM2, mobile NAND flash solution ZUFS 4.0, and the industry’s leading high-performance SSD PCB01 for edge AI PCs. Through these demonstrations, SK hynix emphasized how these technologies, with outstanding energy efficiency, deliver top-tier performance and strong support for a wide range of applications, from mobile AI to consumer electronics.
02
Samsung
At CES 2025 in Las Vegas, Samsung presented its full range of AI-driven smart products, reimagined to shape a more sustainable and accessible world by connecting the best technologies with lifestyles. Earlier, its 10.7Gbps LPDDR5X DRAM product received the 2025 CES Innovation Award.
Samsung developed the world’s first 12nm LPDDR5X DRAM, which supports the fastest speed of 10.7Gbps. LPDDR5X combines high performance and the industry’s smallest chip size, optimized for the next generation of edge AI applications. JunYoung Lee, the project manager of Samsung’s memory product planning team, and JinSuk Chung, the chief engineer, elaborated on the technical features of LPDDR5X.
LPDDR5X is a memory solution optimized for high-performance edge AI in smartphones, tablets, and laptops. The key design objective was to create a thin chip with low power consumption while offering enhanced performance. By collaborating across departments—from planning and design to process and packaging—LPDDR5X achieved the fastest 10.7Gbps speed and the thinnest 0.65mm thickness.
To achieve the fastest 10.7Gbps speed, the performance of the high-speed data transfer I/O circuits had to be ensured. It also leverages design technologies that reduce internal operating voltage, allowing for high-speed operation while maintaining low power consumption.
Samsung’s F-DVFS (Full Dynamic Voltage Frequency Scaling) technology lowers the voltage in data paths at specific intervals, reducing the power consumption involved in signal transmission. Samsung’s F-DVFS technology extends this feature to allow voltage adjustment across the entire range of operation, from the minimum to maximum cycle, offering flexibility to adjust voltage based on operating speed, thereby maximizing energy efficiency and extending battery life.
Samsung’s LPDDR5X is the industry’s thinnest 12nm DRAM. To achieve this, two chips are combined into one unit to form a four-unit cohesive system, and the circuit protection material—EMC (epoxy molding compound)—was optimized. Finally, the development of back-polishing technology, which polishes the wafer back to maintain operational characteristics and reliability while reducing wafer thickness, was essential.
Collaboration with customers was crucial during the development and verification process. Samsung’s internal collaboration, especially between the DRAM design team responsible for low-power high-performance products and the PKG development team, who relentlessly worked to make LPDDR5X thinner, played a key role. The synergy of internal and industry-wide collaboration led to their success and the CES Innovation Award.
With the growing demand for foldable smartphones, consumers are seeking thinner devices. Reducing the thickness of each component is necessary to achieve this. When component thickness is reduced, the internal airflow space in devices increases, making heat management easier. This allows smartphones to operate at high performance, such as in gaming, without being limited by performance restrictions due to heat.
Samsung experts noted that as AI advances, the data center market for large-scale servers is growing, and energy efficiency is becoming a key issue. While many associate AI memory technology with HBM, there is a rising demand for products like LPDDR, which can achieve certain performance levels at low power. This is also true for LP modules (LPCAMM, SOCAMM, etc.) designed to replace existing DDR modules.
Compared to DDR, LPDDR has advantages in energy efficiency and high-performance operation, thus increasing its demand. Additionally, LPDDR’s use in personal computers is rising, and with the growing popularity of electric vehicles and autonomous driving, Auto LPDDR5X is also gaining traction in the automotive market.
Samsung Electronics is actively developing its next-generation LPDDR6 solution for 2026. Compared to LPDDR5X, LPDDR6 will further upgrade in performance and energy efficiency. Additionally, module products like LPCAMM and SOCAMM are currently being verified with customers and are preparing to develop high-performance, low-power DRAM technologies, such as LPW with high bandwidth achieved by increasing data I/O numbers and LP-PIM with added computing functionality. This is expected to extend beyond edge AI and into broader application fields.
03
Micron
During CES 2025, Micron announced the expansion of its Crucial consumer memory and storage product lineup, including the launch of the high-speed Crucial P510 SSD and the expansion of density and form factor options in its existing DRAM product range to offer broader choice and flexibility for consumers. The P510 delivers read/write speeds of up to 11,000/9,550 MB/s, bringing impressive Gen5 performance to the masses.
The new P510 SSD is designed for laptops and desktops and features a new, more power-efficient architecture, consuming 25% less power than previous Crucial Gen5 SSDs to support longer battery life.
Its fast read/write speeds allow users to maximize productivity and enjoy smoother, more immersive experiences, from gaming to creative applications. The SSD comes in 1TB and 2TB capacity options with an integrated heatsink. The P510 also offers a single-sided design for easy installation, even in newer laptops supporting Gen5.
The Crucial series also introduced the Crucial P310 2280 SSD with a heatsink. The integrated heatsink design is intended for use with PlayStation 5 but can also be used with desktops, providing unrestricted, excellent performance for desktop gamers and creators. This SSD has a read speed of 7,100 MB/s, offering fast, reliable, and cost-effective storage and capacity.
04
Summary
Currently, in addition to fierce competition in high-bandwidth memory HBM and accelerating the development and mass production of HBM4, high-speed, low-power storage products have become even more indispensable in the AI application boom. High-performance storage products are critical for AI data processing and storage efficiency, which will drive continuous innovation in storage technology.
Disclaimer:
- This channel does not make any representations or warranties regarding the availability, accuracy, timeliness, effectiveness, or completeness of any information posted. It hereby disclaims any liability or consequences arising from the use of the information.
- This channel is non-commercial and non-profit. The re-posted content does not signify endorsement of its views or responsibility for its authenticity. It does not intend to constitute any other guidance. This channel is not liable for any inaccuracies or errors in the re-posted or published information, directly or indirectly.
- Some data, materials, text, images, etc., used in this channel are sourced from the internet, and all reposts are duly credited to their sources. If you discover any work that infringes on your intellectual property rights or personal legal interests, please contact us, and we will promptly modify or remove it.