Samsung unveiled its next-generation high-bandwidth memory, HBM3E DRAM, for AI-model training and inference in the data center.
The HBM3E boasts 9.8 Gbps per pin speed, meaning it can achieve transfer rates exceeding up to more than 1.2 terabytes-per-second (TBps). In order to enable higher layer stacks and improve thermal characteristics, Samsung has optimized its non-conductive film (NCF) technology to eliminate gaps between chip layers and maximize thermal conductivity.
Samsung’s 8H and 12H HBM3 products are currently in mass production and samples for Shinebolt are shipping to customers.
At its annual Memory Tech Day in San Jose, Samsung also highlighted:
- the 32Gb DDR5 DRAM with the industry’s highest capacity, the industry’s first 32Gbps GDDR7
- the petabyte-scale PBSSD, which offers a significant boost to storage capabilities for server applications.
- new 3D structures for sub-10-nanometer (nm) DRAM, allowing larger single-chip capacities that can exceed 100 gigabits (Gb)
- Samsung is working on its next-generation 11nm-class DRAM, which is set to offer the industry’s highest density.
In addition, Samsung believes NAND flash breakthroughs will shrink cell sizes and refine channel hole etching techniques are also in development, enabling 1,000-layer vertical NAND (V-NAND).
“The new era of hyperscale AI has brought the industry to a crossroads where innovation and opportunity intersect, presenting a time with potential for great leaps forward, despite the challenges,” said Jung-Bae Lee, President and Head of Memory Business at Samsung Electronics. “Through endless imagination and relentless perseverance, we will continue our market leadership by driving innovation and collaborating with customers and partners to deliver solutions that expand possibilities.”