Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Top News
Top News

Samsung Develops Industry-First 36GB HBM3E 12H DRAM

Samsung Develops Industry-First 36GB HBM3E 12H DRAM

Samsung Electronics has announced the development of the industry's first 12-stack HBM3E DRAM, named HBM3E 12H, marking the highest-capacity HBM product to date. This groundbreaking memory solution offers an unprecedented bandwidth of up to 1,280 gigabytes per second (GB/s) and a remarkable capacity of 36 gigabytes (GB), representing a significant improvement of over 50% compared to the previous 8-stack HBM3 8H.

The HBM3E 12H is designed to meet the increasing demand for high-capacity and high-performance memory solutions in the AI era. Samsung's Executive Vice President of Memory Product Planning highlighted that AI service providers are seeking higher-capacity HBM, and the new HBM3E 12H product aims to address this need.

One of the key technologies applied in the HBM3E 12H is the advanced thermal compression non-conductive film (TC NCF), which enables the 12-layer stack to maintain the same height specification as 8-layer stacks, meeting current HBM package requirements. This technology enhances vertical density by over 20% compared to the previous HBM3 8H product, while also improving thermal properties.

Moreover, Samsung's advanced TC NCF technology optimizes thermal properties by utilizing bumps of various sizes between the chips. This approach not only aids in heat dissipation but also contributes to higher product yield during the chip bonding process.

As AI applications continue to expand, the HBM3E 12H is positioned as an ideal solution for future systems requiring increased memory capacity. The higher performance and capacity of this memory solution are expected to enable customers to manage resources more efficiently and reduce total cost of ownership (TCO) for data centers.

Initial sampling of the HBM3E 12H has commenced, with mass production scheduled for the first half of the year. The memory solution is anticipated to enhance AI training speed by 34% and expand the number of simultaneous users for inference services by more than 11.5 times, compared to the previous HBM3 8H product.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.