Get all your news in one place.
100’s of premium titles.
One app.
Start reading
AnandTech
AnandTech
Technology
Anton Shilov

Micron Kicks Off Production of HBM3E Memory

Micron Technology on Monday said that it had initiated volume production of its HBM3E memory. The company's HBM3E known good stack dies (KGSDs) will be used for Nvidia's H200 compute GPU for artificial intelligence (AI) and high-performance computing (HPC) applications, which will ship in the second quarter of 2024.

Micron has announced it is mass-producing 24 GB 8-Hi HBM3E devices with a data transfer rate of 9.2 GT/s and a peak memory bandwidth of over 1.2 TB/s per device. Compared to HBM3, HBM3E increases data transfer rate and peak memory bandwidth by a whopping 44%, which is particularly important for bandwidth-hungry processors like Nvidia's H200.

Nvidia's H200 product relies on the Hopper architecture and offers the same computing performance as the H100. Meanwhile, it is equipped with 141 GB of HBM3E memory featuring bandwidth of up to 4.8 TB/s, a significant upgrade from 80 GB of HBM3 and up to 3.35 TB/s bandwidth in the case of the H100.

Micron's memory roadmap for AI is further solidified with the upcoming release of a 36 GB 12-Hi HBM3E product in March 2024. Meanwhile, it remains to be seen where those devices will be used.

Micron uses its 1β (1-beta) process technology to produce its HBM3E, which is a significant achievement for the company as it uses its latest production node for its data center-grade products, which is a testament to the manufacturing technology.

Starting mass production of HBM3E memory ahead of competitors SK Hynix and Samsung is a significant achievement for Micron, which currently holds a 10% market share in the HBM sector. This move is crucial for the company, as it allows Micron to introduce a premium product earlier than its rivals, potentially increasing its revenue and profit margins while gaining a larger market share.

"Micron is delivering a trifecta with this HBM3E milestone: time-to-market leadership, best-in-class industry performance, and a differentiated power efficiency profile," said Sumit Sadana, executive vice president and chief business officer at Micron Technology. "AI workloads are heavily reliant on memory bandwidth and capacity, and Micron is very well-positioned to support the significant AI growth ahead through our industry-leading HBM3E and HBM4 roadmap, as well as our full portfolio of DRAM and NAND solutions for AI applications."

Source: Micron

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.