In a quite unexpected twist, Samsung late on Thursday said that it had completed the development of the industry's first GDDR7 memory chip. The new device will feature a data transfer rate of 32 GT/s, use pulse-amplitude modulation (PAM3) signaling, and promise a 20% power efficiency improvement over GDDR6. To achieve this, Samsung had to implement several new technologies.
Samsung's first 16Gb GDDR7 device features a data transfer rate of 32 GT/s and therefore boasts a bandwidth of 128 GB/s, up significantly from 89.6 GB/s per chip provided by GDDR6X at 22.4 GT/s. To put it into perspective, a 384-bit memory subsystem featuring 32 GT/s GDDR7 chips would provide a whopping 1.536 TB/s of bandwidth, which by far exceeds GeForce RTX 4090's 1.008 TB/s.
To hit unprecedentedly high data transfer rates, GDDR7 uses PAM3 signaling, a kind of pulse amplitude modulation featuring three distinct signaling levels (-1, 0, and +1). This mechanism enables the transfer of three bits of data within two cycles, which is more efficient than the two-level NRZ, which is the method used by GDDR6. However, it is important to note that PAM3 signals are more complex to generate and decode than NRZ signals (which means additional power consumption), and they can be more susceptible to noise and interference. Meanwhile, it looks like the benefits of PAM3 outweigh its challenges, so it is set to be adopted by both GDDR7 and USB4 v2.
In addition to higher performance, Samsung's 32 GT/s GDDR7 chip is also said to feature a 20% improvement in power efficiency compared to 24 GT/s GDDR6, though Samsung does not specify how it measures power efficiency. Typically, memory makers tend to measure power per transferred bit, which is a fair thing to do, and from this point of view, GDDR7 promises to be more efficient than GDDR6.
Meanwhile, this does not mean that GDDR7 memory chips and GDDR7 memory controllers will consume less than today's GDDR6 ICs and controllers. PAM3 encoding/decoding is more complex and will require more power. In fact, Samsung even goes on to say that it used an epoxy molding compound (EMC) with high thermal conductivity and a 70% lower thermal resistance for GDDR7 packaging to ensure that the active components (the IC itself) do not overheat, which is an indicator that GDDR7 memory devices are hotter than GDDR6 memory devices, especially when working at high clocks.
It is also noteworthy that Samsung's GDDR7 components will offer a low operating voltage option for applications like laptops, but the company does not disclose what kind of performance we should expect from such devices.
Truth be told, Samsung's announcement is a little bit shy of details. The company does not say when it plans to start mass production of its GDDR7 components and which process technology it is set to use. Given the cadence of new GPU architecture announcements by AMD and Nvidia — every two years — it is logical to expect next-generation graphics processors to hit the market in 2024, and they are more than likely to adopt GDDR7.
Meanwhile, Samsung expects artificial intelligence, high-performance computing, and automotive applications to take advantage of GDDR7 as well, so perhaps some sort of AI or HPC ASICs may adopt GDDR7 ahead of GPUs.
"Our GDDR7 DRAM will help elevate user experiences in areas that require outstanding graphic performance, such as workstations, PCs and game consoles, and is expected to expand into future applications such as AI, high-performance computing (HPC) and automotive vehicles," said Yongcheol Bae, Executive Vice President of Memory Product Planning Team at Samsung Electronics. "The next-generation graphics DRAM will be brought to market in line with industry demand and we plan on continuing our leadership in the space."