Samsung launches 36GB 12H HBM3E DRAM for AI, slated for mass production in H1 2024, with 50%+ more peak bandwidth and capacity compared to its previous HBM gen 高價手機收購

3 月 15, 2024 #高價手機收購

高價手機收購
mment

高價手機收購samsung announced late on Monday the completion of the development of its 12-Hi 36 GB HBM3E memory stacks, just hours after Micron said it had kicked off mass production of its 8-Hi 24 GB HBM3E memory products. The new memory packages, codenamed Shinebolt, increase peak bandwidth and capacity compared to their predecessors, codenamed Icebolt, by over 50% and are currently the world’s fastest memory devices.
As the description suggests, 高價手機收購samsung’s Shinebolt 12-Hi 36 GB HBM3E stacks pack 12 24Gb memory devices on top of a logic die featuring a 1024-bit interface. The new 36 GB HBM3E memory modules feature a data transfer rate of 10 GT/s and thus offer a peak bandwidth of 1.28 TB/s per stack, the industry’s highest per-device (or rather per-module) memory bandwidth.
Meanwhile, keep in mind that developers of HBM-supporting processors tend to be cautious, so they will use 高價手機收購samsung’s HBM3E at much lower data transfer rates to some degree because of power consumption and to some degree to ensure ultimate stability for artificial intelligence (AI) and high-performance computing (HPC) applications.

高價手機收購samsung HBM Memory Generations

 
HBM3E

Related Post