Memory chipmaker SK hynix recently updated one of its specs pages for HBM2E and has perhaps inadvertently given away the initial specifications of the next-gen HBM3 standard. Despite its name, HBM3 is actually the fourth generation of the High Bandwidth Memory (HBM) design and according to SK hynix, the new standard will offer 665GB/s or more in terms of bandwidth with an I/O speed of 5.2Gbps.
This means that there is an uplift of close to 50% in the bandwidth as well in I/O speed over current-gen HBM2E. However, we still don't know some of the other key specs of HBM3's arrangement. For example, details on the 3D stack height and die density of the next-gen design are unavailable at this moment which means we can't determine the maximum capacity of memory each stack of HBM3 will be able to accommodate.
Below is a graphic comparing the gen-on-gen improvement for HBM:
In terms of availability, HBM3 is likely quite a distance away from the finish line since HBM2E already provides sufficient throughput for most of the GPUs we have today.
In related HBM news, Samsung has come up with an innovative technology called HBM-PIM that uses AI to improve data flow efficiency. Once HBM3 is out, it will be interesting to see how the combination of the two will perform.
Source: SK hynix
7 Comments - Add comment