05 November 24, 07:07
Quote:SK hynix preparing for next-gen HBM4, PCIe 6 SSDs and LPCAMM2
In September, SK hynix announced the mass production of its new HBM3e memory featuring 12 layers. Now, just two months later, the company has unveiled a 16-layer version.
The standard 8-layer High Bandwidth Memory enables a capacity of 24GB per stack since each layer offers 3GB. By increasing that layer count to 12, the capacity grows to 36GB already. Now the 16-layer version increases the capacity to 48GB, double what is currently used in large AI accelerators from AMD and NVIDIA.
SK hynix is praising themselves as a “Full Stack AI Memory Provider,” because the HBM memory is primarily intended for the data-center market, no longer a suitable or cost-effective solution for consumers. Marketing language aside, the press release shared by SK hynix today says that the 16-layer HBM3 memory can contribute to 18% faster training and 32% faster inference compared to the 12-layer HBM product.
The HBM3e comments weren’t the only ones made at the SK AI Summit; the CEO of the company also revealed that the company is looking forward to HBM4 memory (also using 16-high stacks), and the development of LPCAMM2 and LPDDR6 memory is also already underway. Additionally, there were comments on PCIe 6.0 Gen SSDs already in development. Have you guys upgraded to PCIe 5.0 yet?
Continue Reading...