close
close

Le-verdict

News with a Local Lens

SK hynix Announces World’s First 48GB HBM3E 16-Hi Memory — Next-Generation PCIe 6.0 SSDs and UFS 5.0 Storage Also in the Works
minsta

SK hynix Announces World’s First 48GB HBM3E 16-Hi Memory — Next-Generation PCIe 6.0 SSDs and UFS 5.0 Storage Also in the Works

When you purchase through links on our articles, Future and its syndication partners may earn a commission.

    CEO of SK hynix.     CEO of SK hynix.

During the SK AI Summit 2024, SK hynix The CEO took the stage and unveiled the industry’s first 16-Hi HBM3E memory, beating out Samsung and Micron. As development of the HBM4 continues, SK hynix has prepared a 16-layer version of its HBM3E offerings to ensure “technological stability” and aims to offer samples as early as next year.

A few weeks ago, SK hynix unveiled a 12-Hi variant of its HBM3E memory – obtaining contracts from AMD (MI325X) and Nvidia (Blackwell Ultra). Rake record profits Last quarter, SK hynix is ​​in great shape again as the giant has just announced a 16-layer upgrade to its HBM3E range, offering capacities of 48 GB (3 GB per individual chip) per stack. This increase in density now allows AI accelerators to have up to 384 GB of HBM3E memory in an 8-stack configuration.

SK hynix claims an 18% improvement in training as well as a 32% increase in inference performance. Like its 12-Hi counterpart, the new 16-Hi HBM3E memory incorporates packaging technologies like MR-MUF which connects the chips by melting the solder between them. SK hynix expects 16-Hi HBM3E samples to be ready by early 2025. However, this memory could be short-lived as Nvidia’s next generation. Rubin the chips are expected to be mass-produced later next year and will be based on HBM4.

SK hynix HBM3E 16-HiSK hynix HBM3E 16-Hi

SK hynix HBM3E 16-Hi

That’s not all as the company is actively working on PCIe 6.0 SSDs, high capacity QLC (Quad Level Cell) eSSDs for AI servers and UFS 5.0 for mobile devices. Additionally, to power future laptops and even handhelds, SK hynix is ​​developing an LPCAMM2 module and soldered LPDDR5/6 memory using its 1cnm node. There is no mention of CAMM2 modules for desktop computers, so computer users will have to wait – at least until CAMM2 adoption matures.

To overcome what SK hynix calls a “memory wall,” the memory maker is developing solutions such as near-memory processing (PNM), in-memory processing (PIM) and computer storage. Samsung has already demonstrated its version of PIM, in which data is processed in memory so that it does not need to be moved to an external processor.

HBM4 will double the channel width from 1,024 bits to 2,048 bits while supporting more than 16 vertically stacked (16-Hi) DRAM dies, each containing up to 4 GB of memory. These are monumental, generation-over-generation upgrades that should be enough to meet the high memory demands of upcoming AI GPUs.

Samsung HBM4 recording is expected to move forward later this year. On the other hand, reports suggest that SK hynix has already completed its registration phase in October. After a traditional silicon development lifecycle, Nvidia and AMD are expected to receive qualification samples by the first and second quarters of next year.

THE SK AI Summit 2024 will be held at the COEX Convention Center in Seoul on November 4-5. It is the largest AI symposium in Korea, the company claimed.