HBM Market Expansion

Nvidia CEO Jensen Huang introduces Nvidia’s next-generation AI chip Grace Hopper (GH) 200 at SIGGRAPH 2023 in Los Angeles of the United States on Aug. 8 (local time).
Nvidia CEO Jensen Huang introduces Nvidia’s next-generation AI chip Grace Hopper (GH) 200 at SIGGRAPH 2023 in Los Angeles of the United States on Aug. 8 (local time).

Nvidia, which leads the global artificial intelligence (AI) chip market, is expected to introduce a new advanced graphics processing unit (GPU) that will boost AI performance, fueling the expansion of the advanced memory chip market led mainly by Korean companies. SK hynix and Samsung will be able to enjoy big benefits in the next-generation high-bandwidth memory (HBM) market as the two Korean chipmakers are trailblazers in the HBM domain.

Nvidia CEO Jensen Huang on Aug. 8 announced at the SIGGRAPH 2023 event in Los Angeles that the company will begin to roll out the GH200, a super GPU featuring HBM3E, a next-generation HBM chip, in the second quarter of 2024. The GH200 combines an ARM-based NVIDIA Grace central processing unit (CPU) with hopper GPU architecture by using NVIDIA Interconnect technology.

An interesting part of the announcement is that Huang emphasizes GPU competitiveness by referring to HBM3E. Currently, only SK hynix and Samsung Electronics produce advanced HBM globally. HBMs are high-performance products with vertically stacked multiple DRAM modules and are used in GPUs for AI.

HBM is currently being developed under the following product names – the 1st generation (HBM), 2nd generation (HBM2), 3rd generation (HBM2E), 4th generation (HBM3), 5th generation (HBM3E), and 6th generation (HBM4). They are considered essential memory chips for running high-performance AI systems.

While HBM3 belonging to the fourth generation can be used with high-end GPUs, HBM3E is suitable for maximizing the performance of the newly mass-produced GH200, Nvidia explained. 141GB HBM3E supports 5TB/s bandwidth. This is 1.7 times the capacity and 1.5 times the bandwidth of the H100 GPU, which is currently the most popular AI-specific product. Currently, H100 is only loaded with the HBM3 made by SK hynix.

The HBM3E in the GH200 will reportedly be provided by SK Hynix and Samsung Electronics. Samsung and SK Hynix are more than doubling their HBM production capacity to meet the growing global demand for HBMs, including from Nvidia.

TrendForce forecasts that next-generation products from SK hynix and Samsung, including HBM3 and HBM3E (including HBM3P), will dominate the relevant chip market next year. For these chips, Samsung and SK hynix will release HBM3E samples in the first quarter of next year, with volume production starting in the second half of 2024.

SK hynix plans to start mass-producing fifth-generation HBM3E in the first half of next year and sixth-generation HBM4 in 2026. Samsung will ship HBM3 and fifth generation HBM3P starting in the fourth quarter of this year. The company has not yet announced when HBM4 will hit the market.

HBM3E is made through a 10-nm, fifth-generation advanced process. Recently, U.S. semiconductor company Micron announced that it will develop its own HBM3E and join the HBM market led by SK hynix and Samsung, attracting attention from industry insiders. However, the industry insiders predict that it will be a real challenge for Micron to catch up with SK hynix and Samsung, which are leaders in this field.

Copyright © BusinessKorea. Prohibited from unauthorized reproduction and redistribution