Driven by HBM, DDR5

The demand for high-performance memory semiconductors essential for AI chips is experiencing explosive growth due to the escalating demand for AI chips among global tech companies involved in the cloud server business. Consequently, there are high expectations for the upcoming business prospects of domestic companies that provide these memory chips. The focus is on the next-generation products related to High Bandwidth Memory (HBM) chips set to be launched by Samsung Electronics and SK hynix next year.

Market research firm TrendForce recently conducted an analysis on Aug. 1, suggesting that the competition in the AI accelerator chip market will intensify as cloud server providers (CSP) in North America and China conduct further validation on AI chip-related technologies, contributing to the optimistic outlook for the HBM market.

Samsung and SK hynix are currently at the forefront of the global market in the AI era, leading with HBM, a technology gaining significant attention for its use in DRAM. HBM is a high-performance product that vertically stacks multiple DRAM chips and finds applications in devices like Graphic Processing Units (GPUs) designed for AI processing.

HBM is being developed in multiple generations, namely 1st generation (HBM), 2nd generation (HBM2), 3rd generation (HBM2E), 4th generation (HBM3), and 5th generation (HBM3E). With each new generation, the chip’s bandwidth is enhanced, resulting in faster speed and higher memory capacity, making HBM a preferred choice for AI-related tasks.

According to TrendForce, global tech giants involved in the cloud server business such as Google, Amazon, and Microsoft are increasingly adopting HBM, with HBM2E being the prominent choice. Notably, NVIDIA’s “A100” and “A800,” as well as AMD’s “MI200” chips, widely used in AI-related tasks, feature HBM2E. Both Samsung and SK hynix are currently supplying HBM2E to global customers and are dominating the market.

In an annual report released on July 28 (local time), Microsoft highlighted a novel risk factor in its cloud business, mentioning a GPU shortage for the first time and emphasizing the necessity for expanding the AI GPU market. Microsoft said, “Data center operations depend on predictable land, energy, networking supply and servers, including graphic processing units (GPUs).”

TrendForce anticipates that SK hynix and Samsung’s next-generation products, including HBM3 and HBM3E, will dominate the related chip market next year. Currently, SK hynix is the sole global producer of HBM3, which is integrated into NVIDIA’s “H100,” a leading AI chip.

However, with the news of NVIDIA’s plans to launch the next-generation “GB100” chip in 2025, concerns about domestic companies’ HBM3 supply have grown. To meet the demand, Samsung and SK hynix are expected to release HBM3E samples in the first quarter of next year, with mass production commencing in the second half of 2024. HBM3E utilizes advanced 5th-generation process technology in the 10-nanometer range. Micron, a U.S.-based semiconductor company, recently announced its intention to independently develop HBM3E, declaring its entry into the HBM market led by SK hynix and Samsung, attracting industry attention. Nevertheless, the industry believes that it won't be easy for Micron to catch up with the technological capabilities of SK hynix and Samsung in this field.

In addition, Double Data Rate 5 (DDR5), a core memory emerging as a key player in the server market since the latter half of this year, has experienced a recent price rebound, bolstering expectations of “earnings turnaround” for Samsung and SK hynix. According to TrendForce, the average fixed transaction price of DDR5 8GB products in July was US$15.30, up by 3.13 percent from the previous month’s US$14.84. The price surge for DDR5 16GB products was even higher at 37.9 percent. As a result, the price of DDR5 8GB has been consistently declining from US$44.70 since December 2021, but it recently showed signs of a rebound after 1 year and 6 months.

Copyright © BusinessKorea. Prohibited from unauthorized reproduction and redistribution