An HBM3e chip from SK hynix
An HBM3e chip from SK hynix

SK hynix has already sold out all of its HBM3E, a memory chip for artificial intelligence (AI), that it plans to produce beginning from 2024.

“We have sold off all the production volume of not only HBM3 but HBM3E in 2023,” an SK hynix official said in a conference call to announce its third-quarter earnings on Oct. 26, adding that the Korean chipmaker was receiving additional demand from customers. He added that the company is also in talks with customers over production volume for 2025.

Currently, high bandwidth memory (HBM) is a memory semiconductor for AI servers that has explosively grown in demand. The latest product is HBM3E, which is an upgrade of HBM3. It is expected to power the GH200 GPU for AI that Nvidia will mass-produce next year.

In addition, SK hynix expects that even if the demand for HBM3E expands, it will not pull down the average selling price (ASP) of the existing product, HBM3. This is because demand for DRAM is expected to recover in earnest beginning from 2024, and demand for HBM3 is still strong.

Accordingly, SK hynix plans to ramp up its investment in 2024 compared to 2023 in order to respond to the expanding demand. However, the company plans to minimize the increase in investment by considering its investment priorities. This year’s investment has shrunken 50 percent from the previous year due to a slump in the semiconductor industry.

For the third quarter, SK hynix reported sales of 9.622 trillion won (US$7.088 billion) and an operating loss of 1.792 trillion won. Its loss slid by 1 trillion won from the second quarter. This was due to a profit in its DRAM business.

Copyright © BusinessKorea. Prohibited from unauthorized reproduction and redistribution