Artificial intelligence is highly dependent on the latest in computer processor technology.
Artificial intelligence is highly dependent on the latest in computer processor technology.

The AI semiconductor market, currently concentrated on Nvidia and High Bandwidth Memory (HBM), is expected to diversify into DRAM and other areas moving forward.

On March 28, Kwon Seok-jun, a professor in the School of Chemical Engineering at Sungkyunkwan University, participated in the 2024 ACE Semiconductor Press Briefing hosted by Korea Investment Trust Management and stated, “The term ‘AI semiconductor’ is broadly used to refer to chips for various purposes such as learning, inference, creation, on-device, server, and cloud usage. As the market diversifies, the monopoly structure of Nvidia and HBM could be broken.” Professor Kwon is a leading semiconductor expert and author of The Three Kingdoms of Semiconductors.

Professor Kwon predicted that the demand for various products, including DRAM, would increase, since there cannot be a universal chip capable of addressing all of AI’s diverse applications. He mentioned, “The ‘Mach 1’ AI chip announced by Samsung Electronics and Naver does not incorporate HBM but will use Low Power (LP) DDR5 DRAM instead. DDR5, compared to HBM, has lower bandwidth but is cheaper and consumes less power, which could lead to a diversification of AI chips as models like this emerge.”

An improvement in the memory business is expected this year, with Samsung Electronics poised to see significant earnings from DRAM due to its early investments. “As interest shifts away from HBM, narrowing the price gap between HBM and DRAM from 5-6 times to 2-3 times, producing DRAM with a stable yield could become advantageous. Samsung Electronics, as a latecomer in HBM, may also engage in strategic collaborations with companies like AMD,” Professor Kwon added.

Copyright © BusinessKorea. Prohibited from unauthorized reproduction and redistribution