ChatGPT Boosts Demand for High-performance Memories

ChatGPT is expected to change the landscape of the global memory semiconductor market.

The advent of ChatGPT, an artificial intelligence (AI) chatbot, is providing opportunities for Korean memory semiconductor makers to create a new business. ChatGTP learns a lot of data through super-large artificial intelligence (AI) and answers questions naturally. DRAM data processing speed has become important for better and faster ChatGTP services. Korean companies are producing all of high-performance DRAMs essential for this. 

Since the beginning of this year, Samsung Electronics and SK Hynix have been receiving a surge in orders for high bandwidth memories (HBMs). HBMs significantly increase data processing speeds compared to other DRAMs by vertically connecting several DRAMs. They work together with central processing units (CPUs) and graphics processing units (GPUs) and can greatly improve the learning and calculation performances of servers.

Until now, despite its excellent performances, HBMs have had fewer applications than general DRAMs. This is because HBMs’ average selling price (ASP) is at least three times that of DRAM. HBMs require a complex production process and highly advanced technology. The expansion of AI services has turned the tide. 

Nvidia, the world's largest GPU company, has been asking SK Hynix to supply the latest product, HBM3 chips. Intel, the world's No. 1 server CPU company, is also working hard to sell products equipped with SK Hynix's HBM3. An industry insider said, “The price of HBM3 increased up to five times compared to the highest performance DRAM."

As the high-performance memory semiconductor market is expected to grow rapidly, product development competition is heating up between Samsung Electronics and SK Hynix. The HBM market is still nascent as HBMs began to go into servers for AI in earnest beginning in 2022, but SK Hynix and Samsung Electronics are focusing on securing customers through new product launches.

SK Hynix is taking the lead in the HBM market. It developed the world's first HBM in cooperation with AMD in 2013. The Korean chipmaker has released a first-generation HBM (HBM), a second-generation HBM (HBM2), a third-generation HBM (HBM2E), and a fourth-generation (HBM3) and has secured a market share of 60-70 percent.

In February 2021, Samsung Electronics developed HBM-PIM, which combines memory semiconductors and AI processors into one, in collaboration with AMD. When an HBM-PIM chip is installed on a CPU and a GPU, it can significantly increase the server's calculation speed. SK Hynix also unveiled a product solution powered by PIM technology in February 2022.

In the mid- to long-term, experts predict that the development of AI-specific DRAMs such as HBM will make a big change in the semiconductor industry. “The era when memory semiconductor companies were busy developing ultra-micro fabrication processes has passed,” said an official of the Korean semiconductor industry. “The development of AI semiconductor technology that efficiently processes data and even has the ability to process data will become so important that it will decide the future of chipmakers.

Copyright © BusinessKorea. Prohibited from unauthorized reproduction and redistribution