Samsung Electronics HBM3E 12H DRAM product image
Samsung Electronics HBM3E 12H DRAM product image

Samsung Electronics has thrown a decisive move to overturn the market dynamics of High Bandwidth Memory (HBM), a key component of Artificial Intelligence semiconductors, by being the first in the industry to mass-produce the 5th generation HBM, known as “HBM3E” with 12 layers, in the first half of this year.

On Feb. 27, Samsung announced the development of HBM3E 12H, which stacks up to 12 layers of 24 Gb DRAM chips using Through Silicon Via (TSV) technology. This product delivers a maximum bandwidth of 1280 GB per second and the industry’s highest capacity of 36 GB. It’s capable of downloading about 40 ultra-HD movies, each 30 GB in size, in just one second. Both performance and capacity have been improved by more than 50% compared to its predecessor, the HBM3 8H.

HBM is a product that stacks individual DRAM vertically to increase data processing speed and power efficiency. A challenge has been managing the heat generated as the DRAM is stacked higher, which also increases the thickness of the HBM, making thermal control more difficult. Another issue has been physical bending due to the thinner chip. To overcome these challenges, Samsung adopted the use of Advanced Thermal Compression Non-Conductive Film (Advanced TC NCF). This technique involves inserting a non-conductive adhesive film between DRAM to combine the HBM, minimizing bending and favoring high-layer stacking.

Despite being 12 layers, this product has been made the same height as the previous 8-layer versions. Samsung has continuously reduced the thickness of the NCF material to achieve the industry’s smallest chip gap of 7 micrometers (um), improving the vertical integration density by more than 20% compared to the HBM3 8H.

Server companies adopting the improved HBM3E 12H can expect to reduce their total cost of ownership (TCO) due to decreased graphics processing unit (GPU) usage. Applying HBM3E 12H in server systems can increase AI training speeds by an average of 34% compared to using HBM3 8H. For inference, it allows for up to 11.5 times more AI user services.

Samsung has already sent samples of the HBM3E 12H to major clients, with mass production scheduled for the first half of this year. Samsung’s successful development of the new product is seen as a pivotal moment in taking the lead in the next-generation HBM market competition. While SK hynix and Micron have completed the development of HBM3E and announced mass production for the first half as well, their products are all 8-layer. To date, Samsung is the only company that has announced the successful development of a 12-layer product.

This year, Samsung aims to increase its HBM supply volume by more than 2.5 times compared to last year, with significant investment in facilities. The plan is to quickly increase market share by leveraging stable production capacity in an HBM market where supply falls short of demand.

Copyright © BusinessKorea. Prohibited from unauthorized reproduction and redistribution