An image illustrating what On-device AI means
An image illustrating what On-device AI means

Following the rise of generative AI led by ChatGPT, the On-Device AI market is now opening up, drawing attention to a new type of memory semiconductor. On-Device AI refers to technology that implements AI functions within information technology (IT) devices like smartphones without relying on servers and the cloud.

According to industry sources on Nov. 13, Samsung Electronics is developing Low Latency Wide IO (LLW) DRAM, with mass production targeted by the end of next year. LLW is a special type of DRAM that increases the bandwidth compared to conventional mobile product LPDDR by expanding the input/output (I/O) pathways, which are the channels through which information enters and exits. Since bandwidth is proportional to transmission speed, this type of DRAM is significantly more efficient in processing data generated in real-time by devices.

Samsung Electronics has been developing lightweight On-Device AI algorithms since before 2020 and applying them to System on Chips (SoCs), memory, and sensors, enhancing its competitiveness in semiconductors for On-Device AI. The company is expected to start full-scale market capture, beginning with the mobile product deployment of its in-house developed generative AI, Samsung Gauss, next year.

SK hynix is also set to supply its special DRAM to Apple’s next-generation Augmented Reality (AR) device VisionPro, slated for release early next year. This DRAM supports real-time high-definition video processing in conjunction with Apple’s newly developed R1 chip for VisionPro. During the development stage, Apple switched to using SK hynix’s high-bandwidth DRAM for the R1 chip, continuing a collaborative relationship.

On-Device AI performs various functions within IT devices like smartphones, autonomous vehicles, and Extended Reality/Augmented Reality, such as conversation recognition, document summarization, location recognition, and operational control. Unlike server AI, which processes complex computations through the cloud, On-Device AI performs hundreds of millions of operations directly on the device. To rapidly process large volumes of data without consuming excessive power, enhancing the performance of DRAM, which assists in computation, is essential.

Industry insiders expect that following HBM, the expansion of the LLW DRAM market will herald the beginning of an era of customized memory. Since On-Device AI-equipped devices vary widely and each requires different functions, close collaboration with customers from the development stage is necessary to determine production methods and quantities. Moving away from the mass production of a small variety of products, memory manufacturers can operate on an order-based business model, maintaining pricing power and securing stable performance.

Copyright © BusinessKorea. Prohibited from unauthorized reproduction and redistribution