Control Just with Hand Motions

An example of applying the deep neural network processing unit (DNPU to the intelligent pet robot. (photo courtesy:(IITP)
An example of applying the deep neural network processing unit (DNPU to the intelligent pet robot. (photo courtesy:(IITP)

 

A South Korean research team succeeded in developing an artificial intelligence (AI) chip that can control robots with hand motions alone.

According to the Institute for Information & communications Technology (IITP; Director Lee Sang-hong) on August 28, the research team led by Professor Yoo Hoi-jun from the KAIST successfully developed “deep neural network processing unit (DNPU),” a low-power deep learning chip that can realize AI systems in various mobile platforms.

A DNPU is a low power chip integrating the features of the multilayer processing (MLP), which supports categorization in accordance with its intended use, the convolutional neural network (CNN), which recognize the changes of pictures and data over time, and recurrent neural network (RNN).

The team produced CNN deep neural network and MLP and RNN deep neural network into separate accelerators and then integrated these two separate accelerators into one chip, securing the capability of high-performance association.

Based on this, the AI technology, including object and action recognition and image captioning, can be realized in not only mobile devices such as cellphones and robots but also wearable devices and Internet of Things devices in real time with low power.

Meanwhile, the findings of the research led by the doctoral student Shin Dong-joo from the KAIST were released at the Hot Chips conference held in San Jose of the U.S. from the 20th to the 22nd, and attracted attention due to its high energy efficiency which is up to four times higher than a tensor processing unit (TPU), the core of Google’s AlphaGo.

Copyright © BusinessKorea. Prohibited from unauthorized reproduction and redistribution