Interworking technology of neural network and data among deep learning frameworks

Based on the growing demand for neural network technologies, various neural network inference engines are being developed. However, each inference engine has its own neural network storage format. There is a growing demand for standardization to solve this problem. This study presents interworking t...

Full description

Bibliographic Details
Main Authors: Jaebok Park, Seungmok Yoo, Seokjin Yoon, Kyunghee Lee, Changsik Cho
Format: Article
Language:English
Published: Electronics and Telecommunications Research Institute (ETRI) 2019-09-01
Series:ETRI Journal
Subjects:
ai
cnn
Online Access:https://doi.org/10.4218/etrij.2018-0135
Description
Summary:Based on the growing demand for neural network technologies, various neural network inference engines are being developed. However, each inference engine has its own neural network storage format. There is a growing demand for standardization to solve this problem. This study presents interworking techniques for ensuring the compatibility of neural networks and data among the various deep learning frameworks. The proposed technique standardizes the graphic expression grammar and learning data storage format using the Neural Network Exchange Format (NNEF) of Khronos. The proposed converter includes a lexical, syntax, and parser. This NNEF parser converts neural network information into a parsing tree and quantizes data. To validate the proposed system, we verified that MNIST is immediately executed by importing AlexNet's neural network and learned data. Therefore, this study contributes an efficient design technique for a converter that can execute a neural network and learned data in various frameworks regardless of the storage format of each framework.
ISSN:1225-6463