M-ARY quantized neural networks

Jen-Tzung Chien, Su Ting Chang

研究成果: Conference contribution同行評審

3 引文 斯高帕斯(Scopus)

摘要

Parameter quantization is crucial for model compression. This paper generalizes the binary and ternary quantizations to M-ary quantization for adaptive learning of the quantized neural networks. To compensate the performance loss, the representation values and the quantization partitions of model parameters are jointly trained to optimize the resolution of gradients for parameter updating where the non-differentiable function in back-propagation algorithm is tackled. An asymmetric quantization is implemented. The restriction in parameter quantization is sufficiently relaxed. The resulting M-ary quantization scheme is general and adaptive with different M. Training of the M-ary quantized neural network (MQNN) can be tuned to balance the tradeoff between system performance and memory storage. Experimental results show that MQNN is able to achieve comparable image classification performance with full-precision neural network (FPNN), but the memory storage can be far less than that in FPNN.

原文English
主出版物標題2020 IEEE International Conference on Multimedia and Expo, ICME 2020
發行者IEEE Computer Society
頁數6
ISBN(電子)9781728113319
ISBN(列印)978-1-7281-1332-6
DOIs
出版狀態Published - 7月 2020
事件2020 IEEE International Conference on Multimedia and Expo, ICME 2020 - London, United Kingdom
持續時間: 6 7月 202010 7月 2020

出版系列

名字Proceedings - IEEE International Conference on Multimedia and Expo
2020-July
ISSN(列印)1945-7871
ISSN(電子)1945-788X

Conference

Conference2020 IEEE International Conference on Multimedia and Expo, ICME 2020
國家/地區United Kingdom
城市London
期間6/07/2010/07/20

指紋

深入研究「M-ARY quantized neural networks」主題。共同形成了獨特的指紋。

引用此