Mitigating Asymmetric Nonlinear Weight Update Effects in Hardware Neural Network Based on Analog Resistive Synapse

Chih Cheng Chang, Pin Chun Chen, Teyuh Chou, I. Ting Wang, Boris Hudec, Che Chia Chang, Chia-Ming Tsai, Tian-Sheuan Chang, Tuo-Hung Hou*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

67 Scopus citations

Abstract

Asymmetric nonlinear weight update is considered as one of the major obstacles for realizing hardware neural networks based on analog resistive synapses, because it significantly compromises the online training capability. This paper provides new solutions to this critical issue through co-optimization with the hardware-applicable deep-learning algorithms. New insights on engineering activation functions and a threshold weight update scheme effectively suppress the undesirable training noise induced by inaccurate weight update. We successfully trained a two-layer perceptron network online and improved the classification accuracy of MNIST handwritten digit data set to 87.8%/94.8% by using 6-/8-b analog synapses, respectively, with extremely high asymmetric nonlinearity.

Original languageEnglish
Pages (from-to)116-124
Number of pages9
JournalIEEE Journal on Emerging and Selected Topics in Circuits and Systems
Volume8
Issue number1
DOIs
StatePublished - 1 Mar 2018

Keywords

  • multilayer perceptron
  • Neuromorphic computing
  • RRAM
  • synapse

Fingerprint

Dive into the research topics of 'Mitigating Asymmetric Nonlinear Weight Update Effects in Hardware Neural Network Based on Analog Resistive Synapse'. Together they form a unique fingerprint.

Cite this