On the error and parameter convergence of back-propagation learning

Fu-Chuang Chen*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Scopus citations

Abstract

The author presents a convergence result based on a modified backpropagation training rule, which is the same as the standard backpropagation algorithm except that a dead-zone around the origin of the error coordinates is incorporated in the training rule. It is shown that, if the network modeling error and the initial parameter errors are small enough, then the norm of the parameter error will converge to a constant, the increment of network parameters will converge to zero, and the output error between the network and the nonlinear function will converge into a small ball. Simulations are used to verify the theoretical results.

Original languageEnglish
Title of host publication91 IEEE Int Jt Conf Neural Networks IJCNN 91
PublisherPubl by IEEE
Pages1092-1097
Number of pages6
ISBN (Print)0780302273
DOIs
StatePublished - 18 Nov 1991
Event1991 IEEE International Joint Conference on Neural Networks - IJCNN '91 - Singapore, Singapore
Duration: 18 Nov 199121 Nov 1991

Publication series

Name91 IEEE Int Jt Conf Neural Networks IJCNN 91

Conference

Conference1991 IEEE International Joint Conference on Neural Networks - IJCNN '91
CitySingapore, Singapore
Period18/11/9121/11/91

Fingerprint

Dive into the research topics of 'On the error and parameter convergence of back-propagation learning'. Together they form a unique fingerprint.

Cite this