On the error and parameter convergence of back-propagation learning

Fu-Chuang Chen*

*此作品的通信作者

研究成果: Conference contribution同行評審

4 引文 斯高帕斯(Scopus)

摘要

The author presents a convergence result based on a modified backpropagation training rule, which is the same as the standard backpropagation algorithm except that a dead-zone around the origin of the error coordinates is incorporated in the training rule. It is shown that, if the network modeling error and the initial parameter errors are small enough, then the norm of the parameter error will converge to a constant, the increment of network parameters will converge to zero, and the output error between the network and the nonlinear function will converge into a small ball. Simulations are used to verify the theoretical results.

原文English
主出版物標題91 IEEE Int Jt Conf Neural Networks IJCNN 91
發行者Publ by IEEE
頁面1092-1097
頁數6
ISBN(列印)0780302273
DOIs
出版狀態Published - 18 11月 1991
事件1991 IEEE International Joint Conference on Neural Networks - IJCNN '91 - Singapore, Singapore
持續時間: 18 11月 199121 11月 1991

出版系列

名字91 IEEE Int Jt Conf Neural Networks IJCNN 91

Conference

Conference1991 IEEE International Joint Conference on Neural Networks - IJCNN '91
城市Singapore, Singapore
期間18/11/9121/11/91

指紋

深入研究「On the error and parameter convergence of back-propagation learning」主題。共同形成了獨特的指紋。

引用此