TY - GEN
T1 - On the error and parameter convergence of back-propagation learning
AU - Chen, Fu-Chuang
PY - 1991/11/18
Y1 - 1991/11/18
N2 - The author presents a convergence result based on a modified backpropagation training rule, which is the same as the standard backpropagation algorithm except that a dead-zone around the origin of the error coordinates is incorporated in the training rule. It is shown that, if the network modeling error and the initial parameter errors are small enough, then the norm of the parameter error will converge to a constant, the increment of network parameters will converge to zero, and the output error between the network and the nonlinear function will converge into a small ball. Simulations are used to verify the theoretical results.
AB - The author presents a convergence result based on a modified backpropagation training rule, which is the same as the standard backpropagation algorithm except that a dead-zone around the origin of the error coordinates is incorporated in the training rule. It is shown that, if the network modeling error and the initial parameter errors are small enough, then the norm of the parameter error will converge to a constant, the increment of network parameters will converge to zero, and the output error between the network and the nonlinear function will converge into a small ball. Simulations are used to verify the theoretical results.
UR - http://www.scopus.com/inward/record.url?scp=0026298236&partnerID=8YFLogxK
U2 - 10.1109/IJCNN.1991.170542
DO - 10.1109/IJCNN.1991.170542
M3 - Conference contribution
AN - SCOPUS:0026298236
SN - 0780302273
T3 - 91 IEEE Int Jt Conf Neural Networks IJCNN 91
SP - 1092
EP - 1097
BT - 91 IEEE Int Jt Conf Neural Networks IJCNN 91
PB - Publ by IEEE
T2 - 1991 IEEE International Joint Conference on Neural Networks - IJCNN '91
Y2 - 18 November 1991 through 21 November 1991
ER -