On the learning and convergence of the radial basis networks

Fu-Chuang Chen, Mao Hsing Lin

研究成果: Conference contribution同行評審

9 引文 斯高帕斯(Scopus)

摘要

Although the radial basis networks have been shown to be able to model any "well behaved" nonlinear function to any desired accuracy, there is no guarantee that the correct networks weights can be learned using any existing training rule. This paper reports a convergence result for training radial basis networks based on a modified gradient descent training rule, which is the same as the standard gradient descent algorithm except that a deadzone around the origin of the error coordinates is incorporated in the training rule. The result says that, if the deadzone size is large enough to cover the modeling error and if the learning rate is seleted within certain range, then the norm of the parameter error will converge to a constant, and the output error between the network and the nonlinear function will converge into a small ball. Simulations are used to verify the theoretical results.

原文English
主出版物標題1993 IEEE International Conference on Neural Networks, ICNN 1993
發行者Institute of Electrical and Electronics Engineers Inc.
頁面983-988
頁數6
ISBN(電子)0780309995
DOIs
出版狀態Published - 1 1月 1993
事件IEEE International Conference on Neural Networks, ICNN 1993 - San Francisco, United States
持續時間: 28 3月 19931 4月 1993

出版系列

名字IEEE International Conference on Neural Networks - Conference Proceedings
1993-January
ISSN(列印)1098-7576

Conference

ConferenceIEEE International Conference on Neural Networks, ICNN 1993
國家/地區United States
城市San Francisco
期間28/03/931/04/93

引用此