Convergence analysis for a class of neural networks
Ημερομηνία
1992ISBN
0-7803-0164-1Εκδότης
Publ by IEEESource
Proceedings. IJCNN - International Joint Conference on Neural NetworksInternational Joint Conference on Neural Networks - IJCNN-91-Seattle
Google Scholar check
Keyword(s):
Metadata
Εμφάνιση πλήρους εγγραφήςΕπιτομή
Summary form only given, as follows. The authors consider the convergence issue that arises in the application of backpropagation algorithms in a special class of neural network architectures, referred to as structured networks, which are used for solving matrix algebra problems. They have developed bounds for the learning rate under which exponential convergence of the training procedure is shown. They also investigated methods for improving the rate of convergence. For a special class of problems, they introduced the orthogonalized backpropagation algorithm, an optimal recursive update law for minimizing a least-squares cost functional, which guarantees exact convergence in one epoch. The results make it possible to obtain valuable insight into neural network learning and to unify certain learning procedures used by connectionists and adaptive control theorists.