You are here

Convergence of a Neural Network Classifier

Type: 
Conference PaperInvited and refereed articles in conference proceedings
Authored by:
Baras, John S., LaVigna, Anthony.
Conference date:
December 1990
Conference:
The 29th Conference on Decision and Control (CDC), pp. 1735-1740
Full Text Paper: 
Abstract: 

In this paper, we show that the LVQ learning algorithm converges to locally asymptotic stable equilibria of an ordinary differential equation. We show that the learning algorithm performs stochastic approximation. Convergence of the Voronoi vectors is guaranteed under the appropriate conditions on the underlying statistics of the classification problem. We also present a modification to the learning algorithm which we argue results in convergence of the LVQ for a larger set of initial conditions. Finally, we show that LVQ is a general histogram classifier and that its risk converges to the Bayesian optimal risk as the appropriate parameters go to infinity with the number of past observations.