We consider neural network learning problems in which the objective is to learn the relationship between the inputs and the probability distribution of a proposition. We regard successive truth values of the proposition as a dependent binary time series whose instantaneous probability of truth is a function of the past behavior of the joint process between the analog inputs and binary output truth values. In this context, we identify the gradient descent learning algorithm using the Kullback-Leibler relative entropy cost function on a perceptron with a Maximum Partial Likelihood (MPL) estimator of the perceptron model for the probability of a binary event in terms of its covariates. The implications of this result are: (i) The neural network models obtained by relative entropy learning are shown to have the nice large sample (i.e. training set size) characteristics of MPL estimates: consistency and asymptotic normality. (ii) An important and widely used statistical inference technique, logistic regression, can efficiently be implemented on analog perceptrons for time series modeling and prediction.
You are here
Time Series Modeling by Perceptrons: A Likelihood Approach
Type:
Conference Paper›Invited and refereed articles in conference proceedings
Authored by:
Sonmez, Kemal M., Baras, John S.
Conference date:
July 1993
Conference:
The World Congress on Neural Networks, Vol. IV, pp. 601-604
Full Text Paper:
Abstract: