Translator Disclaimer
1 August 1990 Enhanced neural net learning algorithms for classification problems
Author Affiliations +
This paper considers the application of a " global" optimization scheme to the training of multilayer perceptions for signal classifications. This study is motivated by the fact that the error surface of a multilayer perceptron is a highly nonlinear function of the parameters. Therefore the backpropagation which is a gradient descent algorithm converges to locally minimum structures. As an example we consider a signal classification problem where the optimum classifier has been shown to have an exponential complexity and the optimum decision boundary to be nonlinear and nonconvex. In this example when standard backpropagation is used to train the weights of a multi-layer perception the network is shown to classify with a " linear" decision boundary which corresponds locally to a minima of the neural network configurations. In this paper we propose to enhance the learning process of the network by considering an optimization scheme referred to as simulated annealing. This optimization scheme has been proven to be effective in finding global minima in many applications. We derive an iterative training algorithm based on this " global" optimization technique using the backpropagation as the " local" optimizer. We will verify the effectiveness of the learning algorithm via an empirical analysis of two signal classification problems. 1 PRELIMINARIES Artificial Neural Networks are highly interconnected networks of relatively simple processing units (commonly referred to as nodes e. g. perceptrons) which operate in parallel.
© (1990) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Behnaam Aazhang and Troy F. Henson "Enhanced neural net learning algorithms for classification problems", Proc. SPIE 1294, Applications of Artificial Neural Networks, (1 August 1990);

Back to Top