Paper
2 September 1993 Differential theory of learning for efficient neural network pattern recognition
John B. Hampshire II, Bhagavatula Vijaya Kumar
Author Affiliations +
Abstract
We describe a new theory of differential learning by which a broad family of pattern classifiers (including many well-known neural network paradigms) can learn stochastic concepts efficiently. We describe the relationship between a classifier's ability to generate well to unseen test examples and the efficiency of the strategy by which it learns. We list a series of proofs that differential learning is efficient in its information and computational resource requirements, whereas traditional probabilistic learning strategies are not. The proofs are illustrated by a simple example that lends itself to closed-form analysis. We conclude with an optical character recognition task for which three different types of differentially generated classifiers generalize significantly better than their probabilistically generated counterparts.
© (1993) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
John B. Hampshire II and Bhagavatula Vijaya Kumar "Differential theory of learning for efficient neural network pattern recognition", Proc. SPIE 1965, Applications of Artificial Neural Networks IV, (2 September 1993); https://doi.org/10.1117/12.152523
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Error analysis

Neural networks

Information operations

Artificial neural networks

Optical character recognition

Pattern recognition

Databases

Back to Top