Paper
10 October 1994 Feature space trajectory (FST) classifier neural network
Leonard Neiberg, David P. Casasent
Author Affiliations +
Abstract
A new classifier neural network is described for distortion-invariant multi-class pattern recognition. Its input training data in different classes are described by a feature space. As a distortion parameter (such as aspect view) of a training set object is varied, an ordered training set is produced. This ordered training set describes the object as a trajectory in feature space, with different points along the trajectory corresponding to different aspect views. Different object classes are described by different trajectories. Classification involves calculation of the distance from an input feature space point to the nearest trajectory (this denotes the object class) and the position of the nearest point along that trajectory (this denotes the pose of the object). Comparison to other neural networks and other classifiers show that this feature space trajectory neural network yields better classification performance and can reject non-object data. The FST classifier performs well with different numbers of training images and hidden layer neurons and also generalizes better than other classifiers.
© (1994) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Leonard Neiberg and David P. Casasent "Feature space trajectory (FST) classifier neural network", Proc. SPIE 2353, Intelligent Robots and Computer Vision XIII: Algorithms and Computer Vision, (10 October 1994); https://doi.org/10.1117/12.188901
Lens.org Logo
CITATIONS
Cited by 11 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Neurons

Databases

Neural networks

Distortion

Image enhancement

Prototyping

Target recognition

Back to Top