Paper
20 August 1992 Large-memory-based learning systems
George Cybenko, Sirpa Saarinen
Author Affiliations +
Abstract
The recent introduction of compact large capacity memories has opened up possibilities for more aggressive use of data in learning systems. Instead of using complex, global models for the data we investigate the use of well-known but modified and extended local non-parametric methods for learning. Our focus is on learning problems where a large set of data is available to the system designer. Such applications include speech recognition, character recognition and local weather prediction. The general system we present, called an Adaptive Memory, (AM) is adaptive in the sense that some part of the sample data is stored and a local non- parametric model is updated when new training data becomes available. This makes training possible throughout the usable lifetime of the system, in contrast with many popular learning algorithms like neural networks and other parametric methods that have a distinctive learning phase. In the past designers of learning systems have been reluctant to store data samples in memory because of the inherent slowness of searching and storing. However, with the advent of parallel searching algorithms and high speed large memories the AM approach is competitive with parametric methods and may ultimately exceed their performance for a large class of problems.
© (1992) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
George Cybenko and Sirpa Saarinen "Large-memory-based learning systems", Proc. SPIE 1706, Adaptive and Learning Systems, (20 August 1992); https://doi.org/10.1117/12.139955
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Data modeling

Databases

Error analysis

Evolutionary algorithms

Adaptive optics

Data storage

Neural networks

Back to Top