Paper
21 May 2018 Design and simulation of optoelectronic neuron equivalentors as hardware accelerators of self-learning equivalent convolutional neural structures (SLECNS)
Author Affiliations +
Abstract
In the paper, we consider the urgent need to create highly efficient hardware accelerators for machine learning algorithms, including convolutional and deep neural networks (CNN and DNNS), for associative memory models, clustering, and pattern recognition. These algorithms usually include a large number of multiply-accumulate (and the like) operations. We show a brief overview of our related works the advantages of the equivalent models (EM) for describing and designing neural networks and recognizing bio-inspired systems. The capacity of NN on the basis of EM and of its modifications, including auto-and hetero-associative memories for 2D images, is in several times quantity of neurons. Such neuroparadigms are very perspective for processing, clustering, recognition, storing large size and strongly correlated and highly noised images. They are also very promising for solving the problem of creating machine uncontrolled learning. And since the basic operational functional nodes of EM are such vector-matrix or matrix-tensor procedures with continuous-logical operations as: normalized vector operations "equivalence", "nonequivalence", "autoequivalence", "auto-nonequivalence", we consider in this paper new conceptual approaches to the design of full-scale arrays of such neuron-equivalentors (NEs) with extended functionality, including different activation functions. Our approach is based on the use of analog and mixed (with special coding) methods for implementing the required operations, building NEs (with number of synapsis from 8 up to 128 and more) and their base cells, nodes based on photosensitive elements and CMOS current mirrors. We show the results of modeling the proposed new modularscalable implementations of NEs, we estimates and compare them. Simulation results show that processing time in such circuits does not exceed units of micro seconds, and for some variants 50-100 nanoseconds. Circuits are simple, have low supply voltage (1.5 – 3.3 V), low power consumption (milliwatts), low levels of input signals (microwatts), integrated construction, satisfy the problem of interconnections and cascading. Signals at the output of such neurons can be both digital and analog, or hybrid, and also with two complement outputs. They realize principle of dualism which gives a number of advantages of such complement dual NEs.
© (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Vladimir G. Krasilenko, Alexander A. Lazarev, and Diana V. Nikitovich "Design and simulation of optoelectronic neuron equivalentors as hardware accelerators of self-learning equivalent convolutional neural structures (SLECNS)", Proc. SPIE 10689, Neuro-inspired Photonic Computing, 106890C (21 May 2018); https://doi.org/10.1117/12.2316352
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Analog electronics

Neurons

Neural networks

Image processing

Convolution

Nonlinear filtering

Process modeling

Back to Top