Paper
1 March 1992 Efficient activation functions for the back-propagation neural network
Surender K. Kenue
Author Affiliations +
Abstract
The back-propagation algorithm is the most common algorithm in use in artificial neural network research. The standard activation (transfer) function is the logistic function s(x) equals 1/(1 + exp(-x)). The derivative of this function is used in correcting the error signals for updating the coefficients of the network. The maximum value of the derivative is only 0.25, which yields slow convergence. A new family of activation functions is proposed, whose derivatives belong to Sechn (x) family for n equals 1,2,.... The maximum value of the derivatives varies from 0.637 to 1.875 for n equals 1-6, and thus a member of the activation function-family can be selected to suit the problem. Results of using this family of activation functions show orders of magnitude savings in computation. A discrete version of these functions is also proposed for efficient implementation. For the parity 8 problem with 16 hidden units, the new activation function f3 uses 300 epochs for learning when compared to 500,000 epochs used by the standard activation function.
© (1992) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Surender K. Kenue "Efficient activation functions for the back-propagation neural network", Proc. SPIE 1608, Intelligent Robots and Computer Vision X: Neural, Biological, and 3-D Methods, (1 March 1992); https://doi.org/10.1117/12.135110
Lens.org Logo
CITATIONS
Cited by 3 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Computer programming

Neurons

Evolutionary algorithms

Computer vision technology

Machine vision

Matrices

Neural networks

RELATED CONTENT


Back to Top