24 September 2024 Rethinking the loss function in image classification
Litian Lin, Biao Chen, Feng Ye, Yizong Lai
Author Affiliations +
Abstract

Recent years have witnessed tremendous success of convolutional neural networks in image classification. In the training of image classification, softmax cross-entropy (SCE) loss is widely used because of its clear physical definition and concise gradient calculation. We reformulate SCE loss into a more basic form, in light of which we can understand SCE loss better and rethink the loss functions in image classification. We propose a novel loss function called another metric (AM) loss, which is different from SCE loss in form but is the same in essence. AM loss not only retains the advantages of SCE loss but also provides more gradient information at the later phase of training and has larger inter-classes distances. Extensive experiments on CIFAR-10/100, Tiny ImageNet, and three fine-grained datasets show that AM loss can outperform SCE loss, which also shows the great potential of improving deep neural network training by exploring new loss functions.

© 2024 SPIE and IS&T
Litian Lin, Biao Chen, Feng Ye, and Yizong Lai "Rethinking the loss function in image classification," Journal of Electronic Imaging 33(5), 053018 (24 September 2024). https://doi.org/10.1117/1.JEI.33.5.053018
Received: 14 May 2024; Accepted: 3 September 2024; Published: 24 September 2024
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Education and training

Image classification

Neural networks

Data modeling

Feature extraction

Convolutional neural networks

Facial recognition systems

Back to Top