Paper
22 May 2024 Research on contrastive image classification algorithm based on scalable self-attention
Xin Tian, Jian Liu, Zhenwei Cui, Yueming Guo, Yifei Xu, Yuan Sun, Kai Qiao
Author Affiliations +
Proceedings Volume 13176, Fourth International Conference on Machine Learning and Computer Application (ICMLCA 2023); 131760A (2024) https://doi.org/10.1117/12.3029022
Event: Fourth International Conference on Machine Learning and Computer Application (ICMLCA 2023), 2023, Hangzhou, China
Abstract
This study introduces a scalable self-attention based contrastive image classification algorithm, aiming to train a backbone network suitable for downstream image classification tasks through contrastive learning. It begins by addressing the challenges of high training overhead and low model accuracy encountered when applying the contrastive learning model represented by Moco V3 to image classification. Subsequently, the Momentum Contrast For Image Classification (MocoIC) algorithm is proposed as a solution to these issues. The MocoIC algorithm comprises the Scalable Shift Window Transformer (SSWT) feature extraction network, feature mapping network, prediction head, and Info NCE loss function. This model exhibits superior algorithmic performance and requires less training overhead compared to the original contrastive learning model.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Xin Tian, Jian Liu, Zhenwei Cui, Yueming Guo, Yifei Xu, Yuan Sun, and Kai Qiao "Research on contrastive image classification algorithm based on scalable self-attention", Proc. SPIE 13176, Fourth International Conference on Machine Learning and Computer Application (ICMLCA 2023), 131760A (22 May 2024); https://doi.org/10.1117/12.3029022
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image classification

Machine learning

Feature extraction

Data modeling

Education and training

Performance modeling

Transformers

Back to Top