Paper
12 April 2023 Adversarial attacks against face recognition: a study
Author Affiliations +
Proceedings Volume 12565, Conference on Infrared, Millimeter, Terahertz Waves and Applications (IMT2022); 125652D (2023) https://doi.org/10.1117/12.2662621
Event: Conference on Infrared, Millimeter, Terahertz Waves and Applications (IMT2022), 2022, Shanghai, China
Abstract
With the continuous development of artificial intelligence, face recognition is widely used in identity authentication, facial recognition payment, intelligent security and other fields. However, the existence of adversarial samples has great security risks to face recognition. In the face recognition system, the attacker can make the system mistake the identity by adding tiny changes to the face image, thus causing a series of security threats such as system intrusion, illegal access to authority, stolen property, and evasion of legal responsibility. In this paper, we first introduce the basic concept of adversarial attacks, and briefly analyze the typical adversarial sample generation methods in recent years. Then, the adversarial attack security problem of face recognition is discussed. Finally, the research status of face recognition adversarial attacks is analyzed.
© (2023) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yunfei Cheng, Yuexia Liu, and Wu Wang "Adversarial attacks against face recognition: a study", Proc. SPIE 12565, Conference on Infrared, Millimeter, Terahertz Waves and Applications (IMT2022), 125652D (12 April 2023); https://doi.org/10.1117/12.2662621
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Facial recognition systems

Data modeling

Back to Top