TY - GEN
T1 - Query-Based External Information Leakage Attacks on Face Recognition Models
AU - Grolman, Edita
AU - Giloni, Amit
AU - Kremer, Ryuta
AU - Saito, Hiroo
AU - Shibata, Tomoyuki
AU - Omino, Tsukasa
AU - Komatsu, Misaki
AU - Hanatani, Yoshikazu
AU - Shabtai, Asaf
AU - Elovici, Yuval
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024/1/1
Y1 - 2024/1/1
N2 - Recent studies have demonstrated external information leakage (EIL) attacks which allow an attacker to infer various sensitive implicit properties related to a machine learning (ML) model's training data. Most of those attacks assumed 1) a white-box scenario in which the attacker has complete access to the ML model, its structure, and its parameters, or 2) a black-box (alternatively gray-box) scenario with non-realistic requirements such as a high query budget or high computational resources for the attacker. In this paper, we propose two practical query-based (i.e., black-box) EIL attacks that target face recognition ML models and allow an attacker to infer sensitive implicit properties, such as the facial characteristics, gender, ethnicity, income level, and average age of the individuals in the training data, with a limited number of queries. The first proposed attack, referred to as the random noise injection (RNI) attack, exploits the effect of injecting random noise into input samples on the target model's predictions. The second proposed attack, referred to as the property substitute model (PSM) attack, creates a substitute model for each property value examined, whose predictions are compared to the target model's predictions. Our comprehensive evaluation (a total of 730 experiments) performed on the CelebA dataset shows that the proposed attacks outperform existing EIL attacks and successfully infer private information, posing a threat to the privacy and security of the face recognition models.
AB - Recent studies have demonstrated external information leakage (EIL) attacks which allow an attacker to infer various sensitive implicit properties related to a machine learning (ML) model's training data. Most of those attacks assumed 1) a white-box scenario in which the attacker has complete access to the ML model, its structure, and its parameters, or 2) a black-box (alternatively gray-box) scenario with non-realistic requirements such as a high query budget or high computational resources for the attacker. In this paper, we propose two practical query-based (i.e., black-box) EIL attacks that target face recognition ML models and allow an attacker to infer sensitive implicit properties, such as the facial characteristics, gender, ethnicity, income level, and average age of the individuals in the training data, with a limited number of queries. The first proposed attack, referred to as the random noise injection (RNI) attack, exploits the effect of injecting random noise into input samples on the target model's predictions. The second proposed attack, referred to as the property substitute model (PSM) attack, creates a substitute model for each property value examined, whose predictions are compared to the target model's predictions. Our comprehensive evaluation (a total of 730 experiments) performed on the CelebA dataset shows that the proposed attacks outperform existing EIL attacks and successfully infer private information, posing a threat to the privacy and security of the face recognition models.
KW - External Information Leakage
KW - Face Recognition
KW - Privacy Violation Attacks
KW - Privacy and Security
KW - Property Inference
UR - http://www.scopus.com/inward/record.url?scp=85204954217&partnerID=8YFLogxK
U2 - 10.1109/IJCNN60899.2024.10651362
DO - 10.1109/IJCNN60899.2024.10651362
M3 - Conference contribution
AN - SCOPUS:85204954217
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2024 International Joint Conference on Neural Networks, IJCNN 2024 - Proceedings
PB - Institute of Electrical and Electronics Engineers
T2 - 2024 International Joint Conference on Neural Networks, IJCNN 2024
Y2 - 30 June 2024 through 5 July 2024
ER -