Query-Based External Information Leakage Attacks on Face Recognition Models

Edita Grolman, Amit Giloni, Ryuta Kremer, Hiroo Saito, Tomoyuki Shibata, Tsukasa Omino, Misaki Komatsu, Yoshikazu Hanatani, Asaf Shabtai, Yuval Elovici

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Recent studies have demonstrated external information leakage (EIL) attacks which allow an attacker to infer various sensitive implicit properties related to a machine learning (ML) model's training data. Most of those attacks assumed 1) a white-box scenario in which the attacker has complete access to the ML model, its structure, and its parameters, or 2) a black-box (alternatively gray-box) scenario with non-realistic requirements such as a high query budget or high computational resources for the attacker. In this paper, we propose two practical query-based (i.e., black-box) EIL attacks that target face recognition ML models and allow an attacker to infer sensitive implicit properties, such as the facial characteristics, gender, ethnicity, income level, and average age of the individuals in the training data, with a limited number of queries. The first proposed attack, referred to as the random noise injection (RNI) attack, exploits the effect of injecting random noise into input samples on the target model's predictions. The second proposed attack, referred to as the property substitute model (PSM) attack, creates a substitute model for each property value examined, whose predictions are compared to the target model's predictions. Our comprehensive evaluation (a total of 730 experiments) performed on the CelebA dataset shows that the proposed attacks outperform existing EIL attacks and successfully infer private information, posing a threat to the privacy and security of the face recognition models.

Original languageEnglish
Title of host publication2024 International Joint Conference on Neural Networks, IJCNN 2024 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers
ISBN (Electronic)9798350359312
DOIs
StatePublished - 1 Jan 2024
Event2024 International Joint Conference on Neural Networks, IJCNN 2024 - Yokohama, Japan
Duration: 30 Jun 20245 Jul 2024

Publication series

NameProceedings of the International Joint Conference on Neural Networks

Conference

Conference2024 International Joint Conference on Neural Networks, IJCNN 2024
Country/TerritoryJapan
CityYokohama
Period30/06/245/07/24

Keywords

  • External Information Leakage
  • Face Recognition
  • Privacy Violation Attacks
  • Privacy and Security
  • Property Inference

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Query-Based External Information Leakage Attacks on Face Recognition Models'. Together they form a unique fingerprint.

Cite this