Enhancing Resilience: Redundancy Reduction for Object Detection in Adversarial Conditions

Shubham Agarwal, Raz Birman, Ofer Hadar

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Vision systems, like other deep learning-based systems, encounter limitations due to training data and struggle to handle adversarial conditions such as varying lighting and weather conditions. In this paper, we propose a knowledge distillation framework aimed at bolstering the resilience of computer vision systems under adversarial conditions. Specifically, we focus on object detection task in adverse weather conditions and demonstrate that our system either exceeds or matches the state-of-the-art accuracy levels. Our system achieves a 2% higher mean average precision (mAP@50) in hazy conditions, and 9% higher mean average precision (mAP@50) in low-light conditions, compared to the nearest state-of-the-art frameworks.

Original languageEnglish
Title of host publication2024 IEEE Space, Aerospace and Defence Conference, SPACE 2024
PublisherInstitute of Electrical and Electronics Engineers
Pages580-583
Number of pages4
ISBN (Electronic)9798350367386
DOIs
StatePublished - 1 Jan 2024
Event2024 IEEE Space, Aerospace and Defence Conference, SPACE 2024 - Bangalore, India
Duration: 22 Jul 202423 Jul 2024

Publication series

Name2024 IEEE Space, Aerospace and Defence Conference, SPACE 2024

Conference

Conference2024 IEEE Space, Aerospace and Defence Conference, SPACE 2024
Country/TerritoryIndia
CityBangalore
Period22/07/2423/07/24

Keywords

  • adverse conditions
  • computer vision
  • knowledge distillation
  • object detection
  • redundancy reduction

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Networks and Communications
  • Signal Processing
  • Aerospace Engineering
  • Instrumentation

Fingerprint

Dive into the research topics of 'Enhancing Resilience: Redundancy Reduction for Object Detection in Adversarial Conditions'. Together they form a unique fingerprint.

Cite this