Compressive imaging for defending deep neural networks from adversarial attacks

Vladislav Kravets, Bahram Javidi, Adrian Stern

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Despite their outstanding performance, convolutional deep neural networks (DNNs) are vulnerable to small adversarial perturbations. In this Letter, we introduce a novel approach to thwart adversarial attacks. We propose to employ compressive sensing (CS) to defend DNNs from adversarial attacks, and at the same time to encode the image, thus preventing counterattacks. We present computer simulations and optical experimental results of object classification in adversarial images captured with a CS single pixel camera.

Original languageEnglish
Pages (from-to)1951-1954
Number of pages4
JournalOptics Letters
Volume46
Issue number8
DOIs
StatePublished - 15 Apr 2021

Fingerprint

Dive into the research topics of 'Compressive imaging for defending deep neural networks from adversarial attacks'. Together they form a unique fingerprint.

Cite this