Comparison of deep learning-based compressive imaging from a practitioner's viewpoint

Guy Hanzon, Or Nizhar, Vladislav Kravets, Adrian Stern

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

For nearly twenty years, a multitude of Compressive Imaging (CI) techniques have been under development. Modern approaches to CI leverage the capabilities of Deep Learning (DL) tools in order to enhance both the sensing model and the reconstruction algorithm. Unfortunately, most of these DL-based CI methods have been developed by simulating the sensing process while overlooking limitations associated with the optical realization of the optimized sensing model. This article presents an outline of the foremost DL-based CI methods from a practitioner's standpoint. We conduct a comparative analysis of their performances, with a particular emphasis on practical considerations like the feasibility of the sensing matrices and resistance to noise in measurements.

Original languageEnglish
Title of host publicationApplications of Machine Learning 2023
EditorsMichael E. Zelinski, Tarek M. Taha, Barath Narayanan Narayanan
PublisherSPIE
ISBN (Electronic)9781510665644
DOIs
StatePublished - 1 Jan 2023
EventApplications of Machine Learning 2023 - San Diego, United States
Duration: 23 Aug 202324 Aug 2023

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
Volume12675
ISSN (Print)0277-786X
ISSN (Electronic)1996-756X

Conference

ConferenceApplications of Machine Learning 2023
Country/TerritoryUnited States
CitySan Diego
Period23/08/2324/08/23

Keywords

  • Compressive Imaging
  • Deep Learning
  • Neural Networks

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Computer Science Applications
  • Applied Mathematics
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Comparison of deep learning-based compressive imaging from a practitioner's viewpoint'. Together they form a unique fingerprint.

Cite this