Analyzing target detection performance with multispectral fused images

J. Lanir, M. Maltz

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Scopus citations

Abstract

With the advance in multispectral imaging, the use of image fusion has emerged as a new and important research area. Many studies have considered the advantages of specific fusion methods over the individual input bands in terms of human performance, yet few comparison studies have been conducted to determine which fusion method is preferable to another. This paper examines four different fusion methods, and compares human performance of observers viewing fused images in a target detection task. In the presented experiment, we implemented an approach that has not been generally used in the context of image fusion evaluation: we used the paired comparison technique to qualitatively assess and scale the subjective value of the fusion methods. Results indicated that the false color and average methods showed the best results.

Original languageEnglish
Title of host publicationApplications of Digital Image Processing XXIX
DOIs
StatePublished - 9 Nov 2006
EventApplications of Digital Image Processing XXIX - San Diego, CA, United States
Duration: 15 Aug 200617 Aug 2006

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
Volume6312
ISSN (Print)0277-786X

Conference

ConferenceApplications of Digital Image Processing XXIX
Country/TerritoryUnited States
CitySan Diego, CA
Period15/08/0617/08/06

Keywords

  • Image fusion
  • Law of comparative judgement
  • Multispectral imaging
  • Paired comparison
  • Target detection

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Computer Science Applications
  • Applied Mathematics
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Analyzing target detection performance with multispectral fused images'. Together they form a unique fingerprint.

Cite this