Robust Bayesian estimation via the K-divergence

Yair Sorek, Koby Todros

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper, we introduce a novel framework for robust Bayesian parameter estimation using the K-divergence. The framework incorporates an outlier resilient pseudo-posterior density function, called the K-posterior, which is based on an empirical version of the K-divergence. The latter involves utilizing Parzen's non-parametric Kernel density estimator to mitigate the influence of outliers. Under the quadratic loss, a new robust analog of the posterior mean estimator (PME), referred here to as the KPME, is obtained. In the paper, we examine the asymptotic behavior of the KPME and investigate its robustness in the presence of outliers. Furthermore, we tackle the task of data-guided selection for the bandwidth parameter of the kernel in order to optimize a performance-oriented objective. Lastly, the KPME is successfully applied to robust Bayesian source localization under intermittent jamming.

Original languageEnglish
Article number109440
JournalSignal Processing
Volume220
DOIs
StatePublished - 1 Jul 2024

Keywords

  • Bayesian estimation
  • Divergences
  • Estimation theory
  • Non-parametric density estimation
  • Robust statistics

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Robust Bayesian estimation via the K-divergence'. Together they form a unique fingerprint.

Cite this