On the Minimum K-Divergence Estimator

Yair Sorek, Koby Todros

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

In this paper, we deal with the problem of robust parameter estimation in the presence of outlying measurements. To that sake, we introduce a new divergence, called K-divergence, that involves a weighted version of the hypothesized log-likelihood function. To down-weight low density areas, attributed to outliers, the corresponding weight function is a convolved version of the underlying density with a strictly positive smoothing 'K'ernel function that is parameterized by a bandwidth parameter. The resulting minimum K-divergence estimator (M K DE) operates by minimizing the empirical K-divergence w.r.t. the vector parameter of interest. The M K DE utilizes Parzen's non-parametric kernel density estimator, arising from the nature of the weight function, to suppress outliers. By proper selection of the kernel's bandwidth parameter we show that the M K DE can gain enhanced estimation performance along with implementation simplicity as compared to other robust estimators. The M K DE is illustrated for parameter estimation in a contaminated linear latent variable model, for direction-of-arrival estimation in the presence of intermittent directional jamming, and for robust estimation of location and scatter.

Original languageEnglish
Pages (from-to)4337-4352
Number of pages16
JournalIEEE Transactions on Signal Processing
Volume70
DOIs
StatePublished - 1 Jan 2022

Keywords

  • Divergences
  • estimation theory
  • robust statistics

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'On the Minimum K-Divergence Estimator'. Together they form a unique fingerprint.

Cite this