Robust Parameter Estimation Based on the K-Divergence.

Yair Sorek, Koby Todros

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In this paper we present a new divergence, called Κ-divergence, that involves a weighted version of the hypothesized log-likelihood function. To down-weight low density areas, attributed to outliers, the corresponding weight function is a convolved version of the underlying density with a strictly positive smoothing "K"ernel function parameterized by a bandwidth parameter. The resulting minimum K-divergence estimator (MKDE) operates by minimizing the empirical K-divergence w.r.t. the vector parameter of interest. The MKDE utilizes Parzen's non-parametric kernel density estimator, arising from the nature of the weight function, to suppress outliers. By proper selection of the kernel's bandwidth parameter we show that the MKDE can gain enhanced estimation performance along with implementation simplicity as compared to other robust estimators.
Original languageEnglish
Title of host publicationICASSP
Pages5767-5771
Number of pages5
DOIs
StatePublished - 2022

Keywords

  • Smoothing methods
  • Parameter estimation
  • Density measurement
  • Conferences
  • Signal processing
  • Bandwidth
  • Estimation
  • robust statistics
  • estimation theory
  • Divergences

Fingerprint

Dive into the research topics of 'Robust Parameter Estimation Based on the K-Divergence.'. Together they form a unique fingerprint.

Cite this