Robust Parameter Estimation Based on the K-Divergence.

Yair Sorek, Koby Todros

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations


In this paper we present a new divergence, called Κ-divergence, that involves a weighted version of the hypothesized log-likelihood function. To down-weight low density areas, attributed to outliers, the corresponding weight function is a convolved version of the underlying density with a strictly positive smoothing "K"ernel function parameterized by a bandwidth parameter. The resulting minimum K-divergence estimator (MKDE) operates by minimizing the empirical K-divergence w.r.t. the vector parameter of interest. The MKDE utilizes Parzen's non-parametric kernel density estimator, arising from the nature of the weight function, to suppress outliers. By proper selection of the kernel's bandwidth parameter we show that the MKDE can gain enhanced estimation performance along with implementation simplicity as compared to other robust estimators.
Original languageEnglish
Title of host publicationICASSP
PublisherInstitute of Electrical and Electronics Engineers
Number of pages5
ISBN (Electronic)9781665405409
StatePublished - May 2022
Event47th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2022 - Virtual, Online, Singapore
Duration: 23 May 202227 May 2022

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
ISSN (Print)1520-6149


Conference47th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2022
CityVirtual, Online


  • Smoothing methods
  • Parameter estimation
  • Density measurement
  • Conferences
  • Signal processing
  • Bandwidth
  • Estimation
  • robust statistics
  • estimation theory
  • Divergences


Dive into the research topics of 'Robust Parameter Estimation Based on the K-Divergence.'. Together they form a unique fingerprint.

Cite this