Abstract
In this paper we present a new divergence, called Κ-divergence, that involves a weighted version of the hypothesized log-likelihood function. To down-weight low density areas, attributed to outliers, the corresponding weight function is a convolved version of the underlying density with a strictly positive smoothing "K"ernel function parameterized by a bandwidth parameter. The resulting minimum K-divergence estimator (MKDE) operates by minimizing the empirical K-divergence w.r.t. the vector parameter of interest. The MKDE utilizes Parzen's non-parametric kernel density estimator, arising from the nature of the weight function, to suppress outliers. By proper selection of the kernel's bandwidth parameter we show that the MKDE can gain enhanced estimation performance along with implementation simplicity as compared to other robust estimators.
Original language | English |
---|---|
Title of host publication | ICASSP |
Pages | 5767-5771 |
Number of pages | 5 |
DOIs | |
State | Published - 2022 |
Keywords
- Smoothing methods
- Parameter estimation
- Density measurement
- Conferences
- Signal processing
- Bandwidth
- Estimation
- robust statistics
- estimation theory
- Divergences