Abstract
In this paper, we propose a method for P-variate probability density estimation assuming a Gaussian mixture model (GMM). Our method exploits a regularization technique for improving the estimation accuracy of the GMM component covariance matrices. We derive an expectation maximization algorithm for fitting our regularized GMM (RGMM), which exploits an analytical Ledoit-Wolf-type shrinkage estimation of the covariance matrices. Our method is compared with recent model-based and variational Bayes approximation methods using synthetic and real data sets. The obtained results show that the proposed RGMM method achieves a significant improvement in the performance of multivariate probability density estimation with respect to other methods on both the synthetic and the real data sets.
Original language | English |
---|---|
Pages (from-to) | 460-470 |
Number of pages | 11 |
Journal | IEEE Transactions on Neural Networks and Learning Systems |
Volume | 24 |
Issue number | 3 |
DOIs | |
State | Published - 8 Oct 2013 |
Keywords
- Expectation maximization (EM) algorithm
- Gaussian mixture model (GMM)
- Model selection
- Multivariate density estimation
- Regularization
- Shrinkage estimation
ASJC Scopus subject areas
- Software
- Computer Science Applications
- Computer Networks and Communications
- Artificial Intelligence