Regularized mixture density estimation with an analytical setting of shrinkage intensities

Zohar Halbe, Maria Bortman, Mayer Aladjem

Research output: Contribution to journalArticlepeer-review

12 Scopus citations

Abstract

In this paper, we propose a method for P-variate probability density estimation assuming a Gaussian mixture model (GMM). Our method exploits a regularization technique for improving the estimation accuracy of the GMM component covariance matrices. We derive an expectation maximization algorithm for fitting our regularized GMM (RGMM), which exploits an analytical Ledoit-Wolf-type shrinkage estimation of the covariance matrices. Our method is compared with recent model-based and variational Bayes approximation methods using synthetic and real data sets. The obtained results show that the proposed RGMM method achieves a significant improvement in the performance of multivariate probability density estimation with respect to other methods on both the synthetic and the real data sets.

Original languageEnglish
Pages (from-to)460-470
Number of pages11
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume24
Issue number3
DOIs
StatePublished - 8 Oct 2013

Keywords

  • Expectation maximization (EM) algorithm
  • Gaussian mixture model (GMM)
  • Model selection
  • Multivariate density estimation
  • Regularization
  • Shrinkage estimation

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Regularized mixture density estimation with an analytical setting of shrinkage intensities'. Together they form a unique fingerprint.

Cite this