Bayesian network classification using spline-approximated kernel density estimation

Yaniv Gurwicz, Boaz Lerner

Research output: Contribution to journalArticlepeer-review

8 Scopus citations


The likelihood for patterns of continuous features needed for probabilistic inference in a Bayesian network classifier (BNC) may be computed by kernel density estimation (KDE), letting every pattern influence the shape of the probability density. Although usually leading to accurate estimation, the KDE suffers from computational cost making it unpractical in many real-world applications. We smooth the density using a spline thus requiring for the estimation only very few coefficients rather than the whole training set allowing rapid implementation of the BNC without sacrificing classifier accuracy. Experiments conducted over a several real-world databases reveal acceleration in computational speed, sometimes in several orders of magnitude, in favor of our method making the application of KDE to BNCs practical.

Original languageEnglish
Pages (from-to)1761-1771
Number of pages11
JournalPattern Recognition Letters
Issue number11
StatePublished - 1 Aug 2005


  • Bayesian networks
  • Classification
  • Kernel density estimation
  • Naïve Bayesian classifier
  • Spline

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence


Dive into the research topics of 'Bayesian network classification using spline-approximated kernel density estimation'. Together they form a unique fingerprint.

Cite this