A growing and pruning method for radial basis function networks

M. Bortman, M. Aladjem

Research output: Contribution to journalArticlepeer-review

86 Scopus citations


A recently published generalized growing and pruning (GGAP) training algorithm for radial basis function (RBF) neural networks is studied and modified. GGAP is a resource-allocating network (RAN) algorithm, which means that a created network unit that consistently makes little contribution to the network's performance can be removed during the training. GGAP states a formula for computing the significance of the network units, which requires a d-fold numerical integration for arbitrary probability density function p(x) of the input data x (x ∈ Rd). In this work, the GGAP formula is approximated using a Gaussian mixture model (GMM) for p(x) and an analytical solution of the approximated unit significance is derived. This makes it possible to employ the modified GGAP for input data having complex and high-dimensional p(x), which was not possible in the original GGAP. The results of an extensive experimental study show that the modified algorithm outperforms the original GGAP achieving both a lower prediction error and reduced complexity of the trained network.

Original languageEnglish
Pages (from-to)1039-1045
Number of pages7
JournalIEEE Transactions on Neural Networks
Issue number6
StatePublished - 1 Jul 2009


  • Gaussian mixture model (GMM)
  • Growing and pruning algorithms
  • Radial basis function (RBF) neural networks
  • Resource-allocating network (RAN)
  • Sequential function approximation

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence


Dive into the research topics of 'A growing and pruning method for radial basis function networks'. Together they form a unique fingerprint.

Cite this