Neural Estimation of Statistical Divergences

Sreejith Sreekumar, Ziv Goldfeld

Research output: Contribution to journalArticlepeer-review

13 Scopus citations


Statistical divergences (SDs), which quantify the dissimilarity between probability distributions, are a basic constituent of statistical inference and machine learning. A modern method for estimating those divergences relies on parametrizing an empirical variational form by a neural network (NN) and optimizing over parameter space. Such neural estimators are abundantly used in practice, but corresponding performance guarantees are partial and call for further exploration. We establish non-asymptotic absolute error bounds for a neural estimator realized by a shallow NN, focusing on four popular f-divergences| Kullback-Leibler, chi-squared, squared Hellinger, and total variation. Our analysis relies on non-asymptotic function approximation theorems and tools from empirical process theory to bound the two sources of error involved: function approximation and empirical estimation. The bounds characterize the effective error in terms of NN size and the number of samples, and reveal scaling rates that ensure consistency. For compactly supported distributions, we further show that neural estimators of the first three divergences above with appropriate NN growth-rate are minimax rate-optimal, achieving the parametric convergence rate.

Original languageEnglish
JournalJournal of Machine Learning Research
StatePublished - 1 Jan 2022
Externally publishedYes


  • Approximation theory
  • empirical process theory
  • f-divergence
  • minimax estimation
  • neural estimation
  • neural network
  • statistical divergence
  • variational form

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence
  • Control and Systems Engineering
  • Statistics and Probability


Dive into the research topics of 'Neural Estimation of Statistical Divergences'. Together they form a unique fingerprint.

Cite this