How To Train Your Program

David Tolpin

Research output: Working paper/PreprintPreprint

21 Downloads (Pure)

Abstract

We present a Bayesian approach to machine learning with probabilistic programs. In our approach, training on available data is implemented as inference on a hierarchical model. The posterior distribution of model parameters is then used to \textit{stochastically condition} a complementary model, such that inference on new data yields the same posterior distribution of latent parameters corresponding to the new data as inference on a hierachical model on the combination of both previously available and new data, at a lower computation cost. We frame the approach as a design pattern of probabilistic programming referred to herein as `stump and fungus', and illustrate realization of the pattern on a didactic case study.
Original languageEnglish GB
StatePublished - 8 May 2021

Publication series

Namearxiv cs.LG

Keywords

  • cs.LG

Fingerprint

Dive into the research topics of 'How To Train Your Program'. Together they form a unique fingerprint.

Cite this