How To Train Your Program: A Probabilistic Programming Pattern for Bayesian Learning From Data

Research output: Working paper/PreprintPreprint

26 Downloads (Pure)

Abstract

We present a Bayesian approach to machine learning with probabilistic programs. In our approach, training on available data is implemented as inference on a hierarchical model. The posterior distribution of model parameters is then used to \textit{stochastically condition} a complementary model, such that inference on new data yields the same posterior distribution of latent parameters corresponding to the new data as inference on a hierachical model on the combination of both previously available and new data, at a lower computation cost. We frame the approach as a design pattern of probabilistic programming referred to herein as `stump and fungus', and illustrate realization of the pattern on a didactic case study.
Original languageEnglish
DOIs
StatePublished - 8 May 2021

Keywords

  • cs.LG

Fingerprint

Dive into the research topics of 'How To Train Your Program: A Probabilistic Programming Pattern for Bayesian Learning From Data'. Together they form a unique fingerprint.

Cite this