TY - UNPB
T1 - How To Train Your Program
T2 - A Probabilistic Programming Pattern for Bayesian Learning From Data
AU - Tolpin, David
N1 - submitted to PROBPROG11
PY - 2021/5/8
Y1 - 2021/5/8
N2 - We present a Bayesian approach to machine learning with probabilistic programs. In our approach, training on available data is implemented as inference on a hierarchical model. The posterior distribution of model parameters is then used to \textit{stochastically condition} a complementary model, such that inference on new data yields the same posterior distribution of latent parameters corresponding to the new data as inference on a hierachical model on the combination of both previously available and new data, at a lower computation cost. We frame the approach as a design pattern of probabilistic programming referred to herein as `stump and fungus', and illustrate realization of the pattern on a didactic case study.
AB - We present a Bayesian approach to machine learning with probabilistic programs. In our approach, training on available data is implemented as inference on a hierarchical model. The posterior distribution of model parameters is then used to \textit{stochastically condition} a complementary model, such that inference on new data yields the same posterior distribution of latent parameters corresponding to the new data as inference on a hierachical model on the combination of both previously available and new data, at a lower computation cost. We frame the approach as a design pattern of probabilistic programming referred to herein as `stump and fungus', and illustrate realization of the pattern on a didactic case study.
KW - cs.LG
U2 - 10.48550/arXiv.2105.03650
DO - 10.48550/arXiv.2105.03650
M3 - Preprint
BT - How To Train Your Program
ER -