Skip to main navigation Skip to search Skip to main content

Stochastically Differentiable Probabilistic Programs

  • David Tolpin
  • , Yuan Zhou
  • , Hongseok Yang

    Research output: Working paper/PreprintPreprint

    Abstract

    Probabilistic programs with mixed support (both continuous and discrete latent random variables) commonly appear in many probabilistic programming systems (PPSs). However, the existence of the discrete random variables prohibits many basic gradient-based inference engines, which makes the inference procedure on such models particularly challenging. Existing PPSs either require the user to manually marginalize out the discrete variables or to perform a composing inference by running inference separately on discrete and continuous variables. The former is infeasible in most cases whereas the latter has some fundamental shortcomings. We present a novel approach to run inference efficiently and robustly in such programs using stochastic gradient Markov Chain Monte Carlo family of algorithms. We compare our stochastic gradient-based inference algorithm against conventional baselines in several important cases of probabilistic programs with mixed support, and demonstrate that it outperforms existing composing inference baselines and works almost as well as inference in marginalized versions of the programs, but with less programming effort and at a lower computation cost.
    Original languageEnglish
    DOIs
    StatePublished - 5 May 2020

    Fingerprint

    Dive into the research topics of 'Stochastically Differentiable Probabilistic Programs'. Together they form a unique fingerprint.

    Cite this