On learning parametric-output HMMs

Aryeh Kontorovich, Boaz Nadler, Roi Weiss

Research output: Contribution to conferencePaperpeer-review

16 Scopus citations


We present a novel approach to learning an HMM whose outputs are distributed according to a parametric family. This is done by decoupling the learning task into two steps: first estimating the output parameters, and then estimating the hidden state transition probabilities. The first step is accomplished by fitting a mixture model to the output stationary distribution. Given the parameters of this mixture model, the second step is formulated as the solution of an easily solvable convex quadratic program. We provide an error analysis for the estimated transition probabilities and show they are robust to small perturbations in the estimates of the mixture parameters. Finally, we support our analysis with some encouraging empirical results.

Original languageEnglish
Number of pages9
StatePublished - 1 Jan 2013
Event30th International Conference on Machine Learning, ICML 2013 - Atlanta, GA, United States
Duration: 16 Jun 201321 Jun 2013


Conference30th International Conference on Machine Learning, ICML 2013
Country/TerritoryUnited States
CityAtlanta, GA

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Sociology and Political Science


Dive into the research topics of 'On learning parametric-output HMMs'. Together they form a unique fingerprint.

Cite this