Hidden Markov Models
A Hidden Markov Model is often used for sequential data: the words of a sentence, the sentences of a document, the phonemes of speech, financial market prices over time, etc. They are typically used when one wants to estimate the value of a variable over time, given imperfect observations.
Here we describe the theory behind HMMs, as well as some sample applications. Often, values of some variables connected together by an HMM are known, while others are to be estimated. There are three different algorithms available for performing different types of inference on HMMs:
- The Forward-Backward Algorithm: When one wants to estimate a distribution over an individual variable, i.e. Marginal Inference.
- The Viterbi Algorithm: When one wants to jointly estimate the most likely value for all variables simultaneously, i.e. Maximum A Posterior (MAP) inference.
- The Markov Chain Monte Carlo (MCMC) Algorithm: When one wants to estimate confidence intervals on variable values, or if the dimensionality of each variable is very high, the approximation algorithm be best.