Hidden Markov Models
- Models sequential data by assuming an underlying sequence of unobserved (hidden) states that generate observable outputs.
- Uses transition probabilities between hidden states and emission probabilities from states to observations.
- Enables inference about the most likely hidden state sequence or the probability of a state given observed data.
Definition
Section titled “Definition”A hidden Markov model (HMM) is a statistical model that predicts the likelihood of sequences of observations given a set of underlying hidden states.
Explanation
Section titled “Explanation”An HMM defines:
- A set of hidden states that are not directly observable.
- A set of observations that are emitted probabilistically from those hidden states.
- Transition probabilities governing how the system moves between hidden states over time.
- Emission probabilities specifying how likely each observation is given the current hidden state.
Using these components, an HMM can estimate the probabilities of transitions between hidden states and the probabilities of emitting particular observations from each state. Given a sequence of observations, the model can be used to infer the probability of being in a particular hidden state at a given time, or to determine the most likely sequence of hidden states that produced the observations.
Examples
Section titled “Examples”Dog barking example
Section titled “Dog barking example”Consider a dog with two hidden states: barking and not barking. Observations are either barking noises or silence. The HMM specifies:
- Higher transition probability for the dog to remain in the barking state than to switch from barking to not barking.
- Higher emission probability of barking noises from the barking state than of silence from the barking state. Given observed noises, the HMM can compute the probability that the dog is currently barking based on previous observations and hidden-state transitions.
Speech recognition example
Section titled “Speech recognition example”In speech recognition, hidden states represent phonemes (the smallest units of sound) and observations are the acoustic signals produced by the speaker. For example, if the hidden state is the phoneme “b”, the HMM assigns a higher probability to observations corresponding to the sound “b” than to the sound “a”. The model accounts for variation in how a phoneme can be produced in different contexts (for instance, “b” in “bat” versus “bit”) by using state-dependent emission probabilities to improve phoneme prediction.
Use cases
Section titled “Use cases”- Speech recognition
- Financial forecasting
- Biological sequence analysis
Related terms
Section titled “Related terms”- Hidden states
- Observations
- Transition probabilities
- Emission probabilities
- Phoneme