Hidden Markov Model Homework: State Space S={1,2,3}, Alphabet A={a,b,c}

  • Thread starter daneault23
  • Start date
  • Tags
    Models
In summary, the conversation discusses the parameters and equations of a Hidden Markov Model, specifically with a state space of S={1,2,3} and an alphabet of A={a,b,c}. The initial probability vector (∏) is defined as 1 for state 1 and 0 for all other states. The probabilities of transitioning from one state to another (P) are shown as a 3x3 matrix. The model also includes observation probabilities (b) for each state and observation. The problem at hand is finding the probability of observing the sequence (a,b,c) in this model, as well as determining the most likely state sequence that could have generated this observation. This requires considering all possible state sequences (Q =
  • #1
daneault23
32
0

Homework Statement


Define a Hidden Markov Model with the following parameters: State space S={1,2,3}, alphabet A = {a,b,c,}, P = 0 .5 .5
1 0 0
0 1 0

initial probability vector, ∏ = 1 0 0

b1(a) = 1/2 ; b1(b) = 1/2 ; b1(c) = 0
b2(a) = 1/2 ; b2(b) = 0; b2(c) = 1/2
b3(a) = 0; b3(b) = 1/2 ; b3(c) = 1/2

List all possible state sequences Q = (q1; q2; q3). Consider the observations
O = (a; b; c). What is the probability to observe this sequence in the model, i.e.
find P(O|λ)? Given these observations, which state sequence has most likely
generated it, i.e., find P(Q|O)?

Homework Equations



Remember the collection of parameters of a Hidden Markov Model is denoted λ = (P, B, ∏)

The Attempt at a Solution



Since ∏ has a 1 in the first entry, we always start in state 1. Thus, the probabilities of starting in other states, (P(O=abc|Q=312)*P(312) for example = 0), so we only need to find P(O=abc|Q=123)*P(Q=123) + P(O=abc|Q=132)*P(132). Well, right off the bat, the probability of going from state 2 to observation b is 0 so P(O=abc|Q=123)*P(Q=123)=0 already. Therefore, the only thing left is P(O=abc|Q=132)*P(132)=.5^4=1/16, so the probability to observe the sequence (a,b,c) in this model is 1/16.

Am I doing something wrong here?
 
Physics news on Phys.org
  • #2
I just realized that the sequence could also be given by Q=111, 112, 113, 121, 122, and 133. It wasn't as easy as I originally thought.
 

FAQ: Hidden Markov Model Homework: State Space S={1,2,3}, Alphabet A={a,b,c}

What is the purpose of a Hidden Markov Model (HMM)?

A Hidden Markov Model is a statistical model used to predict and analyze the behavior of a system over time. It is commonly used in fields such as speech recognition, bioinformatics, and finance.

What is the state space and alphabet in this HMM?

The state space (S) in this HMM is {1,2,3}, which means there are three possible states that the system can be in at any given time. The alphabet (A) is {a,b,c}, which represents the possible observations or outputs that can occur in each state.

How is the transition matrix determined in an HMM?

The transition matrix is determined by estimating the probabilities of transitioning from one state to another based on a given sequence of observations. This is typically done using the Baum-Welch algorithm, which uses the Expectation-Maximization (EM) algorithm to find the maximum likelihood estimates of the model parameters.

What is the difference between the state space and the observation space in an HMM?

The state space refers to the underlying states of the system, while the observation space refers to the observable outputs or symbols that occur in each state. The state space is usually larger than the observation space, as each state can have multiple possible outputs.

How does the Viterbi algorithm work in an HMM?

The Viterbi algorithm is used to find the most likely sequence of states in an HMM based on a given sequence of observations. It works by recursively calculating the probability of being in each state at each time step, and then choosing the most likely sequence of states that leads to the observed sequence of symbols. This algorithm is commonly used for tasks such as speech recognition and part-of-speech tagging.

Back
Top