How Do I Create a Transition Matrix for This Markov Chain Scenario?

In summary: The second column is $\begin{bmatrix}0 \\ \frac12 \\ \frac12 \end{bmatrix}$ because the student never eats the same kind of food for 2 consecutive weeks. The third column is $\begin{bmatrix}0 \\ \frac12 \\ \frac12 \end{bmatrix}$ because the student always eats Italian food. So the matrix has the following structure:\begin{bmatrix}0 & 0 & 0\begin{bmatrix}\frac12 & \frac12 & \frac12\end{bmatrix}\begin{bmatrix}0 & 0 & 0
  • #1
spence1
1
0
I just discovered this website and want to thank everyone who is willing to contribute some of their time to help me. I appreciate it more than you know

First off, assume that state 1 is Chinese and that state 2 is Greek, and state 3 is Italian.

A student never eats the same kind of food for 2 consecutive weeks. If she eats a Chinese restaurant one week, then she is equally likely to have Greek as Italian food the next week. If she eats a Greek restaurant one week, then she is four times as likely to have Chinese as Italian food the next week. If she eats a Italian restaurant one week, then she is twice as likely to have Chinese as Greek food the next week.

I feel like I am on the right track, but I'm having trouble translating the words to notation (the most important part).
Chinese could be represented by x, Greek by y, and Italian by z, correct? And that has to add up to 1?

"If she eats a Greek restaurant one week, then she is four times as likely to have Chinese as Italian food the next week."
(I'm using ... to mean I think that there is something more in the equation)

y=4x...

"If she eats an Italian restaurant one week, then she is twice as likely to have Chinese as Greek food the next week."

z=2x...
So yeah. I kinda get the idea, I kinda don't. I think
 
Last edited:
Physics news on Phys.org
  • #2
spence said:
"If she eats a Greek restaurant one week, then she is four times as likely to have Chinese as Italian food the next week."
(I'm using ... to mean I think that there is something more in the equation)

y=4x...

"If she eats an Italian restaurant one week, then she is twice as likely to have Chinese as Greek food the next week."

z=2x...

She eats restaurants?
 
  • #3
spence said:
I just discovered this website and want to thank everyone who is willing to contribute some of their time to help me. I appreciate it more than you know

First off, assume that state 1 is Chinese and that state 2 is Greek, and state 3 is Italian.

A student never eats the same kind of food for 2 consecutive weeks. If she eats a Chinese restaurant one week, then she is equally likely to have Greek as Italian food the next week. If she eats a Greek restaurant one week, then she is four times as likely to have Chinese as Italian food the next week. If she eats a Italian restaurant one week, then she is twice as likely to have Chinese as Greek food the next week.

I feel like I am on the right track, but I'm having trouble translating the words to notation (the most important part).
Chinese could be represented by x, Greek by y, and Italian by z, correct? And that has to add up to 1?

"If she eats a Greek restaurant one week, then she is four times as likely to have Chinese as Italian food the next week."
(I'm using ... to mean I think that there is something more in the equation)

y=4x...

"If she eats an Italian restaurant one week, then she is twice as likely to have Chinese as Greek food the next week."

z=2x...
So yeah. I kinda get the idea, I kinda don't. I think
In this problem there are three states: Chinese, Greek and Italian. The transition matrix tells you the probability of changing from one state to another. So it will be a $3\times3$ matrix. The rows and columns of the matrix will correspond to the states. So row 1 corresponds to Chinese, row 2 to Greek, and row 3 to Italian, and similarly for the columns. The $(i,j)$-element of the matrix give the probability of changing from state $j$ to state $i$. So for example the $(1,1)$-element of the matrix is the probability of changing from Chinese to Chinese. But the student never eats the same kind of food for 2 consecutive weeks. That tells you that the $(1,1)$-element of the matrix is $0$.

The $(2,1)$-element is the probability of changing from Chinese to Greek, and the $(3,1)$-element is the probability of changing from Chinese to Italian. Those probabilities must add up to $1$, and you are told that they are both equally llikely. So they must both be $\frac12$. The first column of the matrix is therefore $\begin{bmatrix}0 \\ \frac12 \\ \frac12 \end{bmatrix}$.

Now you have to do the same thing for the other two columns.
 

FAQ: How Do I Create a Transition Matrix for This Markov Chain Scenario?

What is a Markov chain?

A Markov chain is a mathematical model used to describe the probability of transitioning from one state to another in a system. It is a type of stochastic process, meaning that the next state is determined by the current state and does not depend on any previous states.

How is a transition matrix used in Markov chains?

A transition matrix is used to represent the probabilities of transitioning from one state to another in a Markov chain. Each row and column in the matrix represents a different state, and the values in the matrix represent the probabilities of transitioning from one state to another.

Can you provide an example of a transition matrix for a simple scenario?

Sure, let's say we have a system with three states: A, B, and C. The transition matrix may look like this:

| State | A | B | C |

| A | 0.4 | 0.2 | 0.4 |

| B | 0.3 | 0.5 | 0.2 |

| C | 0.1 | 0.3 | 0.6 |

This means that if the system is currently in state A, there is a 40% chance it will remain in state A, a 20% chance it will transition to state B, and a 40% chance it will transition to state C.

How do you create a transition matrix for a specific scenario?

To create a transition matrix, you first need to identify all the possible states in the system and determine the probabilities of transitioning from one state to another. These probabilities can be based on historical data or estimated based on expert knowledge. Once you have all the probabilities, you can arrange them in a matrix format as shown in the previous example.

Can you explain the concept of a stationary distribution in Markov chains?

In a Markov chain, the stationary distribution is the long-term probability of being in each state. It is reached when the system has gone through many transitions and has reached a stable state where the probability of being in each state remains constant. This distribution can be calculated using the transition matrix and can provide insights into the behavior of the system over time.

Similar threads

Replies
5
Views
2K
Replies
20
Views
4K
Replies
6
Views
1K
Replies
11
Views
2K
Replies
2
Views
2K
Replies
2
Views
2K
Back
Top