Probability of Amy Not Being Mad in the Next Three Days

  • Thread starter jumbogala
  • Start date
  • Tags
    Intro
In summary: This will give us the probabilities of Amy being in each mood on the third day. We can then use the principle of complement probability to find the probability of her not being mad on the third day. Finally, we can multiply these probabilities together to get the probability of her not being mad on any of the following three days.In summary, to find the probability of Amy not being in a mad mood on any of the following three days, we need to:1. Calculate the transition matrix P3 by raising the given matrix to the power of 3.2. Find the probability of Amy being mad on the third day by
  • #1
jumbogala
423
4

Homework Statement


On any given day, Amy is either cheerful (C), neutral (N), or mad (M). If she is cheerful today, then she will be C, N, or M tomorrow with probabilities 0.5, 0.4, 0.1 respectively. The rest of the probabilities are given so that we can get the probability matrix P:

P = [
0.5 0.4 0.1
0.3 0.4 0.3
0.2 0.3 0.5
]

Amy is currently in a cheerful mood. What is the probability that she is not in a mad mood on any of the following three days?

Homework Equations


The Attempt at a Solution


I'm not really sure how to go about this. To find the probability that Amy would be mad in 3 days, I would find P3 and look at the (1,3) entry, correct?

Then to find the probability that she is NOT mad in 3 days, I'd take 1 minus that probability.

But it doesn't ask for the 3rd day, it asks for "any of the following 3 days". I'm confused about how to do that!

The solutions manual changes the matrix to
[
0.5 0.4 0.1
0.3 0.4 0.3
0.0 0.0 1.0
]

Raises that to the power of 3, then takes 1 - [the (1,3) entry of the matrix]

I don't get why. Thanks!
 
Physics news on Phys.org
  • #2


I would approach this problem by using the principles of probability and conditional probability. First, I would rewrite the given information in a more clear and organized form:

Current Mood: Cheerful (C)

Transition Probabilities:
C -> C: 0.5
C -> N: 0.4
C -> M: 0.1

N -> C: 0.3
N -> N: 0.4
N -> M: 0.3

M -> C: 0.2
M -> N: 0.3
M -> M: 0.5

Next, I would use the principle of conditional probability to calculate the probability of Amy being in a particular mood on any given day. For example, the probability of Amy being in a cheerful mood on the second day (assuming she was cheerful on the first day) would be:

P(C on day 2) = P(C -> C) * P(C on day 1) + P(C -> N) * P(N on day 1) + P(C -> M) * P(M on day 1)
= (0.5 * 1) + (0.4 * 0) + (0.1 * 0)
= 0.5

Similarly, the probability of Amy being in a neutral mood on the third day (assuming she was cheerful on the first day) would be:

P(N on day 3) = P(N -> C) * P(C on day 1) + P(N -> N) * P(N on day 1) + P(N -> M) * P(M on day 1)
= (0.3 * 1) + (0.4 * 0) + (0.3 * 0)
= 0.3

Now, to answer the question of the probability of Amy not being in a mad mood on any of the following three days, we can use the principle of complement probability. This means that the probability of an event not happening is equal to 1 minus the probability of the event happening. In this case, the event is Amy being in a mad mood on any of the following three days. Therefore, the probability we are looking for is:

P(not mad in 3 days) = 1 - P(mad in 3 days)

To calculate the probability of Amy being mad in 3 days, we can use
 

FAQ: Probability of Amy Not Being Mad in the Next Three Days

1. What is a Markov chain?

A Markov chain is a mathematical model that describes a system that changes over time in a probabilistic manner. It is made up of a set of states and a set of probabilities that dictate the likelihood of transitioning from one state to another.

2. How is a Markov chain used in science?

Markov chains are used in a variety of scientific fields, including biology, economics, and physics. They are used to model dynamic systems and make predictions about future states based on the current state and transition probabilities.

3. What are some real-world applications of Markov chains?

Markov chains have been used to model the spread of diseases, analyze stock market trends, and predict weather patterns. They are also commonly used in natural language processing and speech recognition algorithms.

4. How are Markov chains different from other types of models?

Unlike other models, Markov chains only consider the current state and not the entire history of the system. They also assume that the future state is only dependent on the present state, not on any previous states.

5. Are there any limitations to using Markov chains?

One limitation of Markov chains is that they assume the system is in a state of equilibrium, meaning the transition probabilities do not change over time. They also do not take into account external factors that may affect the system.

Similar threads

Replies
6
Views
1K
Replies
18
Views
3K
Replies
10
Views
2K
Replies
7
Views
2K
Replies
5
Views
3K
Replies
1
Views
2K
Back
Top