Conditionally independent with a Random Chain.

In summary, the conversation is about a problem with finding a recursion for a probability value in a given Markov Chain and discrete random variables. The solution provided by the individual is incorrect and they are seeking clarification on a specific step. They also ask for help in finding the correct answer.
  • #1
fortune
4
0
I found this post somewhere on the net. I am also getting a similar problem. Can you all please help?


+++++++++++++++

I have a problem, I try to solve but still get stuck.
Can you please read my solution and point out what my wrong points are?

Xn is a Markov Chain . n=1...M. with P(X(0)=i)=a_i. and P(X(n+1)=i|X(n)=j)=Pji.
Yn n=1..M are discrete random variables which are conditionally independent
given X with

P(Y(n)=i|X(n)=j)=f(i,j)


I need to find a recursion for alpha_N(i)= P(Y(N),Y(N-1),...,Y(0), and X(N)=i).

where P(.) is the probability of (.)


My solution is that:


alpha_N(i)= P(Y(N),Y(N-1),...,Y(0), and X(N)=i)

=P(Y(N),Y(N-1),...,Y(0)|X(N)=i)P(X(N)=i)
=P(Y(N)|X(N)=1)*P(Y(N-1),..,Y(0)|X(N)=i)P(X(N)=i) (1)


***question*******
Is (1) correct? I am confused because as the provided information
Yn n=1..M are discrete random variables which are conditionally independent
given X. That means P(Y(0),...Y(N)|X)=P(Y(0)|X)*...P(Y(N)|X)
where X is a whole chain, not only one sample X(n).
Does it still guarantee P(Y(0),...Y(N)|X(N))=P(Y(0)|X(N))*...P(Y(N)|X(N))? ?

***end question****


=f(y(n),i)*Sum_k{P(X(N)=i|P(X(N-1)=k)P(X(N-1)=k}*P(Y(N-1),...,Y(0)|X(N)=i)

=f(y(n),i)*Sum_k{P_ki*P(X(N-1)=k}*P(Y(N-1),...,Y(0)|X(N)=i)

I get stuck here. The answer is wrong While the correct answer should be as:

alpha_N(i)=f(y(N),i)*Sum_k{P_ki*P(X(N-1)=k*P(Y(N-1),...,Y(0)|X(N-1)=k)}

=Sum_k {f(Y(N),i)*P_ki*alpha_(N-1)(k); where Sum_k means summary for k=1 to M.

I think the problem that I can not get the correct answer is the step (1).


Can you please help me out?

Thanks
 
Physics news on Phys.org
  • #2
anyone can help please!
 
  • #3


+++++++++++++++

Hi there,

Thank you for sharing your solution and question. I would be happy to help.

First of all, let's start by defining some terms and concepts to make sure we are on the same page.

Markov Chain: A Markov Chain is a sequence of random variables where the probability of each variable depends only on the previous variable in the sequence. In other words, the current state of the chain only depends on the previous state, not the entire history.

Conditionally Independent: Two random variables are conditionally independent given a third variable if the probability of one variable does not affect the probability of the other variable, given the third variable.

Random Chain: I am assuming that by "Random Chain" you mean the Markov Chain mentioned earlier.

Now, let's look at your solution and question.

In your solution, you have correctly used the definition of conditional independence to write P(Y(N),Y(N-1),...,Y(0)|X(N)=i) as P(Y(N)|X(N)=i)*P(Y(N-1),...,Y(0)|X(N)=i). This is correct because, as mentioned earlier, the current state of the chain (X(N)=i) only depends on the previous state (X(N-1)). Therefore, the previous states of the chain (Y(N-1),...,Y(0)) are also conditionally independent given the current state (X(N)=i).

However, in your question, you are asking if this is still true if instead of X(N)=i, we have X=N. In this case, the current state of the chain is the entire sequence of variables (X(N)=X), not just one sample (X(N)=i). This changes things because now the previous states of the chain (Y(N-1),...,Y(0)) are not necessarily conditionally independent given the current state (X(N)=X). This is because the current state (X(N)=X) contains information about the entire history of the chain, not just the previous state. Therefore, P(Y(0),...Y(N)|X(N)=X) does not necessarily equal P(Y(0)|X(N)=X)*...P(Y(N)|X(N)=X).

To solve this problem, you need to use the definition of conditional independence again. Since Y(N-1),...,Y(0) are conditionally independent given X, we can write P(Y(0
 

FAQ: Conditionally independent with a Random Chain.

What does "conditionally independent" mean?

Conditionally independent refers to a statistical relationship between two variables, where the probability of one variable occurring does not affect the probability of the other variable occurring, given the value of a third variable.

How is "conditionally independent" different from "independent"?

While "independent" means that there is no relationship between two variables, "conditionally independent" means that there is no relationship between two variables given the value of a third variable. In other words, the variables are not independent in general, but they become independent when a third variable is taken into consideration.

What is a "Random Chain" in the context of conditional independence?

A Random Chain is a sequence of random variables that are connected through conditional dependencies. In other words, each variable in the chain is conditionally independent of all the previous variables in the chain, given the values of the variables that come before it.

How is "conditionally independent with a Random Chain" used in scientific research?

This concept is commonly used in statistical modeling and data analysis to understand the relationships between multiple variables. It allows researchers to control for the effects of a third variable and determine if there is a direct relationship between two variables or if the relationship is due to the influence of the third variable.

Can a relationship between two variables be both "conditionally independent" and "dependent" at the same time?

No, a relationship between two variables cannot be both "conditionally independent" and "dependent" at the same time. These two concepts are mutually exclusive and cannot coexist. However, a variable can be conditionally independent of one variable and dependent on another variable simultaneously.

Back
Top