Homework Statement
Hi guys, I'm absolutely desperate on the following problem:
Consider a random walker who can make steps only to neighbor sites in "D" dimensions where D is an arbitrary natural number. Assume that the distance between 2 adjacent sites is the same for all sites and that the...
Hi all,
I am trying to understand the concept of Markov Chain (a scan copy of 2 pages is attached), before this text I already studied the notes on Markov Chains at:
http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/Chapter11.pdf
I am lil' confused at the...
I am using a Markov Chain to get the 10 best search results from the union of 3 different search engines. The top 10 results are taken from each engine to form a set of 30 results.
The chain starts at State x, a uniform distribution of set S = {1,2,3,...30}. If the current state is page i...
The title says it all. It looks like the configuration probability only depends on where you want to go, not what state you are in now. Yet when I watch simulations, there is clearly a dependence on the previous state. Is there something pretty basic I'm misunderstanding about configuration...
Homework Statement
A taxicab moves between the airport, Hotel A, and Hotel B according to a Markov chain with transition probabilities:
P(airport → A) = 0.7,
P(airport → B) = 0.3,
P(A → airport) = 0.9,
P(A → B) = 0.1,
P(B → airport) = 0.8,
P(B → A) = 0.2.
A-If the taxicab starts...
Homework Statement
Find all Invariant Probability Measures for P (Markov Chain)
E = {1,2,3,4,5}
The screenshot below has P and my attempted solution. I am wondering if it acceptable to have infinitely many answers ("all" seems to indicate that is acceptable). Basically, I had too many unknowns...
Homework Statement
Find \lambda= \{\lambda_i\} with \lambda_0 = 1 if P(X_n=i+1|X_n=i) = p, P(X_n = i-1|X_n = i) = 1 - p, i \in Z, 0<p<1
Homework Equations
\lambda P = \lambda
The Attempt at a Solution
I used my relevant equation to write out:
(1-p)\lambda_0 + P\lambda_{-2} =...
Homework Statement
This is not really a homework question but just something that I'm confused about.
I'm having trouble with understanding the movement of Markov chain.
Say, if S = {0,1,2...} and 0<q<1. Let p(i,i+1)=q and p(i,i-1)=1-q. I can see how this Markov chain moves (back and forth...
Homework Statement
Homework Equations
N/A
The Attempt at a Solution
I'll shortly explain what my reasoning is so far, but please ignore it if it comes across too jumbled:
---
Let P denote the markov matrix associated with this problem, then I think I was able to argue that the...
Question : Let Xn be the maximum score obtained after n throws of a fair dice
a) Prove that Xn is a markov chain and write down the transition matrix
Im having a problem starting the transition matrix
im assuming the states are meant to be the sum. then do you write out the transition...
Hi, I need help with answering this question. Firstly, I'm not sure what the transition matrix should like. Should there be 2 states? One where both switches are off and one where both switches are on?
The question is:
Suppose that each of 2 switches is either on or off during the day. On...
This is probably a noob question, background in probability theory isn't great but I was shown this problem in a lecture:
"Suppose a gambler starts out with £n, and makes a series of £1 bets against the house.
Let the probability of winning each bet be p, and of loosing be q = 1 − p. If...
Suppose X is a random walk with probability
P(X_k=+1)=p and P(X_k=-1)=q=1-p
and S_n=X_1+X_2+...+X_n
Can anyone explain why does line 3 equal to line 4?
P(S_k-S_0≠0 ,S_k-S_1≠0 ,…,S_k-S_{k-1}≠0)
=P(X_k+X_{k-1}+⋯+X_1≠0 ,X_k+X_{k-1}+⋯+X_2≠0 ,…,X_k≠0)
=P( X_k≠0 ,X_k+X_{k-1}≠0...
I have a state transition probability matrix and a state probability vector
[0.9 0.1; 0.1 0.9] & [0.4 0.6] respectively.
Now, I want to generate the states of 1 and 0 according to this. say 100 state sequence.
Any sort of help would be appreciated.
Thanks.
In a Markov chain, show that a state i is persistent if and only if the mean number of visits to the state i is infinite given the chain started in state i.
I thought about looking at the mean recurrence time, but that's all I have so far.
Let X be a Markov chain with a state s that is absorbing, i.e. pss(1) = 1. All other states
communicate with s i.e. i → s for all states i ∈ S. Show that all states in S except s are
transient.
I understand this intuitively, but I'm not really sure how to start the proof.
* I have already posted this in the General Math, but I guess the problem is more like a linear algebra problem.
Currently I am using a rather simple way, to solve vector w from (M-I)w=0 (replace one equation by w1+w2+...wn=1). Is there any faster way to do this? Thank you.
Homework Statement
Homework Equations
The Attempt at a Solution
A few things.
First of all, the homework problem notes that "all the columns should sum to 1," whereas Wikipedia says ∑Pij = 1 when we sum all along the the row i.
Second of all, I don't know where to...
What is the difference between martingale and markov chain. As it seems apparently, if a process is a martingale, then the future expected value is dependent on the current value of the process while in markov chain the probability of future value (not the expected value) is dependent on the...
Today we encountered a problem with markov chains and are wondering if this can be solved analytically.
Suppose we have a banded transition probability matrix M of the following form:
M=
[
P P 0 0 0 ...
Q 0 P 0 0 ...
0 Q 0 P 0 ...
0 0 Q 0 P ...
0 0 0 Q 0 ...
. . . . .
. . . . . ]...
Hello, in relation to Markov chains, could you please clarify the following equations:
In particular, could you please expand on why the first line is equal. Surely from , along with the first equation, this implies that:
I just don't see why they are all equal. Please could you...
How do I find the stationary distribution of the Markov chain with the countable state space {0, 1, 2, ..., n, ...}, where each point, including 0, can either
a. return to 0 with probability 1/2, or
b. move to the right n -> n+1 with probability 1/2?
Thanks.
Homework Statement
Let \left( X_n \right)_{n \geq 0} be a Markov chain on {0,1,...} with transition probabilities given by:
p_{01} = 1, p_{i,i+1} + p_{i,i-1} = 1, p_{i,i+1} = \left(\frac{i+1}{i} \right)^2 p_{i,i-1}
Show that if X_0 = 0 then the probability that X_n \geq 1 for all n \geq 1 is...
Hi
I am trying to model the behaviour of 2 independent ON-OFF sources. My state diagram is as follows
state 0 = both sources are OFF
state 1 = 1 of the sources are ON
state 2 = both sources are ON
The transition rates are given as
BIRTH RATE = lamda(i) = (N-I)*lamda
DEATH RATE =...
A water distribution company in southern California gets its water supply from the north and sell it back to its customers in Orange county. Assume the following simplified scheme: 3 MG (millions of gallons) of water arrives from the north at the beginning of the month. The company can store up...
Ok, I had a homework problem that I cannot for the life of me, figure out. I've tried to google for different sources that would show me how to find the stationary distribution of a markov chain, but I can't seem to find one that makes sense to me.
The transition matrix of a markov chain is...
Hello there,
yet another trivial problem:
I've attended the 'stochastic process' course some time ago but the only thing I remember is that this kind of problem is really easy to compute, there is some simple pattern for this I presume.
thanks for your help,
rahl.
I would like to construct a model using a markov chain that has different stochastic processes for each state in the chain. Is there a term for such a thing, or anything similar to it?
Thanks
Hi,
I have a (markov chain) transition matrix X which I understand. In particular each row of this matrix sums to 1.
I have used this transition matrix to construct it's generator, Y. I.e. Y is the continuously compounded transition matrix,
X = exp(Y)
X*X = exp(2Y), etc
both X and Y...
Homework Statement
I have the following queuing system: http://img39.imageshack.us/img39/8264/immaginetd.jpg
that models voice traffic that come up with \alpha e \beta parameters, on both queue 1 and 2. When a source of voice is active causes traffic with exponential inter-arrival time which...
Please Help! Markov Chain
Homework Statement
Let X0 be a random variable with values in a countable set I. Let Y1, Y2, ... be a
sequence of independent random variables, uniformly distributed on [0, 1].
Suppose we are given a function G : I x [0, 1] -> I
and define inductively for n >=...
Homework Statement
Rat and Cat move between room 1 and 2 using different paths. Their motions are governed by their respective transition matrices:
[0.9, 0.1 ; 0.2, 0.8] [0.6, 0.4 ; 0.3, 0.7]
(semi colon is a new line in the matrix, like in matlab)
If they are ever in the same room...
Homework Statement
well i have my algebra exam coming up and my teacher told us that there is going to be a markov chain problem. the only problem i have is that i don't know how to get the initial transition matrix, which is crucial in getting full marks. can someone help me in determining how...
Homework Statement
i have a scenario which i have to find the proportion of time spent in each area by a person using markov chains. i was given a word problem, which i have put into a matrix and the question asks what the proportion of time is spent in each area A, B and C.
Homework...
I've attached the diagram of 4 rooms, which a rat must move through. Each period he changes his room (his state). As you can see if you click on the image, the rat cannot access room 2 from 3, vice versa.
If I assume the rat begins in room 1, how do I calculate the probability he will be in...
Homework Statement
Hi!
I have been given such a task:
A population of firms can assume three states: good-bad-bankrupt (default)
The cumulated frequencies of default (DP) from year 1 to 10 are given.
Find an appropriate transition matrix (TM)
I'm given a matrix of historical cumulated...
Homework Statement
-A taxi is located either at the airport or in the city. From the city, the next trip is to the airport with 1/4 probability, or to somewhere else in the city with 3/4. From the airport, the next trip is always to the city.
(a) find the stationary distribution
(b)...
Homework Statement
Transition matrix is
0 0 1
0 0 1
(1/3) (2/3) 0
"argue that this chain is aperiodic"
Homework Equations
definition of aperiodicity - there must exist a time n such that there is a non-zero probability of going from state i to state j for all i & j
The...
my transition matrix is
0 0 1
0 0 1
(1/3) (2/3) 0
I'm supposed to argue that this chain is aperiodic,
A markov chain is aperiodic iff there exists a time n such that there is a positive probability of going from state i to state j for all i and j...
Hi all,
I'm given a Markov chain Q_k, k>0 with stationary transition probabilities. The state is space uncountable.
What I want to show is that the chain is asymptotically stationary, that is it converges in distribution to some random variable Q.
All I have at hand is an k-independent upper...
Let \bold{X} be a discrete random variable whose set of possible values is \bold{x}_j, \ j \geq 1 . Let the probability mass function of \bold{X} be given by P \{\bold{X} = \bold{x}_j \}, \ j \geq 1 , and suppose we are interested in calculating \theta = E[h(\bold{X})] =...
[SOLVED] Markov chain problem
Hi, all! This is not a homework thing or something like that. I'm just curious.
Suppose a Markov chain p(n+1)=Q*p(n), where Q=[q11, q12; q21, q22] is the transition probability matrix and p(n)=[p1(n);p2(n)] is the probability vector.
Given Q and p(1), we can...
I'm reading the wikipedia article on them and I can't really get an understanding of what they are.
I'm writing a paper for psychology, and I keep coming across articles that say 'learning can be modeled with markov chains'
what does that mean?
Let \left( {X_n } \right)_{n \ge 0} be a Markov chain (discrete time).
I have
{\bf{P}} = \left[ {pij} \right]_{i,j} = \left[ {P\left( {X_1 = j|X_0 = i} \right)} \right]_{i,j},
and the initial probability distribution {\bf{p}}^{\left( 0 \right)}.
I need to calculate
P\left(...