Markov chain Definition and 97 Threads

  1. WMDhamnekar

    A Coin flipping problem (Markov chain)

    b)Suppose that the coin flipped on Monday comes up heads. What is the probability that the coin flipped on Friday of the same week also comes up heads? My attempt to answer this question:
  2. J

    A discrete-time queue Markov Chain problem

    The following problem is seriously tricky and I urgently need help with it, thanks. For part a: we have the following transition probability matrix P = a0 a1 a2 a3 a0 a1 a2 a3 0 a0 a1 b2 0 0 a0 b1 Now, is a0 = a1 = a2 = a3 =...
  3. J

    The infinite limits of the probability transition matrix for Markov chain

    Consider a Markov chain with state space {1, 2, 3, 4} and transition matrix P given below: Now, I have already figured out the solutions for parts a,b and c. However, I don't know how to go about solving part d? I mean the question says we can't use higher powers of matrices to justify our...
  4. M

    I Proving P(X_4|X_1,X_2)=P(X_4|X_2) for Markov Chain with Finite Possibilities

    Given events ##X_i## and the following: ##P(X_3|X_1,X_2)=P(X_3|X_2)## and ##P(X_4|X_1,X_2,X_3)=P(X_4|X_3)## prove ##P(X_4|X_1,X_2)=P(X_4|X_2)##? Special case proof: Finite number of possibilities for each ##X_i##. Transition matrix from ##X_2## to ##X_4## is product of transitions from...
  5. A

    A Question about a property of a matrix of transition probabilities

    In a 2012 article published in the Mathematical Gazette, in the game of golf hole score probability distributions were derived for a par three, four and five based on Hardy's ideas of how an hole score comes about. Hardy (1945) assumed that there are three types of strokes: a good (##G##)...
  6. IreneFerri

    New Ph.D candidate in Statistical Physics on networks, Hello

    My name is Irene and I've just started my PhD. at the University of Barcelona (October, 2020). I have a Physics degree and a Master in Computational Modelling. I work in a research group named ClabB (complexity lab Barcelona) where I develop a large scale opinion model using Monte-Carlo...
  7. S

    B Question about transition matrix of Markov chain

    The note I get from the teacher states that for transition matrix, the column part will be current state and the row part will be future state (let this be matrix A) so the sum of each column must be equal to 1. But I read from another source, the row part is the current state and the column...
  8. J

    Markov chain on state {1, 2, 3, 4, 5, 6 , 7}

    I need this for a programming project. Could you help?
  9. S

    Understanding Recurrence in Probability: Solving for hN(1) and cNcN

    I set hN(1)hN(1) equal to cNcN, but I'm confused on how I'd be able to solve it and because of that I was not able to conclude that 0 is recurrent when qx/px = infinity
  10. caffeinemachine

    MHB Regarding mixing time of a Markov chain

    Let $X=X_0, X_1, X_2, \ldots$ be a Markov chain on a finite state space $S$, and let $P$ denote the transition matrix. Assume that there is an $\varepsilon>0$ such that whenever $\mu_0$ and $\nu_o$ are point distributions on $S$ (in other words, $\mu_0$ and $\nu_0$ are Direac masses) we have...
  11. K

    A Martingale, calculation of probability - Markov chain needed?

    I am trying to estimate probability of loosing (probability of bankrupt ##Pb##) using Martingale system in betting. I will ilustrate my problem on the following example: Let: ##p## = probability of NOT getting a draw (in some match) We will use following system for betting: 1) We will bet only...
  12. user366312

    Finding conditional and joint probabilities from a table of data

    Let, alpha <- c(1, 1) / 2 mat <- matrix(c(1 / 2, 0, 1 / 2, 1), nrow = 2, ncol = 2) chainSim <- function(alpha, mat, n) { out <- numeric(n) out[1] <- sample(1:2, 1, prob = alpha) for(i in 2:n) out[i] <- sample(1:2, 1, prob = mat[out[i - 1], ])...
  13. user366312

    Finding the value of ##P(X_3 = 1|X_1 = 2) = ?## in a Markov Chain

    ##P^2=\begin{bmatrix} 1/2&0&1/2\\ 0&1/2&1/2\\ 1/2&1/2&0 \end{bmatrix} \begin{bmatrix} 1/2&0&1/2\\ 0&1/2&1/2\\ 1/2&1/2&0 \end{bmatrix}= \begin{bmatrix} 1/2 & 1/4 & 1/4\\ 1/4 & 1/2 & 1/4\\ 1/4 & 1/4 & 1/2 \end{bmatrix}## So, ##P(X_3 = 1|X_1 = 2) = 1/4##. Is this solution correct?
  14. user366312

    Finding ##P(X_2 = 2)## of a Markov Chain

    My solution: ##X_1 = \begin{bmatrix} 1/2&1/2&0 \end{bmatrix} \begin{bmatrix} 1/2&0&1/2\\ 0&1/2&1/2\\ 1/2&1/2&0 \end{bmatrix} = \begin{bmatrix} 1/4&1/4&1/2 \end{bmatrix}## ##X_2 = \begin{bmatrix} 1/4&1/4&1/2 \end{bmatrix} \begin{bmatrix} 1/2&0&1/2\\ 0&1/2&1/2\\ 1/2&1/2&0...
  15. user366312

    How can I compute expected return time of a state in a Markov Chain?

    Problem Statement I was watching a YouTube video regarding the calculation of expected return time of a Markov Chain. I haven't understood the calculation of ##m_{12}##. How could he write ##m_{12}=1+p_{11}m_{12}##? I have given a screenshot of the video.
  16. I

    MHB Optimizing Markov Chain Production System Throughput w/ Exponential Rate of 50

    I am new to Markov chain, i want to model this as a continuous-time Markov chain. A wind turbine manufacturer would like to increase the throughput of its production system. For this purpose it intends to install a buffer between the pre-assembly and the final assembly of the wind turbines...
  17. caffeinemachine

    MHB "Two step" Markov chain is also a Markov chain.

    Let $X$ be a compact metric space and $\mathcal X$ be its Borel $\sigma$-algebra. Let $\mathscr P(X)$ be the set of all the Borel probability measures on $X$. A **Markov chain** on $X$ is a measurable map $P:X\to \mathscr P(X)$. We write the image of $x$ under $P$ as $P_x$. (Here $\mathscr P(X)$...
  18. P

    MHB How can i set this problem as a continuous markov chain?

    I request your help in order to know, how can i configure this problem as a continuous markov chain, need to define the main variable, the states, transition rates, and the matrix. I thought that it could be relationed with the independent status of the machines, because if the machine 1 is...
  19. Mayan Fung

    I A seemingly simple problem about probability

    My friend is now taking an introductory course about statistics. The professor raised the following question: A light bulb has a lifespan with a uniform distribution from 0 to 2/3 years (i.e. with a mean of 1/3 years). You change a light bulb when it burns. How many light bulbs are expected to...
  20. T

    Performing Metropolis-Hastings algorithm for a Poisson Distribution

    Homework Statement The number of busy lines in a trunk group (Erlang system) is given by a truncated Poisson distribution. I am asked to generate values from this distribution by applying the Metropolis-Hastings algorithm. Homework Equations The distribution is given in the attached picture...
  21. FallenApple

    A Markov Chain as a function of dimensions

    Here is an animation I created in R. I built this Markov chain of order 50 by correlating the information in one of the coordinates while randomly varying the rest. Is there an explanation for the clustering and flattening out over increasing dimensions of the vector space? Is it due to the...
  22. grquanti

    I Jump probability of a random walker

    Hello everybody. I have a Markowian homogeneous random walk. Given the transition matrix of the chain, I know that ##P[ X(t) = i | X(t-1) = j ] ≡ P_{j→i}=T_{ij}## where ##T## is the transition matrix and ##X(t)## is the position of the walker...
  23. F

    Construct a Markov Chain: How to Generate Xn's Using the Sequence U0, U1, ...?

    Homework Statement Describe the construction of a Markov chain X0, X1, ... on Ω ∈ (0, 1) with state space S = {1, 2, ..., s} and S X S PTM P and initial state X0 ~ ν (probabilities distributed like vector ν). Use the sequence U0, U1, ... to generate the Xn's Homework Equations U0, U1 is a...
  24. C

    Can the Markov chain be used in qubit manipulation?

    Can a Markov chain be used to generate a set of qubits given their initial state?
  25. J

    Limit of a continuous time Markov chain

    Homework Statement Calculate the limit $$lim_{s,t→∞} R_X(s, s+t) = lim_{s,t→∞}E(X(s)X(s+t))$$ for a continuous time Markov chain $$(X(t) ; t ≥ 0)$$ with state space S and generator G given by $$S = (0, 1)$$ $$ G= \begin{pmatrix} -\alpha & \alpha \\ \beta & -\beta\...
  26. J

    Birth and death process -- Total time spent in state i

    Homework Statement Let X(t) be a birth-death process with parameters $$\lambda_n = \lambda > 0 , \mu_n = \mu > 0,$$ where $$\lambda > \mu , X(0) = 0$$ Show that the total time T_i spent in state i is $$exp(\lambda−\mu)-distributed$$ 3. Solution I have a hard time understanding this...
  27. A

    Can QM be described by Markov chain theory?

    Can we describe the intensity of spectral lines using Markov theory? No matter what is the initial state vector of the system, the final state will be reduced to a stationary vector whose elements represent the intensity of the spectral lines.
  28. J

    Markov Chain - Time Reversibility proof

    Homework Statement Let X = {Xn : n ≥ 0} be an irreducible, aperiodic Markov chain with finite state space S, transition matrix P, and stationary distribution π. For x,y ∈ R|S|, define the inner product ⟨x,y⟩ = ∑i∈S xiyiπi, and let L2(π) = {x ∈ R|S| : ⟨x,x⟩ < ∞}. Show that X is time-reversible...
  29. mmarkov

    Markov chains: period of a state

    Hello, I am trying to understand the intuition of the definition of the period of a state in a Markov chain. Say for example we can go from state i to state i in 3,5,7,9,11... and so on steps. The gcd here is one. So is this aperiodic state or one with periodicity of 2. Thanks
  30. W

    Markov's Inequality for Geometric Distribution.

    Homework Statement Let X∼Geometric(p). Using Markov's inequality find an upper bound for P(X≥a), for a positive integer a. Compare the upper bound with the real value of P(X≥a). Then, using Chebyshev's inequality, find an upper bound for P(|X - EX| ≥ b). Homework Equations P(X≥a) ≤ Ex / a...
  31. B

    Finding the rate of convergence for a markov chain

    Homework Statement For the following Markov chain, find the rate of convergence to the stationary distribution: \begin{bmatrix} 0.4 & 0.6 \\ 1 & 0 \end{bmatrix} Homework Equations none The Attempt at a Solution I found the eigenvalues which were \lambda_1=-.6 or \lambda_2=1 . The...
  32. C

    Markov chain: finding a general solution

    1. The problem statement Given a stochastic matrix P with states s_1...s_5: P = \begin{pmatrix} 1 & p_2 & 0 & 0 & 0\\ 0 & 0 & p_3 & 0 & 0\\ 0 & q_2 & 0 & p_4 & 0\\ 0 & 0 & q_3 & 0 & 0 \\ 0 & 0 & 0 & q_4 & 1 \end{pmatrix} and the matrix A (which is obviously related to P, but I can't see...
  33. P

    What Is the Expected First-Passage Time in a Markov Chain?

    Homework Statement Markov process has probabilities p_{j,j+1} = 1-p_{j,0} = (\frac{j+1}{j+2})^k for j=0,1,2,... If T_j = min[n>1 : X_n=j] What is E[T(j)|X_0=j] for j=0,1,2,...? Homework EquationsThe Attempt at a Solution I figured that T(j)|X_0=j is j+1 but don't know how to work out the...
  34. P

    MHB Some more Brownian motions and Birth-Death processes in Markov Chain

    Hi, I have a deadline tomorrow, and I need some urgent help now. I don't have a background in Markov chain, so I would be very very very very thankful if you can help with solutions step-by-step. Thanks a lot in advance. Below come the questions: 2) Give an example of birthrates λ(i) > 0 such...
  35. P

    MHB Some questions about Brownian Motion and Birth-Death in Markov chain

    Hi, I need urgent answers. Basically, I don't have background in Markov and I don't need to learn it now actually. But I have to solve the questions below somehow. If somebody can give detailed answers to the questions below (From beginning to the final solution with explanations), then I will...
  36. ElijahRockers

    What is the proportion of time the runner runs barefooted using a Markov Chain?

    Homework Statement A certain laid-back runner owns three pairs of shoes, which he keeps by either the front or back doors of his house. Each morning he is equally likely to leave through the front or back door, and then after his run, he is equally likely to enter through the front or back...
  37. J

    Continuous-Time Markov Chain populations

    Hello, I'm working on a CTMC three-state model to obtain time-dependent populations of each state. S <=> E <=> G I have built a rate matrix for this (diffusion) process. K = \begin{pmatrix} K_{SS} & K_{SE} & K_{SG}\\ K_{ES} & K_{EE} & K_{EG}\\ K_{GS} & K_{GE} & K_{GG} \end{pmatrix} =...
  38. M

    MHB Markov Chain - Is state 2 periodic?

    Hey! :o Given the Markov chain $\{X_n, n \geq 1\}$ and the following probability transition matrix: $\begin{pmatrix} 0 & 1/3 & 2/3\\ 1/4 & 3/4 & 0\\ 2/5 & 0 & 3/5 \end{pmatrix}$ All states communicate, so the chain is irreducible, isn't? Could you tell me if the state $2$ is periodic?
  39. D

    Solving Markov Chain Problem: Probability of Infection

    Homework Statement Consider the following (simple) epidemic model: A population of size N consists of infected and susceptible individuals. During each time period, each of the N choose 2 possible pairs in the population will come in contact with probability p. If a pair is in contact and...
  40. F

    The significance of pi in a markov chain

    A question in regards to nomenclature. I am curious as to the significance of using the symbol π for a time-independent distribution. Does it have any relation to the number π or circular geometry? Or does it come maybe from the invariance of i in the P Matrix such that Pi,j → πj? I don't like...
  41. K

    Markov chain chance one state is reached before another

    Hey could someone explain why this is true? I am trying to understand how to solve such a problem but I don't understand the solution. Problem: Given a Markov chain \left\{X_{n}: n\in\ \mathbb{N}\right\} with four states 1,2,3,4 and transition matrix P = \begin{pmatrix} 0 &...
  42. M

    Transition Matrix ( Markov Chain Monte Carlo)

    1. -Find a regular transition matrix that is not time reversible, i.e., doesn't satisfy the balance equations? 2.Pi,j=0≠Pj,ifor some i and j My understanding from Markov Chain Monte Carlo is that for the transition matrix to be regular the matrix has to have all positives entries and each row...
  43. M

    Markov Chain Monte Carlo question

    I was wondering if anyone could help me with this problem dealing with Markov Chain Monte Carlo -Find a regular transition matrix that is not time reversible, i.e., doesn't satisfy the balance equations? My understanding from Markov Chain Monte Carlo is that for the transition matrix to be...
  44. B

    Solving Markov Chain Question: Closed Form Expression for f(n)ij | PDF Attached

    This is slightly applied stuff. Please look at the PDF attached. How do we express the function f(n)ij, found at the bottom of page 4, in closed form, in terms of the values of n, i and j? If we can figure this out, then, as noted half-way down page 5, fij is simply the sum of f(n)ij from n=1...
  45. A

    Conditional Probability - Markov chain

    Hi, I was reading about Markov chains and came across the following statement: "The conditional distribution p(x_n|x_{n-1}) will be specified by a set of K-1 parameters for each of the K states of x_{n-1} giving a total of K(K-1) parameters." In the above we have assumed that the...
  46. dexterdev

    MATLAB How to simulate 2nd order markov chain (if poss. Nth order) in MATLAB

    Hi PF, I would like to simulate N th order markov chain (not by means of hidden markov models, but ordinary markov chain) using Matlab. If n-th order is a heavy thing atleast 2nd or 3rd order will do. TIA
  47. R

    MHB How Does a Markov Chain Model the Distribution of Stickers Among Children?

    A teacher leaves out a box of N stickers for children to take home as treats. Children form a queue and look at the box one by one. When a child finds k⩾1 stickers in the box, he or she takes a random number of stickers that is uniformly distributed on {1,2,…,k}. 1- What is the expectation of...
  48. T

    Markov Chain Steady State (?were am i going wrong?)

    Homework Statement Subpart of the question requires me to find the steady state of the transition matrix: P=\begin{bmatrix} 0.1 & 0.7 & 0.2 \\ 0.1 & 0.8 & 0.1\\ 0.3 & 0.1 & 0.6 \end{bmatrix} Homework Equations We thus need to find vector \boldsymbol{v} in the equation...
  49. B

    Simulating a discrete time markov chain

    Hi, I'm trying to simulate a discrete time time markov chain in matlab. Unfortunately I am neither a markov chain expert or a good MATLAB coder. My problem is simple, I have a transition probability matrix with 100 states (100x100) and I want to simulate a 1000 steps beginning from state 1...
  50. M

    General first-order markov chain of 2 states

    Hi, Hope you can give me an answer regarding this trellis diagram. why in this picture of a general first-order markov chain of 2 states,we should know the prob. of each state at each time? A general first-order markov chain, can be Time-dependent(non stationary) so Transition prob. can change...
Back
Top