How Does a Markov Chain Model the Distribution of Stickers Among Children?

In summary: E(n2|N-n1)+ E(n2|N)+E(n1)This is just an approximation because the second child might not take all the stickers from the box. But this will give us a good starting point.In summary, the expectation of the number of stickers taken by the second child, as a function of the initial number of stickers N, is E_N+1=E_1+⋯+E_N.
  • #1
rachelro
4
0
A teacher leaves out a box of N stickers for children to take home as treats. Children form a queue and look at the box one by one. When a child finds k⩾1 stickers in the box, he or she takes a random number of stickers that is uniformly distributed on {1,2,…,k}.

1- What is the expectation of the number of stickers taken by the second child, as a function of the initial number of stickers N?

2- If E_N denotes the expected number of children who take at least one sticker from the box given that it initially contained N stickers. How can I compute a formula to represent E_N+1 in terms of E_1+⋯+E_N. Also, how E_N can be expressed in terms of k?
 
Physics news on Phys.org
  • #2
Hello and welcome to MHB, Sarah! :D

In order for our helpers to provide you with the best help, we ask that you show what you have tried so far, or what your thoughts are on how to begin. This way, we can show you what you may be doing wrong, or provide you with a hint or suggestion on how to proceed.
 
  • #3
Sarah said:
A teacher leaves out a box of N stickers for children to take home as treats. Children form a queue and look at the box one by one. When a child finds k⩾1 stickers in the box, he or she takes a random number of stickers that is uniformly distributed on {1,2,…,k}.

Very interesting problem!... in order to understand correcly however let's do an example: the first child finds N stichers... that meants that he/she takes a random number of stickers uniformly distributed on {1,2,...,N}?... and that meants that, if the first child selects k stichers, the number of stickers selected by the second child is uniformly distributed in {1,2,...,N-k}?... Kind regards $\chi$ $\sigma$
 
  • #4
Sarah said:
A teacher leaves out a box of N stickers for children to take home as treats. Children form a queue and look at the box one by one. When a child finds k⩾1 stickers in the box, he or she takes a random number of stickers that is uniformly distributed on {1,2,…,k}.

1- What is the expectation of the number of stickers taken by the second child, as a function of the initial number of stickers N?

Under the assumption that what I written in post #3 is true, let's try to answer to the question 1. The mean value of object selected among k is...

$\displaystyle E\{n\} = \sum_{n=1}^{k} \frac{n}{k} = \frac{k\ (k+1)}{k\ 2} = \frac{k+1}{2}$ (1)

If $n_{1}$ is the number of objects selected by the first child, then the second child can choose among $N-n_{1}$ objects so the mean numbers of object choosen by ther second child is...

$\displaystyle E \{n_{2}\} = \frac{1}{2\ N}\ \sum_{n=1}^{n} (N-n+1) = \frac{1}{2\ N}\ \frac{N\ (N+1)}{2} = \frac{N+1}{4}$ (2)

Kind regards

$\chi$ $\sigma$
 
  • #5
Thank you for your comment, but I think the distribution for the first child should be between [1,N]. I was thinking to go ahead with the following argument:

n1 is the number of objects taken by the first child and it is uniformly distributed in [1, N], but n2 is uniformly distributed in [1, N-n1]. so we have to evaluate:
E(n2)= E( E(n2|N-n1))
 

FAQ: How Does a Markov Chain Model the Distribution of Stickers Among Children?

What is probability in the context of Markov chains?

Probability in the context of Markov chains refers to the likelihood of a certain event or state occurring at a specific point in time, based on the current state of the system and the transition probabilities between states. It is a way to model and predict the behavior of a system over time.

How is a Markov chain different from other types of mathematical models?

A Markov chain is a type of mathematical model that uses probability to describe the transition from one state to another in a system. Unlike other models, it only considers the current state of the system and does not take into account any past states or future predictions.

What are the applications of Markov chains in real-world scenarios?

Markov chains have various applications in different fields such as finance, economics, biology, and computer science. They are commonly used to model and analyze systems with discrete states that change over time, such as stock market trends, weather patterns, and genetic sequences.

How are transition matrices used in Markov chains?

A transition matrix is a square matrix that represents the probabilities of transitioning from one state to another in a system. In Markov chains, these matrices are used to calculate the probabilities of different states at each time step and to predict the long-term behavior of the system.

Can Markov chains be used for continuous systems?

Markov chains are typically used for discrete systems that change over time in discrete steps. However, they can also be adapted for continuous systems by using smaller time intervals and adjusting the transition probabilities accordingly. These continuous-time Markov chains are often used in physics and chemistry to model and predict the behavior of complex systems.

Back
Top