Unsolved statistics questions from other sites, part II

  • MHB
  • Thread starter chisigma
  • Start date
  • Tags
    Statistics
In summary, this conversation is discussing an unsolved statistic question posted on the website Art of Problem Solving. The question involves a game where a random number generator selects an integer between 1 and n, and the player wins n dollars if n is selected and loses 1 dollar otherwise. The player continues to press the button until they have more money than they started with or have an "even" amount. The conversation involves a member proposing a solution and discussing the expected value of the number of times the button must be pressed. There is also some confusion about the wording and specifics of the problem.
  • #1
chisigma
Gold Member
MHB
1,628
0
Like a previous thread also this is opened with the purpose to answer to statistic questions proposed in other sites that didn't receive answer for not less than three days. Let’s start with an interesting problem about ‘Random walking’…

Posted on 11 26 2011 on www.artofproblemsolving.com by the member mr10123 and not yet solved…

Daniel plays a game with a random number generator. It randomly selects an integer between 1 and n inclusive at the press of a button. If n is selected, he wins n dollars. If any other integer is selected, he loses 1 dollar. He keeps pressing the button until he has more than ‘even’ or he has more money than he begins with. Daniel’s disposable money is unlimited. What is the expected value of the number of times he must press the button?...

Kind regards

$\chi$ $\sigma$
 
Mathematics news on Phys.org
  • #2
Re: Unsolved statistic questions from other sites, part II

In my opinion the solution requires the preliminary computation of the following finite sum...

$\displaystyle S_{n}= \sum_{k=1}^{n} k\ x^{k}$ (1)

Starting from the well know formula...

$\displaystyle \sum_{k=0}^{n} x^{k} = \frac{1-x^{n+1}}{1-x}$ (2)

... we obtain in some steps...

$\displaystyle S_{n}= x\ \frac{d}{d x}\ \frac{1-x^{n+1}}{1-x}= \frac{x}{(1-x)^{2}}\ \{1-x^{n+1} -(n+1)\ (1-x)\ x^{n}\} = \frac{x}{(1-x)^{2}}\ -n\ \frac{x^{n+1}}{1-x}$ (3)

Of course is...

$\displaystyle \sum_{k=m}^{n} k\ x^{k}= S_{n}- S_ {m-1}= \frac{(m-1)\ x^{m} -n\ x^{n+1}}{1-x}\ $ (4)

Kind regards

$\chi$ $\sigma$
 
  • #3
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Like a previous thread also this is opened with the purpose to answer to statistic questions proposed in other sites that didn't receive answer for not less than three days. Let’s start with an interesting problem about ‘Random walking’…

Posted on 11 26 2011 on www.artofproblemsolving.com by the member mr10123 and not yet solved…
Daniel plays a game with a random number generator. It randomly selects an integer between 1 and n inclusive at the press of a button. If n is selected, he wins n dollars. If any other integer is selected, he loses 1 dollar. He keeps pressing the button until he has more than ‘even’ or he has more money than he begins with. Daniel’s disposable money is unlimited. What is the expected value of the number of times he must press the button?...

The first step is the computation of the probability that the iterations end at the k-th trial. Is...

$\displaystyle \text{for}\ k=1\ \text{to}\ n+1 ,\ P_{k}= \frac{1}{n}\ (1-\frac{1}{n})^{k-1}$

$\displaystyle \text{for}\ k=n+2\ \text{to}\ 2n+2 ,\ P_{k}= \frac{1}{n^{2}}\ (1-\frac{1}{n})^{k-2}$

$\displaystyle \text{for}\ k=2n+3\ \text{to}\ 3 n+3 ,\ P_{k}= \frac{1}{n^{3}}\ (1-\frac{1}{n})^{k-3}$

... and in general...

$\displaystyle \text{for}\ k= (j-1)\ n + j\ \text{to}\ j\ (n+1) ,\ P_{k}= \frac{1}{n^{j}}\ (1-\frac{1}{n})^{k-j} = \frac{1}{(n-1)^{j}}\ (1-\frac{1}{n})^{k}$ (1)

... so that the expected value of k can be computed…

$\displaystyle E \{k\}= \sum_{k=1}^{\infty} k\ P_{k}= \sum_{j=1}^{\infty} \frac{1}{(n-1)^{j}}\ \sum_{k=(j-1)\ n + j}^{j\ (n+1) } k\ (1-\frac{1}{n})^{k}$ (2)

The second finite sum can be computed using the result of the previous post and that will be done [possibly! (Blush)...] in next posts ...

Kind regards

$\chi$ $\sigma$
 
  • #4
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Like a previous thread also this is opened with the purpose to answer to statistic questions proposed in other sites that didn't receive answer for not less than three days. Let’s start with an interesting problem about ‘Random walking’…

Posted on 11 26 2011 on www.artofproblemsolving.com by the member mr10123 and not yet solved…

Daniel plays a game with a random number generator. It randomly selects an integer between 1 and n inclusive at the press of a button. If n is selected, he wins n dollars. If any other integer is selected, he loses 1 dollar. He keeps pressing the button until he has more than ‘even’ or he has more money than he begins with. Daniel’s disposable money is unlimited. What is the expected value of the number of times he must press the button?...

Kind regards

$\chi$ $\sigma$

Is that the exact wording? Could you provide a link to the problem on APS?

I take it n is the index for the play number, but the phrase "He keeps pressing the button until he has more than ‘even’ or .. " is ambiguous. As I read this the stopping condition is satisfied at the beginning of the game and so D. never presses the button. Also if he did press the button the game would imeadiatly stop since D. would win and again statisfy the stopping condition. So may be we should assume that n is a fixed integer, but the stopping condition is still satisfied befor the first play.

CB
 
Last edited:
  • #5
Re: Unsolved statistic questions from other sites, part II

CaptainBlack said:
Is that the exact wording? Could you provide a link to the problem on APS?

I take it n is the index for the play number, but the phrase "He keeps pressing the button until he has more than ‘even’ or .. " is ambiguous. As I read this the stopping condition is satisfied at the beginning of the game and so D. never presses the button. Also if he did press the button the game would imeadiatly stop since D. would win and again statisfy the stopping condition. So may be we should assume that n is a fixed integer, but the stopping condition is still satisfied befor the first play.

CB

The original post is here...

http://www.artofproblemsolving.com/Forum/viewtopic.php?f=498&t=433457

Kind regards

$\chi$ $\sigma$
 
  • #6
Re: Unsolved statistic questions from other sites, part II

Posted on 08 17 2012 on www.mathhelpforum.com by the member pyromania and not yet solved…A post office has 2 clerks. Alice enters the post office while 2 other customers,Bob and Claire, are being served by the 2 clerks. She is next in line. Assume that the time a clerk spends serving a customer has the exponential (lambda) distribution. What is the expected total time that Alice needs to spend at the post office?...The answer gives that the expected waiting time(waiting in line) is ½ 1/lambda and the expected time being served in 1/lambda, so the total time is 3/2 1/lambda. I don't understand why the expected waiting time (waiting in line) is ½ 1/lambda…

Kind regards

$\chi$ $\sigma$
 
  • #7
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Posted on 08 17 2012 on www.mathhelpforum.com by the member pyromania and not yet solved…A post office has 2 clerks. Alice enters the post office while 2 other customers,Bob and Claire, are being served by the 2 clerks. She is next in line. Assume that the time a clerk spends serving a customer has the exponential (lambda) distribution. What is the expected total time that Alice needs to spend at the post office?...The answer gives that the expected waiting time(waiting in line) is ½ 1/lambda and the expected time being served in 1/lambda, so the total time is 3/2 1/lambda. I don't understand why the expected waiting time (waiting in line) is ½ 1/lambda…

Let suppose to have two exponential r.v. X and Y with...

$\displaystyle f_{x}(t)=f_{y}(t)= \begin{cases} \lambda\ e^{- \lambda\ t}&\text{if}\ t>0 \\ 0 &\text{if}\ t<0 \end{cases}$ (1)

... and we want to find the p.d.f. of the r.v. Z= min (X,Y). Observing the following diagram...

View attachment 304

... we have...

$\displaystyle P\{ Z<z\} = 1- \lambda^{2} \int_{z}^{\infty} \int_{z}^{\infty} e^{- \lambda\ (x+y)}\ dx\ dy = 1- e^{-2\ \lambda\ z}$ (2)

... so that is...

$\displaystyle f_{z}(z)= \begin{cases} 2 \lambda\ e^{- 2 \lambda\ z}&\text{if}\ z>0 \\ 0 &\text{if}\ z<0 \end{cases}$ (3)

... and...

$\displaystyle E\{Z\}= \int_{0}^{\infty} z\ f_{z}(z)\ dz = \frac{1}{2\ \lambda}$ (4)

Kind regards

$\chi$ $\sigma$

 

Attachments

  • MHB16.PNG
    MHB16.PNG
    315 bytes · Views: 126
  • #8
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Let suppose to have two exponential r.v. X and Y with...

$\displaystyle f_{x}(t)=f_{y}(t)= \begin{cases} \lambda\ e^{- \lambda\ t}&\text{if}\ t>0 \\ 0 &\text{if}\ t<0 \end{cases}$ (1)

... and we want to find the p.d.f. of the r.v. Z= min (X,Y). Observing the following diagram...

View attachment 304

... we have...

$\displaystyle P\{ Z<z\} = 1- \lambda^{2} \int_{z}^{\infty} \int_{z}^{\infty} e^{- \lambda\ (x+y)}\ dx\ dy = 1- e^{-2\ \lambda\ z}$ (2)

... so that is...

$\displaystyle f_{z}(z)= \begin{cases} 2 \lambda\ e^{- 2 \lambda\ z}&\text{if}\ z>0 \\ 0 &\text{if}\ z<0 \end{cases}$ (3)

... and...

$\displaystyle E\{Z\}= \int_{0}^{\infty} z\ f_{z}(z)\ dz = \frac{1}{2\ \lambda}$ (4)

Kind regards

$\chi$ $\sigma$


Because the exponential distribution is the distribution of the wait for the next arrival for events that arrive at random in time, it is clear that the wait for the first of two events with mean waits of t1 and t2 is exponential with mean wait (t1t2)/(t1+t2). So if t1=t2 the mean wait is t1/2, and the result follows with no integrations since expectations add.

CB
 
  • #9
Re: Unsolved statistic questions from other sites, part II

Posted on 12 21 2011 on www.artofproblemsolving.com bythe member ayus_2008 and not yet solved…

Adrunk person walk along the number line as follows : his original position is from x=0 he takes step forward or step backward with equal probability. What isthe probability that he will reach his home which is x=3 before falling into pit which is at x=-2?...

Kind regards

$\chi$ $\sigma$
 
  • #10
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Posted on 12 21 2011 on www.artofproblemsolving.com bythe member ayus_2008 and not yet solved…

A drunk person walk along the number line as follows : his original position is from x=0 he takes step forward or step backward with equal probability. What isthe probability that he will reach his home which is x=3 before falling into pit which is at x=-2?...


Calling $X_{n}$ the sequence of states, is $X_{k}=3$ only if $X_{k-1}=2$ and k is an odd number and it will be...

$\displaystyle P \{X_{2n+1}=3 | X_{2n}=2\}= \frac{1}{2}$ (1)

Now if we set $p_{0}(n) = P\{X_{2n}=0\}$ and $p_{2}(n) = P\{X_{2n}=2\}$ is...

$\displaystyle p_{0}(n+1)= \frac{1}{4}\ \{p_{2}(n) + p_{0} (n)\}$

$\displaystyle p_{2}(n+1)= \frac{1}{4}\ \{p_{2}(n) + p_{0} (n)\}$ (2)

... and from (2) we derive immediately...

$\displaystyle p_{0}(n+1)= p_{2}(n+1)= p(n+1)= \frac{p(n)}{2}$ (3)

The (2) is a difference equation the solution of which with the initial condition $p(0)=\frac{1}{4}$ is $\displaystyle p(n)=\frac{1}{4}\ (\frac{1}{2})^{n}$ so that the probability that the random walker arrives in $X=3$, taking into account (1), is...

$\displaystyle P= \frac{1}{8}\ \sum_{n=0}^{\infty} (\frac{1}{2})^{n}= \frac{1}{4}$ (4)

Kind regards

$\chi$ $\sigma$
 
  • #11
Re: Unsolved statistic questions from other sites, part II

Posted on 08 12 2012 on www.mathhelpforum.com by the member Newtonian and not yet solved…

Suppose we have independent normally distributed random variables
png.latex
where
png.latex
and all the
png.latex
are known. Define
png.latex
and
png.latex
(where all the
png.latex
and
png.latex
are known). Form the complex number
png.latex
. What would be a good way of computing the distribution of
png.latex
?...


Kind regards

$\chi$ $\sigma$
 
  • #12
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Posted on 08 12 2012 on www.mathhelpforum.com by the member Newtonian and not yet solved…

Suppose we have independent normally distributed random variables
png.latex
where
png.latex
and all the
png.latex
are known. Define
png.latex
and
png.latex
(where all the
png.latex
and
png.latex
are known). Form the complex number
png.latex
. What would be a good way of computing the distribution of
png.latex
?...

The solution of this problem [a little more difficult respect tomost previously posted problems...] require a preliminary computation on thebasis of fig. 1...

View attachment 314

fig. 1

The triangle with sides a, b and c of the figure is subject to the ‘sine’s law’so that is…

$\displaystyle \frac {a}{\sin \alpha}=\frac{b}{\sin \beta}= \frac{c}{\sin\theta}$ (1)

… and let’s us suppose that $a$, $\theta$ and$\beta= \pi-\varphi$, so that wecan find from (1) $c$ as function of them…

$\displaystyle c= a\ \frac{\sin \theta}{\sin(\gamma-\theta)}\ ,\ \gamma \ge\theta$ (2)Now we suppose to have the complex r.v. $A= a +X + i\ Y = a + R\e^{i\ \Gamma}$where $X \sim N (0,\sigma)$ and $Y \sim N(0,\sigma)$ are two independent r.v.The r.v. $R$ and $\Gamma$ have p.d.f. given by…$\displaystyle f_{R} (\rho) = \frac{\rho}{s}\ e^{- \frac{\rho^{2}}{2\ s^{2}}}$

$\displaystyle f_{\Gamma} (\gamma) =\begin{cases} \frac{1}{2\ \pi}&\text{if}\ -\pi<\gamma<\pi \\ 0&\text{otherwise}\end{cases}$(3)

... where $\displaystyle s^{2}=2\ \sigma^{2}$.From (2) and (3) we derive theprobability that A is in the 'colored' area of fig. 1...

$\displaystyle P \{\text{arg}\ A < \theta\}=\frac{1}{2} + \int_{0}^{\theta}\int_{0}^ {\infty} f_{R}(\rho)\ f_{\Gamma}(\gamma)\ d \rho\ d \gamma +\int_{\theta}^{\pi} \int_{0}^{a\ \frac{\sin\theta}{\sin (\gamma-\theta)}}f_{R}(\rho)\ f_{\Gamma} (\gamma)\ d \rho\ d\gamma =$

$\displaystyle = 1- \frac{1}{2\ \pi}\ \int_{\theta}^{\pi} e^{- \frac{a^{2}\\sin^{2} \theta}{2\ s^{2}\ \sin^{2}(\gamma-\theta)}} d \gamma$ (4)

The possibility of more develop from the (4) will be examined in successive posts...

Kind regards

$\chi$ $\sigma$
 

Attachments

  • MHB17.PNG
    MHB17.PNG
    813 bytes · Views: 85
  • #13
Re: Unsolved statistic questions from other sites, part II

chisigma said:
The solution of this problem [a little more difficult respect tomost previously posted problems...] require a preliminary computation on thebasis of fig. 1...

https://www.physicsforums.com/attachments/314

fig. 1

The triangle with sides a, b and c of the figure is subject to the ‘sine’s law’so that is…

$\displaystyle \frac {a}{\sin \alpha}=\frac{b}{\sin \beta}= \frac{c}{\sin\theta}$ (1)

… and let’s us suppose that $a$, $\theta$ and$\beta= \pi-\varphi$, so that wecan find from (1) $c$ as function of them…

$\displaystyle c= a\ \frac{\sin \theta}{\sin(\gamma-\theta)}\ ,\ \gamma \ge\theta$ (2)Now we suppose to have the complex r.v. $A= a +X + i\ Y = a + R\e^{i\ \Gamma}$where $X \sim N (0,\sigma)$ and $Y \sim N(0,\sigma)$ are two independent r.v.The r.v. $R$ and $\Gamma$ have p.d.f. given by…$\displaystyle f_{R} (\rho) = \frac{\rho}{s}\ e^{- \frac{\rho^{2}}{2\ s^{2}}}$

$\displaystyle f_{\Gamma} (\gamma) =\begin{cases} \frac{1}{2\ \pi}&\text{if}\ -\pi<\gamma<\pi \\ 0&\text{otherwise}\end{cases}$(3)

... where $\displaystyle s^{2}=2\ \sigma^{2}$.From (2) and (3) we derive theprobability that A is in the 'colored' area of fig. 1...

$\displaystyle P \{\text{arg}\ A < \theta\}=\frac{1}{2} + \int_{0}^{\theta}\int_{0}^ {\infty} f_{R}(\rho)\ f_{\Gamma}(\gamma)\ d \rho\ d \gamma +\int_{\theta}^{\pi} \int_{0}^{a\ \frac{\sin\theta}{\sin (\gamma-\theta)}}f_{R}(\rho)\ f_{\Gamma} (\gamma)\ d \rho\ d\gamma =$

$\displaystyle = 1- \frac{1}{2\ \pi}\ \int_{\theta}^{\pi} e^{- \frac{a^{2}\\sin^{2} \theta}{2\ s^{2}\ \sin^{2}(\gamma-\theta)}} d \gamma$ (4)

The possibility of more develop from the (4) will be examined in successive posts...

Kind regards

$\chi$ $\sigma$
But \(X\) and \(Y\) are not independent.

CB
 
  • #14
Re: Unsolved statistic questions from other sites, part II

CaptainBlack said:
But \(X\) and \(Y\) are not independent.

CB

That's right... so that my 'preliminary computation' has only illustrative purpose and has to be modified...

Kind regards

$\chi$ $\sigma$
 
Last edited:
  • #15
Re: Unsolved statistic questions from other sites, part II

Posted on 07 16 2012 on www.artofproblemsolving.com by the member jetix and not yet solved…

Calculate the distribution of the random variable...

http://www.artofproblemsolving.com/Forum/code.php?hash=a196673a4843aa139cb5037a4d58669d797c2b5c&type=1&sid=49ea506bce6cc32c1e039d57f17ccccf

... where http://www.artofproblemsolving.com/Forum/code.php?hash=d47fc17a8b804eb3768435939bade3b3b01b024c&sid=49ea506bce6cc32c1e039d57f17ccccf is Brownian motion...

Kind regards

$\chi$ $\sigma$
 
  • #16
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Posted on 07 16 2012 on www.artofproblemsolving.com by the member jetix and not yet solved…

Calculate the distribution of the random variable...

http://www.artofproblemsolving.com/Forum/code.php?hash=a196673a4843aa139cb5037a4d58669d797c2b5c&type=1&sid=49ea506bce6cc32c1e039d57f17ccccf

... where http://www.artofproblemsolving.com/Forum/code.php?hash=d47fc17a8b804eb3768435939bade3b3b01b024c&sid=49ea506bce6cc32c1e039d57f17ccccf is Brownian motion...


A Wiener Process[know alsoas Brownian Motion…] is a stochastic process$W_{t}$ with the following properties…a)$\displaystyle W_{0}=0$...

b) $\displaystyle W_{t}$ is continuous in t…

c) $\displaystyle W_{t}$ has increments $\displaystyle W_{t} – W_{s} \sim\mathcal {N} (0, t-s)$...


From the property c) setting s=0 we derive the p.d.f. of $W_{t}$at the time t…

$\displaystyle f(x,t)=\frac{1}{\sqrt{2\ \pi\ t}}\ e^{- \frac{x^{2}}{2\ t}}$ (1)

... and also the basic properties...

$\displaystyle E \{W_{t}\}=0$ (2)

$\displaystyle E \{W_{s} \cdot W_{t}\} =\text{min}\ (s,t)$ (3)

Now if we define $\displaystyle Y(t)=\int_{0}^{t} W_{s}\ d s$, according with (3), we find...

$\displaystyle E \{Y^{2}\} = \int_{0}^{t}\ \int_{0}^{t} E \{W_{\sigma} \cdot W_{\tau}\} d \sigma d \tau = $

$\displaystyle = \int_{0}^{t}\ \int_{0}^{t} \text{min} (\sigma,\tau)\ d \sigma d \tau = \frac{t^{3}}{3}$ (4)

... so that the p.d.f of Y is $\displaystyle \mathcal{N} (0, \frac{t^{3}}{3})$...

Kind regards

$\chi$ $\sigma$
 
  • #17
Re: Unsolved statistic questions from other sites, part II

Posted on 08 30 2012 on www.mathhelpforum.com by the member nek and not yet solved…

Hello, I'm trying to complete a course on SDE and I need to solve two stochastic differential equations.They are supposed to be easy, but I'm still a beginner and to be honest I'm quite stuck.The equations are the following...

a) $\displaystyle d X_{t} = dt + d W_{t}^{x}$

b) $\displaystyle d Y_{t} = X{t}\ d W_{t}^{y}$

... where $W_{t}^{x}$ and $W_{t}^{y}$ are uncorrelated brownian motions...

Kind regards

$\chi$ $\sigma$
 
  • #18
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Posted on 08 30 2012 on www.mathhelpforum.com by the member nek and not yet solved…

Hello, I'm trying to complete a course on SDE and I need to solve two stochastic differential equations.They are supposed to be easy, but I'm still a beginner and to be honest I'm quite stuck.The equations are the following...

a) $\displaystyle d X_{t} = dt + d W_{t}^{x}$

b) $\displaystyle d Y_{t} = X{t}\ d W_{t}^{y}$

... where $W_{t}^{x}$ and $W_{t}^{y}$ are uncorrelated brownian motions...


Preliminary to the solution of SDE is a short explanation of the so called 'Ito's Integral' , a 'special integral' developed by the Japanese mathematician Kiyoshi Ito. Let's suppose that $W_{t}$ is a Wiener process [or brownian motion...] and we want to compute an integral like this...

$\displaystyle \int_{0}^{t} G(W_{t})\ dW_{t}$ (1)

The 'standard' approach $\displaystyle dW_{t}= \frac {dW_{t}}{dt}\ dt$ fails because one of the properties of the Wiener process is that it is non differentiable respect to t. The general case is quite complex and the interested reader is recommended to access to the specialized lecterature but for our purpose it will be enough to consider the following particular case...

Let $f(x)$ a twice differrentiable function with $\displaystyle f^{'\ '} (x)$ continous. In that case the 'Ito's rule' extablishes that...

$\displaystyle d f(W_{t})= f^{'}(W_{t})\ dW_{t} + \frac{1}{2}\ f^{'\ '}(W_{t})\ dt$ (2)

... and integrating (2) we obtain...

$\displaystyle \int_{0}^{t} f^{'}(W_{t}) dW_{t} = f(W_{t})- f(0) -\frac{1}{2} \int_{0}^{t} f^{'\ '} (W_{s}) ds$ (3)

As simple example let's consider the 'simple' case $f(x)=x$...

$\displaystyle \int_{0}^{t} W_{t} dW_{t} = \frac{W_{t}^{2}}{2} - \frac{1}{2}\ \int_{0}^{t} ds = \frac{W_{t}^{2}}{2}- \frac{t}{2}$ (4)

Two very interesting properties of (4)...

a) respect to the 'traditional integration' there is the 'extra term' $\displaystyle - \frac{t}{2}$...

b) the integral of a 'random process' is the sum of a random variable and a deterministic variable...

Kind regards

$\chi$ $\sigma$
 
Last edited:
  • #19
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Posted on 08 30 2012 on www.mathhelpforum.com by the member nek and not yet solved…

Hello, I'm trying to complete a course on SDE and I need to solve two stochastic differential equations.They are supposed to be easy, but I'm still a beginner and to be honest I'm quite stuck.The equations are the following...

a) $\displaystyle d X_{t} = dt + d W_{t}^{x}$

b) $\displaystyle d Y_{t} = X{t}\ d W_{t}^{y}$

... where $W_{t}^{x}$ and $W_{t}^{y}$ are uncorrelated brownian motions...


At this point a short description of the solving procedure of linear SDE is presented. Let's start with a 'standard' first order linear ODE...

$\displaystyle \frac{d x_{t}}{d t}= a_{t}\ x_{t} + u_{t}\ ,\ x(0)=x_{0}$ (1)

... the solution of which is...

$\displaystyle x_{t}= \varphi_{t}\ \{x_{0} + \int_{0}^{t} \varphi_{s}^{-1}\ u_{s}\ ds\}$ (2)

... where $\varphi_{t}$ is the solution of the linear ODE...

$\displaystyle \frac{d \varphi_{t}}{d t}= a_{t}\ \varphi_{t}\ ,\ \varphi_{0}=1$ (3) All that is well known and now we examine how can be used in the solution of a linear in narrow sense SDE that has the form...

$\displaystyle d X_{t}= (a_{t}\ X_{t} + u_{t})\ d t + v_{t}\ dW_{t}\ ,\ X_{0}=x_{0}$ (4)

... and the solution of which is given by...

$\displaystyle X_{t}= \varphi_{t}\ \{ x_{0} + \int_{0}^{t} \varphi_{s}^{-1}\ u_{s}\ ds + \int_{0}^{t} \varphi_{s}^{-1}\ v_{s}\ dW_{s} \}$ (5)

... where $\varphi$ has been defined previously and the second integral is an 'Ito's integral' so that [very important detail...] the solution is the sum of a deterministic variable and a random variable.

Now we are able to solve the first of the SDE proposed by nek...

$\displaystyle d X_{t} = dt + d W_{t}$ (6)

It is a 'narrow sense' linear SDE in the form (4) where $a_{t}=0$, $u_{t}=1$ and $v_{t}=1$ so that is $\varphi_{t}=1$ and the solution is...

$\displaystyle X_{t}= x_{0}+t + \int_{0}^{t} dW_{s}= x_{0}+t+ W_{t}$ (7)

... where the previously described solving procedure of the Ito's integral has been applied...

Kind regards

$\chi$ $\sigma$
 
  • #20
Re: Unsolved statistic questions from other sites, part II

Posted on 06 07 2012 on www.talkstat.com by the member nounuo and not yet properly solved…

... how can I find the expectation of the distance between any two points uniformly distributed in a square of side 1?... I need a mathematical prove…

Kind regards

$\chi$ $\sigma$
 
  • #21
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Posted on 06 07 2012 on www.talkstat.com by the member nounuo and not yet properly solved…

... how can I find the expectation of the distance between any two points uniformly distributed in a square of side 1?... I need a mathematical prove…

Although at first may seem 'easy', the solution requires several steps. The first step is, given two r.v. X and Y uniformly distributed from 0 to1, the p.d.f. of the r.v. Z=X-Y is...

$\displaystyle f_{Z} (t) =\begin{cases} 1+t\ \text{if}\ -1<t<0 \\ 1-t\ \text {if}\ 0<t<1 \\ 0\ \text{otherwise} \end{cases}$ (1)

... and the demonstration of that is left to the reader. The second step istofind the p.d.f. of the r.v. $U=Z^{2}$ and that is achieved considering that...

$\displaystyle P \{U<t\} = P \{Z<\sqrt t\}= 2\ \int_{0}^{\sqrt t} (1-t)\ dt= 2\ \sqrt t -t $ (2)

... and deriving (2) we have...

$\displaystyle f_{U}(t)= \begin{cases}\frac{1}{\sqrt t}-1\ \text {if}\ 0<t<1 \ \\ 0\ \text{otherwise} \end{cases}$(3)

More steps will be realized in successive posts...

Kind regards

$\chi$ $\sigma$
 
  • #22
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Although at first may seem 'easy', the solution requires several steps. The first step is, given two r.v. X and Y uniformly distributed from 0 to1, the p.d.f. of the r.v. Z=X-Y is...

$\displaystyle f_{Z} (t) =\begin{cases} 1+t\ \text{if}\ -1<t<0 \\ 1-t\ \text {if}\ 0<t<1 \\ 0\ \text{otherwise} \end{cases}$ (1)

... and the demonstration of that is left to the reader. The second step istofind the p.d.f. of the r.v. $U=Z^{2}$ and that is achieved considering that...

$\displaystyle P \{U<t\} = P \{Z<\sqrt t\}= 2\ \int_{0}^{\sqrt t} (1-t)\ dt= 2\ \sqrt t -t $ (2)

... and deriving (2) we have...

$\displaystyle f_{U}(t)= \begin{cases}\frac{1}{\sqrt t}-1\ \text {if}\ 0<t<1 \ \\ 0\ \text{otherwise} \end{cases}$(3)

More steps will be realized in successive posts...


The r.v. U having p.d.f. (3) is the square of the distance between two random points in the interval (0,1). Now we proceed as in...http://www.mathhelpboards.com/f23/unsolved-statistics-questions-other-sites-932/index8.html#post7039

... and...

http://www.mathhelpboards.com/f23/unsolved-statistics-questions-other-sites-932/index9.html#post7118

... and we find...

$\displaystyle \mu= E (\delta^{2})= \int_{0}^{1} (\sqrt t -t)\ dt = \frac{1}{6} \implies E (\delta)= \frac{1}{\sqrt{6}}$ (4)

Now we suppose to have two r.v. $U_{1}$ and $U_{2}$ each having p.d.f. (3). In this case if $U=U_{1}+U_{2}$ it will be... $\displaystyle E (\delta^{2})= 2\ \mu= \frac{1}{3} \implies E (\delta)= \frac{1}{\sqrt{3}}$ (5)

... and (5) is the solution of the proposed problem. Of course in n dimension is...$\displaystyle E (\delta^{2})= n\ \mu= \frac{n}{6} \implies E (\delta)= \sqrt{\frac{n}{6}}$ (6)

Kind regards

$\chi$ $\sigma$
 
  • #23
Re: Unsolved statistic questions from other sites, part II

Posted on 10 15 2012 on Math Help Forum - Free Math Help Forums by the user Dinkydoe and not jet solved…

Let $\displaystyle P\ \{X_{k}=1\} = P\ \{X_{k}=0\}=\frac{1}{2}$ (Bernoulli trials) and consider $\displaystyle Y=3\ \sum_{k=1}^{\infty} 4^{-k}\ X_{k}$. Apparently the p.d.f. of Y is constant on $(\frac{1}{4}\ ,\ \frac{3}{4})$, satisfies $\displaystyle f(x)=1-f(x)$ on $(0,1)$ and for $x<\frac{1}{4}$ is $\displaystyle f(x)=2\ f(\frac{x}{4})$...

I’m a bit bewildered…


Kind regards

$\chi$ $\sigma$
 
Last edited:
  • #24
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Posted on 10 15 2012 on Math Help Forum - Free Math Help Forums by the user Dinkydoe and not jet solved…

Let $\displaystyle P\ \{X_{k}=1\} = P\ \{X_{k}=0\}=\frac{1}{2}$ (Bernoulli trials) and consider $\displaystyle Y=3\ \sum_{k=1}^{\infty} 4^{-k}\ X_{k}$. Apparently the p.d.f. of Y is constant on $(\frac{1}{4}\ ,\ \frac{3}{4})$, satisfies $\displaystyle f(x)=1-f(x)$ on $(0,1)$ and for $x<\frac{1}{4}$ is $\displaystyle f(x)=2\ f(\frac{x}{4})$...

I’m a bit bewildered…

The most effective way to attack this problem is the use of 'convolution theorem' that extablishes that, if the r.v. U and V have p.d.f. $f_{U}(t)$ and $f_{V} (t)$, then the r.v. Z=U+V has p.d.f. $f_{Z}(t)= f_{U}(t)*f_{V}(t)$ where '*' means convolution. Each r.v. $Y_{k}= 3\ \frac{X_{k}}{2^{2\ k}}$ has p.d.f. $f_{k}(t)$ the Fourier Transform of which is given by...$\displaystyle \mathcal {F} \{f_{k}(t)\}= e^{- i\ \frac{3\ \omega}{2^{2k+1}}}\ \cos \frac{3\ \omega}{2^{2k+1}}$ (1)

... so that the Fourier Transform od the p.d.f. of the r.v. Y is...

$\displaystyle \mathcal {F} \{f_{Y}(t)\}= \prod_{k=1}^{\infty} e^{- i\ \frac{3\ \omega}{2^{2k+1}}}\ \prod_{k=1}^{\infty} \cos \frac{3\ \omega}{2^{2k+1}} = e^{-i\ \frac{\omega}{2}}\ \prod_{k=1}^{\infty} \cos \frac{3\ \omega}{2^{2k+1}}$ (2)

Observing (2) it is evident enough that the p.d.f. $f_{Y}(t)$ is 'even' respect to $t=\frac{1}{2}$ but for an explicit expression for that requires the computation of the integral... $\displaystyle \frac{1}{\pi}\ \int_{0}^{\infty} \prod_{k=1}^{\infty} \cos \frac{3\ \omega}{2^{2k+1}}\ \cos \omega\ t\ d \omega $ (3)

... that will be performed [if possible...] in successive posts... Kind regards $\chi$ $\sigma$
 
  • #25
Re: Unsolved statistic questions from other sites, part II

From…

http://www.mathhelpboards.com/f28/problem-paper-3-1998-step%3B-probability-distribution-submarine-call-signs-2168/

A hostile naval power possesses a large, unknown number N of submarines. Interception of radio signals yields a small number n of identification numbers $X_{i},\ i=1,2,...,n$ which ate taken to be independent and uniformly distributed over the continuous range from 0 to N. Show that $Z_{1}$ and $Z_{2}$, defined by...

$\displaystyle Z_{1}= \frac{n+1}{n}\ \text{max}\ (X_{1},X_{2},…,X_{n})$ (1)

… and…

$\displaystyle Z_{2}= \frac{2}{n} \sum_{i=1}^{n} X_{i}$ (2)

… both have means equal to N.

Calculate the variance of $Z_{1}$ and $Z_{2}$. Which estimator do you prefer, and why?...

For illustrative purpose let's start with $Z_{1}$ in the particular case N=4 and n=2, so that is $Z_{1}= \text{max}\ (X_{1},X_{2})$, and some aids can come from the following table...

https://www.physicsforums.com/attachments/478._xfImport

Calling $p_{k,4,2}$ the probability that $\text{max}\ (X_{1}, X_{2})=k$ for the samples $X_{1},\ X_{2}$, it is easy to see from the table that, if $X_{1}$ and $X_{2}$ are independent and uniformely distributed from 1 to 4, is…

$\displaystyle p_{k,4,2} = P \{X_{1} \le k, X_{2} \le k \} - P \{X_{1} \le k-1, X_{2} \le k-1 \}= \frac{k^{2}-(k-1)^{2}}{16}$ (1)

The example now illustrated makes easier to find the general formula...

$\displaystyle p_{k,N,n} = P \{X_{1} \le k, X_{2} \le k, ..., X_{n} \le k \} - P \{X_{1} \le k-1, X_{2} \le k-1, ..., X_{n} \le k-1 \}= \frac{k^{n}-(k-1)^{n}}{N^{n}}$ (2)

... that supplies the probability that $X = \text{max}\ (X_{1}, X_{2}, ..., X_{n})=k$ for the samples $X_{1},\ X_{2}, ...,\ X_{n}$ independent and uniformely distributed from 1 to N.
Using (2) we can evaluate the $E \{ X \}$ as...

$\displaystyle \mu = E \{ X \} = \frac{1}{N^{n}}\ \sum_{k=1}^{N} k\ \{ k^{n}-(k-1)^{n} \}$ (3)
… that, if we consider that is…
$\displaystyle k\ \{ k^{n}-(k-1)^{n} = n\ k^{n} - \frac{n\ (n-1)}{2}
k^{n-1} + … + (-1)^{n-1} k$ (4)
... may be written as...

$\displaystyle \mu= E \{ X \} = \frac{1}{N^{n}}\ \{ n\ \sum_{k=1}^{N} k^{n} - \frac{n\ (n-1)}{2}\ \sum_{k=1}^{N} k^{n-1} + ...\} = \frac{1}{N^{n}}\ \{ n\ \frac{B_{n+1} (N+1) - B_{n+1} (0)}{n+1} - \frac{n\ (n-1)}{2}\ \frac{B_{n} (N+1) - B_{n} (0)}{n} + ...\}$ (5)

... where the $B_{i} (*)$ are the Bernoulli Polynomials of order i. Now if we consider that is $\displaystyle \lim_{N \rightarrow \infty} \frac {B_{i} (N+1)}{N^{j}}$ is 1 for i=j and 0 for i<j we obtain that for N 'large enough' is...

$\displaystyle E \{ X \} \sim \frac {n}{n+1}\ N$ (6)

With the goal to arrive to the $\text{Var}\ \{x \}$ now we compute the $E\ \{X^{2} \}$ as...

$\displaystyle E \{ x^{2} \} = \frac{1}{N^{n}}\ \sum_{k=1}^{N} k^{2}\ \{ k^{n}-(k-1)^{n} \}$ (7)

All what we have to do is to repeat the steps from (3) to (6) with $k^{2}$ instead of $k$, obtaining...

$\displaystyle k^{2}\ \{ k^{n}-(k-1)^{n} = n\ k^{n+1} - \frac{n\ (n-1)}{2}\ k^{n} + … $ (8)
... so that is...

$\displaystyle E \{ x^{2} \} = \frac{1}{N^{n}}\ \{ n\ \sum_{k=1}^{N} k^{n+1} - \frac{n\ (n-1)}{2}\ \sum_{k=1}^{N} k^{n} + ...\} = \frac{1}{N^{n}}\ \{ n\ \frac{B_{n+2} (N+1) - B_{n+2} (0)}{n+2} - \frac{n\ (n-1)}{2}\ \frac{B_{n+1} (N+1) - B_{n+1} (0)}{n+1} + ...\}$ (9)

... and 'pushing' N to infinity...

$\displaystyle E \{ x^{2} \} \sim \frac{n}{n+2}\ N^{2} - \frac{n\ (n-1)}{2\ (n+1)}\ N$ (10)

The requested value of $\text {Var} \{ X \}$ is therefore...

$\displaystyle \sigma^{2} = \text {Var} \{ X \} = E \{ X^{2} \} - \mu^{2} \sim \frac{n}{(n+1)^{2}\ (n+2)}\ N^{2} - \frac{n\ (n-1)}{2\ (n+1)}\ N $ (11)

As in previous posts, if would be highly preferable that someone controls my work (Thinking)...

The r.v. $Z_{2}$ will be treated in succesive post...

Kind regards

$\chi$ $\sigma$
 

Attachments

  • MHB19.PNG
    MHB19.PNG
    986 bytes · Views: 74
  • #26
Re: Unsolved statistic questions from other sites, part II

The r.v. $Z_{2}$ of course is related to the r.v. $\displaystyle X= \sum_{i=1}^{n} X_{i}$, where the $X_{i}$ are all uniformely distributed from 1 to N. Also in this case the problem is relatively easy if some approximation is allowed, so that we adopt the Central Limit Theorem as in...

http://www.mathhelpboards.com/f23/unsolved-statistics-questions-other-sites-932/index9.html#post7147

Each $X_{i}$ has mean...

$\displaystyle \mu_{i}= \frac{1}{N}\ \sum_{k=1}^{N} k = \frac {N + 1}{2}$ (1)

... and variance...

$\displaystyle \sigma^{2}_{i}= \frac{1}{N}\ \sum_{k=1}^{N} k^{2} - \frac{(N+1)^{2}}{4} = \frac{5\ N^{2} + 6\ N + 1}{24}$ (2)

... so that $Z_{2}$ has mean...

$\displaystyle \mu_{2} \sim N+1$ (3)

... and standard deviation...

$\displaystyle \sigma_{2} \sim \sqrt {(\frac{5\ N^{2} + 6\ N + 1}{6n})}$ (4)

In the previous post we found that $Z_{1}$ has mean...

$\displaystyle \mu_{1} = N$ (3)

... and standard deviation...

$\displaystyle \sigma_{1} \sim N\ \sqrt {\frac{1}{n\ (n+2)}}$ (4)

In order to extablish which is the 'better estimator' we define a sort of 'quality factor' defined as $\displaystyle \alpha= \frac{\sigma}{\mu}$ and obtain for $Z_{1}$...

$\displaystyle \alpha_{1} \sim \sqrt {\frac{1}{n\ (n+2)}}$ (5)

... and for $Z_{2}$...

$\displaystyle \alpha_{2} \sim \sqrt {\frac{5}{6\ n}}$ (6)

The conclusion is: $Z_{1}$ is the better estimator...

Kind regards

$\chi$ $\sigma$
 
  • #27
Re: Unsolved statistic questions from other sites, part II

Posted on www.artofproblemsolving.com on 19/27/2012 by the user newsum and not yet solved…

Let X and Y have bivariate normal distribution function with parameters $\mu_{1}=3$, $\mu_{2}= 1$, $\sigma_{1}^{2}= 16$, $\sigma_{2}^{2}= 25$ and $\rho=.6$. Determine…

a) $\displaystyle P\{ 3 < Y < 8 \}$

b) $\displaystyle P\{ 3 < Y < 8 | X < 7 \}$

c) $\displaystyle P\{ -3 < Y < 3 \}$

d) $\displaystyle P\{ -3 < Y < 3| Y = -4 \}$Kind regards $\chi$ $\sigma$
 
  • #28
Re: Unsolved statistic questions from other sites, part II

Posted the 12 08 2012 on www.artofproblemsolving.com by the user inakamono and not yet solved…

Find the probability that among 10000 random digits the digit 7 appears not more than 968 times…

Kind regards

$\chi$ $\sigma$
 
  • #29
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Posted the 12 08 2012 on www.artofproblemsolving.com by the user inakamono and not yet solved…

Find the probability that among 10000 random digits the digit 7 appears not more than 968 times…

That is a classical example of cumulative binomial distribution... the probability of k events in n trials is...

$\displaystyle P_{n,k}= \binom {n}{k}\ p^{k}\ (1-p)^{n-k}$ (1)

... so that the requested probability is...

$\displaystyle P = \sum_{k=0}^{968} P_{n,k}$ (2)

... with $p=.1$ and $n=10000$. The direct computation of (2) of course requires a computer tool like...

http://www.stat.tamu.edu/~west/applets/binomialdemo.html

... that gives $= .1467...$ . Alternatively we can approximate the (1) with $\displaystyle P_{n,k} \sim N (\mu, \sigma^{2})$ where...

$\displaystyle \mu= n\ p\ ,\ \sigma^{2}= n\ p\ (1-p)$ (3)

... so that the requested probability is...

$\displaystyle P \sim \frac{1}{2}\ \{1 + \text{erf} (\frac {968 - \mu}{\sigma\ \sqrt{2}})\}$ (4)

Also in this case You need a computer tool fot the computation of (4)... 'Monster Wolfram' gives $P \sim .143061$... Kind regards $\chi$ $\sigma$
 
  • #30
Re: Unsolved statistic questions from other sites, part II

chisigma said:
That is a classical example of cumulative binomial distribution... the probability of k events in n trials is...

$\displaystyle P_{n,k}= \binom {n}{k}\ p^{k}\ (1-p)^{n-k}$ (1)

... so that the requested probability is...

$\displaystyle P = \sum_{k=0}^{968} P_{n,k}$ (2)

... with $p=.1$ and $n=10000$. The direct computation of (2) of course requires a computer tool like...

http://www.stat.tamu.edu/~west/applets/binomialdemo.html

... that gives $= .1467...$ . Alternatively we can approximate the (1) with $\displaystyle P_{n,k} \sim N (\mu, \sigma^{2})$ where...

$\displaystyle \mu= n\ p\ ,\ \sigma^{2}= n\ p\ (1-p)$ (3)

... so that the requested probability is...

$\displaystyle P \sim \frac{1}{2}\ \{1 + \text{erf} (\frac {968 - \mu}{\sigma\ \sqrt{2}})\}$ (4)

Also in this case You need a computer tool fot the computation of (4)... 'Monster Wolfram' gives $P \sim .143061$... Kind regards $\chi$ $\sigma$

In the normal approximation you have not used the continuity correction. The 968 should be replaced by 968.5, when the probability becomes ~=0.1469.

And you don't need a computer to evaluate it, tables are quite adequate.

CB
 
  • #31
Re: Unsolved statistic questions from other sites, part II

CaptainBlack said:
... and you don't need a computer to evaluate it, tables are quite adequate...

CB

Unfortunately the personal experience of more that thirty five years in the area of telecommunications doesn't agree with this point of view. In the 'Bible' of Abramowitz and Stegun...

Abramowitz and Stegun: Handbook of Mathematical Functions

... the table of the normalized integral...$\displaystyle erf(x)= \frac{1}{\sqrt{2\ \pi}}\ \int_{- \infty}^{x} e^{- \frac{t^{2}}{2}}\ dt$ (1)

... arrives till to x=5 and supplies the value $\text{erf} (x) \sim .9999997133 \implies \text{erfc} (x) \sim 2.867 10^{-7}$. Well!... in digital transmission a standard bit error rate not greater that $10^{-6}$ is required, and that means that, in order to have necessary 'system margin', a target of bit error rate of $10^{-8} - 10^{-9}$ is often required... and even less in the case of optical fibre link...

At this point it is clear that the use of tables was for me not adequate, so that a lot of years ago I composed, with 'patient' application of the Simpson rule, the following 'little but accurate table' of the function $\log_{10} \text{erfc} (x)$, where 'erfc(x)' is defined as ...

$\displaystyle \text{erfc} (x) = 1 - \frac{2}{\sqrt{\pi}} \int_{0}^{x} e^{- t^{2}}\ dt$ (2) View attachment 505

May be that, sooner or later, in a dedicate post, I will better explain the 'little accurate table' and indicate an easy way to transform it in a 'little computer program'... Kind regards $\chi$ $\sigma$
 

Attachments

  • log erfc.JPG
    log erfc.JPG
    104.2 KB · Views: 67
  • #32
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Unfortunately the personal experience of more that thirty five years in the area of telecommunications doesn't agree with this point of view. In the 'Bible' of Abramowitz and Stegun...

Abramowitz and Stegun: Handbook of Mathematical Functions

... the table of the normalized integral...$\displaystyle erf(x)= \frac{1}{\sqrt{2\ \pi}}\ \int_{- \infty}^{x} e^{- \frac{t^{2}}{2}}\ dt$ (1)

... arrives till to x=5 and supplies the value $\text{erf} (x) \sim .9999997133 \implies \text{erfc} (x) \sim 2.867 10^{-7}$. Well!... in digital transmission a standard bit error rate not greater that $10^{-6}$ is required, and that means that, in order to have necessary 'system margin', a target of bit error rate of $10^{-8} - 10^{-9}$ is often required... and even less in the case of optical fibre link...

At this point it is clear that the use of tables was for me not adequate, so that a lot of years ago I composed, with 'patient' application of the Simpson rule, the following 'little but accurate table' of the function $\log_{10} \text{erfc} (x)$, where 'erfc(x)' is defined as ...

$\displaystyle \text{erfc} (x) = 1 - \frac{2}{\sqrt{\pi}} \int_{0}^{x} e^{- t^{2}}\ dt$ (2) https://www.physicsforums.com/attachments/505

May be that, sooner or later, in a dedicate post, I will better explain the 'little accurate table' and indicate an easy way to transform it in a 'little computer program'... Kind regards $\chi$ $\sigma$

Then get a better table, mine goes to \(z=9.5\) with a tail probability of \(\sim 10^{-21}\), Also A&S give pretty good asymtotic representations for the extreme tails of the normal distribution (26.2.12 and following sections).

Also the suggestion of using a normal calculator may be less than useless to a student who will meet such a problem where they do not have access to calculation aides but may have an exam handbook with a table.

CB
 
Last edited:
  • #33
Re: Unsolved statistic questions from other sites, part II

CaptainBlack said:
Then get a better table, mine goes to \(z=9.5\) with a tail probability of \(\sim 10^{-21}\)...

That's not a very difficuly task if we use the formula in...

Erfc -- from Wolfram MathWorld

$\displaystyle \frac{2}{\sqrt{\pi}}\ \frac{e^{- x^{2}}}{x + \sqrt{x^{2}+2}} < \text{erfc} (x) \le \frac{2}{\sqrt{\pi}}\ \frac{e^{- x^{2}}}{x + \sqrt{x^{2}+\frac{4}{\pi}}}$ (1)

... which gives an 'upper bound' and a 'lower bound' of the function. In the figure...

View attachment 506
... only the 'upper bound' is shown because the 'lower bound' in logaritmic scale is hard to be dinstinct from it. Pf course the only limitation in proceeding is the size of the diagram. It seems that the agreement with my old computation is good enough...

Kind regards

$\chi$ $\sigma$
 

Attachments

  • logerfx.JPG
    logerfx.JPG
    30 KB · Views: 82
Last edited:
  • #34
Re: Unsolved statistic questions from other sites, part II

Posted the 12 12 2012 [the 'magic date' of the Maya's calendar!...] on www.mathhelpforum.com by the user asilvester635 and not yet solved…

While he was a prisoner of war during World War II, John Kerrich tossed a coin 10,000 times. He got 5067 heads. If the coin is perfectly balanced, the probability of a head is 0.5. Is there reason to think that Kerrich's coin was not balanced?... To answer this question use a normal distribution to estimate the probability that tossing a balanced coin 10,000 times would give a count of heads at least this far from 5000, that is, at least 5067 heads or no more than 4933 heads…

The problem is very similar to what treated in...

http://www.mathhelpboards.com/f23/unsolved-statistic-questions-other-sites-part-ii-1566/index3.html... and the requested probability is...

$\displaystyle P \sim \text {erfc} (\frac {5067.5 - \mu}{\sigma\ \sqrt{2}})$ (1)

... where $\mu = 10000\ p = 5000$ and $\sigma= \sqrt{10000\ p\ (1-p)}= 50$ . For $x = .9546$ 'MonsterWolfram' supplies $\displaystyle \text{erfc} (x) \sim .177$, so that the Kerric's coin seems to be a little unbalanced toward head. The scope of this post however is to verify the possibility to use the approximate value of the erfc(*) described in...

http://www.mathhelpboards.com/f23/unsolved-statistic-questions-other-sites-part-ii-1566/index4.html#post12076

... by the formula...

$\displaystyle \frac{2}{\sqrt{\pi}}\ \frac{e^{- x^{2}}}{x + \sqrt{x^{2}+ 2}} < \text {erfc} (x) \le \frac{2}{\sqrt{\pi}}\ \frac{e^{- x^{2}}}{x + \sqrt{x^{2}+ \frac{4}{\pi}}}$ (2)

Using a normal handset calculator for $x = .9546$ we find...

$ \displaystyle .170483 < \text{erfc(.9546)} < .186478$

... and taking the aritmetic mean $\text{erfc(.9546)} \sim .1784$, a result 'good enough' obtained without using tables...

Kind regards

$\chi$ $\sigma$
 
  • #35
Re: Unsolved statistic questions from other sites, part II

Posted on 12 15 2012 on www.artofproblemsolving.com by the member BlackMax and not yet solved...

Three points are uniformly and independently chosen inside a given circle. What is the probability that their circumcircle lies entirely within the given circle?... a C++ program suggests that the answer is most likely to be .4 ...Kind regards

$\chi$ $\sigma$
 

Similar threads

Replies
90
Views
61K
Replies
69
Views
16K
Replies
11
Views
2K
Replies
2
Views
10K
Replies
13
Views
2K
Replies
7
Views
3K
Replies
7
Views
3K
Replies
17
Views
7K
Replies
2
Views
2K
Back
Top