Unsolved statistics questions from other sites....

  • MHB
  • Thread starter chisigma
  • Start date
  • Tags
    Statistics
In summary: Now we have to find the distribution of \(\phi\). This is just a wee bit of geometry, but a little long winded... ... the answer is that the probability density is \(f(\phi)= \frac{1}{\pi} \frac{1}{\sqrt{1- \cos^{2} \phi}}\) for \( 0 \leq \phi \leq \pi\).Hence the answer to the question is that the probability is \(\frac{1}{2}\).
  • #36
chisigma said:
Posted on 05 21 2012 on [FONT=&]www.artofproblemsolving.com[/FONT] by the member pablo_ro and not yet properly solved...

Computing integral using the random variables...

$\displaystyle \int_{0}^{1} ... \int_{0}^{1} \sqrt{x_{1}^{2}+ ...+ x_{n}^{2}}\ d x_{1}...d x_{n}$


Of course that is not a question in the area of probability, anyway that is a suggestive question the solution of which is not trivial...

Kind regards

$\chi$ $\sigma$
This seems to describe the average distance of a randomly placed $n$-dimensional point in the unit hypercube, from the origin. For what it's worth I had Mathematica churn out the definite integrals for the first few $n$. I was able to get analytical solutions for up to $n = 3$, but the closed-form expression for $n = 3$ is too unwieldy to post here. For $n > 11$, it takes a long time to produce a result:

$\begin{array}{|l|r|}
n &\mathrm{Integral} \\
~ &~ \\
1 & \frac{1}{2} \\
2 &\frac{\sqrt{2} + \sinh^{-1}{1}}{3} = 0.765 \\
3 &0.961 \\
4 &1.12 \\
5 &1.26 \\
6 &1.39 \\
7 &1.50 \\
8 &1.61 \\
9 &1.71 \\
10 &1.81\\
11 &1.90\\
\end{array}$

It seems to grow at a decreasing rate. The integral is relatively well approximated by $\sqrt{\frac{n}{3}}$, and the approximation gets increasingly better as $n$ increases. This result is trivially obtained by taking the square root outside the integral - this is asymptotically valid as $n$ grows large - the Euclidian metric is a poor choice of distance function in high-dimensional space (curse of dimensionality).

I don't see a way to nicely calculate a closed-form expression of the integral with respect to $n$, but I suspect it is possible. That's all I can say though (Sadface)
 
Mathematics news on Phys.org
  • #37
Just tp show the power of Monte-Carlo as a method for estimating high order multiple integrals I attach a plot of the integral against order:

View attachment 182

Note 1: the last lable on the horizontal axis has been clipped, it is 10000.

Note 2: the standard error is more or less independednt of n, and is less than 0.01 taking the average over 1000 sample points for each n.

Note 3: Looking at the curve in detail it is possible to detect what appears to be an a deviation from the asymptotic form \( \sqrt{n/3} \), IIRC it looks like \( I_n \approx \sqrt{n/3}-0.03 \) is a better approximation over the range investigated.
 

Attachments

  • MC integral.PNG
    MC integral.PNG
    3.3 KB · Views: 68
Last edited:
  • #38
chisigma said:
Posted on 05 21 2012 on www.artofproblemsolving.com by the member pablo_ro and not yet properly solved...

Computing integral using the random variables...

$\displaystyle \int_{0}^{1} ... \int_{0}^{1} \sqrt{x_{1}^{2}+ ...+ x_{n}^{2}}\ d x_{1}...d x_{n}$


Of course that is not a question in the area of probability, anyway that is a suggestive question the solution of which is not trivial...


Several years ago [see http://www.osti.gov/bridge/servlets/purl/919496-nVsPUt/919496.pdf...] D.H. Bayley, J.M. Boewein, R.E. Crandall examined the family of complex variable functions...

$\displaystyle B_{n}(s)= \int_{0}^{1} ... \int_{0}^{1} (x_{1}^{2}+...+x_{n}^{2})^{\frac{s}{2}}\ d x_{1} ... d x_{n}$ (1)

... and they arrived to the one dimensional integral formula...

$\displaystyle B_{n}(s)= \frac{s}{\Gamma(1-\frac{s}{2})}\ \int_{0}^{\infty} \frac{1-b(u)^{n}}{u^{1+s}}\ du$ (2)

... where...

$\displaystyle b(u)= \int_{0}^{1} e^{- u^{2}\ x^{2}}\ dx = \frac{\sqrt{\pi}}{2}\ \frac{\text{erf}\ (u)}{u}$ (3)

... and as collateral result to the asymptotic relation...

$\displaystyle B_{n}(s) \sim (\frac{n}{3})^{\frac{s}{2}}\ \{1 + \frac{s\ (s-2)}{10\ n} + ...\}$ (4)

Kind regards

$\chi$ $\sigma$

 
  • #39
chisigma said:
Several years ago [see http://www.osti.gov/bridge/servlets/purl/919496-nVsPUt/919496.pdf...] D.H. Bayley, J.M. Boewein, R.E. Crandall examined the family of complex variable functions...

$\displaystyle B_{n}(s)= \int_{0}^{1} ... \int_{0}^{1} (x_{1}^{2}+...+x_{n}^{2})^{\frac{s}{2}}\ d x_{1} ... d x_{n}$ (1)

... and they arrived to the one dimensional integral formula...

$\displaystyle B_{n}(s)= \frac{s}{\Gamma(1-\frac{s}{2})}\ \int_{0}^{\infty} \frac{1-b(u)^{n}}{u^{1+s}}\ du$ (2)

... where...

$\displaystyle b(u)= \int_{0}^{1} e^{- u^{2}\ x^{2}}\ dx = \frac{\sqrt{\pi}}{2}\ \frac{\text{erf}\ (u)}{u}$ (3)

... and as collateral result to the asymptotic relation...

$\displaystyle B_{n}(s) \sim (\frac{n}{3})^{\frac{s}{2}}\ \{1 + \frac{s\ (s-2)}{10\ n} + ...\}$ (4)

Kind regards

$\chi$ $\sigma$


Interesting, I will have to have another run of the MC, since for \(s=1\) that gives:

\(I_n \sim \sqrt{\frac{n}{3}}-\frac{1}{10\sqrt{3n}}\)

which is significantly different from what I recall, and would have been undetectable.

CB
 
Last edited:
  • #40
CaptainBlack said:
Interesting, I will have to have another run of the MC, since for \(s=1\) that gives:

\(I_n \sim \sqrt{\frac{n}{3}}-\frac{1}{10\sqrt{3n}}\)

which is significantly different from what I recall, and would have been undetectable.

CB

Well increasing the sample size to 1000000 seems to indicate the above asymptotic form is better yet (over the range checked).

CB
 
  • #41
Posted on 06 06 2012 on www.mathhelpforum.com by tttcomrader and noy yet solved...

Let X and Y be random variables with distribution f(*) and g(*) and write thye Laplace Transform of them F(*) and G(*). Show that...

$\displaystyle E\ \{e^{- \lambda\ X\ Y} \} = \int_{0}^{\infty} F(\lambda\ y)\ d G(y)\ dy = \int_{0}^{\infty} G(\lambda\ y)\ d F(y)\ dy$

Kind regards

$\chi$ $\sigma$
 
  • #42
chisigma said:
Posted on 06 06 2012 on www.mathhelpforum.com by tttcomrader and noy yet solved...

Let X and Y be random variables with distribution f(*) and g(*) and write thye Laplace Transform of them F(*) and G(*). Show that...

$\displaystyle E\ \{e^{- \lambda\ X\ Y} \} = \int_{0}^{\infty} F(\lambda\ y)\ d G(y)\ dy = \int_{0}^{\infty} G(\lambda\ y)\ d F(y)\ dy$

Kind regards

$\chi$ $\sigma$

Is their anything to do? Writing out the definition of the expectation and doing one of the integrals should suffice.

CB
 
  • #43
Posted on 06 03 2012 on www.talkstat.com by the member jumpydad and not yet properly solved…

Here is an exercise that i don't understand how to solve: imagine a balcony that attends clients. For every 10 min. the number of people getting attended follows a Poisson distribution with an expected value of 2. The balcony only works from 9 am till 12 am, and only attends 40 people at max. Now the question is: what is the probability of not attending all clients in one morning (9am till 12am)?...

Kind regards

$\chi$ $\sigma$
 
  • #44
chisigma said:
Posted on 06 03 2012 on www.talkstat.com by the member jumpydad and not yet properly solved…

Here is an exercise that i don't understand how to solve: imagine a balcony that attends clients. For every 10 min. the number of people getting attended follows a Poisson distribution with an expected value of 2. The balcony only works from 9 am till 12 am, and only attends 40 people at max. Now the question is: what is the probability of not attending all clients in one morning (9am till 12am)?...

Kind regards

$\chi$ $\sigma$

The number attended in 3 hrs has a Poisson distribution with a mean of 36.

CB
 
  • #45
chisigma said:
Posted on 06 03 2012 on www.talkstat.com by the member jumpydad and not yet properly solved…

Here is an exercise that i don't understand how to solve: imagine a balcony that attends clients. For every 10 min. the number of people getting attended follows a Poisson distribution with an expected value of 2. The balcony only works from 9 am till 12 am, and only attends 40 people at max. Now the question is: what is the probability of not attending all clients in one morning (9am till 12am)?...


Very well!... in a time T=180 min. the expected number of people getting attended is $\displaystyle \lambda=36$ and the probability to have exactly n people is...

$\displaystyle P_{n}= e^{- \lambda}\ \frac{\lambda^{n}}{n!}$ (1)

The probability of not attending all clients is...

$\displaystyle P_{ow}= 1- e^{- 36}\ \sum_{n=0}^{40} \frac{36^{n}}{n!}$ (2)

At this point the problem is the computation of the sum in (2), that can be performed, for example, using wolframalpha...

http://www.wolframalpha.com/input/?i=sum+e^(-+36)+36^j/+j!,+j=0...+40

... so that is $P_{ow}= .222897574398...$. If wolframalpha isn't allowable, then the sum in (2) can efficiently computed as explained in...

http://www.mathhelpboards.com/threads/426-Difference-equation-tutorial-draft-of-part-I

... as the 40.th term of the sequence defined by the difference equation...

$\displaystyle a_{n+1}=\frac{n+1}{36}\ a_{n}+1\ ,\ a_{0}=1$ (3)

Kind regards

$\chi$ $\sigma$
 
Last edited:
  • #46
chisigma said:
Very well!... in a time T=180 min. the expected number of people getting attended is $\displaystyle \lambda=32$ and the probability to have exactly n people is...

$\displaystyle P_{n}= e^{- \lambda}\ \frac{\lambda^{n}}{n!}$ (1)

The probability of not attending all clients is...

$\displaystyle P_{ow}= 1- e^{- 32}\ \sum_{n=0}^{40} \frac{32^{n}}{n!}$ (2)

At this point the problem is the computation of the sum in (2), that can be performed, for example, using wolframalpha...

http://www.wolframalpha.com/input/?i=sum+e%5E%28-+32%29+32%5Ej%2F+j%21%2C+j%3D0...+40

... so that is $P_{ow}= .070660852878...$. If wolframalpha isn't allowable, then the sum in (2) can efficiently computed as explained in...

http://www.mathhelpboards.com/threads/426-Difference-equation-tutorial-draft-of-part-I

... as the 40.th term of the sequence defined by the difference equation...

$\displaystyle a_{n+1}=\frac{n+1}{32}\ a_{n}+1\ ,\ a_{0}=1$ (3)

Kind regards

$\chi$ $\sigma$

\( 18 \times 2=36 \)

CB
 
  • #47
Posted on 06 05 2012 on www.talkstat.com by the member Youler and not yet solved…

When cycling home at night, I notice that sometimes my rear light is switched o when I arrive home. Presumably the switch is loose and can flip from on to o or back again when I go over bumps. I suppose that the number n of flippings per trip has a Poisson distribution...

$\displaystyle P \{ k=n\} = e^{−\lambda}\ \frac{\lambda^{n}}{n!}$

If the probability that the light is still on when I arrive home is p, find $\lambda$...

Kind regards

$\chi$ $\sigma$
 
  • #48
chisigma said:
Posted on 06 05 2012 on www.talkstat.com by the member Youler and not yet solved…

When cycling home at night, I notice that sometimes my rear light is switched o when I arrive home. Presumably the switch is loose and can flip from on to o or back again when I go over bumps. I suppose that the number n of flippings per trip has a Poisson distribution...

$\displaystyle P \{ k=n\} = e^{−\lambda}\ \frac{\lambda^{n}}{n!}$

If the probability that the light is still on when I arrive home is p, find $\lambda$...


If we indicate with $P_{on}$ and $P_{off}$ the probabilities that the light is on or off, then is... $\displaystyle P_{on}= e^{- \lambda}\ \sum_{n\ even} \frac{\lambda^{n}}{n!}= e^{- \lambda}\ \cosh \lambda= \frac{1+ e^{-2\ \lambda}}{2}$ (1)

$\displaystyle P_{off}= e^{- \lambda}\ \sum_{n\ odd} \frac{\lambda^{n}}{n!}= e^{- \lambda}\ \sinh \lambda= \frac{1- e^{-2\ \lambda}}{2}$ (2)

The problem seems to be solved... but in fact is required to find $\lambda$ as function of $P_{on}$ and not $P_{on}$ as function of $\lambda$, so that the inversion of (1) or (2) isn necessary and that will be done in a successive post...

Kind regards

$\chi$ $\sigma$
 
  • #49
chisigma said:
If we indicate with $P_{on}$ and $P_{off}$ the probabilities that the light is on or off, then is... $\displaystyle P_{on}= e^{- \lambda}\ \sum_{n\ even} \frac{\lambda^{n}}{n!}= e^{- \lambda}\ \cosh \lambda= \frac{1+ e^{-2\ \lambda}}{2}$ (1)

$\displaystyle P_{off}= e^{- \lambda}\ \sum_{n\ odd} \frac{\lambda^{n}}{n!}= e^{- \lambda}\ \sinh \lambda= \frac{1- e^{-2\ \lambda}}{2}= $ (2)

The problem seems to be solved... but in fact is required to find $\lambda$ as function of $P_{on}$ and not $P_{on}$ as function of $\lambda$, so that the inversion of (1) or (2) isn necessary and that will be done in a successive post...

From the pratical point of view it is esasier to find $\lambda$ as function of $q=P_{off}$ and after operate, if necessary, the substitution $ p=1-q$. The procedure is relatively easy...

$\displaystyle q=\frac{1-e^{-2 \lambda}}{2} \implies 1-2\ q= e^{-2\ \lambda} \implies \lambda= - \ln \sqrt{1-2\ q}= - \ln \sqrt {2\ p-1}$ (1)

... where is $0< q < \frac{1}{2}$ and $\frac{1}{2}< p < 1$...

Kind regards

$\chi$ $\sigma$
 
  • #50
Posted on 06 15 2012 on www.mathhelpforum.com by the member saravananbs and not yet solved…

… if x and y are independent random variable such that f(x)= e-x , x> or = 0 g(y)=3e-3y, y>or = 0 find the probability distribution function of z=x/y how it can be taken forward…

Kind regards

$\chi$ $\sigma$
 
  • #51
chisigma said:
Posted on 06 15 2012 on www.mathhelpforum.com by the member saravananbs and not yet solved…

… if x and y are independent random variable such that f(x)= e-x , x> or = 0 g(y)=3e-3y, y>or = 0 find the probability distribution function of z=x/y how it can be taken forward…

In the post #31 of this thread we found that, if X and Y are r.v. with p.d.f. $\displaystyle f_{x}(*)$ and $\displaystyle f_{y} (*)$ , then the r.v. $U=\frac{X}{Y}$ has p.d.f. ...

$\displaystyle f_{u}(u)= \int_{- \infty}^{+ \infty} |v|\ f_{x\ y} (u v,v)\ dv$ (1)

Now for $f_{x}(x)=e^{-x},\ x>0$ and $f_{y}(y)=3\ e^{-3\ y},\ y>0$ we have...

$\displaystyle f_{u}(u)= \int_{0}^{\infty} v\ e^{-(u+3)\ v}\ dv = \frac{3}{(u+3)^{2}},\ u>0$ (2)

Kind regards

$\chi$ $\sigma$
 
  • #52
Posted on 06 15 2012 on www.mathhelpforum.com by the member saravananbs and not yet solved…

… if x and y are independent random variable such that

$ \displaystyle f(x)= \frac{1}{\pi\ (1+x^{2})},\ |x|<1$

... and...

$\displaystyle g(y)= y\ e^{- \frac{y^{2}}{2}},\ y>0$

...find the joint density function z and w where z=xy and w=x…

Kind regards

$\chi$ $\sigma$
 
  • #53
chisigma said:
Posted on 06 15 2012 on www.mathhelpforum.com by the member saravananbs and not yet solved…

… if x and y are independent random variable such that

$ \displaystyle f(x)= \frac{1}{\pi\ (1+x^{2})},\ |x|<1$

... and...

$\displaystyle g(y)= y\ e^{- \frac{y^{2}}{2}},\ y>0$

...find the joint density function z and w where z=xy and w=x…


In the post #31 of this thread we found that, if X and Y are r.v. with p.d.f. $\displaystyle f_{x}(*)$ and $\displaystyle f_{y}(*)$, then the r.v. U= X Y has p.d.f. ...

$\displaystyle f_{u}(u)= \int_{- \infty}^{+ \infty} \frac{1}{|v|}\ f_{x,y} (\frac{u}{v}, v)\ dv$ (1)

Now for $\displaystyle f_{x}(x)= \frac{1}{\pi\ (1+x^{2})},\ |x|<1$ and $\displaystyle f_{y}(y)= y\ e^{- \frac{y^{2}}{2}},\ y>0$ we have...

$\displaystyle f_{u} (u)= \frac{1}{\pi}\ \int_{u}^{\infty} \frac{e^{- \frac{v^{2}}{2}}}{1+\frac{u^{2}}{v^{2}}}\ dv$ (2)

The integral (2) however is not very comfortable and some more study is necessary...

Kind regards

$\chi$ $\sigma$
 
  • #54
Posted on 06 15 2012 on www.matematicamente.it by the member Edwavit [original in Italian…] and not yet solved…

Hello boys!... a question I can’t solve: X is a Gaussian r.v. with $\mu = 10$ and $\sigma=4$ and $P(X>C)=.05$. Find C…

Kind regards

$\chi$ $\sigma$
 
  • #55
chisigma said:
Posted on 06 15 2012 on www.matematicamente.it by the member Edwavit [original in Italian…] and not yet solved…

Hello boys!... a question I can’t solve: X is a Gaussian r.v. with $\mu = 10$ and $\sigma=4$ and $P(X>C)=.05$. Find C…

Setting $\frac{C-\mu}{\sigma}=x$ is...

$\displaystyle P\{X<C\}= \frac{1}{2}\ \{1+ \text{erf}\ (\frac{x}{\sqrt{2}})\} = .95$ (1)

... where...

$\displaystyle \text{erf}\ (t)= \frac{2}{\sqrt{\pi}}\ \int_{0}^{t} e^{- \xi^{2}}\ d \xi$ (2)

... so that from (1) we derive...

$\displaystyle x= \sqrt{2}\ \text{erf}\ ^{-1} (.9)$ (3)

Now of course the problem is the computation of the function $\text{erf}\ ^{-1} (*)$. In...

http://www.mathhelpboards.com/threads/1223-erf

… it has been explained how to find the coefficients of the McLaurin expansion of the function $\text{erf}\ ^{-1} (*)$ and the task defined as ‘tedious but not very difficult’. In …

http://mathworld.wolfram.com/InverseErf.html


… we discover that the ‘tedious task’ has been done by somebody some years ago and the result is the series expansion…

$\displaystyle \text{erf}\ ^{-1} (\frac{2\ x}{\sqrt{\pi}}) = \sum_{n=0}^{\infty} a_{n}\ x^{2n+1}$ (4)

... where...

$\displaystyle a_{n}=\frac{c_{n}}{2n+1}$ (5)

... with $c_{n}$ solution of the difference equation...

$\displaystyle c_{n}= \sum_{k=0}^{n-1} \frac{c_{k}\ c_{n-k-1}}{(k=1)\ (2\ k+1)},\ c_{0}=1$ (6)

The first $a_{n}$ are $a_{0}=1$,$a_{1}= \frac{1}{3}$, $a_{2}= \frac {7}{30}$, $a_{3}= \frac{127}{630}$,... The remaining computation is relatively 'comfortable' and are left to the reader...

Kind regards

$\chi$ $\sigma$
 
  • #56
Posted on 06 16 2012 on www.talkstat.com by the member Ramirez and not yet solved…

A light bulb manufacturer advertises that 'the average life of our new light bulb is 50,000 seconds. An immediate adjustment will be made on any bulb that does not last 50,000 seconds'. You purchased four of these bulbs. What is the probability that all four bulbs will last more than 50,000 seconds?...

Kind regards

$\chi$ $\sigma$
 
  • #57
chisigma said:
Posted on 06 16 2012 on www.talkstat.com by the member Ramirez and not yet solved…

A light bulb manufacturer advertises that 'the average life of our new light bulb is 50,000 seconds. An immediate adjustment will be made on any bulb that does not last 50,000 seconds'. You purchased four of these bulbs. What is the probability that all four bulbs will last more than 50,000 seconds?...

The life T of a manufactured article with mean life time $\tau$ has p.d.f. ...

$\displaystyle f(t)= \frac{1}{\tau}\ e^{- \frac{t}{\tau}},\ t>0$ (1)

... so that the probability that the life time is greater than $\tau$ is...

$\displaystyle P\{T>\tau\} = \frac{1}{\tau}\ \int_{\tau}^{\infty} e^{-\frac{t}{\tau}}\ dt = e^{-1}$ (2)

For 4 manufactured articles the probability that all life times are greater than $\tau$ is $P=e^{-4}$...

Kind regards

$\chi$ $\sigma$
 
  • #58
Posted on 06 21 2012 on www.matematicamente.it by the member sairaki87 [original in Italian language...] and not yet solved...

There are two fellows A and B and an urn with 50 white balls and 1 black ball. Alternatively A and B extract a ball and the winner is who extract the black ball. A first extracts. What is the probability for A and B to be the winner?...

Kind regards

$\chi$ $\sigma$
 
  • #59
chisigma said:
Posted on 06 21 2012 on www.matematicamente.it by the member sairaki87 [original in Italian language...] and not yet solved...

There are two fellows A and B and an urn with 50 white balls and 1 black ball. Alternatively A and B extract a ball and the winner is who extract the black ball. A first extracts. What is the probability for A and B to be the winner?...

Of course it is sufficient to compute the probability $P_{A}$ that A is the winner, and the probability that B is the winner is $P_{B}=1-P_{A}$. If n is the overall number of balls [n-1 whites and 1 black...], the probability that the black ball is extracted at the k-th extraction is...

$\displaystyle P_{k}=\frac{n-1}{n}\ \frac{n-2}{n-1}\ ... \frac{n-k+1}{n-k+2}\ \frac{1}{n-k+1}=\frac{1}{n}$ (1)

Now we have two possibilities...

a) k is even so that $\displaystyle P_{A}= \sum_{k\ \text{odd}} \frac{1}{n}= \frac{1}{2}$

b) k is odd so that $\displaystyle P_{A}= \sum_{k\ \text{odd}} \frac{1}{n}= \frac{n+1}{2\ n}$

Kind regards

$\chi$ $\sigma$
 
  • #60
Posted on 04 23 2012 on http://www.scienzematematiche.it/ by the user whitefang [original in Italian…] and not yet properly solved…

We are shooting at a target over a two-dimension plane. The horizontal and vertical distances of the hits respect to the target are normal r.v. with $\mu=0$ and $\sigma=4$. D is the distance between the hit and the target. Find $E\{D\}$...

Kind regards

$\chi$ $\sigma$
 
  • #61
chisigma said:
Posted on 04 23 2012 on http://www.scienzematematiche.it/ by the user whitefang [original in Italian…] and not yet properly solved…

We are shooting at a target over a two-dimension plane. The horizontal and vertical distances of the hits respect to the target are normal r.v. with $\mu=0$ and $\sigma=4$. R is the distance between the hit and the target. Find $E\{R\}$...

That is material for a basic course of probability. If X and Y are normal r.v. with mean 0 and variance $\sigma$, then $R=\sqrt{X^{2}+Y^{2}}$ is Rayleigh distributed, i.e. its p.d.f. is...

$\displaystyle f(r) =\frac{r}{\sigma^{2}}\ e^{- \frac{r^{2}}{2\ \sigma^{2}}}$ (1)

... and the expected value of D is...

$\displaystyle E\{R\}= \sigma\ \sqrt{\frac{\pi}{2}}$ (2)

See for more details...

http://mathworld.wolfram.com/RayleighDistribution.html

Kind regards

$\chi$ $\sigma$
 
  • #62

Posted on 04 19 2012 on www.mathhelpforum.com by the member cjtdevil and not yet solved…

How do you find a generating function for S(n,2) {S is the 2nd Stirling function} as a ratio of polynomials?...

That is not properly a probability question, even if it has been posted in the 'Advaced Statistic' section. Anyway it is interesting...

Kind regards

$\chi$ $\sigma$
 
  • #63
chisigma said:

Posted on 04 19 2012 on www.mathhelpforum.com by the member cjtdevil and not yet solved…

How do you find a generating function for S(n,2) {S is the 2nd Stirling function} as a ratio of polynomials?...

That is not properly a probability question, even if it has been posted in the 'Advaced Statistic' section. Anyway it is interesting...

Kind regards

$\chi$ $\sigma$

We can start from the definition of Second Kind Stirling Numbers...

$\displaystyle S(n,k)= \frac{1}{k!}\ \sum_{i=0}^{k} (-1)^{k-i}\ \binom{k}{i}\ i^{n}$ (1)

... that obey to the recursive relation...

$\displaystyle S(n+1,k)= k\ S(n,k)+ S(n,k-1)$ (2)

From (1) we derive $\displaystyle S(n,1)= 1-\delta_{n}$ and $\displaystyle S(n,2)=1 - 2^{n-1}- \frac{\delta_{n}}{2}$ so that from (2) we have...

$\displaystyle S(n+1,2)= 2^{n} \implies S(n,2)= 2^{n-1}$ (3)

... and its generating function is...

$\displaystyle f(x)=\frac{1}{2}\ \sum_{n=0}^{\infty} (2 x)^{n}= \frac{1}{2\ (1-2 x)}$ (4)

Kind regards

$\chi$ $\sigma$
 
  • #64
Posted on 05 10 2012 on www.talkstat.com by the member rogersticks and not yet solved…

A game is played as follows: A pile contains 1 dollar and a coin is flipped. Each time a heads occurs, the amount in the piled is doubled and if a tail appears, the pile is given to the player. How much money should be payed to play this game?...

Kind regards

$\chi$ $\sigma$
 
  • #65
chisigma said:
Posted on 05 10 2012 on www.talkstat.com by the member rogersticks and not yet solved…

A game is played as follows: A pile contains 1 dollar and a coin is flipped. Each time a heads occurs, the amount in the piled is doubled and if a tail appears, the pile is given to the player. How much money should be payed to play this game?...

The probability that a tail appears after n-1 consecutive heads is $\displaystyle p_{n}= 2^{-n}$ and the cost has been $\displaystyle c_{n}= 2^{n-1}-1$ so that the expected value of the cost is...

$\displaystyle C= \sum_{n=1}^{\infty} c_{n}\ p_{n} = \sum_{n=1}^{\infty} (\frac{1}{2} - 2^{-n})$ (1)

Now the series (1) diverges, so that the expected cost is unlimited. That is a 'paradox' of the same type of the 'Saint Petersburg Paradox'...

Kind regards

$\chi$ $\sigma$
 
  • #66
Posted the o6 30 2012 on www.matematicamente.it by superfox [original in Italian…] and not yet properly solved…

How to demonstrate that, given the r.v. X and Y uniformly distributed in (0,1), is $\displaystyle P \{X^{2}+Y^{2} <1 \}= \frac{\pi}{4}$ ?...

Kind regards

$\chi$ $\sigma$
 
  • #67
chisigma said:
Posted the o6 30 2012 on www.matematicamente.it by superfox [original in Italian…] and not yet properly solved…

How to demonstrate that, given the r.v. X and Y uniformly distributed in (0,1), is $\displaystyle P \{X^{2}+Y^{2} <1 \}= \frac{\pi}{4}$ ?...

If $X$ is uniformly distributed in (0,1), then the p.d.f. $f(x)$ of $X^{2}$ can be found as follows...

$\displaystyle P\{X^{2}<x \}= \int_{0}^{\sqrt{x}} d \xi = \sqrt{x} \implies f(x)=\begin{cases}\frac{1}{2\ \sqrt{x}} &\text{if}\ 0<x<1\\ 0 &\text{otherwise} \end{cases} $ (1)

The Laplace Transform of (1) is...

$\displaystyle F(s)= \mathcal {L}\{f(x) \}= \frac{\sqrt{\pi}}{2}\ \frac{1-e^{-s}}{\sqrt{s}}$ (2)

... so that the Laplace Transform of the p.d.f. $g(x)$ of the r,v, $Z=X^{2}+Y^{2}$ is...

$\displaystyle G(s)=F^{2}(s)= \frac{\pi}{4}\ \frac{(1-e^{-s})^{2}}{s}$ (3)

Now we are interested to the integral from 0 to 1 of the $g(x)$ that is...

$\displaystyle P\{Z<1\}= \mathcal{L}^{-1} \{\frac{G(s)}{s}\}_{x=1} = \frac{\pi}{4}$ (4)

Kind regards

$\chi$ $\sigma$
 
  • #68
chisigma said:
If $X$ is uniformly distributed in (0,1), then the p.d.f. $f(x)$ of $X^{2}$ can be found as follows...

$\displaystyle P\{X^{2}<x \}= \int_{0}^{\sqrt{x}} d \xi = \sqrt{x} \implies f(x)=\begin{cases}\frac{1}{2\ \sqrt{x}} &\text{if}\ 0<x<1\\ 0 &\text{otherwise} \end{cases} $ (1)

The Laplace Transform of (1) is...

$\displaystyle F(s)= \mathcal {L}\{f(x) \}= \frac{\sqrt{\pi}}{2}\ \frac{1-e^{-s}}{\sqrt{s}}$ (2)

... so that the Laplace Transform of the p.d.f. $g(x)$ of the r,v, $Z=X^{2}+Y^{2}$ is...

$\displaystyle G(s)=F^{2}(s)= \frac{\pi}{4}\ \frac{(1-e^{-s})^{2}}{s}$ (3)

Now we are interested to the integral from 0 to 1 of the $g(x)$ that is...

$\displaystyle P\{Z<1\}= \mathcal{L}^{-1} \{\frac{G(s)}{s}\}_{x=1} = \frac{\pi}{4}$ (4)

Kind regards

$\chi$ $\sigma$

Making hard work of a trivial problem: the question is asking for the area of the unit circle in the first quadrant, so without calculation the answer must be \(\pi/4\)

CB
 
  • #69
Posted on 07 05 2012 on www.mathhelpforum.com by the member Len and not yet solved…

Let $X_{1}$, $X_{2}$ and $X_{3}$ be i.i.d. r.v.'s with common probability density function...

$ f(x)=\begin{cases} e^{-x} &\text{if}\ x \ge 0\\ 0 &\text{otherwise} \end{cases} $

Find $\displaystyle P \{ X_{1}<X_{3} -X_{2} \}$...

Kind regards

$\chi$ $\sigma$
 
  • #70
chisigma said:
Posted on 07 05 2012 on www.mathhelpforum.com by the member Len and not yet solved…

Let $X_{1}$, $X_{2}$ and $X_{3}$ be i.i.d. r.v.'s with common probability density function...

$ f(x)=\begin{cases} e^{-x} &\text{if}\ x \ge 0\\ 0 &\text{otherwise} \end{cases} $

Find $\displaystyle P \{ X_{1}<X_{3} -X_{2} \}$...

The requested probability of course is $\displaystyle P \{X_{1} + X_{2} - X_{3} <0 \}$. The r.v. $X_{1}$ and $X_{2}$ have p.d.f. $ f(x)=\begin{cases} e^{-x} &\text{if}\ x \ge 0\\ 0 &\text{otherwise} \end{cases}$ and the r.v. $X_{3}$ has p.d.f $f(x)=\begin{cases} e^{x} &\text{if}\ x \le 0\\ 0 &\text{otherwise} \end{cases}$ so that the Fourier Transform of the p.d.f. of the r.v. $Z=X_{1} + X_{2} - X_{3}$ is...

$\displaystyle F(i\ \omega) = \frac{1}{4}\ \frac{1}{1+i\ \omega} + \frac{1}{2}\ \frac{1}{(1+i\ \omega)^{2}} + \frac{1}{4}\ \frac{1}{1-i\ \omega}$ (1)

From (1) we derive immediately that is $P\{X_{1} + X_{2} - X_{3} <0 \}= \frac{1}{4}$ ...

Kind regards

$\chi$ $\sigma$
 

Similar threads

Replies
133
Views
30K
Replies
69
Views
16K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
948
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
1K
Replies
2
Views
1K
  • Advanced Physics Homework Help
Replies
5
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
3K
Back
Top