Micromass' big October challenge

In summary, the October challenge has been announced with a lot of suggested challenges from participants. The rules for the challenge are given and there are advanced challenges to solve. These include finding the trajectory of an object experiencing a centripetal force, proving the primitive recursiveness of certain functions, finding the number of generalized limits in ZFC, finding the curve of a pirate ship chasing a merchant vessel, determining the distribution of a random variable, finding the expected value and variance of a random variable, and proving the impossibility of reversing 2016 bells in an odd number of turns. Additionally, there is a task to find all 10-digit numbers with specific properties.
  • #71
Awesome, it appears (a) and (c) are solved then.
 
  • Like
Likes Charles Link
Mathematics news on Phys.org
  • #72
And I saw that (b) was solved too!
 
  • Like
Likes Charles Link
  • #73
Ok, I'll try advanced problem number 7. First, we note that ##X=b\tan{\theta}##. To get the expected value of ##X##, we use the law of the unconcscious statistician:
$$E[X(\theta)] = \int^{\frac{\pi}{2}}_{-\frac{\pi}{2}} X(\theta) f(\theta)d\theta$$
where ##f(\theta)## is the probability density function of ##\theta##. We know that ##\theta## is uniformly distributed over ##(-\pi/2,\pi/2)##, so ##f(\theta) = 1/\pi##. Thus the expected value of ##X## is
$$E[X(\theta)] =\frac{b}{\pi} \int^{\frac{\pi}{2}}_{-\frac{\pi}{2}} \tan{\theta} d\theta$$
Since ##\tan \theta## is an odd function and the integration is symmetric about ##x=0##, the expected value is ##E[X] = 0##.

For the variance, we have:
$$\sigma^2 = E([X^2])-(E[X])^2$$
so we need to evaluate ##E([X^2])##. We use the unconscious statistician again, which gives us the integral:
$$E[X^2] = \frac{b^2}{\pi} \int^{\frac{\pi}{2}}_{-\frac{\pi}{2}} \tan^2{\theta} d\theta$$
This integral diverges to infinity as we take the bounds of integration out to ##\pm \pi/2##. This would imply that the variance is infinite ##(\sigma^2 = \infty)##. I'm not sure if this is right, but my intuition says this makes sense, since there's a finite probability of ##|X|## being arbitrarily large. But I don't know that for certain, and I'm not sure how to make it more mathy.
 
  • #74
TeethWhitener said:
Ok, I'll try advanced problem number 7. First, we note that ##X=b\tan{\theta}##. To get the expected value of ##X##, we use the law of the unconcscious statistician:
$$E[X(\theta)] = \int^{\frac{\pi}{2}}_{-\frac{\pi}{2}} X(\theta) f(\theta)d\theta$$
where ##f(\theta)## is the probability density function of ##\theta##. We know that ##\theta## is uniformly distributed over ##(-\pi/2,\pi/2)##, so ##f(\theta) = 1/\pi##. Thus the expected value of ##X## is
$$E[X(\theta)] =\frac{b}{\pi} \int^{\frac{\pi}{2}}_{-\frac{\pi}{2}} \tan{\theta} d\theta$$
Since ##\tan \theta## is an odd function and the integration is symmetric about ##x=0##, the expected value is ##E[X] = 0##.

For the variance, we have:
$$\sigma^2 = E([X^2])-(E[X])^2$$
so we need to evaluate ##E([X^2])##. We use the unconscious statistician again, which gives us the integral:
$$E[X^2] = \frac{b^2}{\pi} \int^{\frac{\pi}{2}}_{-\frac{\pi}{2}} \tan^2{\theta} d\theta$$
This integral diverges to infinity as we take the bounds of integration out to ##\pm \pi/2##. This would imply that the variance is infinite ##(\sigma^2 = \infty)##. I'm not sure if this is right, but my intuition says this makes sense, since there's a finite probability of ##|X|## being arbitrarily large. But I don't know that for certain, and I'm not sure how to make it more mathy.

Your expected value is wrong.
 
  • #75
micromass said:
Your expected value is wrong.
Did I cut too many corners? Upon closer inspection, I get
$$\int^{\frac{\pi}{2}}_{-\frac{\pi}{2}} \tan{\theta} d \theta = \infty - \infty$$
so it's an indeterminate form. I have no idea how to handle this if not by the symmetry of the function.
 
  • #76
micromass said:
Your expected value is wrong.
Really? As far as I can see, the symmetry of the problem implies that the expectation is ##0##. If we take two infinitesimal intervals ##[-x-dx, x]## and ##[x,x+dx]## on the wall, corresponding to the angle intervals ##[-\theta-d\theta, \theta]## and ##[\theta,\theta+d\theta]##, respectively, both having probability ##d\theta/\pi##, their contributions to the expectation are ##-xd\theta/\pi## and ##xd\theta/\pi##, respectively, so they cancel each other out. Every infinitesimal interval has such a "mirror" interval on the other side of ##a##, so the expectation must be ##0##.

How could it be in any other way?
 
  • #77
Erland said:
Really? As far as I can see, the symmetry of the problem implies that the expectation is ##0##. If we take two infinitesimal intervals ##[-x-dx, x]## and ##[x,x+dx]## on the wall, corresponding to the angle intervals ##[-\theta-d\theta, \theta]## and ##[\theta,\theta+d\theta]##, respectively, both having probability ##d\theta/\pi##, their contributions to the expectation are ##-xd\theta/\pi## and ##xd\theta/\pi##, respectively, so they cancel each other out. Every infinitesimal interval has such a "mirror" interval on the other side of ##a##, so the expectation must be ##0##.

How could it be in any other way?
Then by that same reasoning, would you say that ##\int_{-\infty}^{+\infty} xdx = 0## too?
 
  • #78
micromass said:
Then by that same reasoning, would you say that ##\int_{-\infty}^{+\infty} xdx = 0## too?
So is it just an indeterminate form, like in my post #75 above?
 
  • #79
TeethWhitener said:
So is it just an indeterminate form, like in my post #75 above?

Yes, the expectation value doesn't exist.

You can do the test here: http://www.math.uah.edu/stat/apps/CauchyExperiment.html Run it for 1000 or 10000 turns. You'll see it getting close to 0, but then suddenly it'll jump away from 0 again.
 
  • #80
micromass said:
Yes, the expectation value doesn't exist.
Yes, Ok, I agree. The integral ##\int_{-\pi/2}^{\pi/2}\tan\theta d\theta## diverges. Sorry...

But then, the variance is undefined also, since it is defined in terms of the expectation: ##V(X)=E((X-E(X))^2)##.

So the entire problem was a poser... :wink:
 
  • Like
Likes TeethWhitener
  • #81
Erland said:
So the entire problem was a poser... :wink:

Haha, yes indeed!
 
  • #82
Erland said:
But then, the variance is undefined also
Good point. I was assuming that ##E[X]=0##. Call it the "renormalized" answer :smile:
 
  • #83
Ok, I'll try the unsolved advanced number 2. Last month, I was somewhat unadvisedly trying to show that ##dM_p/dp## was positive everywhere, but the hint this month helps out a whole bunch. We want to know the nature of the relationship:
$$\left(\sum_i {\frac{x_i^p}{n}}\right)^\frac{1}{p} \sim \left(\sum_i {\frac{x_i^q}{n}}\right)^\frac{1}{q}$$
For the moment, let us consider both terms raised to the power of ##p##. We have:
$$\sum_i {\frac{x_i^p}{n}}$$
and
$$\left(\sum_i {\frac{x_i^q}{n}}\right)^\frac{p}{q}$$
We notice that
$$\sum_i {\frac{\left(x_i^q\right)^{\frac{p}{q}}}{n}} = \sum_i {\frac{x_i^p}{n}}$$
which allows us to apply Jensen's inequality:
$$\left(\sum_i {\frac{x_i^q}{n}}\right)^\frac{p}{q} \leq \sum_i {\frac{\left(x_i^q\right)^{\frac{p}{q}}}{n}} = \sum_i {\frac{x_i^p}{n}}$$
for ##x^{p/q}## convex (2nd derivative ##> 0##) and
$$\left(\sum_i {\frac{x_i^q}{n}}\right)^\frac{p}{q} \geq \sum_i {\frac{x_i^p}{n}}$$
for ##x^{p/q}## concave (2nd derivative ##< 0##). For positive ##x##, ##x^{p/q}## is convex when ##p/q > 1## and ##p/q < 0## and concave otherwise. (This is true because the second derivative of ##x^a## is ##a(a-1)x^{a-2}## and the function ##a(a-1)## is a parabola which is only negative between 0 and 1).

We want to know what happens for ##p>q##. This condition breaks the problem into 3 cases: ##p## and ##q## both positive (convex), ##p## positive and ##q## negative (convex), and ##p## and ##q## both negative (concave). For the convex case, we have
$$(M_p)^p \geq (M_q)^p$$
Since ##p## is positive whenever ##x^{p/q}## is convex, this result implies that ##M_p \geq M_q##. For the concave case, we have
$$(M_p)^p \leq (M_q)^p$$
Since ##p## is negative whenever ##x^{p/q}## is concave, this result also implies that ##M_p \geq M_q##, which completes the proof.
 
  • Like
Likes micromass
  • #84
Shreyas Samudra said:
https://mail.google.com/mail/u/0/?u...68608010&rm=15794b3f1927dbe0&zw&sz=w1366-h662

https://mail.google.com/mail/u/0/?ui=2&ik=8afd5aa52a&view=fimg&th=15794b3f1927dbe0&attid=0.1&disp=inline&realattid=1547350341991792640-local0&safe=1&attbid=ANGjdJ9AL_gRE5sXmUYaevAhKSEVWeCDGMrM2iIlWdWoI_HQXuUtcdRj5gRNTsNVhx1fD7uUlFXVMOekFVnhgzPXCNVO8-rWTr1U5G2r1pmTcTPjii6g4adAZZBM6Ic&ats=1475668608010&rm=15794b3f1927dbe0&zw&sz=w1366-h662
Is it right ??
20161002_235107_1.jpg
free adult image hosting
 
  • #86
Attempting Advanced Problem 6:

##X## and ##Y## are independent stochastic variables which both have uniform distrubution on ##[0,1]##. This means that for every Lebesgue mesaurable set ##A\subseteq \mathbb R^2##, ##P((X,Y)\in A)=m(A\cap[0,1]^2)##, where ##m## is the Lebesgue measure on ##\mathbb R^2##. Also, for every Lebesgue measurable set ##B\subseteq \mathbb R##, ##P(X^Y\in B)=m(\{(x,y)\in[0,1]^2\,|\,x^y\in B\})##.

The distribution of ##X^Y## is determined by its cumulative distribution function ##F:\mathbb R \to[0,1]##, given by ##F(t)=P(X^Y\le t)=m(\{(x,y)\in[0,1]^2\,|\,x^y\le t\})##, for all ##t\in\mathbb R##. To find the distribution, it should suffice to find ##F##.

The function ##f(x,y)=x^y##, defined on ##[0,1]^2## (for definiteness, we define ##f(0,0)=0^0=1##, but this does not really matter) is increasing in ##x## and decreasing in ##y##. ##f## is continuous on all ##[0,1]^2## except at ##(0,0)##, so it is Lebesgue measurable. We have ##f(x,0)=1## and ##f(x,1)=x##, for all ##x\in[0,1]##, and ##f(0,y)=0## and ##f(1,y)=1## for all ##y\in\,]0,1]##, so the range of ##f## is ##[0,1]##.

Hence, ##F(t)=0## for ##t<0## and ##F(t)=1## for ##t>1##.
For ##t\in[0,1]## and ##(x,y)\in[0,1]\times\,]0,1]##, we have ##x^y\le t\Leftrightarrow x\le t^{\frac1y}(\le 1)##, since ##\frac1y\ge 1##. For each ##\epsilon\in\,]0,1[\,##, we then have ##m(\{(x,y)\in[0,1]\times[\epsilon,1]\,|\,x^y\le t\})=\int_\epsilon^1 t^{\frac1y}dy##. This tends to both ##F(t)## (since ##m(\{(x,0)\,|\,0\le x\le 1\})=0)## and ##\int_0^1t^{\frac1y}## (since this improper integral converges, because the integrand is continuous and bounded on ##]0,1]\,##) as ##\epsilon\to 0##. Hence ##F(t)=\int_0^1 t^{\frac1y} dy##.
I don't know of any closed expression for this integral, and I doubt that such an expression exists or is known.

So my answer is that the desired distribution is given by the cumulative distribution function:

[tex]F(t)=\begin{cases} 0 & t<0,\\ \int_0^1 t^{\frac1y}dy,& 0\le t \le 1, \\1,&t>1.\end{cases} [/tex]
 
  • Like
Likes micromass
  • #87
I just posted one answer, and only after replying I noticed that it said this:

micromass said:
CHALLENGES FOR HIGH SCHOOL AND FIRST YEAR UNIVERSITY:

before that problem. Whoops... Did that mean that I should not have posted that answer?
 
  • #88
jostpuur said:
Whoops... Did that mean that I should not have posted that answer?
Yes, and we've removed it.
 
  • #89
Chestermiller said:
My solution to this problem for x vs y, expressed parametrically in terms of ##\theta## is as follows:
$$x=x_0\left[1-\frac{1}{(\sec{\theta}+\tan{\theta})^{V/v}}\right]$$
$$y=x_0\left[\int_0^{\theta}{\frac{\sec^2{\theta '}d\theta '}{(\sec{\theta '}+\tan{\theta '})^{V/v}}}-\frac{\tan {\theta}}{(\sec{\theta}+\tan{\theta})^{V/v}}\right]$$where ##\theta '## is a dummy variable of integration.

Charles, your analytic solution to this problem should match mine, and should thus somehow provide the result of correctly integrating of my "mystery integral" in the equation for y. Could you please see if you can back out the integral evaluation? Thanks.

Chet
I finally got the integral in the parameteric equation for y. @fresh_42 submitted the integrand to Wolfram Alpha, and it provided the closed-form result. Thanks you so much @fresh_42. Here is the desired result:
$$y=\frac{x_0(V/v)}{\left(\frac{V}{v}\right)^2-1}\left[1-\frac{(\sec{\theta}+(V/v)\tan{\theta})}{(\sec{\theta}+\tan{\theta})^{V/v}}\right]$$
I hope I did the "arithmetic" correctly and that this result agrees with Charles Links' results.
 
Last edited:
  • Like
Likes Charles Link
  • #90
problem 4 highshool:
x coordinate = ##-1+\frac{1}{2}-\frac{1}{4}... = -1 + \frac{1}{2}(1-\frac{1}{2}+\frac{1}{3}...) = -1 + \frac{log(2)}{2}##
y coordinate = ##1-\frac{1}{3}+\frac{1}{5}..=\frac{\pi }{4}##
converges to the point ##\left ( -1+\frac{log(2)}{2},\frac{\pi }{4} \right )##
 
  • #91
MAGNIBORO said:
problem 4 highshool:
x coordinate = ##-1+\frac{1}{2}-\frac{1}{4}... = -1 + \frac{1}{2}(1-\frac{1}{2}+\frac{1}{3}...) = -1 + \frac{log(2)}{2}##
y coordinate = ##1-\frac{1}{3}+\frac{1}{5}..=\frac{\pi }{4}##
converges to the point ##\left ( -1+\frac{log(2)}{2},\frac{\pi }{4} \right )##

Read the problem more carefully. The lengths are defined recursive.
 
  • #92
micromass said:
Read the problem more carefully. The lengths are defined recursive.
my mistake,
x coordinate: series of cos is ##1-\frac{x^2}{2!}+\frac{x^4}{4!}-\frac{x^6}{6!}...## so ##-1+\frac{1}{2!}-\frac{1}{4!}+\frac{1}{6!}...=-cos(1)##
y coordinate: series of sin is ##x-\frac{x^3}{3!}+\frac{x^5}{5!}-\frac{x^7}{7!}...## so ## 1-\frac{1}{3!}+\frac{1}{5!}-\frac{1}{7!}...=sin(1)##

for problem 9:
##I_{n}= \int_{0}^{1}x^n\sqrt{1-x}\, dx = \beta (n+1,\frac{3}{2})=\frac{n! \, \frac{\sqrt{\pi }}{2}}{(n+\frac{3}{2})!}##
and ##I_{n-1}= \frac{(n-1)! \, \frac{\sqrt{\pi }}{2}}{(n+\frac{1}{2})!}##
so
##I_{n-1} \: \frac{2n}{2n+3}= \frac{(n-1)! \, \frac{\sqrt{\pi }}{2}}{(n+\frac{1}{2})!} \: \frac{n}{n+\frac{3} {2}} = \frac{n! \, \frac{\sqrt{\pi }}{2}}{(n+\frac{3}{2})!} = I_{n}##
 
  • Like
Likes micromass
  • #93
for high school problem 5 (very entertaining) :
##A(1,n) = A(0,A(1,n-1)) = 1 + A(1,n-1)##
##1 + A(1,n-1) = 2 + A(1,n-2) ##
##...##
## A(1,n) = n + A(1,0) = n+2##

##A(2,0) = A(1,1) = 1 + 2 = 3##
##A(2,n) = A(1,A(2,n-1)) = A(2,n-1) + 2##
##2 + A(2,n-1) = 4 + A(2,n-2)##
##...##
## A(2,n) = 2n + 3 ##

##A(3,0) = A(2,1) = 5##
##A(3,1) = A(2,A(3,0)) = 2 A(3,0) + 3 = 13##
##A(3,2) = 2 A(3,1) +3 = 29 ##
##A(3,3) = 2 A(3,2) + 3 = 61 ##
now note that
## A(3,1) - A(3,0) = 8##
## A(3,2) - A(3,1) = 16##
## A(3,3) - A(3,2) = 32##
so
##A(3,n+1) = 2 A(3,n) + 3 = A(3,n) + 2^{n+3}##
## A(3,n) = 2^{n+3} - 3##
 
  • Like
Likes Erland and micromass
  • #94
Problem 10:
$$\int_{-1}^{1}\frac{dx}{x^2+1}=\frac{\pi }{2}$$
$$\int_{-1}^{1}\frac{e^{x}+1}{(x^2+1)(e^{x}+1)}=\frac{\pi }{2}$$
$$\int_{-1}^{1}\frac{e^{x}}{(x^2+1)(e^{x}+1)}+\int_{-1}^{1}\frac{1}{(x^2+1)(e^{x}+1)}=\frac{\pi }{2}$$
we can assume that the 2 integrals are equal, and this is true if
$$\int_{-1}^{1}\frac{1-e^{x}}{(x^2+1)(e^{-x}+1)}=0$$
That would be true if ##\frac{1-e^{x}}{(x^2+1)(e^{-x}+1)}## is a odd function
$$f(-x)=-f(x)$$
$$\frac{1-e^{-x}}{(x^2+1)(e^{-x}+1)}=-\frac{1-e^{x}}{(x^2+1)(e^{x}+1)}$$
$$\frac{-1+e^{x}}{(x^2+1)(e^{x}+1)}=-\frac{1-e^{x}}{(x^2+1)(e^{x}+1)}$$
so
$$\int_{-1}^{1}\frac{e^{x}}{(x^2+1)(e^{x}+1)}=\int_{-1}^{1}\frac{1}{(x^2+1)(e^{x}+1)}$$
$$\int_{-1}^{1}\frac{1}{(x^2+1)(e^{x}+1)}=\frac{\pi }{4}$$
this problem is very good, and It has a very beautiful relationship:
$$\int_{-1}^{1}\frac{1}{(x^2+1)(e^{-x}+1)}+\int_{-1}^{1}\frac{1}{(x^2+1)(e^{x}+1)}=\frac{\pi }{2}$$Problem 6:
$$I=\int_{0}^{\frac{\pi }{2}}\frac{sin(x)cos(x)}{x+1}=\frac{1}{2}\left ( \left[ \frac{sin^2(x)}{x+1} \right]_{0}^{\frac{\pi }{2}}+\int_{0}^{\frac{\pi }{2}}\frac{sin^2(x)}{(x+1)^2}
\right )=\frac{1}{2}\left ( \frac{2}{\pi+2 }+ \frac{1}{2}\left ( \int_{0}^{\frac{\pi }{2}}\frac{1}{(x+1)^2}-\int_{0}^{\frac{\pi }{2}}\frac{cos(2x)}{(x+1)^2} \right )\right )$$
$$=\frac{1}{\pi +2}+\frac{\pi}{4\pi+8}-\frac{1}{4}\int_{0}^{\frac{\pi}{2}}\frac{cos(2x)}{(x+1)^2}$$
$$=\frac{1}{\pi +2}+\frac{\pi}{4\pi+8}-\frac{1}{4}\int_{1}^{1+\frac{\pi}{2}}\frac{cos(2u-2)}{u^2}$$
$$=\frac{1}{\pi +2}+\frac{\pi}{4\pi+8}-\frac{1}{4}\left (cos(2)\int_{1}^{1+\frac{\pi}{2}}\frac{cos(2u)}{u^2}+sin(2)\int_{1}^{1+\frac{\pi}{2}}\frac{sin(2u)}{u^2} \right )$$
$$I=\frac{1}{\pi +2}+\frac{\pi}{4\pi+8}-\frac{1}{2}\left (cos(2)\int_{2}^{2+\pi}\frac{cos(u)}{u^2}+sin(2)\int_{2}^{2+\pi}\frac{sin(u)}{u^2} \right )$$
introduce M
$$I=\frac{1}{\pi +2}+\frac{\pi}{4\pi+8}-\frac{M}{2}$$

now the other integral.

$$J=\int_{0}^{\pi }\frac{cos(x)}{(x+2)^2}=\int_{2}^{\pi +2}\frac{cos(u-2)}{u^2}=cos(2)\int_{2}^{2+\pi}\frac{cos(u)}{u^2}+sin(2)\int_{2}^{2+\pi}\frac{sin(u)}{u^2}$$
$$J=M$$
$$I=\frac{1}{\pi +2}+\frac{\pi}{4\pi+8}-\frac{J}{2}$$
 
  • Like
Likes mfb
  • #95
In Problem 2) ("Let p≠0 be a real number ...") it is interesting to look at what happens as p → 0. The problem is still meaningful in that case!
 
  • #96
so what about the answer to the birds on wire problem? Really eager to find out what the answer and proof is :)
 
  • #97
A clue to advanced problem 3 would also be appreciated...
 
  • #98
CHALLENGES FOR HIGH SCHOOL AND FIRST YEAR UNIVERSITY:

1) let A,B,C,D be a complex numbers with length 1. Prove that if A+B+C+D=0, then these four numbers form a rectangle.

2) On an arbitrary triangle, we produce on each side an equilateral triangle. Prove that the centroids of these three triangles forms an equilateral triangle
  • Are they unsolved
 
  • #99
parshyaa said:
CHALLENGES FOR HIGH SCHOOL AND FIRST YEAR UNIVERSITY:

1) let A,B,C,D be a complex numbers with length 1. Prove that if A+B+C+D=0, then these four numbers form a rectangle.

2) On an arbitrary triangle, we produce on each side an equilateral triangle. Prove that the centroids of these three triangles forms an equilateral triangle
  • Are they unsolved
Hey i got the answer to the second of these questions, please tell me it is solved or unsloved
 
  • #100
IMG_20161124_150230.jpg
IMG_20161124_150408.jpg
 
  • Like
Likes MAGNIBORO
  • #101
If they are not marked as solved (and if the last 2-3 posts don't cover them), then no one posted a solution yet.
 
  • #102
mfb said:
If they are not marked as solved (and if the last 2-3 posts don't cover them), then no one posted a solution yet.
It means they are unsloved, thanks
 
  • #103
  • #104
Where is micromass
 
  • #105
Currently too busy to maintain the challenges threads.
 
  • Like
Likes parshyaa

Similar threads

Replies
3
Views
1K
Replies
1
Views
1K
Replies
1
Views
828
Replies
1
Views
10K
3
Replies
77
Views
11K
Replies
1
Views
937
Back
Top