Using a Logarithmic Transformation for a Simpler Random Walk Model

In summary, the conversation discusses various mathematical concepts related to the sequence ##M_n##. These include the computation of ##E(\log(M_n))##, the proof of ##M_{\infty}=0##, and the existence of a finite value ##C## such that ##\mathbb{E}[M^2_n]\leq C## for all ##n##. The conversation also includes a discussion about the correctness of certain answers and the use of the Law of Large Numbers in proving certain results.
  • #1
WMDhamnekar
MHB
381
28
Homework Statement
Let ## X_1, X_2, . . . ## be independent, identically distributed random variables with ##\mathbb{P}\{X_j=2\} =\frac13 , \mathbb{P} \{ X_j = \frac12 \} =\frac23 ##

Let ##M_0=1 ## and for ##n \geq 1, M_n= X_1X_2... X_n ##

1. Show that ##M_n## is a martingale.

2. Explain why ##M_n## satisfies the conditions of the martingale convergence theorem.

3. Let ##M_{\infty}= \lim\limits_{n\to\infty} M_n.## Explain why ##M_{\infty}=0## (Hint: there are at least two ways to show this. One is to consider ##\log M_n ## and use the law of large numbers. Another is to note that with probability one ##M_{n+1}/M_n## does not converge.)

4. Use the optional sampling theorem to determine the probability that ## M_n## ever attains a value as large as 64.

5. Does there exist a ##C < \infty ## such that ##\mathbb{E}[ M^2_n ] \leq C \forall n ##
Relevant Equations
No relevant equations
Answer to 1.
1676621992225.png

Answer to 2.

1676622068635.png


How would you answer rest of the questions 4 and 5 ?
 
Physics news on Phys.org
  • #2
Why does ##ln(M_n)=ln(M_{n-1})##?
 
  • #3
Office_Shredder said:
Why does ##ln(M_n)=ln(M_{n-1})##?
Sorry, if ##\log{M_n} =0, \log{M_{n-1}}=\frac13 \log{2}## or ##\frac23 \log{\frac12}##

Whatever it may be, we can't say ## M_{\infty} =0 ##
 
  • #4
WMDhamnekar said:
Sorry, if ##\log{M_n} =0, \log{M_{n-1}}=\frac13 \log{2}## or ##\frac23 \log{\frac12}##

Whatever it may be, we can't say ## M_{\infty} =0 ##

This feels like you're looking backwards - why are you computing what ##\log(M_{n-1})## ?

What is ##E(\log(M_{n+1})-\log(M_n))##
 
  • #5
Office_Shredder said:
This feels like you're looking backwards - why are you computing what ##\log(M_{n-1})## ?

What is ##E(\log(M_{n+1})-\log(M_n))##
##(\log\{E[M_{n+1}]=1\} -\log\{E[M_n=1]\}) = 0-0 =0##
 
  • #6
The expected value of log of ##M_n## s not 1. Your answer for part three is wrong, the book is right. Try computing that expected value from the definition.
 
  • #7
Office_Shredder said:
The expected value of log of ##M_n## s not 1. Your answer for part three is wrong, the book is right. Try computing that expected value from the definition.
##E[X_n] = 2\times \frac13 + \frac12\times \frac23= 1##
##E[M_n]=X_1X_2...X_n= 1= E[M_0]##
 
  • #8
WMDhamnekar said:
##E[X_n] = 2\times \frac13 + \frac12\times \frac23= 1##
##E[M_n]=X_1X_2...X_n= 1= E[M_0]##

##E(\log(M_n))\neq \log(E(M_n))##!
 
  • Informative
Likes WMDhamnekar
  • #9
Office_Shredder said:
##E(\log(M_n))\neq \log(E(M_n))##!
That means you want to say ##\lim\limits_{n\to\infty}E[\log{M_{\{n+1\}}}=0]=0 ##

##\therefore## by using the Law of large numbers ##M_{\infty}=0##
Author said another way to prove ##M_{\infty}=0## is ## \mathbb{P}[\lim\limits_{n\to\infty}\displaystyle\sum_{n=0}^{n}\frac{M_{n+1}}{M_n}=\infty]## i-e the sequence ##\frac{M_{n+1}}{M_n} ## does not converge.

Answer to 4.
After using the Optional sampling theorem I determined the ##\mathbb{P}[M_n=64]=\frac{1}{3^6}## Is this answer correct?
 
Last edited:
  • #10
WMDhamnekar said:
That means you want to say ##\lim\limits_{n\to\infty}E[\log{M_{\{n+1\}}}=0]=0 ##

##\therefore## by using the Law of large numbers ##M_{\infty}=0##
Author said another way to prove ##M_{\infty}=0## is ## \mathbb{P}[\lim\limits_{n\to\infty}\displaystyle\sum_{n=0}^{n}\frac{M_{n+1}}{M_n}=\infty]## i-e the sequence ##\frac{M_{n+1}}{M_n} ## does not converge.

This notation doesn't make any sense to me to be honest. Maybe we can start with, what is ##E(\log(M_1))##?

Answer to 4.
After using the Optional sampling theorem I determined the ##\mathbb{P}[M_n=64]=\frac{1}{3^6}## Is this answer correct?

That's the odds you hit 2 six times in a row at the start, so it has to be too small. Can you show your work?
 
  • Informative
Likes WMDhamnekar
  • #11
Office_Shredder said:
This notation doesn't make any sense to me to be honest. Maybe we can start with, what is ##E(\log(M_1))##?
That's the odds you hit 2 six times in a row at the start, so it has to be too small. Can you show your work?
Sorry, :sorry: Answer to 3 and 4 are wrong. Answer to 4 is ##\frac{1}{2^6}##

Now, Let me move on to answer 3.
##M_n= X_1X_2...X_n \therefore M_1= X_1.## Now ##X_1## may be 2 or ##\frac12 \therefore \log{M_1}=0, \therefore E[M_1]=1=E[M_0] ## So,my answer is still ##M_{\infty}=1## But the author said ##M_{\infty}=0##

How is that?😕🤔
 
  • #12
WMDhamnekar said:
Sorry, :sorry: Answer to 3 and 4 are wrong. Answer to 4 is ##\frac{1}{2^6}##

Now, Let me move on to answer 3.
##M_n= X_1X_2...X_n \therefore M_1= X_1.## Now ##X_1## may be 2 or ##\frac12 \therefore \log{M_1}=0, \therefore E[M_1]=1=E[M_0] ## So,my answer is still ##M_{\infty}=1## But the author said ##M_{\infty}=0##

How is that?😕🤔

You haven't done anything with the fact that 2 and 1/2 are not equally likely!

##E(\log(M_1))= \frac{1}{3}\log(2)+\frac{2}{3}\log(\frac{1}{2})## . This is *not* equal to 0
 
  • Informative
Likes WMDhamnekar
  • #13
Office_Shredder said:
You haven't done anything with the fact that 2 and 1/2 are not equally likely!

##E(\log(M_1))= \frac{1}{3}\log(2)+\frac{2}{3}\log(\frac{1}{2})## . This is *not* equal to 0
Answer to 3.

## \because \lim\limits_{n\to\infty} M_n= 2^n\times (\frac13)^n +(\frac12)^n \times (\frac23)^n =0 \therefore M_{\infty}=0##

Answer to 5.
Yes. There exists a ##(C < \infty ): \mathbb{E} [M^2_n]\leq C \forall n ##
 
Last edited:
  • #14
I think your answer to 4 is correct, but without seeing any work I can't say if you got it the right way.

It still looks like you're just writing random strings of symbols for number 3 (like literally, are you just putting stuff into chatgpt?) The limit you've written doesn't correspond to the limit of any object that depends on n in the problem. We don't have to cover it if you just wanted to focus on the later parts though.
 
  • Informative
Likes WMDhamnekar
  • #15
Office_Shredder said:
I think your answer to 4 is correct, but without seeing any work I can't say if you got it the right way.

It still looks like you're just writing random strings of symbols for number 3 (like literally, are you just putting stuff into chatgpt?) The limit you've written doesn't correspond to the limit of any object that depends on n in the problem. We don't have to cover it if you just wanted to focus on the later parts though.
Answer to 5.
1678461525096.png

1678461543908.png

Is the above answer correct?
Note: This answer is provided to me by Chat.G.P.T.

I don't understand this answer from second step onwards. If this is the correct answer, would any member explain me this answer?
 

Attachments

  • 1678460621968.png
    1678460621968.png
    16.1 KB · Views: 93
Last edited:
  • #16
I don't agree with chatgpt's answer given by me in #15.
My own computed answer is as follows:
1678698051468.png


Is my answer correct?
 
Last edited:
  • #17
Wouldn’t it be a lot simpler to write ##Y_i=\log_2(X_i)##, making ##\Sigma_0^n Y_i## a random walk?
 

FAQ: Using a Logarithmic Transformation for a Simpler Random Walk Model

What is a logarithmic transformation in the context of a random walk model?

A logarithmic transformation involves applying the natural logarithm to the values in a dataset. In the context of a random walk model, this transformation can help stabilize the variance and make the data more normally distributed, which simplifies the analysis and modeling of the random walk.

Why would one use a logarithmic transformation for a random walk model?

Using a logarithmic transformation for a random walk model can be beneficial because it can linearize exponential growth patterns, stabilize variance, and make the data more homoscedastic. This transformation can also make it easier to apply statistical techniques that assume normality and constant variance.

How does a logarithmic transformation affect the properties of a random walk?

A logarithmic transformation can change the additive properties of a random walk to multiplicative properties. This means that instead of modeling the changes as additive increments, the model now deals with multiplicative changes, which can be more appropriate for certain types of data, such as financial time series.

Are there any limitations to using a logarithmic transformation on a random walk model?

Yes, there are limitations. One major limitation is that the logarithmic transformation is only defined for positive values. This means it cannot be applied directly to datasets that contain zero or negative values. Additionally, interpreting the results can be more complex, and the transformation might not be suitable for all types of data or all modeling purposes.

How do you interpret the results after applying a logarithmic transformation to a random walk model?

After applying a logarithmic transformation, the interpretation of the results changes. For example, if you are modeling financial returns, the transformed data represents the logarithmic returns rather than absolute returns. This means that the differences in the transformed data correspond to percentage changes rather than absolute changes. Understanding this distinction is crucial for accurate interpretation and analysis.

Similar threads

Replies
6
Views
2K
Replies
29
Views
2K
Replies
10
Views
1K
Replies
3
Views
2K
Replies
1
Views
3K
Replies
6
Views
2K
Replies
5
Views
3K
Back
Top