How to prove asymptotic properties

  • Thread starter jostpuur
  • Start date
  • Tags
    Properties
In summary: Like an approximate solution? If so, what's the justification for approximating this way?I made it clear that I didn't mean anything specific with "\sim" notation there.
  • #1
jostpuur
2,116
19
I'm interested in solutions of an equation

[tex]
f'(x) = -\frac{xf(x)}{Af(x)+ Bx^2}
[/tex]

with some positive initial value [itex]f(x)>0[/itex], and with positive constants [itex]A,B>0[/itex].

First question: Does an explicit formula exist? I couldn't figure it out.

Second question:

I see that [itex]f(x)>0\implies f'(x)<0[/itex], and on the other hand a constant [itex]f=0[/itex] is a solution for all [itex]x>0[/itex]. So clearly [itex]0< f(x)<f(0)[/itex] will hold for all [itex]0<x<\infty[/itex]. Therefore with large [itex]x[/itex] we have [itex]Af(x)+Bx^2\approx Bx^2[/itex] and

[tex]
f'(x)\approx -\frac{f(x)}{Bx}
[/tex]

which implies that in some sense

[tex]
f(x) \sim x^{-\frac{1}{B}}
[/tex]

will probably hold. The second question is that how do you prove something rigor with this approximation. The approximations

[tex]
f(x) = O\big(x^{-\frac{1}{B}}\big)
[/tex]

and

[tex]
f(x)= Cx^{-\frac{1}{B}}+ O\big(x^{-\frac{1}{B}-1}\big)
[/tex]

probably hold, but how do you prove them?
 
Physics news on Phys.org
  • #2
jostpuur said:
I'm interested in solutions of an equation

[tex]
f'(x) = -\frac{xf(x)}{Af(x)+ Bx^2}
[/tex]

with some positive initial value [itex]f(x)>0[/itex], and with positive constants [itex]A,B>0[/itex].
Gee, this looks familiar. I thought we'd save this for a rainy day! :-p

jostpuur said:
First question: Does an explicit formula exist? I couldn't figure it out.

Second question:

I see that [itex]f(x)>0\implies f'(x)<0[/itex], and on the other hand a constant [itex]f=0[/itex] is a solution for all [itex]x>0[/itex]. So clearly [itex]0< f(x)<f(0)[/itex] will hold for all [itex]0<x<\infty[/itex]. Therefore with large [itex]x[/itex] we have [itex]Af(x)+Bx^2\approx Bx^2[/itex] and

[tex]
f'(x)\approx -\frac{f(x)}{Bx}
[/tex]

which implies that in some sense

[tex]
f(x) \sim x^{-\frac{1}{B}}
[/tex]

will probably hold. The second question is that how do you prove something rigor with this approximation. The approximations

[tex]
f(x) = O\big(x^{-\frac{1}{B}}\big)
[/tex]

and

[tex]
f(x)= Cx^{-\frac{1}{B}}+ O\big(x^{-\frac{1}{B}-1}\big)
[/tex]

probably hold, but how do you prove them?
How do you think we should start? You came to these conclusions, so you might know the path to get to them and just might not have realized it. Your intuition might be the best place to start with the rigor here.
 
  • #3
Mandelbroth said:
Gee, this looks familiar. I thought we'd save this for a rainy day! :-p

Yes it is no secret that this is coming from my attempts with the rain drop in mist problem from thread Solved Physics Challenge I: The Raindrop solved by mfb and voko

This ODE problem has become quite distant from the physics problem, so IMO own thread for this is ok. Also, techniques that solve this problem could also turn out useful with other ODEs.

How do you think we should start? You came to these conclusions, so you might know the path to get to them and just might not have realized it. Your intuition might be the best place to start with the rigor here.

No! No! The most promising way to start is of course to recall the rigor definitions of the asymptotic notations :wink:

jostpuur said:
[tex]
f(x) = O\big(x^{-\frac{1}{B}}\big)
[/tex]

This means that there exists [itex]C,R>0[/itex] such that

[tex]
x\geq R\implies f(x)\leq Cx^{-\frac{1}{B}}
[/tex]

Let's denote [itex]g(x)=Cx^{-\frac{1}{B}}[/itex]. The most obvious way to prove the desired inequality is to prove [itex]f(R)\leq g(R)[/itex] and [itex]f'(R)\leq g'(R)[/itex] which would imply it. In other words I should try to prove

[tex]
-\frac{xf(x)}{Af(x) + Bx^2} \leq -\frac{g(x)}{Bx}
[/tex]

for [itex]x\geq R[/itex]. The inequality is equivalent with

[tex]
Cx^{-\frac{1}{B}} \leq \frac{Bx^2 f(x)}{Af(x) + Bx^2}
[/tex]

How do you prove this? With large [itex]x[/itex] the right side is roughly [itex]\approx f(x)[/itex], which is in contradiction with the inequality that we are trying to prove.

So I'm feeling little confused now... This isn't looking like working. :frown:
 
  • #4
jostpuur said:
[tex]
Cx^{-\frac{1}{B}} \leq \frac{Bx^2 f(x)}{Af(x) + Bx^2}
[/tex]

How do you prove this? With large [itex]x[/itex] the right side is roughly [itex]\approx f(x)[/itex], which is in contradiction with the inequality that we are trying to prove.

So I'm feeling little confused now... This isn't looking like working. :frown:
I thought we were trying to prove ##f(x)\sim x^{-1/B}##...?

I'm honestly not seeing the logic in some of this. You say that ##f(x)=0## is a solution, but then say that ##0<f(x)##. I was hoping you'd shed some more light on your thinking behind your expressions, which may then show a way to prove them. I'm off to bed now, but hopefully I'll be able to work on this in the morning.
 
  • #5
Mandelbroth said:
I thought we were trying to prove ##f(x)\sim x^{-1/B}##...?

I made it clear that I didn't mean anything specific with "[itex]\sim[/itex]" notation there.

jostpuur said:
... which implies that in some sense

[tex]
f(x) \sim x^{-\frac{1}{B}}
[/tex]

will probably hold.

Starting with easiest results, I'm now trying to prove this:

jostpuur said:
...there exists [itex]C,R>0[/itex] such that

[tex]
x\geq R\implies f(x)\leq Cx^{-\frac{1}{B}}
[/tex]

How did you interpret "[itex]\sim[/itex]"? Do you see it as something simpler?
 
  • #6
jostpuur said:
How did you interpret "[itex]\sim[/itex]"? Do you see it as something simpler?
Typically, in topology and related areas, ##\sim## denotes an arbitrary equivalence relation. I assumed it meant something like "asymptotically equal to."

Suppose we have $$\frac{df}{dx}=-\frac{xf(x)}{Af(x)+Bx^2}=\operatorname{RHS}(x,f(x)).$$ Observe that ##|\frac{\partial \operatorname{RHS}}{\partial f}|=|\frac{Bx^3}{(Bx^2+Ay)^2}|##. What happens as x grows larger? Is there an interval of ##x## where ##\operatorname{RHS}## is continuous in ##x## and Lipschitz continuous in ##f##?
 
  • #7
jostpuur said:
The most obvious way to prove the desired inequality is to prove [itex]f(R)\leq g(R)[/itex] and [itex]f'(R)\leq g'(R)[/itex] which would imply it.

Little fix: I meant [itex]f'(x)\leq g'(x)[/itex] for [itex]x\geq R[/itex].
 
  • #8
jostpuur said:
I'm interested in solutions of an equation

[tex]
f'(x) = -\frac{xf(x)}{Af(x)+ Bx^2}
[/tex]

with some positive initial value [itex]f(x)>0[/itex], and with positive constants [itex]A,B>0[/itex].

First question: Does an explicit formula exist? I couldn't figure it out.

Write it as:
[tex]y'=-\frac{xy}{ay+bx^2}[/tex]
or:
[tex](xy)dx+(ay+bx^2)dy=0[/tex]
Now, can you find an integrating factor for that?
 
Last edited:
  • #9
jostpuur said:
[tex]
x\geq R\implies f(x)\leq Cx^{-\frac{1}{B}}
[/tex]

The way in which [itex]f[/itex] approaches zero seems to be very mysterious. For example, one might think it would be easier to prove

[tex]
f(x)\leq Cx^{-\frac{1}{B}+\epsilon}
[/tex]

for large [itex]x[/itex], but the condition

[tex]
f'(x)\leq C\Big(-\frac{1}{B}+\epsilon\Big)x^{-\frac{1}{B}-1+\epsilon}
[/tex]

eventually turns out to be equivalent with a claim that [itex]f[/itex] remains above some positive lower bound, which contradicts the convergence to zero.
 
  • #10
I just realized I've been careless with these derivative ideas. The condition [itex]f'\leq g'[/itex] only makes sense on finite intervals, or if [itex]f\to-\infty[/itex] faster than [itex]g\to-\infty[/itex], or something of that kind. If [itex]f\to 0[/itex] faster from positive side than [itex]g\to 0[/itex] then [itex]g'\leq f'[/itex] is inevitable. Oh dear... :redface: I'll need to check some ideas with logarithms next...

jackmell said:
[tex](xy)dx+(ay+bx^2)dy=0[/tex]
Now, can you find an integrating factor for that?

No I don't know how to find an integrating factor for that.
 
  • #11
jostpuur said:
I'm interested in solutions of an equation

[tex]
f'(x) = -\frac{xf(x)}{Af(x)+ Bx^2}
[/tex]

with some positive initial value [itex]f(x)>0[/itex], and with positive constants [itex]A,B>0[/itex].

First question: Does an explicit formula exist? I couldn't figure it out.

I can get an implicit formula valid for [itex]x > 0[/itex].

First introduce the dependent variable [itex]u = f(x)/f(0)[/itex] so that [itex]u(0) = 1[/itex]. Then introduce the independent variable [itex]t = x/\sqrt{f(0)^2A}[/itex]. Then we get the ODE
[tex]
\dot u = -\frac{ut}{u + ct^2}
[/tex]
where [itex]c = Bf(0) > 0[/itex]. Now
[tex]
-\frac{ut}{u + ct^2}
= -ut(ct^2)^{-1}\left(1 + \frac{u}{ct^2}\right)^{-1} = - \frac{u}{ct}
\left(1 + \frac{u}{ct^2}\right)^{-1}[/tex]
which suggests the substitution [itex]v = u/(ct^2)[/itex]. Then
[tex]
\dot u = -vt\frac{1}{1 + v}
[/tex]
and
[tex]
\dot v = \frac{\dot u}{ct^2} - \frac{2u}{ct^3} = \frac{\dot u}{ct^2} - \frac{2v}{t}
= -\frac{v}{ct}\left(\frac{1}{1 + v} + 2c\right)[/tex]
which is separable:
[tex]
\frac{1 + v}{v(1 + 2c + 2cv)} \dot v = -\frac{1}{ct}
[/tex]
and has general solution (for [itex]t > 0[/itex])
[tex]
v^{1/(1 + 2c)}(1 + 2c + 2cv)^{(1 - 2c)/(2c(1 + 2c))} = Dt^{-1/c}
[/tex]
Solving that for [itex]f(x) = (B/A)x^2 v(x/\sqrt{Af(0)^2})[/itex] is not attractive.

However, when [itex]t[/itex] is large, I think we have [itex]1 + 2c + 2cv = 1 + 2c + 2u/t^2 \sim 1 + 2c[/itex] so [itex]v \sim kt^{-(1 + 2c)/c}[/itex] for some constant [itex]k[/itex], and
[tex]f(x) \sim Kx^{2-(1 + 2c)/c} = Kx^{-1/(Bf(0))}[/tex]
for some other constant [itex]K[/itex].
 
Last edited:
  • #12
Now comes a successful proof!

The goal:

[tex]
f(x) = O(x^{-\frac{1}{B}+\epsilon})\quad\quad\textrm{when}\;x\to\infty
[/tex]

with all [itex]\epsilon>0[/itex]. The claim means that we must find [itex]R,C>0[/itex] such that

[tex]
R\geq x\quad\implies\quad f(x)\leq C x^{-\frac{1}{B}+\epsilon}
[/tex]

The inequality is equivalent with

[tex]
\log(f(x)) \leq \log(C) + \Big(-\frac{1}{B} +\epsilon\Big)\log(x)
[/tex]

Once [itex]R[/itex] has been fixed, [itex]C[/itex] can be chosen so that the inequality holds at [itex]x=R[/itex]. Then we need to prove

[tex]
\frac{f'(x)}{f(x)}\leq \Big(-\frac{1}{B}+\epsilon\Big)\frac{1}{x}
[/tex]

to get the other values [itex]x>R[/itex]. Using the ODE for [itex]f[/itex] we see that this is equivalent with

[tex]
1 - \epsilon B \leq \frac{Bx^2}{Af(x) + Bx^2}
[/tex]

Since the right side approaches one, we see that with sufficiently large [itex]x[/itex] the inequality comes true. Thus the [itex]R[/itex] can be found.

The asymptotic result has been proven! :cool:
 
  • #13
jostpuur said:
Now comes a successful proof!

The goal:

[tex]
f(x) = O(x^{-\frac{1}{B}+\epsilon})\quad\quad\textrm{when}\;x\to\infty
[/tex]

with all [itex]\epsilon>0[/itex]. The claim means that we must find [itex]R,C>0[/itex] such that

[tex]
R\geq x\quad\implies\quad f(x)\leq C x^{-\frac{1}{B}+\epsilon}
[/tex]

Should this not be [itex]x > R[/itex], since we're interested in the limit [itex]x \to \infty[/itex]?

The inequality is equivalent with

[tex]
\log(f(x)) \leq \log(C) + \Big(-\frac{1}{B} +\epsilon\Big)\log(x)
[/tex]

Once [itex]R[/itex] has been fixed, [itex]C[/itex] can be chosen so that the inequality holds at [itex]x=R[/itex].

Then we need to prove

[tex]
\frac{f'(x)}{f(x)}\leq \Big(-\frac{1}{B}+\epsilon\Big)\frac{1}{x}
[/tex]
to get the other values [itex]x>R[/itex].

I can't follow the remainder of your proof. However, there is an alternative:

We know [itex]f[/itex] is strictly decreasing, because its derivative is everywhere negative. So we know that for all [itex]R > 0[/itex], if [itex]x > R[/itex] then [itex]f(x) < f(R)[/itex]. Also log is strictly increasing, so if there exist [itex]C[/itex] and [itex]R[/itex] such that
[tex]
\log f(R) \leq \log C + (-B^{-1} + \epsilon)\log R
[/tex]
then we have immediately that if [itex]x > R[/itex] then
[tex]\log f(x) < \log f(R) \leq \log C + (-B^{-1} + \epsilon)\log R < \log C + (-B^{-1} + \epsilon)\log x[/tex]
as required.

Fix [itex]R[/itex]. Then the middle inequality holds provided [itex]C \geq f(R)/R^{-B^{-1} + \epsilon}[/itex]. Such a [itex]C[/itex] can always be found since the right hand side is finite.
 
  • #14
pasmith said:
[tex]f(x) \sim Kx^{2-(1 + 2c)/c} = Kx^{-1/(Bf(0))}[/tex]
for some other constant [itex]K[/itex].

This result contradicts the result I proved, for initial values [itex]f(0)>1[/itex].
 

FAQ: How to prove asymptotic properties

What is an asymptotic property?

An asymptotic property is a property of a function or sequence that describes its behavior as the input value approaches a certain limit, usually infinity. It is often used to analyze the growth rate of functions and compare the relative efficiency of algorithms.

How can we prove asymptotic properties?

To prove asymptotic properties, we can use various mathematical techniques such as limits, derivatives, integrals, and series. We can also use the formal definition of asymptotic notation, which involves comparing the growth rate of two functions as their input values approach infinity.

What is the significance of proving asymptotic properties?

Proving asymptotic properties is important because it allows us to understand the behavior of functions and sequences and make comparisons between them. This is useful in various fields such as computer science, engineering, and statistics, where we need to analyze the efficiency and performance of algorithms and models.

What are the commonly used asymptotic properties?

Some commonly used asymptotic properties include big O notation, big Omega notation, and big Theta notation. These notations describe the upper, lower, and tight bounds of a function's growth rate, respectively. They are often used in algorithm analysis to determine the time and space complexity of algorithms.

What are some challenges in proving asymptotic properties?

One of the main challenges in proving asymptotic properties is finding the appropriate mathematical techniques and tools to use. It can also be challenging to determine the tightest bound for a given function or sequence. Additionally, the complexity of the function or sequence may make it difficult to prove its asymptotic properties analytically, requiring the use of numerical methods instead.

Similar threads

Replies
2
Views
490
Replies
1
Views
2K
Replies
5
Views
2K
Replies
1
Views
2K
Replies
1
Views
2K
Replies
5
Views
2K
Replies
3
Views
1K
Back
Top