- #1
MathematicalPhysicist
Gold Member
- 4,699
- 373
let f(x) be continuously differentiable in [0,infinity) such that the derivative f'(x) is bounded. suppose that the integral [tex]\int_{a}^{\infty}|f(x)|dx[/tex] converges. prove that f(x)->0 when x->infinity.
now, here's what i did:
f'(x) is bounded then for every x>=0 |f'(x)|<=M for some M>0.
now bacause the integral [tex]\int_{a}^{\infty}|f(x)|dx[/tex] converges for some a>0, then by cauchy criterion we have that for every e>0 there exists B such that for every b1>b2>B>0 [tex] \int_{b2}^{b1}|f(x)|dx<e[/tex] now we can use the inequality: [tex] |\int_{b2}^{b1}f(x)dx|<=\int_{b2}^{b1}|f(x)|dx<e[/tex].
now, the crucial point here is that i need to show that for every e>0 there exists M'>0 such that for every x>M' |f(x)|<e.
now i think i can use the mean value theorem for integral here, so there exists a point x in (b2,b1) such that [tex]\int_{b2}^{b1}f(t)dt=f(x)(b1-b2)[/tex] and then we have that for every x greater than B
|f(x)|(b1-b2)<e so |f(x)|<e/(b1-b2) but this doesn't work cause b1 and b2 aren't constants, so perhaps here i should use that the derivative is bounded, so by lagrange theorem we have that: |f(b1)-f(b2)|<=M(b1-b2) but still i don't see how to connect both of these theorems to prove this statement.
any hints?
now, here's what i did:
f'(x) is bounded then for every x>=0 |f'(x)|<=M for some M>0.
now bacause the integral [tex]\int_{a}^{\infty}|f(x)|dx[/tex] converges for some a>0, then by cauchy criterion we have that for every e>0 there exists B such that for every b1>b2>B>0 [tex] \int_{b2}^{b1}|f(x)|dx<e[/tex] now we can use the inequality: [tex] |\int_{b2}^{b1}f(x)dx|<=\int_{b2}^{b1}|f(x)|dx<e[/tex].
now, the crucial point here is that i need to show that for every e>0 there exists M'>0 such that for every x>M' |f(x)|<e.
now i think i can use the mean value theorem for integral here, so there exists a point x in (b2,b1) such that [tex]\int_{b2}^{b1}f(t)dt=f(x)(b1-b2)[/tex] and then we have that for every x greater than B
|f(x)|(b1-b2)<e so |f(x)|<e/(b1-b2) but this doesn't work cause b1 and b2 aren't constants, so perhaps here i should use that the derivative is bounded, so by lagrange theorem we have that: |f(b1)-f(b2)|<=M(b1-b2) but still i don't see how to connect both of these theorems to prove this statement.
any hints?