Which of the following statements are true? (Real Analysis question)

In summary, the question asked if a given positive function is integrable on a certain interval, and it appears that the answer might be D.
  • #1
CGandC
326
34
Thread moved from the technical forums to the schoolwork forums
Summary:: x

Problem:
Let ##f:[0, \infty) \rightarrow \mathbb{R}## be a positive function s.t. for all ## M > 0 ## it occurs that ## f ## is integrable on ## [0,M] ##. Which of the following statements are true?

A. If ##\lim _{x \rightarrow+\infty} f(x)=0## then ##\int_{0}^{\infty} f(x) d x## exists and is finite.
B. If ##\int_{0}^{\infty} f(x) d x## exists and is finite then ##\lim _{x \rightarrow+\infty} f(x)=0##.
C. If ##\int_{0}^{\infty} f(x) d x## exists and is finite then ##\int_{0}^{\infty} f\left(x^{2}\right) d x## exists and is finite.
D. None of the above.

Context: I'm getting ready for an exam and I'm solving past exams. The above question was part of a past-exam, but I don't have any answers so I'm asking here to be , can you please help? I'm not sure.

Attempt:
I marked 'D' as the answer, here are my counter examples for A,B,C:
Counter-example for ## A##:
## f(x)=\begin{cases} 0 &\text{if}\; x \in [ 0,1] \\\\ \frac{1}{\sqrt{x}} & 1<x \; \end{cases} ##
Counter-example for ## B##: ## f(x) = \arctan x ##
Counter example for ## C##: the same ## f ## as in the counter example for A.
 
Physics news on Phys.org
  • #2
A counterexample for A is a case where the first part of A (call it A1) is true and the second part (call it A2) is false. Your counterexample for A is correct.
A counterexample for C is one where C1 is true and C2 is false.
But C1 is the same as A2. So a counterexample for A has C1 false, so cannot be a counterexample for C!

Actually, C is true. To prove it, try the substitution ##u = x^2##. You'll need to use the fact that f(x) is everywhere positive.

Also, f(x) = arctan x is not a counterexample for B because it has B1 false - the integral diverges. An easy counterexample is the function that is zero everywhere except at integer values of x where it has value 1. That satisfies the conditions above. If instead it required the function to be right or left continuous, you'd need to change those zero-length blips up to 1 to have nonzero length but of decreasing length such that the sum of the infinite sequence of blip lengths is finite - easily done.
 
  • Like
Likes FactChecker
  • #3
Thanks, I understand my mistakes now.
Currently I'm trying to prove ## C ##, I've did as you proposed ## \int_{0}^{\infty} f(x^2 ) d x = \{ u = x^2 , \sqrt{u} = x , du = 2x dx = 2\sqrt{u} dx \} = \frac{1}{2} \cdot \int_{0}^{\infty} \frac{f( u )}{ \sqrt{u} } d u ## .

But I think it's a disprove, here's why:
By the Integral Direct comparison test, if we take ## \int_{0}^{\infty} \frac{\frac{f( u )}{ \sqrt{u} }}{ \frac{1}{ \sqrt{u} }} du = \frac{1}{2} \int_{0}^{\infty} f( u ) du## which converges to a finite, positive value, but ## \int_{0}^{\infty} \frac{1}{ \sqrt{u} } du ## diverges. Hence, ## \frac{1}{2} \cdot \int_{0}^{\infty} \frac{f( u )}{ \sqrt{u} } d u ## diverges

Relevant equations:
Using integral comparison test in the sense of comparing between integrals, like here https://web.njit.edu/~bg263/Lecture notes and supplements/L19.pdf ( and not in the context of sums )
 
  • #4
there's a problem in my attempted disprove above since I can't use the Integral Direct Comparison test on when the integration's on two problematic points and I've mistaken for taking a limit on the integral instead of the divison of ## \frac{\frac{f( u )}{ \sqrt{u} }}{ \frac{1}{ \sqrt{u} }} ##.
Instead, ## \int_{0}^{\infty} f(x^2 )\, dx = \frac{1}{2} \int_{0}^{\infty} \frac{f( u )}{ \sqrt{u} }\, du = \frac{1}{2} \int_{0}^{1} \frac{f( u )}{ \sqrt{u} }\, du + \frac{1}{2} \int_{1}^{\infty} \frac{f( u )}{ \sqrt{u} }\, du ##.
## \frac{1}{2} \int_{0}^{1} \frac{f( u )}{ \sqrt{u} }\, du ## converges by the limit comparison test with ## \frac{1}{\sqrt{u}} ##.
However, I can't justify whether ##\frac{1}{2} \int_{1}^{\infty} \frac{f( u )}{ \sqrt{u} }\, du ## converges or not, and that is a problem.

But, I have a counter-example to ## C ##, here it is:
## f(x)=\frac 1 {\sqrt x} ## for ## 0 <x<1 ## and ## 0 ## for ## x \geq 1 ##. What do you think?
 
Last edited:
  • #5
If ##u## is big, ##f(u)/\sqrt{u}< f(u)##. So you can use a comparison test.

I like your counterexample, and I think it points out that your attempt to prove C was backwards. The part you said was easy is actually not (the integral near 0 is tricky to think about) and the part you said was hard is actually irrelevant.

This is a bit of a trick question I think, if you focus on the convergence of the integral near infinity you get the wrong answer.
 
  • #6
Office_Shredder said:
If u is big
But the integral boundaries are from 0 to infinity, so when u approaches zero it is $$\frac{f(u)}{\sqrt u}>f(u)$$
 
  • #7
I think whether ## C ## is true lies in the definiton of what is "Integrable".
In the problem I'm given ## \forall M>0 ## , ## f ## is integrable on ## [ 0,M]##. Now, "Integrable" can mean that ## f ## is bounded on the interval ## [ 0,M] ## and its integral over there exists and is finite, OR that ## f ## is not neccessarily bounded in the interval but it's improper integral over the interval exists and is finite.

If we accept the first definiton of "Integrable" then the counter-example I've brought is incorrect as it is not bounded on ## [ 0,1] ## , and that is the definiton I use in my course.
( The second definition will indeed have the counter-example I've brought as correct ).

So now given the theorem's correct, I can't see how to prove it given that I have used the substitution of variables ## u = x^2 ##
 
  • #8
CGandC said:
So now given the theorem's correct, I can't see how to prove it given that I have used the substitution of variables ## u = x^2 ##

Given which theorem is correct - assuming C is true?

I think it's just not true.I agree that in the technical definition of Riemann integrability, ##1/\sqrt{u}## is not integrable on [0,1] because your partitions are going to have to include 0 as an endpoint, and your function is not defined there. This isn't actually a big issue though, just define ##f(0)=0##, ##f(u)=1/\sqrt{u}## for ##u>0##.

No definition of integrability requires the function to be bounded as far as I know.
 
  • #9
Your counterexample for C in post #4 works as is. I erred in saying C is true.

I can't see the problem you allude to in post #6. Your counterexample in post #4 seems to me to work fine, except you haven't defined f(0). You can just do that as @Office_Shredder points out, without changing any of the integrals.

First show that C1 is true:
$$
\int_0^\infty f(x) dx =
\int_0^\infty \frac{dx}{\sqrt x} =
\int_0^1 \frac{dx}{\sqrt x} =
[2\sqrt x]_0^1 = 2(1 - 0) = 2$$
So C1 is true. Then we do:
$$
\int_0^\infty f(x) dx =
\int_0^\infty \frac{dx}{\sqrt {x^2}} =
\int_0^1 \frac{dx}{x} =
[\log x]_0^1 = 2(0 - \infty) = \infty$$
So C2 is false.
So C is false.
Note the second part of that is not fully formal. To formalise it, we instead write:
$$
\int_0^\infty f(x) dx =
\int_0^\infty \frac{dx}{\sqrt {x^2}} =
\int_0^1 \frac{dx}{x} =
\lim_{h\to 0^+}\int_h^1 \frac{dx}{x} =
\lim_{h\to 0^+}[\log x]_h^1 = 0 + \lim_{h\to 0^+}(-\log h)
= \lim_{h\to 0^+}(-\log h)$$
and we can easily show that limit does not exist, as we can make ##-\log h## as large as we like just by making ##h## sufficiently small.
 
  • Like
Likes CGandC
  • #10
But I've learned that if a function is Riemann-Integrable on a finite interval then it must be bounded, and
## f(x)=\begin{cases} \frac 1 {\sqrt x} &\text{if}\; x \in ( 0,1) \\\\ 0 & x \geq 1 \lor x = 0 \; \end{cases} ##
is not bounded on ## [ 0,1] ##, how to solve this apparent contradiction?

I think it lies in the fact that an Improper integral ( as we have here ) doesn't necessarily have to be bounded on a finite interval, because of that, the above theorem holds for proper integrals and not for improper ones ( as the last example is such )
 
  • #11
As always, the result depends on ones choice of a definition of "Riemann integrable". This may vary from place to place. If we consult Riemann himself, whose definition this term presumably refers to, in his famous 1854 paper on the representation of functions by means of a trigonometric series, we find both definitions. The one where the function is assumed bounded is called by him the integral "in the narrow sense", or in German, "im engern Sinne"; and the more general definition ("improper integral") is called the "extended" or "erweitert" definition. Riemann remarks that, unlike some more complicated extensions due to Cauchy, this simplest one is accepted by all mathematicians. Thus to answer your problem you must begin with a clear definition of "Riemann integrable". I admit that before doing this research into Riemann's works, I too always thought this meant the function is assumed bounded on finite intervals, having relied on modern calculus books for my definition. Indeed if one defines the integral strictly as the limit of Riemann sums, then one must assume boundedness, as Riemann points out, in order for that limit to exist, regardless of choice of points of evaluation. Actually most books still in my possession, use Darboux's (equivalent) definition, via upper and lower sums, which clearly requires boundedness.
 
Last edited:
  • Informative
  • Like
Likes Delta2 and CGandC
  • #12
mathwonk said:
Actually most books still in my possession, use Darboux's (equivalent) definition, via upper and lower sums, which clearly requires boundedness.

Does this actually requires boundedness?

Edit: sorry I forgot what the darboux sums were (or maybe never knew about them). The standard Riemann sums should be fine though
 
  • #13
A Darboux sum has as a summand the product of the length of the subinterval times the maximum (finite) value of the function f over that subinterval, hence boundedness is part of the definition. A Riemann sum has as a summand merely the product of the length of the subinterval times the value of f at an arbitrary point x* of the subinterval. So boundedness is not needed to define a Riemann sum, but asking for the Riemann sums to converge independently of the choice of intermediate point x*, i.e. to ask the function to be Riemann integrable, does imply boundedness of the function, as is pretty easy to show. So for a function to be Riemann integrable in the usual sense, over a finite interval, in the sense that the Riemann sums converge to a finite value, does imply the function is bounded. I.e. for an unbounded function, it is easy to choose the intermediate points x* so that the Riemann sums diverge, hence an unbounded function is not Riemann integrable in the usual sense.

I.e. one can define Riemann sums for any function f defined on [a,b], bounded or not, and define f to be Riemann integrable if the Riemann sums converge to the same finite value, for all choices of intermediate points x* of evaluation. Then one can prove that if the function f is integrable in this sense, then the function f is: 1) bounded, and 2) continuous except on a set of "measure zero". This theorem is due essentially to Riemann, who stated it however differently, the term "measure zero" being due to Lebesgue. I.e. Riemann gave a different description of the set of discontinuities which is easily shown to be equivalent to having measure zero. (Riemann showed the set of discontinuities is a countable union of sets of "content zero", and it is an easy exercise that such a set has measure zero.) This fact is overlooked in most books, whose authors have apparently not read carefully Riemann's original paper cited above.
 
  • Like
Likes CGandC
  • #14
Alright, I yield. I got the definition of cauchy integrable and Riemann integrable mixed up. I stand by my new claim that this is Cauchy integrable, but I agree for the problem you know f is bounded.
 
  • #15
I didn't mean to criticize anything you said. I just meant to emphasize that before one tries to prove anything one should say what definition one is using, as they vary from place to place. Since Riemann himself included the (Cauchy?) improperly integrable definition as one "all mathematicians accept", it would be reasonable to call also such unbounded functions as you suggested "Riemann integrable" in my opinion.

Indeed I had trouble when researching this question in my sources, to find a standard definition of "Riemann integrable". it is a bit subtle in my opinion, although I am not an analyst. One takes a function f on a finite interval [a,b], and then most books I have assume it bounded, and consider pairs of step functions S,T such that S≤f≤ T.

Then they do one of two things:
either 1) assume that for every e>0, one can find S,T with |T(x)-S(x)| < e for every x, (i.e. all of whose VALUES are within e of each other),
or else 2) one assumes that for every e>0, one can find S,T whose INTEGRALS differ by e.

Then in both cases, one calls f integrable, and defines its integral as the least upper bound of the integrals of the S, or as the greatest lower bound of the integrals of the T.

The difference is only in the classes of functions f which satisfy these conditions. Functions f satisfying 1) are "uniform limits of step functions; functions f satisfying 2) are called functions which are (certain restricted) L1 limits of step functions. Every function in class 1) is also in class 2), and the second class coincides with the functions whose Riemann sums converge, and are often called Riemann integrable. Functions in class 1) are often called Newton integrable. If we include also functions such that there are a finite number of points in [a,b] which we can omit, and such that if we surround them with small intervals, the resulting restricted functions will be Riemann integrable on the complement of these intervals, and these integrals have a limit as the intervals shrink to zero, then we get a larger class of functions which Riemann called "integrable in the extended sense" and are perhaps called Cauchy integrable? Anyway, "all I ask" is that when a definition is appealed to in a proof, that it either be given explicitly, or a reference for it be cited where a precise definition is given. My point is simply that the problem cited by the OP is not well posed until a precise definition is given of what is meant in it by the word "integrable".

cheers!
 
Last edited:
  • Like
Likes Office_Shredder and CGandC
  • #16
Thanks very much for the quality answers.
It's really fascinating how one innocent looking question can call your utmost-attention to the very fundamentals of a theory and make you understand that whatever you considered as "not a big deal" actually is a serious matter ( although I did learn about what an Integration is by definition from a mathematician's viewpoint).
Up until now I simply haven't thought there could be difference in the definition of 'Integrable' since whenever I was calculating integrals on finite interval, I would say that if the value of the integration existed then the integrand is definitely integrable, but the catch is that the definition of Integrable also depends on the integrand itself.
 

FAQ: Which of the following statements are true? (Real Analysis question)

What is real analysis?

Real analysis is a branch of mathematics that deals with the study of real numbers, functions, and their properties. It is considered to be a more rigorous and advanced version of calculus.

What are the main concepts in real analysis?

Some of the main concepts in real analysis include limits, continuity, differentiation, integration, and sequences and series.

How is real analysis different from calculus?

Real analysis is more rigorous and theoretical compared to calculus. It focuses on the underlying principles and foundations of calculus, rather than just the techniques for solving problems.

What is the importance of real analysis?

Real analysis is important in many areas of mathematics, including pure mathematics, applied mathematics, and physics. It also has practical applications in fields such as engineering, economics, and computer science.

What are some common misconceptions about real analysis?

One common misconception is that real analysis is simply a more difficult version of calculus. In reality, it involves a different way of thinking and requires a deeper understanding of mathematical concepts. Another misconception is that real analysis is only useful for theoretical purposes, when in fact it has many practical applications.

Similar threads

Back
Top