Discussions on the convergence of integrals and series

In summary, this thread discussed the convergence of various definite integrals and infinite series. The first integral was proven to be equal to 1-gamma and have a removable singularity at x=0. The second integral was shown to be convergent at infinity using Dirichlet's convergence test. The third integral was examined at zero and infinity, and found to converge at both. The fourth integral was shown to converge using a substitution. The fifth infinite series was shown to converge using the Dirichlet test, and it was also proven to be a Fourier series for the function 1/2(pi-x) over the interval [0,2pi].
  • #1
alyafey22
Gold Member
MHB
1,561
1
This thread will be dedicated to discuss the convergence of various definite integrals and infinite series , if you have any question to post , please don't hesitate , I hope someone make the thread sticky.

1- \(\displaystyle \int^{\infty}_0 \left(\frac{e^{-x}}{x} \,-\,\frac{1}{x(x+1)^2}\right)\,dx\,=1-\gamma\)

Let us have some ideas
 
Physics news on Phys.org
  • #2
$\frac{e^{-x}}{x} = \frac{1}{x} \Big( 1 - x + O(x^{2}) \Big) = \frac{1}{x} - 1 + O(x)$

$\lim_{x \to 0} \Big( \frac{e^{-x}}{x} - \frac{1}{x(x+1)^{2}} \Big) = \lim_{x \to 0} \Big( \frac{1}{x} - 1 + O(x) - \frac{1}{x} + \frac{1}{x+1} + \frac{1}{(x+1)^{2}} \Big)$

$ \lim_{x \to 0} \Big(- 1 + O(x) + \frac{1}{x+1} + \frac{1}{(x+1)^{2}} \Big) =1$

So the singularity at $x=0$ is removable.

EDIT: And there is no issue at $\infty$ since the integral can be separated into two integrals that both converge on $[\epsilon, \infty)$.
 
Last edited:
  • #3
Random Variable said:
$\frac{e^{-x}}{x} = \frac{1}{x} \Big( 1 - x + O(x^{2}) \Big) = \frac{1}{x} - 1 + O(x)$

$\lim_{x \to 0} \Big( \frac{e^{-x}}{x} - \frac{1}{x(x+1)^{2}} \Big) = \lim_{x \to 0} \Big( \frac{1}{x} - 1 + O(x) - \frac{1}{x} + \frac{1}{x+1} + \frac{1}{(x+1)^{2}} \Big)$

$ \lim_{x \to 0} \Big(- 1 + O(x) + \frac{1}{x+1} + \frac{1}{(x+1)^{2}} \Big) =1$

So the singularity at $x=0$ is removable.

EDIT: And there is no issue at $\infty$ since the integral can be separated into two integrals that both converge on $[\epsilon, \infty)$.

Well, that is better . Very good .
 
  • #4
2-\(\displaystyle \int^{\infty}_0 \frac{\sin x}{x}\)
 
  • #5
ZaidAlyafey said:
2-\(\displaystyle \int^{\infty}_0 \frac{\sin x}{x}\)

The integral has a removable singularity at the origin , so it converges there , now let us examine at infinity

\(\displaystyle \int^{\infty}_{\frac{\pi}{2}} \frac{\sin x}{x}\)

Integrating by parts we get

\(\displaystyle \int^{\infty}_{\frac{\pi}{2}} \frac{\sin x}{x}= \frac{-\cos x}{x} \biggr]^ {\infty }_{\frac{\pi}{2}} -\int^{\infty}_{\frac{\pi}{2}} \frac{\cos x}{x^2}\)

The first term vanishes , for the second one

Because the integral is absolutley convergent it converges

\(\displaystyle \int^{\infty}_{\frac{\pi}{2}} \frac{1}{x^2}< \infty\)

Now Let us look at another form

3-\(\displaystyle \int^{\infty}_0 \frac{\cos x}{x}\)
 
  • #6
$\int_{\frac{\pi}{2}}^{\infty} \frac{\sin x}{x} \ dx$ is also convergent by Dirichlet's convergence test since $\frac{1}{x}$ is bounded, monotonic, and tends to zero, while $\int_{\frac{\pi}{2}}^{a} \sin x \ dx $ is bounded for any $a > \frac{\pi}{2}$
 
  • #7
$\frac{\cos x}{x} = \frac{1}{x} \big( 1-\frac{x^{2}}{2!} + O(x^{4}) \Big)= \frac{1}{x} - \frac{x}{2!}+ O(x^{3})$

Since $\frac{1}{x}$ is not integrable at zero, $\frac{\cos x}{x}$ is not integrable at zero.
 
  • #8
4-[tex]\int_{0}^{1}\frac{\ln^{2}(x)}{x^{2}+x-2}dx[/tex]
 
Last edited:
  • #9
ZaidAlyafey said:
4-[tex]\int_{0}^{1}\frac{\ln^{2}(x)}{x^{2}+x-2}dx[/tex]

\(\displaystyle \frac{1}{x}=\frac{1}{1+x-x} = \frac{1}{1-(1-x)}\)

\(\displaystyle \frac{1}{x}=\sum^{\infty}_{n=0}(1-x)^n\) converges \(\displaystyle \forall \, x \, : \, \,\, |1-x|<1\)

\(\displaystyle \ln(x) =-\sum^{\infty}_{n=0} \frac{(1-x)^{n+1}}{n+1}\)

At $1$ we have a removable singularity .

\(\displaystyle \lim_{x \to 1}\frac{ \left( (1-x) + \frac{(1-x)^2}{2}+ \cdots \right)^2 }{ (x+2) (x-1) } < \infty\)

To examine the integral near zero , let us make the substitution

\(\displaystyle \ln(x) =-t \)

\(\displaystyle -\int_{\epsilon}^{\infty} \frac{t^2 \, e^{t}}{2e^{2t}-e^{t}-1}< - \frac{1}{2}\, \int^{\infty}_{\epsilon} t^2 e^{-t}< \infty\)

The integral converges ...
 
  • #10
5- \(\displaystyle \sum^{\infty}_{n=1}\frac{\sin(nx)}{n}\)
 
  • #11
ZaidAlyafey said:
5- \(\displaystyle \sum^{\infty}_{n=1}\frac{\sin(nx)}{n}\)

I have seen a series like this before. The simple (but unsatisfying) explanation is that it must converge by the alternating series test, which may be extended to such unconventionally oscillating terms.

However, I'm sure there's a more elegant underlying structure if you use some decomposition of Euler's formula. Having suggested it, I will look into it if I have the inclination later.
 
  • #12
ZaidAlyafey said:
5- \(\displaystyle \sum^{\infty}_{n=1}\frac{\sin(nx)}{n}\)

According to the Diriclet test the series converges so that we have to compute its sum. Using the well known expansion...

$$ \sum_{n=1}^{\infty} \frac{z^{n}}{n} = - \ln (1-z)\ (1)$$

... we arrive to write...

$$\sum _{n=1}^{\infty} \frac{\sin n x}{n} = - \mathcal {Im} \{\ln (1-e^{i x})\} = \tan^{-1} \frac{\sin x}{1-\cos x}\ (2)$$

Kind regards

$\chi$ $\sigma$
 
  • #13
chisigma said:
According to the Diriclet test the series converges so that we have to compute its sum. Using the well known expansion...

$$ \sum_{n=1}^{\infty} \frac{z^{n}}{n} = - \ln (1-z)\ (1)$$

... we arrive to write...

$$\sum _{n=1}^{\infty} \frac{\sin n x}{n} = - \mathcal {Im} \{\ln (1-e^{i x})\} = \tan^{-1} \frac{\sin x}{1-\cos x}\ (2)$$

Kind regards

$\chi$ $\sigma$

Yes, I think also it is solvable by Fourier series .
 
  • #14
chisigma said:
According to the Diriclet test the series converges so that we have to compute its sum. Using the well known expansion...

$$ \sum_{n=1}^{\infty} \frac{z^{n}}{n} = - \ln (1-z)\ (1)$$

... we arrive to write...

$$\sum _{n=1}^{\infty} \frac{\sin n x}{n} = - \mathcal {Im} \{\ln (1-e^{i x})\} = \tan^{-1} \frac{\sin x}{1-\cos x}\ (2)$$
To take that a bit further, the expression in (2) can be simplified as $$\tan^{-1} \Bigl(\frac{\sin x}{1-\cos x}\Bigr) = \tan^{-1} \biggl(\frac{2\sin\frac x2\cos\frac x2}{2\sin^2\frac x2}\biggr) = \tan^{-1}\bigl(\cot\tfrac x2\bigr) = \tfrac\pi2 - \tfrac x2.$$ Hence $$\sum _{n=1}^{\infty} \frac{\sin n x}{n} = \tfrac12(\pi-x).$$ But that only works provided that $0<x< 2\pi$. At the endpoints of the interval, when $x=0$ or $2\pi$, the sum $\sum _{n=1}^{\infty} \frac{\sin n x}{n}$ is obviously $0$ (since each term vanishes).

As ZaidAlyafey points out, this sum is a Fourier series, namely for the function $\tfrac12(\pi-x)$ over the interval $[0,2\pi].$
 
  • #15
Opalg said:
To take that a bit further, the expression in (2) can be simplified as $$\tan^{-1} \Bigl(\frac{\sin x}{1-\cos x}\Bigr) = \tan^{-1} \biggl(\frac{2\sin\frac x2\cos\frac x2}{2\sin^2\frac x2}\biggr) = \tan^{-1}\bigl(\cot\tfrac x2\bigr) = \tfrac\pi2 - \tfrac x2.$$ Hence $$\sum _{n=1}^{\infty} \frac{\sin n x}{n} = \tfrac12(\pi-x).$$ But that only works provided that $0<x< 2\pi$. At the endpoints of the interval, when $x=0$ or $2\pi$, the sum $\sum _{n=1}^{\infty} \frac{\sin n x}{n}$ is obviously $0$ (since each term vanishes).

As ZaidAlyafey points out, this sum is a Fourier series, namely for the function $\tfrac12(\pi-x)$ over the interval $[0,2\pi].$

Since the series converges for all $x$ , there may be a general solution that works for all $x$ , right ?
 
  • #16
ZaidAlyafey said:
Since the series converges for all $x$ , there may be a general solution that works for all $x$ , right ?
Yes, it is the $2\pi$-periodic function defined on the interval $[0,2\pi)$ by $f(x) = \begin{cases}0&(x=0),\\ \frac12(\pi-x)&(0<x<2\pi). \end{cases}$ It is an example of what is often called a sawtooth function.
 
  • #17
6- \(\displaystyle \int^{\infty}_0 \frac{x}{\sqrt{e^x-1}}\, dx\)
 
  • #18
$\displaystyle \int_{0}^{\infty} \frac{x}{\sqrt{e^{x}-1}} \ dx = \int_{0}^{\infty} \frac{x e^{- \frac{x}{2}}} {\sqrt{1-e^{-x}}} \ dx $

Let $ \displaystyle u = e^{-\frac{x}{2}}$

$ \displaystyle = - 4 \int_{0}^{1} \frac{\ln{u}}{\sqrt{1-u^{2}}} \ du $

Let $v = \arcsin u$

$ \displaystyle = -4 \int^{\frac{\pi}{2}}_{0} \ln (\sin v) \ dv $I'm sure we've all seen evaluations of that last integral. So I'm just going to argue that it converges.The only potential issue is at $x=0$.

But $\displaystyle \ln (\sin x) = \ln \Big( \frac{x \sin x}{x} \Big) = \ln(x) + \ln \Big(\frac{\sin x}{x} \Big) $.

So near $x=0$, $ \ln(\sin x)$ behaves like $\ln x$, and thus $\displaystyle \int_{0}^{\frac{\pi}{2}} \ln (\sin x) \ dx $ converges.
 
  • #19
7- $ \displaystyle \int_{0}^{\infty} \frac{\ln (\tan^{2} x)}{1+x^{2}} \ dx$

8- $ \displaystyle \int_{0}^{\infty} \frac{\sin (\tan x)}{x} \ dx $
 
Last edited:
  • #20
TheBigBadBen said:
I have seen a series like this before. The simple (but unsatisfying) explanation is that it must converge by the alternating series test, which may be extended to such unconventionally oscillating terms.

However, I'm sure there's a more elegant underlying structure if you use some decomposition of Euler's formula. Having suggested it, I will look into it if I have the inclination later.

Why do you consider the Alternating Series test not elegant? I personally find the most simple solution to be the most elegant, because it is the most likely to be understood by others...
 
  • #21
Prove It said:
Why do you consider the Alternating Series test not elegant? I personally find the most simple solution to be the most elegant, because it is the most likely to be understood by others...

I believe I said unsatisfying, not inelegant. At any rate, in my 1:30 AM internet-browsing state, I was annoyed at not being able to immediately see what the series should converge to. As evidenced by the solutions that followed, it seems that there was a concise, complete, and more satisfying answer all along.

Also, I'm not sure that this series technically falls under the purview of the alternating series test, but as $\chi\sigma$ pointed out, the Dirichlet test works here.

It should be pointed out though that simply recognizing that the series conforms to a Fourier series takes for granted that at some point, somebody had to show that Fourier series fulfill a whole bunch of nice properties, including convergence providing that the emulated function is bounded. That process itself resulted in the restructuring of some areas of mathematics, analysis in particular.
 
  • #22
No one has attempted the integrals I posted a few days ago.

Here's my attempt.$ \displaystyle \int_{0}^{\infty} \frac{\sin (\tan x)}{x} \ dx = \int_{0}^{\frac{\pi}{2}} \frac{\sin (\tan x)}{x} \ dx + \sum_{n=1}^{\infty} \int_{(n-\frac{1}{2}) \pi}^{(n+\frac{1}{2}) \pi } \frac{\sin (\tan x)}{x} \ dx$Since $\displaystyle \frac{\sin (\tan x)}{x}$ has a removable singularity at $x=0$ and is bounded near $x= \frac{\pi}{2}$, $ \displaystyle \int_{0}^{\frac{\pi}{2}} \frac{\sin (\tan x)}{x} \ dx$ converges.

And since $\displaystyle \frac{\sin (\tan x)}{x}$ is bounded near $(n-\frac{1}{2}) \pi$ and $(n+\frac{1}{2}) \pi$, $ \displaystyle \int_{(n-\frac{1}{2})\pi}^{(n+\frac{1}{2}) \pi} \frac{\sin (\tan x)}{x} \ dx$ converges.

So we need to show that $\displaystyle \sum_{n=1}^{\infty} \int_{(n-\frac{1}{2})\pi}^{(n+\frac{1}{2})\pi} \frac{\sin (\tan x)}{x} \ dx$ converges.$ \displaystyle \int_{(n-\frac{1}{2})\pi}^{(n+\frac{1}{2})\pi} \frac{\sin (\tan x)}{x} \ dx = \int_{(n-\frac{1}{2})\pi }^{n \pi} \frac{\sin (\tan x)}{x} \ dx + \int^{(n+\frac{1}{2})\pi}_{n \pi } \frac{\sin (\tan x)}{x} \ dx = \int_{\frac{\pi}{2}}^{0} \frac{\sin (\tan u)}{n \pi -u} \ du + \int^{\frac{\pi}{2}}_{0} \frac{\sin (\tan v)}{n \pi + v} \ dv$

$ \displaystyle = \int_{0}^{\frac{\pi}{2}} \Big( \frac{1}{n \pi + u} - \frac{1}{n \pi -u} \Big) \sin (\tan u) \ du = 2 \int_{0}^{\frac{\pi}{2}} \frac{u \sin (\tan u)}{u^{2} - n^{2} \pi^{2}} \ du$And $ \displaystyle \sum_{n=1}^{\infty} \Big| \int_{(n-\frac{1}{2})\pi}^{(n+\frac{1}{2})\pi} \frac{\sin (\tan x)}{x} \ dx \Big| = \sum_{n=1}^{\infty} \Big| 2 \int_{0}^{\frac{\pi}{2}} \frac{x \sin (\tan x)}{x^{2} - n^{2} \pi^{2}} \ dx \Big| \le 2 \sum_{n=1}^{\infty} \int_{0}^{\frac{\pi}{2}} \Big| \frac{x \sin (\tan x)}{x^{2} - n^{2} \pi^{2}}\Big| \ dx $

$ \displaystyle \le 2 \sum_{n=1}^{\infty} \frac{\pi}{2} \max \Big| \frac{x \sin(\tan x)}{x^{2}-n^{2} \pi^{2}} \Big|\le \pi \sum_{n=1}^{\infty} \frac{\frac{\pi}{2}}{n^{2} \pi^{2} -\frac{\pi^{2}}{4}} < \infty$

Therefore $ \displaystyle \sum_{n=1}^{\infty} \int_{(n-\frac{1}{2}) \pi}^{(n+\frac{1}{2})\pi} \frac{\sin (\tan x)}{x} \ dx $ converges by the absolute convergence test.
 
Last edited:
  • #23
Opalg said:
To take that a bit further, the expression in (2) can be simplified as $$\tan^{-1} \Bigl(\frac{\sin x}{1-\cos x}\Bigr) = \tan^{-1} \biggl(\frac{2\sin\frac x2\cos\frac x2}{2\sin^2\frac x2}\biggr) = \tan^{-1}\bigl(\cot\tfrac x2\bigr) = \tfrac\pi2 - \tfrac x2.$$ Hence $$\sum _{n=1}^{\infty} \frac{\sin n x}{n} = \tfrac12(\pi-x).$$ But that only works provided that $0<x< 2\pi$. At the endpoints of the interval, when $x=0$ or $2\pi$, the sum $\sum _{n=1}^{\infty} \frac{\sin n x}{n}$ is obviously $0$ (since each term vanishes).

As ZaidAlyafey points out, this sum is a Fourier series, namely for the function $\tfrac12(\pi-x)$ over the interval $[0,2\pi].$
HINT: consider the Bernoulli Polynomial or order 1:

\(\displaystyle B_1(x) = x-\frac{1}{2}\)

The connection is there. :rolleyes:
 
  • #24
Given \(\displaystyle n\ge 1\) and \(\displaystyle 0\le x\le1\), or, alternatively, \(\displaystyle n=1\) and \(\displaystyle 0<x<1\), then\(\displaystyle B_n(x) = -\frac{2\, (n!)}{(2\pi)^n}\, \sum_{k=1}^{\infty} \frac{1 }{k^n} \cos \left(2\pi kx-\frac{\pi n}{2}\right) \)

See eqn. 9.622 on (approx) page 1628 of the Maths Bible that is Gradshteyn & Ryzhik; the squirrel's guide to life, the universe, and everything:http://f3.tiera.ru/ShiZ/math/MRef_R...ies, and products (5ed., AP, 1996)(1762s).pdfNom nom nom! (Heidy)(Heidy)(Heidy)
 
Last edited:

FAQ: Discussions on the convergence of integrals and series

What is the definition of convergence in integrals and series?

In mathematics, convergence is the property of a sequence or series of numbers that approaches a finite limit as the number of terms increases. In the context of integrals and series, convergence means that as the limits of integration or the terms in the series tend to infinity, the value of the integral or series approaches a finite number.

What are the different types of convergence in integrals and series?

There are several types of convergence that can occur in integrals and series, including pointwise convergence, uniform convergence, absolute convergence, and conditional convergence. Pointwise convergence means that the function or series converges at each individual point but may not converge as a whole. Uniform convergence means that the function or series converges at every point simultaneously. Absolute convergence means that the integral or series converges regardless of the order of its terms. Conditional convergence means that the integral or series converges only when certain conditions are met.

How do you determine if an integral or series converges?

To determine the convergence of an integral, you can use tests such as the integral test, comparison test, limit comparison test, or ratio test. For series, you can use tests such as the divergence test, integral test, comparison test, limit comparison test, or ratio test. These tests compare the given integral or series to known convergent or divergent functions or series to determine its convergence.

What is the relationship between integrals and series?

Integrals and series are closely related in that the convergence of a series of functions is determined by the convergence of the corresponding integral. This is known as the integral test. Additionally, the sum of a convergent series can be represented as an integral, and an integral can be approximated by a series of partial sums using techniques such as Simpson's rule or the trapezoidal rule.

How are integrals and series used in real-world applications?

Integrals and series have many practical applications in fields such as physics, engineering, and economics. For example, integrals are used to calculate areas, volumes, and other quantities in real-world scenarios. Series are used to approximate functions and solve differential equations. They are also used in statistics to analyze data and make predictions. Overall, integrals and series are powerful tools for modeling and solving real-world problems.

Similar threads

Replies
3
Views
1K
Replies
29
Views
2K
Replies
4
Views
2K
Replies
7
Views
2K
Replies
11
Views
1K
Replies
9
Views
2K
Replies
21
Views
2K
Replies
9
Views
1K
Replies
7
Views
526
Back
Top