Does the Convergence of \(\Sigma \frac{f(x)}{x}\) Depend on \(\Sigma f(x) = 0\)?

In summary: In your construction, the first series converges by converging to zero, but the second series does not converge. Convergence of the first series does not imply convergence of the second series.
  • #1
CalTech>MIT
7
0

Homework Statement


Let f:Z[tex]\rightarrow[/tex]R be periodic such that f(x+a) = f(x) for some fixed a[tex]\geq[/tex]1.

Prove that [tex]\Sigma[/tex] [tex]^{infinity}_{x=1}[/tex] [tex]\frac{f(x)}{x}[/tex] converges if and only if [tex]\Sigma[/tex] [tex]^{a}_{x=1}[/tex] f(x) = 0.


Homework Equations


n/a


The Attempt at a Solution


Ok, so I have a general idea of how to write the proof. We can do this by contradiction and assume that the second series isn't equal to zero. As a result, the first series becomes similar to the harmonic series? and thus doesn't converge?
 
Physics news on Phys.org
  • #2
First show that the converse holds.

Then, by contradiction, suppose that [tex]\sum_{x=1}^{a-1}{f(x)}=-f(a)-b[/tex]. Then we put g(a)=f(a)+b and g(x)=f(x) for other values (while still making sure that the function is periodic). Then we got (since both series converge), that

[tex]\sum_{x=1}^{+\infty}{\frac{f(x)}{x}}-\sum_{x=1}^{+\infty}{\frac{g(x)}{x}}[/tex]

converges. But this difference diverges, since it looks very much like a harmonic series.
 
  • #3
micromass said:
First show that the converse holds.

Then, by contradiction, suppose that [tex]\sum_{x=1}^{a-1}{f(x)}=-f(a)-b[/tex]. Then we put g(a)=f(a)+b and g(x)=f(x) for other values (while still making sure that the function is periodic). Then we got (since both series converge), that

[tex]\sum_{x=1}^{+\infty}{\frac{f(x)}{x}}-\sum_{x=1}^{+\infty}{\frac{g(x)}{x}}[/tex]

converges. But this difference diverges, since it looks very much like a harmonic series.

Its been a while since I took analysis but I would be careful with this approach. It seems to me that you are trying to use two ideas here.
First, is the convergence of a series (seen as a sequence) to conclude that the sum of the limits is the limit of the sums, in this case we do not have absolute convergence so we are not free to assume that we can pair up terms exactly how we want- even if we know that both converge. IE, although it is true that the sum of the limits is the limit of the sums in this case, the sum of the limits equals the limit of the sums of partials sums, not the sum of terms of the sequence.

Second, you are trying to conclude that the second sequence converges by virtue of the convergence of the first(so I am reading), but in this case you have changed the sign of an entire subsequence of the sequence of terms- the convergence of the first series no longer gives you convergence of the second because of this (at least I do not think so, not without absolute convergence).

Have you guys tried induction on a? I am not 100% positive that it works (again because you are not guaranteed absolute convergence), but it might be worth a shot.
 
  • #4
I know that the series is not conditionally convergent. And I've been very careful not to use this...

Secondly, the convergence of [tex]\sum{\frac{g(x)}{x}}[/tex] does not follow from the convergence of the first series. It follows from the reverse implication of what he's trying to prove (or alternatively from Dirichlets criterion).
 
  • #5
Ok well I wrote up a reply and then PF timed me out- so that reply went bye bye. In a nutshell-

1) I apologize if I misunderstood your argument, in hindsight I had no good reason to assume that you were using absolute convergence.

2) I am convinced that if argued carefully, your approach works just fine (I had suspected that it did before(except for point 4 below), my point was not to argue that it didn't- I was simply saying be careful with it, this approach needs to be done carefully).

3) "It follows from the reverse implication of what he's trying to prove" - agreed
"or alternatively from Dirichlets criterion" -Nice!

4) "but in this case you have changed the sign of an entire subsequence of the sequence of terms" - this was wrong on my part. I had misread your construction.
 

FAQ: Does the Convergence of \(\Sigma \frac{f(x)}{x}\) Depend on \(\Sigma f(x) = 0\)?

What is a convergent series?

A convergent series is a sequence of numbers that approaches a finite limit as the number of terms increases. In other words, the sum of the terms in the series approaches a specific value as more terms are added.

How is a convergent series proved?

A convergent series can be proved using various methods, such as the ratio test, the comparison test, or the integral test. These methods involve analyzing the behavior of the terms in the series and determining if they approach a finite limit.

What is the difference between a convergent series and a divergent series?

A divergent series is one in which the terms do not approach a finite limit as more terms are added. This means that the sum of the terms in a divergent series is infinite, while the sum of the terms in a convergent series is a finite value.

Can a convergent series have an infinite number of terms?

Yes, a convergent series can have an infinite number of terms. This is because the series may approach a finite limit, even if there are an infinite number of terms. However, the terms in a convergent series must decrease in value as more terms are added for the series to converge.

Why are convergent series important in mathematics?

Convergent series are important in mathematics because they allow us to determine the sum of an infinite number of terms and make predictions about the behavior of a sequence. They are also used in various mathematical applications, such as in calculus and physics, to model real-world phenomena.

Similar threads

Replies
3
Views
872
Replies
2
Views
1K
Replies
1
Views
1K
Replies
22
Views
4K
Replies
2
Views
1K
Replies
5
Views
1K
Replies
7
Views
1K
Back
Top