Proving convergence of a series to its generating function

In summary: I think these should be two different symbols. For instance, if you had a sequence of functions fn(x) that converged pointwise to a function f(x), you wouldn't write "f(x) = fn(x)", you would write "lim fn(x) = f(x)" or "lim fn(x) = f". In summary, the conversation discusses the use of the Lagrange form of the remainder in Taylor's Theorem to prove the convergence of the Maclaurin series generated by f(x) = xex. The relevant equations include the Lagrange form of the remainder and the Remainder Estimation Theorem. The attempt at a solution involves using the Remainder Estimation Theorem with different values for
  • #1
Subdot
78
1

Homework Statement


The problem asks to use the Lagrange form of the remainder in Taylor's Theorem to prove that the Maclaurin series generated by f(x) = xex converges to f. From the actual answer, I'm guessing it wants me to use the Remainder Estimation Theorem to accomplish this.


Homework Equations


The Lagrange form of the remainder of Taylor's Theorem, Rn(x), where a is the center of the Taylor series in an open interval I, c is some number between a and x in the interval I, f(n+1) is the (n+1)st derivative of f, and n is a positive integer:

[tex]R_n(x) = \frac{f^{(n+1)}(c)}{(n+1)!}(x-a)^{n+1}[/tex]


The other relevant equation, I would guess, is the Remainder Estimation Theorem. It states (word-for-word) that "if there are positive constants M and r such that |f(n+1)(t)| [itex]\leq[/itex] Mrn+1 for all t between a and x, then the remainder Rn(x) in Taylor's Theorem satisfies the inequality

[tex]|R_n(x)| \leq M \frac{r^{n+1}|x-a|^{n+1}}{(n+1)!}[/tex]

If these conditions hold for every n and all other conditions of Taylor's Theorem are satisfied by f, then the series converges to f(x)."


The Attempt at a Solution


I got the series by multiplying the series for ex by x. [tex]\sum_{n=0}^{\infty}\frac{x^{n+1}}{n!}[/tex] In my attempt to prove the convergence of this to f, I used the Remainder Estimation Theorem with r = 1, a = 0, and f(n+1)(t) = the (n+1)st derivative of tet. |f(n+1)(t)| [itex]\leq[/itex] M. So, I just needed to find a bound for the (n+1)st derivative. This is where I got confused. I can't find the bound. On the interval [0,x], the bound would be the (n+1)st derivative evaluated at x (the right end of the interval) because xex is a positive increasing function. So, M = xex + (n+1)ex.

However, M = a positive constant, so it can't be dependent on n. On the interval [x,0], a similar thing occurs except this time the bound would be (n+1) because M = 0e0 + (n+1)e0 = 0 + (n+1) = (n+1). Once again, this is a shifting bound.


Attempting to continue the proof by plugging this into the inequality of the Remainder Estimation Theorem despite the shifting bounds, I get

[tex]|R_n(x)| \leq \frac {xe^x + (n+1)(e^x)}{(n+1)!}|x|^{n+1} = \frac {xe^x|x|^{n+1} + (n+1)e^x|x|^{n+1}}{(n+1)!} = \frac{xe^x|x|^{n+1}}{(n+1)!} + \frac {(n+1)e^x|x|^{n+1}}{(n+1)!} = \frac {xe^x|x|^{n+1}}{(n+1)!} + \frac {e^x|x|^{n+1}}{n!} [/tex]

The n+1 cancels and so as n [itex]\rightarrow\infty[/itex], the right side of this inequality and thus the remainder goes to zero which proves convergence on [0,x] (the convergence on [x,0] follows from the "bound" found earlier through this same method). Assuming there is nothing wrong with this proof, that is the only way I can see how to prove it.


However, the correct answer gives this. |f(n+1)(t)| = tet [itex]\leq[/itex] Mrn+1. r = 1, so M = tet. On [x,0], M = 0. On [0,x], M = xex. Thus, tet [itex]\leq[/itex] M, and the Remainder Estimation Theorem is satisfied which proves the convergence of the series to f.

And that is why I'm confused. (1) It claims that M = xex on [0,x], (2) equates tet to |f(n+1)(t)|, and (3) sets M = 0 on [x,0]. There are other similar problems to prove in which a similar thing happens (I have not attempted these again yet, though). The correct answer does all three of these confusing things for h(x) = sin2(x) and i(x)=cos2(x). On a third question, the correct answer sets M = 0 on [x,0] for j(x) = sin(x) - x + [itex]\frac {x^3}{6}[/itex]. However, that last one I can find an actual bound for the (n+1)st derivative fine (it isn't M = 0 though).

I thought that there might be something similar going on with all of these problems and that by understanding the one I explained in detail, I could solve the rest.


I'd appreciate any help here. Is there something obvious I'm missing? Because I noticed that https://www.physicsforums.com/showthread.php?t=292812&highlight=remainder+estimation+theorem" (but with no attempt made and definitely not as detailed as me) and was asked in another website without getting an answer.
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
It looks to me like you are only confused because the solutions are written sloppily. First let's restrict to a finite interval [-C,C] and think of C large. I think you would agree that |R_n(x)|<=|(C*e^C+(n+1)e^C)*C^(n+1)|/(n+1)!. I've replaced the x and c in x^(n+1) and f^(n+1)(c) with C, since both of these functions take on their maximum absolute value at the right endpoint (+C). That's easy enough to show. Now you just want to show lim n->infinity of |R_n(x)| is zero. Can you show that? If so then the series converges on [-C,C]. But C could be anything, so you've actually shown it converges for all x.
 
  • #3
Can you show that? If so then the series converges on [-C,C]. But C could be anything, so you've actually shown it converges for all x.
Yes, I can show that. If I understand you right, then I did prove it right (albeit very informally) by using M = CeC + (n+1)eC, except I used x instead of C, correct? So, if that proof is correct, this means that the constant M can "shift" its value? M can depend on n?

It looks to me like you are only confused because the solutions are written sloppily.
You said "sloppily" instead of saying they were wrong. This implies that they are correct in a sense. I don't see how. Will you or someone else please explain this to me?
 
  • #4
I'm not really sure what the "Remainder Estimation Theorem" is. So I'm not sure what 'M shifting it's value' means. Sorry, maybe somebody else does. But if you can show that the remainder term goes to zero, then you have shown that the series converges. You quoted lines from the solution like |f^(n+1)(t)| = t*e^t. That's not right. That's what I meant by 'sloppily'.
 
  • #5
I'm not really sure what the "Remainder Estimation Theorem" is. So I'm not sure what 'M shifting it's value' means.
I quoted the Theorem word-for-word in the first post. Sorry, I guess I wasn't clear on that. All I meant by 'M shifiting its value' is that although M is a constant, its value in this case depends on n. For example, its value for n = 1 is M = e + 2e = 3e. Its value for n = 2 is M = 2e2 + 3e2 = 5e2. So, M's value will change with n as n goes to infinity. However, M is supposed to be a constant. That is why I asked if it could 'shift its value' and if M could depend on n.

You quoted lines from the solution like |f^(n+1)(t)| = t*e^t. That's not right. That's what I meant by 'sloppily'.
Ah, I see now. I'd better solve those other ones then so that the solutions will be correct for future students! Thanks for clearing that up.

But if you can show that the remainder term goes to zero, then you have shown that the series converges.
This implies, then, that M can 'shift its value' and 'depend on n' defined as I explained them in this post. If that's the case, then you have answered that question too.

Thanks for all the help!
 
Last edited:
  • #6
Yes, the whole point is showing that the remainder term approaches 0. The numerator of the remainder term does depend on n. But it still approaches 0. That's all.
 
  • #7
Okay. I get it now. Thanks for clearing that up, and thanks again for helping!
 

FAQ: Proving convergence of a series to its generating function

What is a generating function?

A generating function is a mathematical tool used to represent a sequence of numbers as a power series. It is typically denoted by a function of a variable x and is used to encode information about the sequence, such as its coefficients and its behavior as n approaches infinity.

What does it mean for a series to converge?

A series is said to converge if the sum of its terms approaches a finite value as the number of terms increases. In other words, as more terms are added, the total value of the series approaches a constant value rather than increasing without bound.

How do I prove convergence of a series to its generating function?

To prove convergence of a series to its generating function, one would typically use the definition of convergence, which states that the partial sums of the series must approach a finite limit as the number of terms increases. This can be shown by evaluating the limit of the partial sums and showing that it equals the generating function.

What are some common techniques used to prove convergence of a series to its generating function?

Some common techniques used to prove convergence of a series to its generating function include the ratio test, the root test, and comparison tests such as the limit comparison test and the direct comparison test. These tests involve evaluating the behavior of the series and comparing it to known convergent or divergent series.

Why is it important to prove convergence of a series to its generating function?

Proving convergence of a series to its generating function is important because it allows us to understand the behavior of the series and its relationship to its generating function. It also allows us to use the generating function to compute values of the series, which can be useful in a variety of mathematical and scientific applications.

Similar threads

Replies
3
Views
808
Replies
2
Views
1K
Replies
1
Views
990
Replies
5
Views
276
Replies
4
Views
982
Replies
3
Views
1K
Replies
7
Views
852
Back
Top