Taylor Series: Exploring Properties & Applications

In summary: The continuous function property only applies to sums that are finite, and the differentiability property only applies to differentiable functions.
  • #1
OhMyMarkov
83
0
Hello Everyone!

Suppose $f(x)$ can be written as $f(x)=P_n(x)+R_n(x)$ where the first term on the RHS is the Taylor polynomial and the second term is the remainder.
If the sum $\sum _{n=0} ^{\infty} = c_n x^n$ converges for $|x|<R$, does this mean I can freely write $f(x)=\sum _{n=0} ^{\infty} = c_n x^n$?

Can I also use the fact that the sum of continuous functions over a domain (in this case, $|x|<R$) is continuous, and that the sum of differentiable functions over a domain is differentiable?
 
Physics news on Phys.org
  • #2
Could you rewrite your question? For example:

OhMyMarkov said:
Hello Everyone! If the sum $\sum _{n=0} ^{\infty} = c_n x^n$

What does the left side mean? What is $c_n$?

Thanks.
 
  • #3
$P_n(x)=\sum _{k=0} ^n c_k x^k$ is the Taylor polynomial. the $c_n$s are the coefficients.
EDIT: Ah, that was an honest typo, I meant the following:

Can I freely write $f(x)=\sum _{n=0} ^{\infty} c_n x^n$?
 
  • #4
OhMyMarkov said:
... I meant the following...

... can I freely write $f(x)=\sum _{n=0} ^{\infty} c_n x^n$?

If the series converges inside and diverges outside a circle of radius R, then for $|x|<R$ the answer is YES!... an interesting question could be: and for $|x|=R$?... in this case the answer is: in the points on the circle where the series converges YES!...

Kind regards

$\chi$ $\sigma$
 
  • #5
chisigma said:
If the series converges inside and diverges outside a circle of radius R, then for $|x|<R$ the answer is YES!... an interesting question could be: and for $|x|=R$?... in this case the answer is: in the points on the circle where the series converges YES!...

How is important the concept is demonstrated by the following 'beatiful example'. In order to 'complete' [improbable!(Thinking)..] the discussion strated in...

http://www.mathhelpboards.com/f13/never-ending-dispute-2060/

... why don't try to asnswer to the question: what is $\varphi(x)=x^{x}$ for $x=0$?... avoiding wasting time answering to assertions like '$0^{0}$ is a nonsense' or something like that, we first use the identity $\displaystyle x^{x}= e^{x\ \ln x}$, so that we have to arrive at the value of the function $x\ \ln x$ for $x=0$. At first it seems that we are in the same situation because $0 * - \infty$ is 'indeterminate' but we don't decourage and search the Taylor expansion of the function $x\ \ln x$ around $x=1$. Without dreat effort we find...

$\displaystyle \psi (x)= x\ \ln x = (x-1) + \sum_{n=1}^{\infty} (-1)^{n}\ \frac{(x-1)^{n}}{n\ (n-1)}$ (1)

Very well!... it is relatively easy to 'discover' that the (1) converge inside a circle of radious R=1, but wht does it happen in $x=0$ that is on the circle?... in little time we obtain...

$\displaystyle \psi(0)= -1 + \frac{1}{2 \cdot 1} + \frac{1}{3 \cdot 2} + ... + \frac{1}{n\ (n-1)} + ...$ (2)

Now is...

$\displaystyle \sum_{n=2}^{\infty} \frac{1}{n\ (n-1)} = \sum_{n=2}^{\infty} (\frac{1}{n-1}-\frac{1}{n}) = 1$ (3)... so that is $\psi(0)=0 \implies \varphi(0)= 1$... one more dart on target! (Wink)... Kind regards $\chi$ $\sigma$
 
  • #6
OhMyMarkov said:
Suppose $f(x)$ can be written as $f(x)=P_n(x)+R_n(x)$ where the first term on the RHS is the Taylor polynomial and the second term is the remainder.
If the sum $\sum _{n=0} ^{\infty} c_n x^n$ converges for $|x|<R$, does this mean I can freely write $f(x)=\sum _{n=0} ^{\infty} c_n x^n$?
In order to be able to write $f(x)=\sum _{n=0} ^{\infty} c_n x^n$ (where the $c_n$ are the Taylor series coefficients, $c_n = f^{(n)}(0)/n!$), you need to know that $R_n(x)$ (in the formula $f(x)=P_n(x)+R_n(x)$) tends to 0 as $n\to\infty.$ It is not sufficient to know that the series $\sum _{n=0} ^{\infty} = c_n x^n$ converges, because unfortunately it can sometimes happen that the series converges to a sum different from $f(x).$

The classic example of this is the function $f(x) = \begin{cases}e^{-1/x}&\text{if }x>0, \\ 0 &\text{if }x\leqslant 0.\end{cases}$ As shown here, that function has the property that its Taylor series about the point 0 converges to the identically zero function. But the function itself is not identically zero. What happens in that case is that all the coefficients in the Taylor series are 0. So in the formula $f(x)=P_n(x)+R_n(x)$, $P_n(x)$ is always the zero polynomial and $R_n(x)$ is always equal to $f(x).$ Thus $R_n(x)$ does not tend to 0 as $n\to\infty$, and hence $P_n(x)$ does not converge to $f(x).$

OhMyMarkov said:
Can I also use the fact that the sum of continuous functions over a domain (in this case, $|x|<R$) is continuous, and that the sum of differentiable functions over a domain is differentiable?
Yes, those things are both true, provided that you mean a finite sum. If you allow infinite sums then those statements can both go wrong.
 
Last edited:
  • #7
Opalg said:
In order to be able to write $f(x)=\sum _{n=0} ^{\infty} c_n x^n$ (where the $c_n$ are the Taylor series coefficients, $c_n = f^{(n)}(0)/n!$), you need to know that $R_n(x)$ (in the formula $f(x)=P_n(x)+R_n(x)$) tends to 0 as $n\to\infty.$ It is not sufficient to know that the series $\sum _{n=0} ^{\infty} = c_n x^n$ converges, because unfortunately it can sometimes happen that the series converges to a sum different from $f(x).$

The classic example of this is the function $f(x) = \begin{cases}e^{-1/x}&\text{if }x>0, \\ 0 &\text{if }x\leqslant 0.\end{cases}$ As shown here, that function has the property that its Taylor series about the point 0 converges to the identically zero function. But the function itself is not identically zero...

I'm afraid I cannot agree with this statement!... As explained in...

Analytic Function -- from Wolfram MathWorld

... a function $f(z)$ is analytic in $z=z_{0}$ if and only if it is differentiable in $z=z_{0}$ and, because in this case $f(z)$ has derivatives of all order in $z=z_{0}$, its admits Taylor expansion in $z=z_{0}$ and the series converges to $f(z)$ ewerywhere inside a circle with center in $z_{0}$ and radius R. The 'classical example'...

$f(z) = \begin{cases}e^{-1/ z^{2}}&\text{if } z \ne 0, \\ 0 &\text{if } z=0 \end{cases}$ (1)

... doesn't have complex derivative in $z=0$ so that its Taylor expansion doesn't exist [in fact it exists the Laurent expansion around $z=0$...]. Regarding the 'mathematical material' contained in Wikipedia, in my opinion it is not all reliable and I strongly recommend 'Monster Wolfram'...

Kind regards

$\chi$ $\sigma$
 
  • #8
Now I'm thinking, what if $c_n$ doesn't really represent coefficients "extracted" by Taylor's formula, what if $\sum c_n x^n$ is just an ordinary power series that converges for $x$ inside the radius. Then we do not have to worry about the remainder, given that the power series converges, right?
 
  • #9
chisigma said:
I'm afraid I cannot agree with this statement!... As explained in...

Analytic Function -- from Wolfram MathWorld

... a function $f(z)$ is analytic in $z=z_{0}$ if and only if it is differentiable in $z=z_{0}$ and, because in this case $f(z)$ has derivatives of all order in $z=z_{0}$, its admits Taylor expansion in $z=z_{0}$ and the series converges to $f(z)$ ewerywhere inside a circle with center in $z_{0}$ and radius R. The 'classical example'...

$f(z) = \begin{cases}e^{-1/ z^{2}}&\text{if } z \ne 0, \\ 0 &\text{if } z=0 \end{cases}$ (1)

... doesn't have complex derivative in $z=0$ so that its Taylor expansion doesn't exist [in fact it exists the Laurent expansion around $z=0$...]. Regarding the 'mathematical material' contained in Wikipedia, in my opinion it is not all reliable and I strongly recommend 'Monster Wolfram'...

Kind regards

$\chi$ $\sigma$
There is of course a major difference between real differentiability and complex differentiability. The OP does not say whether the function in this thread is defined on the real or the complex numbers. If it defined on the complex numbers then it is true that a function that is differentiable at some point will always have a Taylor series in some neighbourhood of that point, and the Taylor series will converge to the function. I was answering the question on the assumption that it referred to a function of a real variable, where the situation is very different. As I explained above, a function on the real line can be (infinitely) differentiable there, with a convergent Taylor series whose sum is not equal to the function.
 
  • #10
OhMyMarkov said:
Now I'm thinking, what if $c_n$ doesn't really represent coefficients "extracted" by Taylor's formula, what if $\sum c_n x^n$ is just an ordinary power series that converges for $x$ inside the radius. Then we do not have to worry about the remainder, given that the power series converges, right?

Right!... it is important to write again what has been written in Your original post...

$\displaystyle f(x)= P_{n} (x) + R_{n} (x)$ (1)

... where $\displaystyle P_{n} (x) = \sum_{k=0}^{n} \frac{f^{(k)} (0)}{k!}\ x^{k}$ and $R_{n} (x)$ is a function that can assume some different look [Peano, Lagrange, Cauchy, etc...] but in any case tends to 0 if n tends to infinity. That shows clearly that to propose as 'counterexample' a Taylor expansion of f(x) that 'doesn't converge to f(x) [i.e. where $R_{n} (x)$ doesn't tend to 0 if n tends to infinity...] is at least innappropiate...

Kind regards

$\chi$ $\sigma$
 
Last edited:
  • #11
A remarkable contribution to that argument comes from an old our 'acquaitance'... Pauls Online Notes : Calculus II - Taylor Series

We report...

To determine a condition that must be true in order for a Taylor series to exist for a function let’s first define the nth degree Taylor polynomial of $f(x)$ as...$\displaystyle T_{n}= \sum_{i=0}^{n} \frac{f^{(i)} (a)}{i!}\ (x-a)^{i}$ (1)

Next the remainder is defined as...

$\displaystyle R_{n} (x) = f(x) - T_{n} (x)$

... so that...

$\displaystyle f(x) = T_{n} (x) + R_{n} (x)$ (2)

We have now the following ...

Theorem

Suppose the (2) is true, then if...

$\displaystyle \lim_{n \rightarrow \infty} R_{n} (x)=0$

... for $|x-a|<r$, then...

$\displaystyle f(x)= \sum_{i=0}^{\infty} \frac{f^{(i)} (a)}{i!}\ (x-a)^{i}$

... for $|x-a|<r$.

Kind regards

$\chi$ $\sigma$
 
  • #12
OhMyMarkov said:
Now I'm thinking, what if $c_n$ doesn't really represent coefficients "extracted" by Taylor's formula, what if $\sum c_n x^n$ is just an ordinary power series that converges for $x$ inside the radius. Then we do not have to worry about the remainder, given that the power series converges, right?
That is correct. If a function is defined by a power series, $f(x) = \sum c_nx^n$ then the pathology I mentioned in previous comments cannot occur. So long as you stay inside the radius of convergence, the series will converge to the function. What is more the function will be differentiable, and you can find its derivative by differentiating the power series term by term, $f'(x) = \sum nc_nx^{n-1}$, and the power series for $f'(x)$ will have the same radius of convergence as the power series for $f(x).$
 

FAQ: Taylor Series: Exploring Properties & Applications

What is a Taylor series?

A Taylor series is a mathematical representation of a function as an infinite sum of terms. It is used to approximate a function with a polynomial, making it easier to work with and understand.

What are the properties of a Taylor series?

The main properties of a Taylor series include the fact that it is centered around a specific point, it converges to the original function within a certain radius, and it can be used to approximate the function to any degree of accuracy.

How is a Taylor series calculated?

A Taylor series is calculated using the derivatives of the original function at the center point. These derivatives are then used to determine the coefficients of each term in the infinite sum.

What are some applications of Taylor series?

Taylor series have many applications in mathematics and physics. They are used to approximate functions, solve differential equations, and analyze the behavior of systems. They are also used in computer graphics to create smooth curves and surfaces.

What are the limitations of a Taylor series?

One limitation of a Taylor series is that it can only approximate a function within a certain radius of the center point. Outside of this radius, the approximation becomes less accurate. Additionally, Taylor series may not converge for certain functions, making it difficult to use in those cases.

Similar threads

Replies
2
Views
2K
Replies
10
Views
2K
Replies
32
Views
6K
Replies
4
Views
17K
Replies
9
Views
2K
Replies
2
Views
1K
Replies
3
Views
2K
Replies
4
Views
1K
Back
Top