How to use Taylor series to represent any well-behaved f(x)

In summary, the conversation discusses the use of Taylor series to represent a function as an infinite series. It is explained how a Taylor series approximation is created by truncating the series at a finite number of terms, which allows for easier calculations. The conversation also mentions the theorem of Taylor's Theorem, which gives the maximal error of the approximation at any point. The use of Taylor series is illustrated with the example of $y = \sin(x)$, and the question is raised about its applicability to polynomials.
  • #1
DeusAbscondus
176
0
Does one assess $x$ at $x=0$ for the entire series? (If so, wouldn't that have the effect of "zeroing" all the co-efficients when one computes?)
only raising the value of $k$ by $1$ at each iteration?
and thereby raising the order of derivative at each iteration?$$\sum_{k=0}^{\infty}\frac{f^{k}(0)}{k!} x^k= f(0)+\frac{df}{dx}|_0 \ x + \frac{1}{2!}\frac{d^2f}{dx^2}|_0 \ x^2+ \frac{1}{3!}\frac{d^3f}{dx^3}|_0 \ x^3+ ...$$

I have no experience with series or sequences, so, I know I have to remedy this gap in my knowledge.
In the interim, however, I am currently enrolled in a Math course that looks at Calculus by beginning with Taylor series.
I am an adult beginner at Math, having done an introductory crash course in Calculus last year; I wanted to flesh this out: hence my current enrolment.

But I am at a loss to know how to manipulate the notation above and would appreciate a worked solution for some simple $f(x)$ (I won't nominate one, so as to preclude the possibility of my cheating on some set work)
I just need to see this baby in action with a "well-behaved function" of someone else's choosing, with some notes attached if that someone would be so kind.

Thanks,
Deo Abscondo
 
Last edited:
Physics news on Phys.org
  • #2
Re: How to use Taylor series to represent any well-behaved $f(x)$

Yes, the idea is that as you add more and more terms to the Taylor series, the series approximation becomes better and better and fits the function more closely for values farther away from $x = 0$.

For instance, let's take the venerable $y = \sin(x)$ function. Let's plot it:

NC6OgKj.gif


The first term of the Taylor series (i.e. the Taylor series approximation at $k = 0$) is just $f(0) = 0$, hence:

whcjIMh.gif


Of course, this approximation sucks. Let's try $k = 1$. Then the Taylor series approximation for $k = 1$ is:
$$\sum_{k = 0}^1 \frac{f^k (0)}{k!} x^k = \sin(0) + x \cos(0) = x$$
And our "first-order approximation" for $\sin(x)$ is the curve $y = x$, as illustrated below:

G9HFLc0.gif


Still not an awesome approximation, but it works pretty well for $x$ close to zero. In fact this is known as the small-angle approximation which says that $\sin(x) \approx x$ for small $x$.

What about the second order approximation $k = 2$, which is given by:
$$\sum_{k = 0}^2 \frac{f^k (0)}{k!} x^k = \sin(0) + x \cos(0) - \frac{sin(0)}{2!} x^2 = x$$
It turns out that the new term becomes zero. Ok.. fine.. that happens, so what about $k = 3$? Now we see that:
$$\sum_{k = 0}^3 \frac{f^k (0)}{k!} x^k = x - \frac{\cos(0)}{3!} x^3 = x - \frac{x^3}{3!}$$
And let's plot this:

9ztUI0A.gif


Wow! That's a really good approximation for all $|x| < 1$. And we see a pattern: repeatedly differentiating $\sin(x)$ will end up giving you $\cos(x)$, $- \sin(x)$, $- \cos(x)$, $\sin(x)$ endlessly. Every second term will become zero because of the $\sin(0)$ term, so only odd-numbered terms actually matter. We conclude that:
$$\sin(x) = \sum_{k = 0}^\infty \frac{f^k (0)}{k!} x^k = x - \frac{x^3}{3!} + \frac{x^5}{5!} - \cdots$$
Here's the Taylor series approximation for $k = 7$. Check it out:

rR6q35y.gif


And here's the one for $k = 9$:

6YFyzqX.gif


As you add more and more terms, the approximation tends towards the function for larger and larger values of $x$. In the limit, the Taylor series is equal to the original function.

So in other words, a Taylor series is another representation of a function, as an infinite series (a sum of infinitely many terms). A Taylor series approximation is the Taylor series truncated at a finite number of terms, which has the nice property of approximating the function around $x = 0$, and is often easier to calculate and work with, especially in physics where approximations are often used.

There is in fact a theorem that gives the maximal error of the Taylor series approximation at any point $x$ of the function in terms of $k$. Of course, as $k$ tends to infinity, the error tends to zero. This is Taylor's Theorem.

EDIT: uploaded images to imgur for perennity. W|A does strange things to externally linked images.
 
Last edited:
  • #3
Re: How to use Taylor series to represent any well-behaved $f(x)$

Beautiful to read!
Thanks kindly Bacterius.

But how would this work for a polynomial?
I mean, wouldn't the evaluation $x=0$ lead to "zeroing" all the co-efficients and resulting in non-sense?

If I take the case of $y=x^2+2x$ is this amenable to being described by the same series?
$$\sum_{k=0}^{\infty}\ \frac{f^k (0)}{k!}=0+ 2(0) + \frac{1}{2!}(2\cdot 0) ...$$
This doesn't seem to work! All I get is an infinite string of 0s!
So in other words, a Taylor series is another representation of a function, as an infinite series (a sum of infinitely many terms). A Taylor series approximation is the Taylor series truncated at a finite number of terms, which has the nice property of approximating the function around $x = 0$, and is often easier to calculate and work with, especially in physics where approximations are often used.

There is in fact a theorem that gives the maximal error of the Taylor series approximation at any point $x$ of the function in terms of $k$. Of course, as $k$ tends to infinity, the error tends to zero. This is Taylor's Theorem.
 
Last edited:
  • #4
Re: How to use Taylor series to represent any well-behaved $f(x)$

Taylor series of polynomials have the interesting property that they are in fact equal to the polynomial itself, and all extra terms are zero. In other words, the Taylor series for $x^2 + 2x$ is $x^2 + 2x + 0 + 0 + \cdots$. To see why:
$$f(x) = x^2 + 2x$$
$$f'(x) = 2x + 2$$
$$f''(x) = 2$$
$$f'''(x) = 0$$
.. and any further differentiation still gives zero

So:
$$\sum_{k = 0}^\infty \frac{f^{(k)} (0)}{k!} x^k = \frac{0^2 + 2 \cdot 0}{0!} x^0 + \frac{2 \cdot 0 + 2}{1!} x^1 + \frac{2}{2!} x^2 + 0 + \cdots = 0 + 2x + x^2 + 0 + \cdots = x^2 + 2x$$
So, yes, it still works, although it would seem to be less useful for polynomials than for other functions (but then, polynomials are easy to compute and are already pretty simple. I don't think "reducing" them to infinite series generally helps).

To calculate Taylor series I recommend you write down the iterated derivative of your function $f(x)$ and then plug in the numbers. Remember, $f^{(k)} (0)$ means "the $k$th derivative of $f$ evaluated at $x = 0$". The $x$ in your Taylor series is a different $x$ and is *not* equal to zero.

Also, there is a generalization of Taylor's series which is centered on arbitrarily values of $x$ instead of $x = 0$. I'll let you work out the general expression, though, it's an interesting exercise.
 
  • #5
Re: How to use Taylor series to represent any well-behaved $f(x)$

That has nailed it for me Bacterius!
Mightily obliged to you.

As per usual, the problem has vanished under the gaze of fresh eyes.

(Going now to chew on this strong meat with a cup of medicinal wine "to aid the digestion")

Cheers,
D'abs

Bacterius said:
Taylor series of polynomials have the interesting property that they are in fact equal to the polynomial itself, and all extra terms are zero. In other words, the Taylor series for $x^2 + 2x$ is $x^2 + 2x + 0 + 0 + \cdots$. To see why:
$$f(x) = x^2 + 2x$$
$$f'(x) = 2x + 2$$
$$f''(x) = 2$$
$$f'''(x) = 0$$
.. a.
 
  • #6
Re: How to use Taylor series to represent any well-behaved $f(x)$

The x in your Taylor series is a different x and is *not* equal to zero.

Okay, basically, I'm in the clear ...
but just what is this "different x"?

If it is distinct from the $x$ of my polynomial, by what principle/rule do I distinguish the two when I come to compute the polynomial?

D'abs
 
  • #7
Re: How to use Taylor series to represent any well-behaved $f(x)$

DeusAbscondus said:
Okay, basically, I'm in the clear ...
but just what is this "different x"?

If it is distinct from the $x$ of my polynomial, by what principle/rule do I distinguish the two when I come to compute the polynomial?

D'abs

The $x$ in the Taylor series is the $x$ at which you are evaluating your Taylor series (or your approximation of it). The derivative is always evaluated at 0 for this version of the Taylor series (sorry I agree it was a bit confusing, there is only one $x$, it's just the derivative is evaluated at a constant).

Essentially you are numerically evaluating derivatives of your original function, and using the resulting values as coefficients for your Taylor series (this is not a particularly useful way to think of it but it may be more intuitive to you to understand what is what)
 
  • #8
Re: How to use Taylor series to represent any well-behaved $f(x)$

That makes sense.
Meanwhile, I've been plugging and chugging a few simple polynomial functions through the T. series: hehe! what fun! it works! (*broadly grins)

You made my day: not understanding was getting me down (as usual);
and, again as usual, once some fresh light comes, and the aha! moment arrives, and one feels exuberant rather than dispirited.

Have a great weekend Bacteriu!

[QUOTEEssentially you are numerically evaluating derivatives of your original function, and using the resulting values as coefficients for your Taylor series (this is not a particularly useful way to think of it but it may be more intuitive to you to understand what is what)[/QUOTE]
 

FAQ: How to use Taylor series to represent any well-behaved f(x)

What is a Taylor series?

A Taylor series is a representation of a well-behaved function as an infinite sum of terms. Each term is a polynomial of increasing degree, with coefficients determined by the function's derivatives at a specific point.

How do you find the Taylor series for a function?

The Taylor series for a function can be found by using the Taylor series formula, which involves taking derivatives of the function at a specific point and plugging them into the formula.

Can any function be represented by a Taylor series?

No, only well-behaved functions can be represented by a Taylor series. This means that the function must be infinitely differentiable and have a finite radius of convergence.

What is the radius of convergence for a Taylor series?

The radius of convergence is the distance from the center point at which the Taylor series is valid. It is determined by the behavior of the function and can be found using various methods, such as the ratio test.

How accurate is a Taylor series representation of a function?

The accuracy of a Taylor series representation depends on the number of terms used and the behavior of the function. As more terms are included, the approximation becomes more accurate. However, for functions with discontinuities or singularities, the Taylor series may not accurately represent the function.

Similar threads

Replies
2
Views
2K
Replies
3
Views
2K
Replies
2
Views
1K
Replies
5
Views
14K
Replies
1
Views
1K
Replies
2
Views
5K
Replies
1
Views
1K
Back
Top