Derive using Taylor series/Establish error term

In summary, the conversation is about deriving a formula using Taylor series and determining the error terms for each part. The original formula given is f'(x) ≈ (1/2*h)[4*f(x+h) - 3*f(x) - f(x+2h)]. The person asking for help is unsure of how to approach the problem, as their professor did not provide any examples or require a textbook for the class. They are then asked if they know what a Taylor series is and to apply it to f(x+h) and f(x+2h). The person responds that it has been a while since they worked with Taylor series and they are confused about the formula given. The conversation ends with a request for further assistance.
  • #1
trouty323
24
0

Homework Statement



Derive the following formula using Taylor series and then establish the error terms for each.

Homework Equations



f ' (x) ≈ (1/2*h) [4*f(x + h) - 3*f(x) - f(x+2h)]

The Attempt at a Solution



I honestly have no idea how to go about deriving this. The professor did not require a book for this class, and he never did an example. Any help would be greatly appreciated.
 
Physics news on Phys.org
  • #2
trouty323 said:

Homework Statement



Derive the following formula using Taylor series and then establish the error terms for each.

Homework Equations



f ' (x) ≈ (1/2*h) [4*f(x + h) - 3*f(x) - f(x+2h)]

The Attempt at a Solution



I honestly have no idea how to go about deriving this. The professor did not require a book for this class, and he never did an example. Any help would be greatly appreciated.

Do you know what a Taylor series is? If so, apply it to f(x+h) and f(x+2h), keeping just a few terms in each expansion.

RGV
 
  • #3
Ray Vickson said:
Do you know what a Taylor series is? If so, apply it to f(x+h) and f(x+2h), keeping just a few terms in each expansion.

RGV

Honestly, it's been several years since I've worked with Taylor series. The class is Numeral Methods. I'm confused by how the formula is set up. Everything I've looked up online does not look like this at all.
 
  • #4
Anybody?
 

FAQ: Derive using Taylor series/Establish error term

What is a Taylor series?

A Taylor series is a mathematical representation of a function as an infinite sum of its derivatives. It is used to approximate the value of a function at a particular point by using the values of its derivatives at that point.

How do you use a Taylor series to derive a function?

To derive a function using a Taylor series, you first need to find the derivatives of the function at a particular point. Then, substitute those derivatives into the Taylor series formula and sum them up to get an approximation of the original function at that point.

What is the error term in a Taylor series?

The error term in a Taylor series is the difference between the actual value of the function at a particular point and the value obtained by using the Taylor series approximation. It represents the accuracy of the approximation and decreases as more terms in the series are added.

Why is the error term important in a Taylor series?

The error term is important because it allows us to determine the accuracy of our Taylor series approximation. By calculating the error term, we can determine how many terms we need to include in the series to get a desired level of accuracy.

Can a Taylor series be used to approximate any function?

No, a Taylor series can only be used to approximate functions that are infinitely differentiable at the point of approximation. If a function is not infinitely differentiable, the Taylor series will not converge and will not provide an accurate approximation.

Back
Top