How Does the Differential df Relate to Function Approximation?

  • Thread starter ck00
  • Start date
In summary, the conversation discusses the definition of the total derivative, or differential, and its relation to the chain rule. The proof for this relationship is based on Taylor's series and the use of δ-ε proof. The conversation also mentions that the function can be approximated within a certain degree of accuracy by working within a specific ball.
  • #1
ck00
19
0
let f(x,y)=0
Why df=(∂f/∂x)dx + (∂f/∂y)dy?
 
Physics news on Phys.org
  • #2
This is the definition of the total derivative, aka differential as I know it.

df here gives you the equation of the tangent plane that approximates the change of the function near a point. Was that your question?
 
  • #3
If x and y are themselves functions of a parameter, say, t, then
we can think of f(x, y)= f(x(t), y(t)) as a function of the single variable t and, by the chain rule:
[tex]\frac{df}{dt}= \frac{\partial f}{\partial x}\frac{dx}{dt}+ \frac{\partial f}{\partial y}\frac{dy}{dt}[/tex]

And then, the usual definition of the "differential" as df= (df/dt)dt gives
[tex]df= \frac{\partial f}{\partial x}\frac{dx}{dt}dt+ \frac{\partial f}{\partial y}\frac{dy}{dt}dt= \frac{\partial f}{\partial x}dx+ \frac{\partial f}{\partial y} dy[/tex]
 
Last edited by a moderator:
  • #4
HallsofIvy said:
If x and y are themselves functions of a parameter, say, t, then we can think of f(x, y)= f(x(t), y(t)) as a function of the single variable t and, by the chain rule:
[tex]\frac{df}{dt}= \frac{\partial f}{\partial x}\frac{dx}{dt}+ \frac{\partial f}{\partial y}\frac{dy}{dt}[/tex]

And then, the usual definition of the "differential" as df= (df/dt)dt gives
[tex]df= \frac{\partial f}{\partial x}\frac{dx}{dt}dt+ \frac{\partial f}{\partial y}\frac{dy}{dt}dt= \frac{\partial f}{\partial x}dx+ \frac{\partial f}{\partial y} dy[/tex]

But how can I prove this [tex]\frac{df}{dt}= \frac{\partial f}{\partial x}\frac{dx}{dt}+ \frac{\partial f}{\partial y}\frac{dy}{dt}[/tex]?

I know the basic operation in partial differentiation but i just not quite understand the theory behind it. Are there some proofs?
 
  • #5
A very informal (and possibly incorrect) proof I just thought of:

[tex]df=df(x(t),y(t))=f(x(t+h),y(t+h))-f(x(t),y(t))=f(x(t+h),y(t+h))-f(x(t),y(t+h))+f(x(t),y(t+h))-f(x(t),y(t))=\frac{\partial f}{\partial x}dx+\frac{\partial f}{\partial y}dy\Leftrightarrow \frac{df}{dt}=\frac{\partial f}{\partial x}\frac{dx}{dt}+\frac{\partial f}{\partial y}\frac{dy}{dt}[/tex]

For something more formal http://math.uc.edu/~halpern/Calc.4/Handouts/Proofchainrule2dim.pdf
 
Last edited by a moderator:
  • #6
ck00 said:
But how can I prove this [tex]\frac{df}{dt}= \frac{\partial f}{\partial x}\frac{dx}{dt}+ \frac{\partial f}{\partial y}\frac{dy}{dt}[/tex]?

I know the basic operation in partial differentiation but i just not quite understand the theory behind it. Are there some proofs?

It's the chain rule, it's very used in calculus. You can see a demonstration in wikipedia.

let f(x,y)=0
Why df=(∂f/∂x)dx + (∂f/∂y)dy?

That can be deduced writing f(x,y) as Taylor's series (for multivariate functions), and going up to the 2nd term. To do it f only has to be differentiable 2 times in (a,b) neighbourhood.

[tex]f(x,y) = f(a,b) + (\frac{\partial f}{\partial x}(a,b), \frac{\partial f}{\partial y}(a,b))\cdot (x-a, y-b)[/tex]

Putting,
[tex]x-a=\Delta x[/tex]
[tex]y-b=\Delta y[/tex]
[tex]f(x,y)-f(a,b)=\Delta f[/tex]

When [tex]\Delta x \to 0[/tex] and [tex]\Delta y \to 0[/tex] you get that expression.
 
  • #7
Now put the two previous answers together (they did the hard work, I am just chiming-in), and do a δ-ε proof, showing that you can approximate the value of your function within ε>0 by using the right value of δ. This is for real-valued functions. If not, i.e., for Rn-valued maps , show:

i)The differential df is a linear map:

ii) ||f(x+h)-f(x)-hL(x)||/||h||→ 0

as ||h||→0 is satisfied only by the differential L(x)=df

In your case, you want to show that your function can be approximated to any degree of accuracy ε>0 by working within a ball B(x,δ), as all the other posters said.
 

FAQ: How Does the Differential df Relate to Function Approximation?

1. What is the meaning of df=(∂f/∂x)dx + (∂f/∂y)dy?

The notation df=(∂f/∂x)dx + (∂f/∂y)dy represents the total differential of a multivariable function f. It indicates the change in the function f due to small changes in the variables x and y.

2. Why is the partial derivative (∂f/∂x)dx used in the equation?

The partial derivative (∂f/∂x)dx represents the change in the function f with respect to the variable x, while keeping all other variables constant. It is used in the equation to account for the change in the function due to changes in x.

3. How is (∂f/∂y)dy different from (∂f/∂x)dx?

(∂f/∂y)dy and (∂f/∂x)dx are both partial derivatives, but with respect to different variables. (∂f/∂y)dy represents the change in f with respect to y, while keeping all other variables constant. It is different from (∂f/∂x)dx, which represents the change in f with respect to x, while keeping all other variables constant.

4. Can you give an example of how to use the equation df=(∂f/∂x)dx + (∂f/∂y)dy?

Sure, let's say we have a function f(x,y) = x^2 + y^2. To find the change in f when x increases by 1 and y increases by 2, we can use the equation df=(∂f/∂x)dx + (∂f/∂y)dy. Plugging in the values, we get df = (2x)(dx) + (2y)(dy). Since x increases by 1 and y increases by 2, dx = 1 and dy = 2. Therefore, df = (2)(1) + (2)(2) = 6. So, the change in f is 6 when x increases by 1 and y increases by 2.

5. In what areas of science is the equation df=(∂f/∂x)dx + (∂f/∂y)dy commonly used?

This equation is commonly used in fields such as physics, engineering, economics, and statistics where multivariable functions are used to model real-world phenomena. It is particularly useful in calculating rates of change and optimization problems.

Back
Top