Why is the Derivative of e^x e^x?

In summary, the derivative of e^x is e^x itself because it is the definition of the derivative. This can also be shown by using the power series or by finding the area under the curve of Inx and e^x. The equation f'(x) = f(x) has a unique solution because of the Lipschitz Condition and the fact that the set of solutions form a vector space. This can be proven through linear algebra or by integrating the function dt/t and finding its inverse.
  • #1
Aki
83
0
I don't understand why the derivative of e^x is e^x itself.
 
Physics news on Phys.org
  • #2
Have you tried apply the definition of a derivative for [itex]f(x) = e^x[/itex]?

[tex]\frac{d}{dx}f(x) = \lim_{h\to 0}\frac{f(x+h) - f(x)}{h}[/tex]

--J
 
  • #3
There are two types of answer to this.

1. Because it's the definition: exp is the function satisfying th differential equation f' = f. The question then becomes: why does it have this form, ie an exponential. If you were to work out the slope using Justin's hint then you'd see that, assuming some limits exist, all exponential functions have derivatives proportional to themselves, and e is the number where the proportionality constant is 1.

2. Assume the definition of exp{x} is its power series, abuse analysis by differentiating term by term.
 
  • #4
first consider the following

the integral of In(x) is equal to xInx-x.
the value of the graph of e^x is the same as doing the e^x bound on the Inx graph. so we could plug in e^x for x into xInx-x. Giving xe^x-e^x as the area inside of the Inx graph. Now this is below the curve, we are trying to find the area above the curve on the Inx graph. So the rectangle area is equal to x(e^x). And the area of the region is now xe^x-(xe^x-e^x)=e^x

I hope this logic was right.
 
  • #5
tongos said:
first consider the following

the integral of In(x) is equal to xInx-x.
the value of the graph of e^x is the same as doing the e^x bound on the Inx graph. so we could plug in e^x for x into xInx-x. Giving xe^x-e^x as the area inside of the Inx graph. Now this is below the curve, we are trying to find the area above the curve on the Inx graph. So the rectangle area is equal to x(e^x). And the area of the region is now xe^x-(xe^x-e^x)=e^x

I hope this logic was right.

Not really.I failed to see the connection between the area under one of the 2 curves and the derivative of e^{x}.

Daniel.
 
  • #6
perhaps we should ask what the questioner knows about the function e^x.

i.e. the respinders are observing that if you know two things:

1) the definition of a derivative

2) the property e^(x+h) = e^x e^h,

then the result is essentially forced on you.
 
  • #7
my post wasnt very clear, but the region of area above the Inx graph resembles the area under the curve in the e^x graph. it creates a rectangle.
 
Last edited:
  • #8
Maybe mine wasn't very clear.What does that fact (resemblence of areas beneath the graphs) got to do with
[tex] (e^{x})'=e^{x} [/tex] ?

Daniel.
 
  • #9
the antiderivative of e^x is equal to e^x+C. The derivative of the antiderivative is equal to e^x.
 
  • #10
I have a related question.

How can you show that the equation f'(x) = f(x) has a unique solution?
 
  • #11
Gues what,it DOES NOT...Ups,i didn't notice Matt forgot about the initial condition (which would select e^{x} from the infinity of solutions of the ODE) f(0)=1 [/tex]

Daniel.
 
  • #12
first i thought he was asking why the integral of e^x was e^x? Then i reread it and saw that he was actually asking why e^x=(e^x)'
After this i did the pathetic "antiderivative of a function is equal to the integral, which is equal to the area under the curve business."
But i thought of the method of similarities in inverse curves, finding the integral, which i thought was pretty. I was asked to find the area under the curve in the arctan on the ap test. so i had make something.
 
  • #13
i proved by the Inx graph, that the integral of e^x is e^x+C.
 
  • #14
However,there's a long way between the derivative of e^{x} and the area under the graph of a function...Almost infinitely long...

Daniel.
 
  • #15
one can only confuse derivatives and integrals only when dealing with e^x. I shouldn't drop the C, which is the constant. I have a question, that probably most of you guys with think as easy, but is there any other function where the derivative of the function is equal to the function itself? prove or disprove.
 
Last edited:
  • #16
No.e^{x} is the only one.Matt suggested via the ODE.And i added the initial condition [itex] y(0)=1 [/itex].

Daniel.
 
  • #17
assume f' = f, and take the derivative of f/e^x. se what happens. then use the MVT.
 
  • #18
how about -e^(x)


or d/dx (y.e^(x)) where y is a arbitrary constant
 
  • #19
Differential equations have the full complement of solutions that are uniquely determined according to whether they satsify the Lipschitz Condition and something else. I'm sure Wolfram will provide the answer as ever.
 
  • #20
How can you show that the equation f'(x) = f(x) has a unique solution?

This is not an easy task, but in analysis it essentially involves making precise the intuitive argument that "if we take small steps and adjust our course according to the DE then the path we walk will be uniquely determined by where we start."

I believe the uniqueness theorem for linear ODEs can be proven through linear algebra, the set of solutions form a vector space yadda yadda.
 
  • #21
i just proved it above. let g, f be any two, solutions of F' = F.

Then (f/g)' = [f'g-g'f]/g^2 = [fg-gf]/g^2 = 0. hence by the MVT, f/g is constant.

so f = cg, where c is constant. hence if f(a) = g(a) = 1, then f = g.

there is nothing difficult about it. or did you mean existence?

that is also easy. i.e. just integrate dt/t, from 1 to x, which is then a function called L(x), and monotone and differentiable on the positive real line, with non zero derivative 1/x, hence having a differentiable inverse called E(x) and satisfying E' = E.


done. this is basic first year calculus, (I hope). It is in my class. these are questions on the tests I give.
 
Last edited:
  • #22
exercise: prove the only solutions to f' = cf, are ae^(cx) where a,c, are constants.


then prove the only solutions to f^(n) + a1 f^(n-1) +...+a(n-1)f + an = 0,

where X^n + a1X^(n-1) +...+a(n-1)X + an = (X-c1)(...)(X-cn), and ci are distinct constants, are linear combinations of e^(c1X),...,e^(cnX).

there is nothhing deep here, just .linear algebra, and the fact that the differential operations (D-c)f = f' - cf, commute, for different values of c.
 
  • #23
what about ln x? would the derivative still be ln x?
 
  • #24
No,compute it using the definition of derivative and the definition of "e"...

Daniel.
 
  • #25
that doesn't do anything robert. you are assuming the result you are supposed to prove.

i.e. when you write integral of dy/y and assume it equals ln(x) that is equivbalent to assuming that e^x is the only function (up to a constant) that equals its own derivative.

oh no, my mistake, no doubt you mean to invoke the MVT to prove the only solution of dy/y is d[ln(y) +c].

thus you are making the nice observation that to prove there is only one function up to constant multiple, which equals its own derivative, is equivalent to showing there is only one function, up to constant additive, that has derivative 1/x. indeed this is true, and easier to prove, since most people are aware of that way of using the MVT.
 
Last edited:
  • #26
Here is the approach on Howard Anton's Calculus...

First define ln(x) as [itex]\int_1^x \frac{1}{x} dx[/itex].

Then by the Fundamental Theorem of Calculus (part 2), d/dx(ln(x)) = 1/x.

Now define e^x as the inverse function of ln(x).

The derivative is then

[tex]\frac{1}{d/du(\ln u)|_{u=e^x}} = e^x[/tex]
 
Last edited:
  • #27
does this seem similar to post 21 agro?
 
Last edited:
  • #28
It does,which would only mean that Howard Anton and Mathwonk think of exponential and natural logarithm in the same way...

Daniel.

P.S.Is that a good thing,or a bad thing...?
 
  • #29
you really know how to hurt a guy.

actually i learned the content of my post from courant. it is not so common in most calc books, to see courant's argument that f' = f only if f = ce^x, using MVT directly, without using the inverse function theorem.

the problem for me, with anton et al, is they use, but ordinarily do not prove, the inverse function theorem.
 
Last edited:
  • #30
tongos said:
one can only confuse derivatives and integrals only when dealing with e^x. I shouldn't drop the C, which is the constant. I have a question, that probably most of you guys with think as easy, but is there any other function where the derivative of the function is equal to the function itself? prove or disprove.

Oooh that's a loaded question. It depends on what you mean by a derivative, and what you mean by a function.

There is at least one trivial disproof using elementary calculus. The function f(x)=0
 
  • #31
the only functions which equal their own derivatives are the functions of form ce^x, with c constant. this is proved above in several ways, all ultimately relying on the mean value theorem.

If one also assumes that f(0) = 1, then the only such function is e^x, as also observed above.

If one asks where the constant function f = 0, fits in here, it is ce^x with c = 0.
 
  • #32
Aki said:
I don't understand why the derivative of e^x is e^x itself.

Hello Aki,

It's been awhile since I did this myself, so I am going to do it again, for you as well as me. The goal is to find a power series in one unknown X, if there is one, whose derivative is equal to itself.

If there is such a power series, we can then reserve the special symbolism [tex] e^x [/tex] for the series. So in otherwords, if you can find a power series whose derivative with respect to X is equal to the original undifferentiated power series, you have the reason that:

[tex] \frac{d}{dx} e^x = e^x [/tex]

The answer would then be verbalized as saying, "There happens to be an analytic function whose derivative is equal to itself, and there is only one such function. Mankind is denoting this function using the symbolism e^x, but in order to find the function we must look for its power series expansion.

Discussion

Suppose that some function f(x) can be expressed as a power series.

[tex] f(x) = C_0 + C_1x + C_2x^2 + C_3x^3 + ... [/tex]

We introduce summation notation to simplify the RHS of the equation above.

[tex] f(x) = \sum_{n=0}^{n= \infty} C_n x^n [/tex]

Now, take the derivative with respect to x of both sides of the equation above to obtain:

[tex] f^\prime (x) = \sum_{n=0}^{n= \infty} n C_n x^{n-1} [/tex]

Which is none other than

[tex] df/dx = C_1 + 2C_2x + 3C_3x^2 + ... = \sum_{n=1}^{n= \infty} n C_n x^{n-1} = \sum_{n=0}^{n= \infty} (n+1) C_{n+1} x^n
[/tex]

Now, take the derivative with respect to x again, to obtain:

[tex] f^\prime^\prime (x) = \sum_{n=0}^{n= \infty} n(n-1) C_n x^{n-2} [/tex]

Which is none other than

[tex] (d^2f/dx^2) = f^\prime^\prime (x) = \sum_{n=2}^{n= \infty} n(n-1) C_n x^{n-2} = \sum_{n=0}^{n= \infty} (n+2)(n+1) C_{n+2} x^{n} [/tex]

Now, take the derivative with respect to x again, to obtain

[tex] f^\prime^\prime^\prime (x) = \sum_{n=0}^{n= \infty} n(n-1)(n-2) C_n x^{n-3} [/tex]

Which is equivalent to:

[tex] f^\prime^\prime^\prime (x) = \sum_{n=3}^{n= \infty} n(n-1)(n-2) C_n x^{n-3}= \sum_{n=0}^{n= \infty} (n+3)(n+2)(n+1) C_{n+3} x^{n} [/tex]

At this point, you have enough information to find a simple formula for the unknown constants in terms of n. Look at the formula for the third derivative of f(x). Suppose that x =0 in that formula. You thus have:

[tex] f^\prime^\prime^\prime (0) = \sum_{n=0}^{n= \infty} (n+3)(n+2)(n+1) C_{n+3} 0^{n}[/tex]

Keeping in mind that [tex] 0^0 =1, 0^1 =0, 0^2 =0, etc [/tex] you should now see that you can deduce that:

[tex] f^\prime^\prime^\prime (0) = 3*2*1*C_3 [/tex]

And if you check the second derivative of f(x), evaluated at x=0, you will find that:

[tex] f^\prime^\prime (0) = 2*1*C_2 [/tex]

And if you check the first derivative of f(x), evaluated at x=0, you will find that:

[tex] f^\prime(0) = 1*C_1 [/tex]

And of course

[tex] f(0) = C_0 [/tex]

Thus, the we have the following formula for [tex] C_n [/tex]

[tex] C_n = \frac{f^n(0)}{n!} [/tex]

Where

[tex] n! = n(n-1)(n-2)(n-3)...*2*1 [/tex]

Now, we can now re-write the formula for f(x), using the formula for Cn. We have:

[tex] f(x) = \sum_{n=0}^{n= \infty} C_n x^n = \sum_{n=0}^{n= \infty} \frac{f^n(0)}{n!} x^n [/tex]

We now have only to answer the question, "is there or isn't there a power series whose derivative is equal to itself." Suppose that there is. Let f(x) denote the series. Therefore we must have:

[tex] f(x) = \sum_{n=0}^{n= \infty} \frac{f^n(0)}{n!} x^n = f^\prime (x) = \sum_{n=0}^{n= \infty} \frac{f^n(0)}{n!} n x^{n-1}

[/tex]

So, if there is a function f(x), whose derivative with respect to x is equal to f(x), then the power series expansion of f(x) must be such that:

[tex] \sum_{n=0}^{n= \infty} \frac{f^n(0)}{n!} x^n = \sum_{n=0}^{n= \infty} \frac{f^n(0)}{n!} n x^{n-1} = \sum_{n=1}^{n= \infty} \frac{f^n(0)}{n!} n x^{n-1} = \sum_{n=0}^{n= \infty} \frac{f^{n+1}(0)}{n!} x^n

[/tex]

Thus, we can see that the statement above is true, provided that:

[tex] f^n(0) = f^{n+1}(0)

[/tex]

Thus, the constraint for there to be a function f(x) that is equivalent to its derviative, is that its nth derivative evaluated at the point x=0 must equal its( n+1)th derivative evaluated at x=0. This isn't enlightening enough, so let us write things more explicitely. We have to have:

[tex] \frac{d^nf}{dx^n}|_{x=0} = \frac{d^{n+1}f}{dx^n}|_{x=0} [/tex]

Now, since we are looking for a power series expression for x, we have:

[tex] f(x) = C_0 + C_1x + C_2x^2 + C_3x^3 + C_4x^4+ ... [/tex]
[tex] f^\prime(x) = C_1 + 2C_2x + 3C_3x^2 + 4C_4x^3+ ... [/tex]
[tex] f^{\prime\prime}(x) = 2C_2 + 3*2C_3x + 4*3C_4x^2+ ... [/tex]
[tex] f^{\prime\prime\prime}(x) = 3*2C_3 + 4*3*2C_4x+ ... [/tex]
[tex] f^{\prime\prime\prime\prime}(x) = 4*3*2C_4+ ... [/tex]

and so on.

And when the expressions above are evaluated at x=0, only the first term in any of them remains, and that first term is a constant. And we know that the constraint dicates that the n+1th derivative evaluated at x=0 must equal the nth derivative evaluated at x=0, hence the constraint is:

[tex] C_0 = C_1 = 2C_2 = 3*2C_3 = 4*3*2*1C_4 [/tex] and so on.

Suppose that c_0 =1. In that case, we must have:

[tex] 1 = C_1 = 2C_2 = 3*2C_3 = 4*3*2*1C_4 [/tex]

So that the formula we need for c(n) in order to have a power series which is equal to its derivative is:

[tex] C_n = \frac{1}{n!} [/tex]

Becase it will now follow that:

[tex] 1 = C_1 = 2/2! = 3*2*1/3! = 4*3*2*1/4! =5*4*3*2*1/5![/tex]

And the above equations are all true.

Thus, a power series which is equal to its own derivative is:

[tex] \sum_{n=0}^{n= \infty} \frac{x^n}{n!} = 1+x+x^2/2!+x^3/3!+x^4/4! [/tex]

Which we can check directly:

[tex] \frac{d}{dx} \sum_{n=0}^{n= \infty} \frac{x^n}{n!} = \sum_{n=0}^{n= \infty} n \frac{x^{n-1}}{n!} = \sum_{n=1}^{n= \infty} n \frac{x^{n-1}}{n!} = \sum_{n=1}^{n= \infty} \frac{x^{n-1}}{(n-1)!}

=\sum_{n=0}^{n= \infty} \frac{x^n}{n!}

[/tex]

QED

And this is not the only function which is equal to its own derivative since multiplication of it, by any constant, will also be equal to its own derivative.

Let us thus define e^x as follows:

[tex] e^x = \sum_{n=0}^{n= \infty} \frac{x^n}{n!} [/tex]

It will follow that [tex] e^x = \frac{de^x}{dx} [/tex]

It will also follow that for any constant B we have:

[tex] Be^x = \frac{dBe^x}{dx} [/tex]

And hence there is a class of functions (not just one as I said in the beginning) that are equal to their own derivative. The class of functions are given by:

[tex] f(x) = B\sum_{n=0}^{n= \infty} \frac{x^n}{n!} [/tex]

Where B is an arbitrary constant.

Regards,

Guru
 
Last edited:
  • #33
this is very beautiful, and no doubt guided the early fathers of calculus. as such it is likely helpful here as well.

however, it does not quite respond to the question by modern standards of proof. for one thing it assumes your power series defines a differentiable function, to get existence of a solution of f' = f, this way.

one can of course prove this, e.g. by showing the local uniform convergence of both the series and the derived series. you still have then a little way to go to relate this series to the exponential function you somewhat cavalierly redefine.

i.e. then you should prove that any solution of the equation f' = f, and f(0) = 1, does satisfy the relation f(x+y) = f(x)f(y), which then does make it an exponential function.


this is not hard as follows: let a be given and consider the function g(x) = f(a+x)/f(x).

then g'(x) = [f(a+x)f(x) - f(a+x)f(x)]/f^2(x) = 0, so f(a+x)/f(x) = a constant, which for x=0, equals f(a), so f(a+x) = f(x)f(a). at least whenever f(x) ≠ 0.

Aha! But both sides of the equation are continuous, and hence it suffices to show that they are equakl on a dense set. But since by your definition, f is an analytic function, its set of zeroes is isolated, hence nowhere dense. Thus the equation holds everywhere, and then it follows that f(x) is never zero, or else it would always be zero, and yet we have assumed that f(0) = 1.
 
  • #34
wow Physicsguru, thank you, I'm sure you put lots of time in answering my question. Thanks
 
  • #35
remember aki also to ask yourself, when does a series actually define a function? when does it define a differentiable function? if so, can the derivative be obtained by formal differentiation of the series?

but i admit these questions about series maybe did not much concern the early and great mathematicians like euler, who pioneered their use.
 
<h2> Why is the derivative of e^x equal to e^x?</h2><p>The derivative of e^x is equal to e^x because e^x is its own derivative. This means that the rate of change of e^x is equal to its value at any given point, making it a very special function in calculus.</p><h2> What makes e^x different from other exponential functions?</h2><p>E^x is different from other exponential functions because it is the only function whose derivative is equal to itself. This is due to the unique properties of the mathematical constant e, which is approximately equal to 2.71828.</p><h2> How is the derivative of e^x derived?</h2><p>The derivative of e^x is derived using the power rule of differentiation. This rule states that for any function f(x) = x^n, its derivative is f'(x) = nx^(n-1). When applied to e^x, the derivative becomes e^x * ln(e), which simplifies to e^x.</p><h2> What is the significance of e^x in calculus?</h2><p>E^x is significant in calculus because it is the basis for many important mathematical concepts, such as exponential growth and decay, compound interest, and the natural logarithm. It also plays a crucial role in solving differential equations, which are used to model many real-world phenomena.</p><h2> Can the derivative of e^x be negative?</h2><p>No, the derivative of e^x cannot be negative. Since e^x is always positive, its derivative must also be positive. This means that the graph of e^x is always increasing and never decreasing, making it a useful tool in analyzing the behavior of various functions in calculus.</p>

FAQ: Why is the Derivative of e^x e^x?

Why is the derivative of e^x equal to e^x?

The derivative of e^x is equal to e^x because e^x is its own derivative. This means that the rate of change of e^x is equal to its value at any given point, making it a very special function in calculus.

What makes e^x different from other exponential functions?

E^x is different from other exponential functions because it is the only function whose derivative is equal to itself. This is due to the unique properties of the mathematical constant e, which is approximately equal to 2.71828.

How is the derivative of e^x derived?

The derivative of e^x is derived using the power rule of differentiation. This rule states that for any function f(x) = x^n, its derivative is f'(x) = nx^(n-1). When applied to e^x, the derivative becomes e^x * ln(e), which simplifies to e^x.

What is the significance of e^x in calculus?

E^x is significant in calculus because it is the basis for many important mathematical concepts, such as exponential growth and decay, compound interest, and the natural logarithm. It also plays a crucial role in solving differential equations, which are used to model many real-world phenomena.

Can the derivative of e^x be negative?

No, the derivative of e^x cannot be negative. Since e^x is always positive, its derivative must also be positive. This means that the graph of e^x is always increasing and never decreasing, making it a useful tool in analyzing the behavior of various functions in calculus.

Similar threads

Back
Top