Paradoxical definition of the Derivative

In summary, the derivative is an approximation to the change in function over unit time. It is done by extrapolating the change over a small interval. If delta(t) is small, then this extrapolation will give a good approximation to the change over a unit time interval.
  • #1
danne89
180
0
A common definition I've read: A derivative of a arbitrary function is the change in one given instant.

It's hard to think about it. I mean, movement, which is change of position cannot be defined for zero time.

Has Zeros paradox something to do with this? If I don't misstaken, Zeros paradox is about the sum of infinity many parts can be finite.

Very confusing...
 
Physics news on Phys.org
  • #2
Or it could be that your "common definition" is just an illustrative explanation and not acutally a rigorous definition at all.
 
  • #3
Ahh. I see. But how does Zenos paradox relate?
 
  • #4
danne89 said:
Ahh. I see. But how does Zenos paradox relate?
It doesn't.

Note that in the "ordinary" interpretation of Zeno's paradox, that paradox is RESOLVED by noting the fact that, say, an infinite number of terms may add up to something finite.
 
  • #5
That's exactly WHY we need a rigorous definition for the derivative!

In fact, that's exactly what led Newton to develop the calculus! He wanted to show that the gravitational force was dependent on the distance from the mass (and, of course, the acceleration due to that force). But the distance could (in theory anyway) measured at any instant while neither speed nor acceleration could, without calculus, be DEFINED at a given instant. The fact that "speed at a given instant" (and, therefore "acceleration at a given instant") could not even be defined was really Zeno's point and the calculus was a way to do that.

Arildno, Zeno had several different "paradoxes". You, I think, are thinking of the one about "You can't cross this distance because you would first have to cross 1/2 of it, then 1/4 of it, then 1/8 of it, etc." danne89 was probably think of the "arrow" paradox: "At any given instant, the arrow is AT a specific point and therefore, not moving! Since it is not moving at any given instant, it can't be moving at all."
 
Last edited by a moderator:
  • #6
Nice. But as I've heard it, Newton's definiton wasn't rigorous.
 
  • #7
Not in the terms of modern analysis.At that time,IT WAS RIGUROUS ENOUGH TO DELIVER THE CORRECT THEORETICAL RESULTS...Namely explaing the laws of Kepler.

Daniel.
 
  • #8
So todays rigurous definition may turne out to be non-exact tomorrow, when new physics demands avaible?
 
  • #9
danne89 said:
So todays rigurous definition may turne out to be non-exact tomorrow, when new physics demands avaible?

Yes,mathematicians like to think they haven't discovered everything... :biggrin: Anyways,the basis cannot change.I suspect point set topology will be the same in the next 5000 years...

Daniel.
 
  • #10
the derivative is not , even in a roiugh intuitve sense, the change of the function in zero time. rather it is an approximation to the change of the function in UNIT time.


to compute by extrapolation the change of the function in unit time, you take the change in time delta(t) and divide that change by delta(t).

if delta(t) is very small, this result will extrapolate the change over a small interval to a change over a unit time interval.

we do this for a sequence of smaller and smaller injtervals, and then try to guess what number these results are tending toward.
 
  • #11
No, today's rigorous definitions are rigorious definitions within th current rigorous definition of "rigorous definition". Newton's weren't rigorous at the time. But this is a sign of the way the philosophy of mathematics has changed. What constituted a proof to Gauss, Euler, and even Galois often wouldn't pass muster in modern mathematics. That isn't to say their proofs were incorrect, but that there were some gaps, small ones, that they glossed over, or ignored.
 
  • #12
Conclusion: I'll look it up in a more rigurous book...
 

FAQ: Paradoxical definition of the Derivative

What is the paradoxical definition of the derivative?

The paradoxical definition of the derivative is a mathematical concept that describes the instantaneous rate of change of a function at a particular point. It is defined as the limit of the average rate of change of the function as the interval between two points approaches zero.

How is the paradoxical definition of the derivative different from the standard definition?

The paradoxical definition of the derivative differs from the standard definition in that it focuses on the instantaneous rate of change at a specific point rather than the average rate of change over an interval. It also involves taking the limit as the interval approaches zero, rather than a specific value.

Why is it called a paradoxical definition?

The paradoxical definition of the derivative is considered paradoxical because it involves using the concept of a limit, which was originally developed to understand the behavior of functions at infinite points, to describe the behavior of a function at a specific point.

How is the paradoxical definition of the derivative used in real-world applications?

The paradoxical definition of the derivative is used in real-world applications to calculate instantaneous rates of change in various fields such as physics, economics, and engineering. It is also used in optimization problems to find the maximum or minimum values of a function.

What are the limitations of the paradoxical definition of the derivative?

The paradoxical definition of the derivative has limitations in cases where the function is not continuous or differentiable at a specific point. It also requires a deep understanding of limits and may not be easy to grasp for some individuals.

Similar threads

Back
Top