Newton's Calculus: Formulation, Fun & Paradoxes

  • Thread starter dalcde
  • Start date
  • Tags
    Calculus
In summary, I think that the Newton's original formulation of Calculus looked like a code that was kept hidden from the general public. It wasn't until the late 1800s that people started teaching calculus using the epsilon/delta formulation. There are still teachers out there who use infinitesimals, but the majority of calculus instruction now uses limits.
  • #1
dalcde
166
0
How did Newton's original formulation of Calculus look like? I've heard that it was more intuitive and fun than the function-based calculus. I've also heard that there were paradoxes that arises from Newton's calculus. I'm not sure about that because I can't find the source again and I can't see them mentioned anywhere else. If there are paradoxes, can you please show me?
 
Physics news on Phys.org
  • #2
Before about 1900, everybody used infinitesimals to do calculus. (If you have infinitesimals, you also have infinities, since you can invert an infinitesimal.) A classic calc text using this approach is Silvanus Thompson's Calculus Made Easy, which you can find for free on the web. Nobody knew how to formalize infinitesimals, and you just had to sort of get a feel for what kinds of manipulations were OK. Because of this, it became stylish ca. 1900-1960 to teach calc using limits rather than infinitesimals, although practitioners in fields like physics never stopped using infinitesimals. Ca. 1960, Abraham Robinson proved that you could do all of analysis using infinitesimals, and laid out specific rules for what manipulations were OK. A well known text using this approach is Elementary Calculus by Keisler, which the author has made freely available on the web.

A typical paradox would be something like this. Infinity plus one is still infinite, so ∞+1=∞. Subtracting ∞ from both sides gives 1=0. Before Robinson, you just had to know that this kind of manipulation didn't smell right. After Robinson, there are more clearly defined rules that you can learn, and those rules tell you that this manipulation is bogus. The basic rule is called the transfer principle, which you can learn about in Keisler's book.
 
  • #3
I've heard Newton's original calculus was very "code like" as he was very secretive about his work, and that you pretty much need a phd to start reading it.
 
  • #4
bcrowell said:
Before about 1900, everybody used infinitesimals to do calculus.

I don't think that's a fair summary of the state of 19th century math. Many mathematicians made a heroic effort to finally come to a rigorous definition of the real numbers and limits, resulting in the modern epsilon/delta formulation.

I understand that you are a specialist in the use of infinitesimals, but the history of the work done by Cauchy, Weierstrass, Dedekind, Cantor, and others should not be ignored in an objective response to the OP's question.

Even Newton understood that his use of infinitesimals was not on solid ground, and he struggled over the years to rework his foundations.

What you call the "stylish" use of limits, I would call "mathematically correct."
 
  • #6
SteveL27 said:
I don't think that's a fair summary of the state of 19th century math.

Without claiming to be a mathematical historian, I agree.

For a person interested in investigating this himself, I recommend Augustus De Morgan's big book "The Differential And Integral Calculus". He covers such interesting topics as fractional derivatives and, to the extent that an expositor of calculus can be witty and entertaining, he is.


It's available online:

http://books.google.com/books?id=rV...ugustus+de+Morgan&hl=es&source=gbs_navlinks_s

Amazon sells photographic copies of the original in two volumes.

https://www.amazon.com/dp/1432616889/?tag=pfamazon01-20

https://www.amazon.com/dp/1432616889/?tag=pfamazon01-20

The typography looks slightly muddy. (It looked a little muddy in an original printed copy of the work that I saw.) A person who already knows calculus won't have any problem understanding. There is a list of errata beginning on page 778, which is in volume 2 of the Amazon offerings.
 
Last edited by a moderator:
  • #7
I got a little sick and tired of how calculus is explained to people and so I decided to write a little book and give it away for free online. Someone out there please take a look at www.thegistofcalculus.com and feel free to tell me what you think. I am sure that this approach to calculus won't please everybody but it's short and interesting and geometric. I am not really trying to make money from this at the moment but it would be nice to maybe some day have enough to finance a 3d movie that could explain things even better.
 

FAQ: Newton's Calculus: Formulation, Fun & Paradoxes

What is Newton's Calculus?

Newton's Calculus is a branch of mathematics developed by Sir Isaac Newton in the 17th century. It is a mathematical system for finding the instantaneous rate of change of a function, as well as the area under a curve. It is one of the foundations of modern mathematics and has numerous applications in science and engineering.

What is the formulation of Newton's Calculus?

The formulation of Newton's Calculus consists of two main concepts: differentiation and integration. Differentiation is the process of finding the instantaneous rate of change of a function, while integration is the process of finding the area under a curve. These two concepts are closely related and form the basis of Newton's Calculus.

What makes Newton's Calculus fun?

Many people find Newton's Calculus fun because it allows them to solve complex problems and understand the world around them in a deeper way. It also involves puzzles and paradoxes, which can be both challenging and entertaining to solve.

What are some common misconceptions about Newton's Calculus?

One common misconception is that Newton's Calculus is only used in advanced mathematics and has no practical applications. In reality, it has numerous real-world applications in fields such as physics, engineering, and economics. Another misconception is that it is only for geniuses, when in fact it can be learned and applied by anyone with a basic understanding of algebra and geometry.

What are some famous examples of paradoxes in Newton's Calculus?

One famous paradox is the "paradox of Achilles and the tortoise," which involves a race between the Greek hero Achilles and a tortoise. Another well-known paradox is the "Banach-Tarski paradox," which states that it is possible to take a solid object and split it into a finite number of pieces, which can then be reassembled to form two identical copies of the original object.

Similar threads

Replies
53
Views
2K
Replies
9
Views
2K
Replies
1
Views
10K
Replies
22
Views
3K
Replies
15
Views
2K
Replies
5
Views
3K
Replies
18
Views
4K
Replies
5
Views
1K
Replies
24
Views
2K
Back
Top