- #1
milesyoung
- 818
- 67
I'm currently taking several physics courses (mechanics, thermodynamics etc) and common to them all is their frequent use of infinitesimals.
I'll just give a short recap of how I was taught calculus, and this is how my math teacher would word it:
[calculus training]
[tex]\frac{dy}{dx}[/tex] is not a fraction. [tex]\frac{d}{dx}[/tex] is a differential operator, y is a function.
The so-called differentials *sneer* can be defined for a function [tex]f(x)[/tex] as [tex]dy=f'(x)\,dx[/tex].
[/calculus training]
As I was taught calculus I really never heard much about differentials, let alone infinitesimals.
In physics it seems to be a whole other world. Differentials, as in infinitesimal quantities (and not just some watered down form of linear approximation), seem to occur naturally in any mathematical argument, as if the concept of limit etc. weren't really needed.
In the textbook I got for my university calculus course, they go to great lengths to point out that we're not operating with these, apparently distasteful, infinitesimals. We use limits like proper men... and then some pages after, with regards to integrals, infinitesimals are used as a "useful heuristic device for setting up integrals".
It's like "THIS IS WRONG, IT MIGHT BREAK! ... but here, use it anyway".
And they make no effort to tell you when it might break or how it should be used to produce correct results.
In the current physics courses I'm taking they throw differentials around like it's nothing, and it's frustrating to me because I see no logic in it. It's not the way they taught me calculus.
Is there any book out there with the title "This is how you bridge the gap between calculus in math and physics"?
I'll just give a short recap of how I was taught calculus, and this is how my math teacher would word it:
[calculus training]
[tex]\frac{dy}{dx}[/tex] is not a fraction. [tex]\frac{d}{dx}[/tex] is a differential operator, y is a function.
The so-called differentials *sneer* can be defined for a function [tex]f(x)[/tex] as [tex]dy=f'(x)\,dx[/tex].
[/calculus training]
As I was taught calculus I really never heard much about differentials, let alone infinitesimals.
In physics it seems to be a whole other world. Differentials, as in infinitesimal quantities (and not just some watered down form of linear approximation), seem to occur naturally in any mathematical argument, as if the concept of limit etc. weren't really needed.
In the textbook I got for my university calculus course, they go to great lengths to point out that we're not operating with these, apparently distasteful, infinitesimals. We use limits like proper men... and then some pages after, with regards to integrals, infinitesimals are used as a "useful heuristic device for setting up integrals".
It's like "THIS IS WRONG, IT MIGHT BREAK! ... but here, use it anyway".
And they make no effort to tell you when it might break or how it should be used to produce correct results.
In the current physics courses I'm taking they throw differentials around like it's nothing, and it's frustrating to me because I see no logic in it. It's not the way they taught me calculus.
Is there any book out there with the title "This is how you bridge the gap between calculus in math and physics"?