How do physicists know if limits exist, things are integrable, etc?

In summary, physicists determine the existence of limits and integrability through mathematical analysis and rigorous definitions. They utilize tools such as calculus, particularly the concepts of continuity and differentiability, to assess whether functions behave predictably within given boundaries. Techniques like the epsilon-delta definition for limits, Riemann integration for integrability, and the use of convergence tests for sequences and series provide essential criteria. These approaches enable physicists to ascertain the behavior of physical systems and ensure that mathematical models accurately reflect real-world phenomena.
  • #1
DrBanana
51
4
My last thread had too many questions, so I was told to make a new one. The question in this thread does not seem the same as the ones in the last thread, but I supposed that this is the root of my problems, so I started here. Also I wasn't sure whether to put this in the physics section or the maths section, sorry if I made a mistake.

So in a standard undergraduate mechanics textbook, a lot of limits may be taken, a lot of things may be differentiated, and some things might be integrated. How do we know that it is always safe to do this and that we don't get the wrong expression for a physical quantity? For example the practice of 'breaking up something into small parts and then summing' is common, how do we know that this integral we're taking doesn't diverge? When we 'divide by a small change in time ##\Delta t## and take the limit as ##\Delta t## goes to zero', how do we know that that limit exists?
 
Physics news on Phys.org
  • #2
Are you speaking physically or mathematically?
 
  • #3
Or in practice? o0) Beacuse from my experience, most of physicists that I have met in my life didn't care that much about that. It was always driving me crazy, that's why I ended specializing in mathematical physics.
 
  • Like
Likes bhobba
  • #4
Well, I’ve never derived an expression for a physics problem or any problem without fretting over it. Do its limits make sense? The units check out? On and on. Certainly not epsilon delta proofs but I don’t just accept the result. I’m just too fallible.
 
  • #5
One can argue that fourier series opened up a whole series of questions about the foundations of calculus that mathematicians spent the next century resolving, culminating with Lesbesgue integration. These concepts are addressed in a real analysis course.
 
  • #6
Orodruin said:
Are you speaking physically or mathematically?
Both, I suppose, but more emphasis on the correct prediction of the physical aspect by the derived mathematical formula (well as far as you can go without resorting to experiment). I guess if you study more advanced physics of course you have to get experimental results but my main focus is on kinematics and Newtonian mechanics here.

Paul Colby said:
Well, I’ve never derived an expression for a physics problem or any problem without fretting over it. Do its limits make sense? The units check out? On and on. Certainly not epsilon delta proofs but I don’t just accept the result. I’m just too fallible.
Can you expand on making sense of the limits?

Frabjous said:
One can argue that fourier series opened up a whole series of questions about the foundations of calculus that mathematicians spent the next century resolving, culminating with Lesbesgue integration. These concepts are addressed in a real analysis course.
I don't know why but I always thought real analysis was detached from reality (well the most exposure I've had to that is by reading the first four chapters of Spivak's calculus so it makes sense). Can you recommend books with eventually come back to proving the physics formulas?
 
  • #7
Partially, I guess, by adjusting/trial and error , as you go. However tempting, a strict bottom-up development or buildup is overall very inefficient, imo.
 
  • #8
DrBanana said:
Both, I suppose, but more emphasis on the correct prediction of the physical aspect by the derived mathematical formula (well as far as you can go without resorting to experiment). I guess if you study more advanced physics of course you have to get experimental results but my main focus is on kinematics and Newtonian mechanics here.
You cannot go anywhere in physics without resorting to experiments. That Newtonian physics are more intuitive to you does not give it a carte blanche. The limits that you are talking about are part of the mathematical model and ultimately observation and experimentation is the judge. If it describes observation well, then it is a good model. If it doesn’t then it isn’t.
 
  • #9
DrBanana said:
Can you expand on making sense of the limits?
Problems in physics and engineering are usually approached with a general understanding of how things work. For example if I work out a lumped element circuit model impedance and I want a sanity check on the result I might look at the limits of the expression for infinite frequency. In this limit all capacitors can be replaced with conductors and all inductors removed. If the derived expression doesn’t agree with the limiting circuit, it’s wrong. I know I’ve made a mistake.
 
  • #10
Paul Colby said:
Problems in physics and engineering are usually approached with a general understanding of how things work. For example if I work out a lumped element circuit model impedance and I want a sanity check on the result I might look at the limits of the expression for infinite frequency. In this limit all capacitors can be replaced with conductors and all inductors removed. If the derived expression doesn’t agree with the limiting circuit, it’s wrong. I know I’ve made a mistake.
I know this is straying away from the topic of the thread, but don't there come times when you need to predict things that can't be tested beforehand but are mission critical?
 
  • #11
DrBanana said:
I know this is straying away from the topic of the thread, but don't there come times when you need to predict things that can't be tested beforehand but are mission critical?
Fortunately, I’m retired so nothing is mission critical. During my working life, several things I worked on fit your description. The approach I used was first make a computational model based on assumptions and approximations that could be directly applied to the full problem. I then developed laboratory tests and fixtures that also could be predicted with this computational model. This allowed direct verification of the modeling.
 
  • #12
DrBanana said:
I know this is straying away from the topic of the thread, but don't there come times when you need to predict things that can't be tested beforehand but are mission critical?
Sure, but we already know from extensive experimentation that the continuum limit is a very good description. So good that we are willing to trust it with mission critical predictions.
 
  • Like
Likes bhobba
  • #13
Paul Colby said:
Well, I’ve never derived an expression for a physics problem or any problem without fretting over it. Do its limits make sense? The units check out? On and on. Certainly not epsilon delta proofs but I don’t just accept the result. I’m just too fallible.

Certain things in physics are mostly assumed, like the ability to expand a function in a power series.

This whole issue is examined in some fascinating lectures by Karl Bender:


The bottom line is problems can become intractable if you don't.

However, there are Mathematical Physicists who are very rigorous.

It is two different styles, each with its own insights.

Thanks
Bill
 
Back
Top