Some sins in physics didactics - comments

  • Insights
  • Thread starter vanhees71
  • Start date
  • Tags
    Physics
In summary, Vanhees71's PF Insights post discusses some sins in physics didactics, particularly in regards to the photoelectric effect and its role in the development of quantum theory. The conversation also touches on Einstein's Nobel Prize, with some speculation about what he could have been awarded for and the role of light quanta in his nomination. The conversation also mentions Lise Meitner and her overlooked contributions to the discovery of nuclear fission.
  • #176
A. Neumaier said:
... Which mental picture we form is a different matter - samalkhaiat probably cannot form a mental picture of the mind, as mind is as unobservable as the electron (we cannot see, hear, feel, smell or taste it), but we other mortals have our own mental pictures of it, which may or may not differ a lot from the scientific picture based on the physics we know.
You are entitled to your opinions. Samalkhaiat, like almost everybody else, distinguishes between mental picturing from mental construction. Mathematics and mathematical models are abstract mental constructions which we (i.e. our brains) can not provide spatial or/and temporal pictures of them. :smile:Good for you, if you can SEE mathematics or SEE the mathematical representation of the electron.
 
Physics news on Phys.org
  • #177
vanhees71 said:
I'd rather call it SUACM=Shut up, calculate, and measure! That, closed to a circle, is physics ;-).
Ah, but I never knew a physicist who really did that. It sounds too much like the "messenger" I alluded to above-- imagine there really was an "Einstein" program that took all the available data and used it to test a search protocol of various theories, ordered by complexity. The program throws out theories that fail, and adjusts the parameters of theories that succeed, and then outputs new experimental tests that are needed to push the theories into new domains. Then you the physicist set up the experiments that the Einstein program suggests, and report the outcomes to the program, which further culls its theories and suggests new tests. Progress in physics rapidly accelerates, as the program is capable of searching a vast space of possibilities very quickly.

Then you decide to further increase efficiency by creating a "Faraday" program that takes the Einstein outputs directly and assembles robotic experiments per the Einstein requirements, and feeds the outcomes right back into the Einstein code. You the physicist just sit back and watch the outcome, which is a set of theoretical equations and models ordered in regard to complexity and accuracy. After awhile you find the equations and models have become too difficult for you to understand what they are saying, so you create a "Scientific American" program to create pedagogical explanations of the Einstein outputs, in some sense "dumbed down" to translate it from the syntactic machine language to a semantic human language, but without the deeper understanding necessary to come up with the theory in the first place because it is actually derived in a different language. You sit back with great pride in your accoplishment-- a fully automated SUACAM system!

But then you realize you have exactly the same relationship to that system as non-scientists have with our current system. You have turned yourself into a non-physicist, in the name of doing SUACAM as efficiently as possible. So there has to be something more than SUACAM in physics!
 
  • #178
I think I know the answer but why is the bound electron quantized as a matter of course?

Is the point of the article that the discrete scattering probability of the quantized bound electron when bombarded with light (EM radiation) can be explained without also a-priori quantization of that radiation into "photons"?
 
  • #179
Yes, that's the point of the article. Einstein's famous formula is entirely derived from a model, where light is described by a classical em. wave, not by the quantum field. At Einstein's time the only observable fact that makes the quantized field necessary is the Planck radiation law, contradicting the classical equipartition theorem, leading to the UV catastrophe of the older theories of black-body radition.

The electron is "quantized", because a bound state belongs, by definition, to the descrete spectrum of the Hamilton operator. E.g., you can take a hydrogen atom as a simple but very important example, which can be solved exactly (neglecting radiation corrections, for which you also need the quantization of the em. field leading to the Lamb shift, which can be calculated very accurately using perturbation theory).
 
  • #180
Ken G said:
But then you realize you have exactly the same relationship to that system as non-scientists have with our current system. You have turned yourself into a non-physicist, in the name of doing SUACAM as efficiently as possible. So there has to be something more than SUACAM in physics!
I don't understand, what you mean. It's the very foundation of the scientific method to have a model (or even a theory) of (or a certain part of) nature, leading to quantitative predictions for the outcome of experiments. Then you plan your experiment to check whether the prediction is right. Either it is, and you haven't learned anything new or there is a discrepancy, and you have to refine your model, leading to new predictions and new experiments to check them. Science is a process, and I'm not sure, whether this will ever stop culminating in a final "theory of everything".
 
  • #181
vanhees71 said:
Yes, that's the point of the article. Einstein's famous formula is entirely derived from a model, where light is described by a classical em. wave, not by the quantum field. At Einstein's time the only observable fact that makes the quantized field necessary is the Planck radiation law, contradicting the classical equipartition theorem, leading to the UV catastrophe of the older theories of black-body radition.

The electron is "quantized", because a bound state belongs, by definition, to the descrete spectrum of the Hamilton operator. E.g., you can take a hydrogen atom as a simple but very important example, which can be solved exactly (neglecting radiation corrections, for which you also need the quantization of the em. field leading to the Lamb shift, which can be calculated very accurately using perturbation theory).

Just two other questions:
Lamb Shift was unknown at the time?

Does the whole specific point here translate If talking about scattering probability amplitudes of say bound neutrons or protons, bombarded with the EM field, but using "electrons". In other words it's the free vs. bound that matters, not the energy scales or forces involved. IOW the point is general; you don't have to posit quantization of the free field (?) a-priori to get quantized probability amplitudes for outcomes when that wave interacts with a bound system, which is by definition quantized?
 
  • #182
vanhees71 said:
I don't understand, what you mean. It's the very foundation of the scientific method to have a model (or even a theory) of (or a certain part of) nature, leading to quantitative predictions for the outcome of experiments.
That part is true of non-scientists-- they "have" those things too. I'm saying that if all these things are to us is a syntactic algorithm for predicting experimental outcomes, then we have no closer connection to the physics than a non-physicist does. Where in SUACAM does it matter if it is our minds that are involved in that process, or someone else's?
Then you plan your experiment to check whether the prediction is right.
And that is neither calculating nor measuring, it requires some idea of what you wish to test. What part of the theory bothers you? Where is your doubt centered? These are crucial issues in science, but do not represent "shutting up", they represent a discussion about what our goals are for our science, and where we regard the key payoffs. One particular example of this is when Poincare and Lorentz were trying to understand the Lorentz transform in terms of physical effects happening to rulers and clocks, causing them to seek experiments that could identify what that physical effect was, and then Einstein came along and said just make the speed of light a fundamental law and remove any need to find a physical effect on rulers and clocks. The experimental question shifted from seeking evidence for some physical effect, to simply testing the predictions of asserting that the speed of light is a law. Or another example, also involving Einstein, was the EPR paradox, where Einstein felt quantum mechanics was making absurd predictions, motivating experiments to test those predictions, leading to Bell's theorem.

These advances were the targets of specific thinking about laws, not just in terms of what calculations they allow, but also in terms of what they mean. Thinking about the deBroglie-Bohm versus the Copenhagen interpretation might also motivate new experiments, just as it motivated experiments on watching decoherence occur, or weak measurements. It seems to me the physicist is always up to his/her ears in their own interpretation of what these laws mean, this is central to not only the pedagogy of physics (which is nonunique), but also the motivations for what direction to take future tests (which is also nonunique).
Either it is, and you haven't learned anything new or there is a discrepancy, and you have to refine your model, leading to new predictions and new experiments to check them. Science is a process, and I'm not sure, whether this will ever stop culminating in a final "theory of everything".
I completely agree there, I'm just saying that the process is something more than SUACAM. If it weren't, it shouldn't matter to us who is doing the calculating and measuring, as long as we are privy to the outcomes. But we want to be privy to more than the theories and the observations that test them, we want to be privy to some kind of sense of what it all means, something we could call understanding that goes deeper than being able to make successful predictions using a syntactic algorithm. This will be a more personal connection, and will be non-unique, but is relevant to what we would regard is a "didactic sin" and what isn't.
 
  • Like
Likes Jimster41
  • #183
Ken G said:
That part is true of non-scientists-- they "have" those things too. I'm saying that if all these things are to us is a syntactic algorithm for predicting experimental outcomes, then we have no closer connection to the physics than a non-physicist does. Where in SUACAM does it matter if it is our minds that are involved in that process, or someone else's?

There is no difference between a physicist and a non-physicist. There are only differences between platonist and non-platonists. For example, take the tribe or whatever that counts 1,2,3, infinity. Are we any different? Has any computer counted to infinity, or is all of science consistent with manipulation of finite strings? Only people like Goedel who believe in the natural numbers are different.

Bohmian mechanics has a cut, and Copenhagen has a cut. It just depends on how accurate one thinks that map is. A really accurate map should contain a tiny version of itself in the map which contains a tiny version of the map in itself etc. Bohmian mechanics is the belief that our map should at least contain a tiny version of ourselves.
 
Last edited:
  • #184
Spinnor said:
They would have been good lectures I'm sure. Have you given any talks that were recorded on this material, maybe time for one?
This term I am giving a course on quantum mechanics for mathematicians, but again not recorded; sorry.
 
  • #185
samalkhaiat said:
Good for you, if you can SEE mathematics or SEE the mathematical representation of the electron.
I am a mathematician. As one can easily infer (''see'') by looking at typical mathematics textbooks and articles, mathematicians SEE everything they understand! And (except for strict Bourbakists) it is all about forming the right mental pictures. No constructions but Anschauung.
 
Last edited:
  • #186
Spinnor said:
Maybe Mr. Neumaier has something that is already written up that outlines his research, on the level of Scientific American?
A simple introduction is perhaps Optical models for quantum mechanics.

Starting from the discovery that everything known today about a single qubit was already known to Stokes in 1852 in completely deterministic terms, long before the advent of quantum mechanics, I go on to discussing elements of my thermal interpretation.
 
  • Like
Likes julcab12
  • #187
atyy said:
There is no difference between a physicist and a non-physicist.
If you hold that to be true, then it is natural to be a SUACAM type. I would say there is an important difference, which is those who wish to have a deep understanding of physics, versus those content to simply use the benefits of physics-- like someone who wants to understand electrodynamics, versus someone who just wants to use an i-phone. Note the distinction I draw there is not between practicing physicists and armchair physicists, it is between those who gain some degree of understanding from the theories, and those who are content that algorithms exist to predict outcomes. SUACAM should be happy with algorithms, but physicists generally are not-- even those who claim to be SUACAM types!
There are only differences between platonist and non-platonists. For example, take the tribe or whatever that counts 1,2,3, infinity. Are we any different? Has any computer counted to infinity, or is all of science consistent with manipulation of finite strings? Only people like Goedel who believe in the natural numbers are different.
I agree that SUACAM types would be less likely to be platonists, but even non-platonic physicists generally seek a level of understanding of what they are doing, and are not content with purely syntactic algorithms for predicting outcomes. There are no "didactic sins" at all if our only goal is syntactic success, indeed we have no need to explain anything other than what equation to use and how to solve it. It's certainly true that physics starts with this, we have to teach people what equations to use when, how to solve them, and how to set up the experiments that test them. But it rarely ends there-- physics pedagogy almost always goes beyond the rules of what equations to use and how to solve them, and experimental acumen almost always goes beyond how to set up the experiment. Physics pedagogy attempts to inspire a deeper understanding, which will guide thinking toward the next theory by looking at essentially the philosophy of the current set of equations, and experimental acumen attempts to inspire what new experiments to try and what would be the most insightful way to get nature to reveal some new secret. These elements underpin SUACAM, they make it work better and produce a more satisfying result, though they come at the cost of producing some variance of opinion (as any forum can attest!). Vive la difference, it promotes varied pathways of exploration.
Bohmian mechanics has a cut, and Copenhagen has a cut. It just depends on how accurate one thinks that map is.
Yet to even assert this is to go beyond SUACAM, because in SUACAM, there are no cuts, there is only the syntax of the testable predictions, and that syntax is the same in Bohm, Bohr, or Everett. Maybe that won't always be true, as our technology allows us access to new tests, but when that's no longer true, then those will be separate theories rather than separate interpretations of the same theory. Hence what I am saying boils down to the reasons that we have interpretations of our theories in the first place-- it's not that we need to marry one interpretation or another, it's that we like to have them at all. But SUACAM never includes them, as they violate the "SU" part.

Let me pose that differently. Imagine you had access to an i-phone app that would allow you to input any experimental apparatus, and the app would output the result of the experiment. Would you then consider yourself empowered to be the greatest physicist ever, based on the complete mastery of the SUACAM approach you now have? We could call it the "nature app". But in a sense physics begins with the nature app, it doesn't end there, because nature will already provide us with the syntactic output of any experiment we can set up. What we want from physics is more than that-- we also want a semantic content, a kind of lesson extracted from a theory that can provide an insightful shortcut to the output of the "nature app." Without that, we don't really have anything we can call physics, we just have a more convenient means for asking nature questions.

A really accurate map should contain a tiny version of itself in the map which contains a tiny version of the map in itself etc. Bohmian mechanics is the belief that our map should at least contain a tiny version of ourselves.
That sounds both profound and impossible at the same time!
 
Last edited:
  • #188
Ken G said:
If you hold that to be true, then it is natural to be a SUACAM type. I would say there is an important difference, which is those who wish to have a deep understanding of physics, versus those content to simply use the benefits of physics-- like someone who wants to understand electrodynamics, versus someone who just wants to use an i-phone. Note the distinction I draw there is not between practicing physicists and armchair physicists, it is between those who gain some degree of understanding from the theories, and those who are content that algorithms exist to predict outcomes. SUACAM should be happy with algorithms, but physicists generally are not-- even those who claim to be SUACAM types!

Well, but if this non-physicist believes that electric fields and spacetime etc really exist in reality (not as a mathematical model), then he will be indistinguishable from the physicist who has reality, mathematical model and syntax.
 
  • #189
Ken G said:
If you hold that to be true, then it is natural to be a SUACAM type. I would say there is an important difference, which is those who wish to have a deep understanding of physics, versus those content to simply use the benefits of physics-- like someone who wants to understand electrodynamics, versus someone who just wants to use an i-phone. Note the distinction I draw there is not between practicing physicists and armchair physicists, it is between those who gain some degree of understanding from the theories, and those who are content that algorithms exist to predict outcomes. SUACAM should be happy with algorithms, but physicists generally are not-- even those who claim to be SUACAM types!

Let's say we do Euclidean geometry. Then (under some circumstances) lines and points are dual. Then there are also real lines and real points. Since lines and points are dual, the real line can modeled as a mathematical point. So what is real? I think the lines and points are not real, only the correspondence between reality and syntax.

Also, does "Skolem's paradox" have any relevance here? https://en.wikipedia.org/wiki/Skolem's_paradox

Or can we escape it by using second-order logic? http://lesswrong.com/lw/g7n/secondorder_logic_the_controversy/

We should see what A. Neumaier is using in his robots :biggrin:
 
  • #190
A. Neumaier said:
One can repeat the experiment many times only for microscopic systems, since the assumptions underlying the statistical interpretation is that one can prepare a system independently and identically many times. It is impossible to do this for a macroscopic system, let alone for a quantum field that extends from the Earth to the sun.

Yes, in QFT everything is deterministic; God doesn't play dice since he world was created according to a QFT. The randomness is in the inability to reproduce identical quantum conditions for a macroscopic system, together with the inherent chaoticity of the kinetic, hydrodynamic and elasticity equations for macroscopic matter.

For the system under discussion in the main part of this thread, it is the randomness in the photodetector that is responsible for the indeterminism.

Really enjoying this thread.

Where does the "inherent chaoticity of the kinetic, hydrodynamic, and elasticity equations for macroscopic matter" come from? If they macroscopic objects are composed of quanta which behave according to determinism even if their behavior is unpredictable there must be some cause?
 
Last edited:
  • #191
Spinnor said:
[...] Maybe Mr. Neumaier has something that is already written up that outlines his research, on the level of Scientific American?
The book is more detailed than anything else I've seen.

BTW, I think you mean "O. Univ-Prof. Dr. Neumaier", not "Mr. ..." :-)
 
  • #192
Jimster41 said:
Where does the "inherent chaoticity of the kinetic, hydrodynamic, and elasticity equations for macroscopic matter" come from?
Almost every nonlinear deterministic system is chaotic, in a precise mathematical sense of ''almost'' and ''chaotic''. It ultimately comes from the fact that already for the simplest differential equation ##\dot x = ax## with ##a>0##, the result at time ##t\gg 0## depends very sensitively on the initial value at time zero, together with the fact that nonlinearities scramble up things. Look up the baker's map if this is new to you.
 
  • #193
strangerep said:
One stumbling block is that Arnold's book does not discuss Bell's theorem nor its cousins, so all the standard objections about hidden variables flood into my mind when I hear an interpretation that sounds deterministic.

All arguments I have seen against hidden variable theories - without exception - assume a particle picture; they become vacuous for fields.

Indeed, already the simplest deterministic fields - plane waves - behave precisely the nonlocal way that is responsible for Bell's theorem. This is the reason I don't discuss the latter in my book. For my book is supposed to be free of all weird quantum stuff (that is weird only because of an inappropriate interpretation of the phenomena) - without didactical sins in the sense of the present thread.

But I had written a paper on Bell inequalities quant-ph/0303047 = Int. J. Mod. Phys. B 17 (2003), 2937-2980, which is in fact the (at that time still embryonic) origin of my thermal interpretation. I also discuss this stuff in my lecture on Classical and quantum field aspects of light and in my paper A simple hidden variable experiment, though without direct reference to my thermal interpretation.

It follows that quantum field theory is not affected by the extended literature on hidden variables.
(Further discussion of this please in this thread on randomness.)

The problems of few particle detection arise because their traditional treatment idealizes the detector (a complex quantum field system) to a simple classical object with a discrete random response to very low intensity field input. It is like measuring the volume of a hydrodynamic system (a little pond) in terms of the number of buckets you need to empty the pond - it will invariably result in integral results unless you model the measuring device (the bucket) in sufficient detail to get a continuously defined response.

Maybe this will act as a dam against the metaphysical flood.
 
Last edited:
  • #194
A. Neumaier said:
All hidden variable theory arguments I have seen - without exception - assume a particle picture; they become vacuous for fields. Indeed, already the simplest deterministic fields - plane waves - behave precisely the nonlocal way that is responsible for Bell's theorem. This is the reason I don't discuss the latter in my book.
It would be nice (understatement of the year perhaps) if one could demonstrate by means of a simulation, even with a "toy model", that a field model can produce the results that with particle models look like "spooky action at a distance".

For example, just now I found a paper of a few years ago by Matzkin, http://arxiv.org/abs/0808.2420v2. At first sight the there presented model looks like a hidden variables model (but using field theory), and it looks simple enough to be tested with numerical simulations. Regretfully it appears that that paper has not been reviewed.
[..]But I had written a paper on Bell inequalities quant-ph/0303047 = Int. J. Mod. Phys. B 17 (2003), 2937-2980, which is in fact the (at that time still embryonic) origin of my thermal interpretation. I also discuss this stuff in my lecture on Classical and quantum field aspects of light and in my paper A simple hidden variable experiment, though without direct reference to my thermal interpretation.[..]
Thanks :smile:
 
Last edited:
  • #195
strangerep said:
BTW, I think you mean "O. Univ-Prof. Dr. Neumaier", not "Mr. ..." :-)

Did not know, no disrespect intended.
 
  • #196
A. Neumaier said:
Almost every nonlinear deterministic system is chaotic, in a precise mathematical sense of ''almost'' and ''chaotic''. It ultimately comes from the fact that already for the simplest differential equation ##\dot x = ax## with ##a>0##, the result at time ##t\gg 0## depends very sensitively on the initial value at time zero, together with the fact that nonlinearities scramble up things. Look up the baker's map if this is new to you.
P32 of your lecture.
"Thus the QED photon is a global state of the whole space, a time-dependent solution of the Maxwell equation. It acts as a carrier of photon particles, which are extended but localized lumps of energy moving with the speed of light along the beam defined by a QED photon state. It is interesting to note that Colosi & Rovelli 2006 arrived at a similar conclusion from a completely different perspective. They argue from quantum general relativity, starting with the Unruh effect."

So the global state of the whole (H space of states) spans what space-times? I am trying to connect this notion to the AdS/CFT correspondence, where entanglement seems important. Your perspective is that "entanglement" is captured by this "global" field state. Does that global EM field state "do" anything to relate space-like separated space-times, or is it totally irrelevant? I am reading you as saying, "yes, of course it does". But I'm not exactly sure what you are saying it does. What does it mean for a global EM field state to connect space-like points?

I am totally intrigued by the connection you are pointing to with the Unruh Effect. This makes me think there is some synergy or reconciliation between your theory and those theories of quantum GR.

I hear "determinism" wrong often because of the way deterministic chaotic dynamics are and are not the same as "randomness" and "unpredictability". I totally buy the local deterministic dynamics leading to chaotic unpredictable outcomes (baker's map). It's confusing to me at the level of detail, why nature should be model-able in this simultaneously recursive and diffusive way. And I'm confused as to whether or not the idea of non-locality is involved. Trying to understand your take on that. Also, the chaotic dynamics as related to periodic structure, or "self-similarity" in chaotic systems seems relevant to this problem. Nature is clearly not just a stirring process, but rather a strange mixture of stirring and self-organization, right?

Note this conversation does seem to connect to Prof N's Insights article on Causal Peturbation theroy and discussion of the same, so I hope I don't get in trouble for careless thread logistics.
 
  • #197
Saying much more here on the thermal interpretation would go too far from the topic here - ''didactic sins''. Could you perhaps open a new thread, and cut down your answer her to a link to there? Then I'll answer in the new thread.
 
  • Like
Likes Jimster41
  • #198
A. Neumaier said:
A simple introduction is perhaps Optical models for quantum mechanics.
.

Just a question that has been bugging me for a while. Under pg 19. 3. "Traditional quantum mechanics does not answer this. But it provides formulas for the computation of the mean position hqi and the mean momentum hpi of each quantum object which can be prepared as an individual . . . . . . provided that one assigns a state to each individual object. Those strictly adhering to a statistical interpretation may find this a forbidden use. But how else shall we encode into quantum mechanics the knowledge that, at a particular time, a particular object is at a particular place in the experimental setup?"

Would it be possible to have like a quadruple, any lensing type effect on microscale or a field that has that effect?.
img046.jpg
The black circle is a particle and x is the true position of a particle(assuming) and four x's are the location of the projection or critical points. Sorry for the crappy illustration.
 
  • #199
atyy said:
I think the lines and points are not real, only the correspondence between reality and syntax.
I agree, indeed I feel what you mean by "reality" here is what I mean by "meaning" or "semantics" when applied to the syntax of objective perceptions. So I would say what meaning we can give to what is real is whatever correspondence we can find between semantics (by which I mean the syntax of a metalanguage we create to talk about reality), and the syntax of objective perceptions. Then we also have the syntax of our physics theory, which can serve as a kind of simplified replacement for the syntax of the objective perceptions. Testing the connections between those three syntaxes is what we call science. Since the testing process itself requires another model to say when a test has been passed, we need another model of the scientific process itself, and when we want to know what that means, we need another model, so we find that it is models all the way up. Each model has a metalanguage syntax that supplies meaning to the model below it, but requires its own model to supply meaning to it. Usually we imagine the models are going downward, from rocks to atoms to quantum fields, etc., but it seems to me the models go up also, because a syntactic manipulation of a model that gives us a sense of meaning, which we then call an interpretation, is like an "upward" model rather than a downward one-- when we interpret quantum mechanics we "lift" the formal QM syntax up into a more everyday language, one capable of attributing meaning, but that lifting is not unique. Each such lifting can then spawn its own downward set of models, so the interpretation process can be used to find new paths to new theories. All we mean by "reality" is the meaning we attribute to our models, i.e., the "upward" modeling process, but what is meant by SUACAM is always looking downward, considering only the syntax of the theory and the syntax of the objective perceptions, never the metalanguage syntax that provides semantic meaning to either. The sense of understanding we get is from that upward modeling process, so from the interpretations we find-- not from simply finding a successful simplification from the syntax of obective perceptions to the syntax of some theory. That's my issue with SUAMAC, and is the reason I claim no physicists (or physics thinkers like yourself) actually do that.
Also, does "Skolem's paradox" have any relevance here? https://en.wikipedia.org/wiki/Skolem's_paradox
As I understand that paradox, it says that we can use a symantic system to prove the existence of things that our model of the semantic system cannot give any meaning to. So it is in some sense the opposite of the Goedel proof, as Goedel showed that truth-by-meaning can extend beyond truth-by-proving, but here we have that truth-by-proving can extend beyond truth-by-meaning. I guess the bottom line is that what we regard as true, and what we can prove are true, are just not the same things in many important situations.
Or can we escape it by using second-order logic? http://lesswrong.com/lw/g7n/secondorder_logic_the_controversy/
That's a remarkable blog entry, hard to follow but it seems to make the case that first-order logic, first-order set theory, and second-order logic, form a sequence of increasing proving power but also increasing uneasiness around their soundness. It seems mathematicians are free to choose their own personal comfort level in how far down that rabbit hole they wish to go!
We should see what A. Neumaier is using in his robots :biggrin:
Indeed!
 
  • #200
atyy said:
We should see what A. Neumaier is using in his robots :biggrin:
Neither paradox nor magic; everything is nicely decidable or remains undecided. Just a large and detailed semantic memory together with algorithms to automatically expand it with new, useful content, with heuristics to decide what falls under this category - to avoid learning didactical sins, and with heuristics to clean up older information - to unlearn what turned out to be a didactical sin. The heuristics are derived from heuristics of the sort professional mathematicians use. The implementation is at the very beginning - it is a lot of work already to impart to a computer program the implicit knowledge needed to read a single math textbook sentence. Also, we are not creating robots - hands and feet are irrelevant for an automatic mathematician.
 
Last edited:
  • Like
Likes atyy
  • #202
Nice post together with the comprehensive mathematical treatment. Although I am a physics graduate I am having hard time grasping the mathematical part since my quantum mechanics and classical mechanics are a bit rusty. What should I particularly revise to get this?

Thanks
 
  • #203
Septim said:
Nice post together with the comprehensive mathematical treatment. Although I am a physics graduate I am having hard time grasping the mathematical part since my quantum mechanics and classical mechanics are a bit rusty. What should I particularly revise to get this?

For the QM bit - Ballentine - Quantum Mechanics - A Modern Development.

For Classical Mechanics - Landau - Mechanics.

Be amazed at the rock bottom of what a lot of physics is about - symmetry.

Thanks
Bill
 
  • Like
Likes Septim
  • #204
bhobba said:
For the QM bit - Ballentine - Quantum Mechanics - A Modern Development.

For Classical Mechanics - Landau - Mechanics.

Be amazed at the rock bottom of what a lot of physics is about - symmetry.

Thanks
Bill

Leon Lederman's book "Symmetry and the Beautiful Universe" makes your point (symmetry being at the foundation of most physical concepts) very well. It's far less mathematically demanding than the other sources you've offered... which, of course, is why I was better able to understand what he was saying. It also gave me a much greater appreciation for the work of Emmy Noether.
 
Back
Top