EPR/Bohm/Bell & Localism vs Universalism

  • Thread starter glengarry
  • Start date
In summary, the conversation discusses the progression from EPR to Bohmian mechanics to the Bell theorem and how it challenges the common sense notion of locality in understanding the constitution of physical reality. The speaker questions whether non-localism implies universalism and if the concepts of semi-localism or semi-universalism can be rigorously defined. They also discuss the use of the term non-local in quantum mechanics and suggest that a choice between localism and universalism could help eliminate certain theories.
  • #36
glengarry said:
Let me just make the picture very, very simple.

Fair enough.

Just take a bunch of guitar strings and pluck them.

Okay, I’ll use my own villainous technique: plunk-ploink-saw-smack-hack-slash-clutch-slap!

(... is it okay if I use my dreadful wawa fuzz pedal, aslo ... ?:rolleyes:?)

You'll want some of them to be fast and feeble to model "matter".

It think that patent belongs to Metalica...

And others should be slow and intense in order to model "gravity fields".

That would be Jose’s 4 tonnes masterpiece.

https://www.youtube.com/watch?v=pC-49GnjdTE <-- INSIDE the thing!


Now bend them into closed loops and superimpose all of them atop each other, such that their closure points occupy various locations. Now imagine this situation in higher dimensional contexts, if you dare.

Definitely Jimi in hazy dimensions! Are You Experienced?

https://www.youtube.com/watch?v=rNtuyQkh7tE
 
Last edited by a moderator:
Physics news on Phys.org
  • #37
The desire to dismiss these concepts as "philosophical hand wavery" or to make jokes out of my attempted illustrations are fairly predictable responses, and I would expect nothing less. I just hope that you guys would refrain from too much more unnecessary jokemaking in the future. It's not a big deal to me, but I'm just concerned that the rest of the readers of this thread will have trouble following along.

Anyhoo, the fact still remains that the Bell line of reasoning points to the notion that reality consists of mechanisms that are fundamentally spatial. There are really no other attempts to offer rational, explicit formulations of these mechanisms as far as I am aware.

Again, all that I'm doing here is just applying the central mathematical device of Quantum Mechanics (the wavefunction) directly to the most general notion of spatiality that we have available to us (the "universe").

By the way, the picture that I'm trying to paint is no different from the basic "particle in a box" picture that exists at the very beginning of any course on QM. It is in this way that the common notion of the particle in free space is transformed into that of the harmonically oscillating standing wave.

Regardless, the whole point of this thread is that real progress can be made in theoretical physics if we just stop holding firm to the notion that scale has any fundamental standing in terms of an objective description of nature. Since time immemorial, there has been the problem of how to resolve the granular picture of the world with the continuous picture of the world. From my perspective, it seems that the highest caliber "thinkers of physical reality" (call them theoretical physicists or natural philosophers, either term works for me) hold firm to the idea that there is no difference between a localized material body and the field that surrounds it.

So, my theoretical model of a Bohm-like universe attempts to resolve the discrete/continuous conundrum precisely in terms of simple topological transformations of waveforms that are defined in Euclidean space, so that they may inhabit a singular universal domain in which relative translational motions are possible.

The fundamental causal principle for this model is simply that the universal hyperspherical surface must remain as flat as possible at all times. That is, the difference between the global minima and maxima of the composite waveform must be minimized. Logically speaking, this is no easy task in any sufficiently complex universe in which there are countless numbers of oscillators that have huge disparities in their waveform geometries, intensities (maximum amplitudes), and frequencies. So, in terms of being able to come up with trivially solvable mathematical equations in order to satisfy "chalkboard theorists", this model will offer very little in the way of satisfaction.

But in terms of offering an effectively infinite array of thoroughly dynamic -- yet fundamentally stable -- arrangements of the composite manifold so that life-filled worlds such as ours at least have a chance of evolving, then this model should be something of a breath of fresh air.

Let's just think of the basic notion of heat for a moment. I am not aware of any theoretical paradigm that includes this concept at its very core. That is, heat is fundamentally understood to consist of relative oscillatory motions between atoms. But the question of why these oscillations occur is not an easy one to answer. In my model, however, the elementary constituents are simply harmonic oscillations of the universal spatial manifold. So in order to satisfy the fundamental principle that calls for a composite waveform that is as flat as possible, every element must adapt itself to constantly changing configurations of the manifold at the same time that it directly contributes to these changes.

In terms of models of reality that give reasonable chances for life-as-we-know-it to develop, the inclusion of heat at this most fundamental of levels is a no-brainer. The way I see it, this perpetual oscillatory motion of matter is the determining factor in terms of whether the evolution of life is possible. The idea of the existence of a substance that dependably regulates heat (such as water) is a close second in this regard. So, I would say that the theoretical Bohm-like universe that I've constructed is of a fundamentally thermodynamic character. The idea that we are seeking to "smooth out" the universal waveform is highly correlated to the second law of thermodynamics.

But it is also important to develop a reasonable model of signal propagation. This will allow us to understand the mechanisms underlying the way in which electromagnetic waves traverse seemingly vacant spaces. My understanding in this regard is just that signal propagation consists of the necessarily time-consuming process of delivering effective fronts of organized waveform oscillations to various regions of the universe. This just means that there is a necessarily delay (due to the thermodynamic nature of the universe) between when a "light switch" is turned on, and when the elements that constitute the light source have been able to organize their oscillatory motions so that they can collectively be detected at some arbitrary distance from their effective locations.

The difference between this model and the current model (based on independently traveling photons) is profound. In the current model, there is a definite distinction between a source event and a reception event. But in this model, there is no distinction because there is truly no separation between the elements at the source and those at the receiver. This just means simply: once the wavefront has reached the receiver, all events are seen without delay. The corollary is: if the wavefront has not reached the receiver, no events will ever be seen.

I realize that there is an obvious dichotomy between this theoretical notion of "no delay seeing" and the everyday experience we have with the delays in technology-based signal transmissions that bridge non-trivial distances. The best explanation I can offer is that, in terms of signal propagation processes that involve real world components (eg, satellite links), there are many underlying, non-trivial thermodynamical mechanisms taking place that must all be coordinated in order to allow for the technology to work in the first place. It terms of what we are seeing when we are looking at any celestial object with our naked eyes, my model unabashedly dictates that we are seeing its current, immediately given state (minus any delays that occur within our own brains, of course).
 
Last edited:
  • #38
glengarry said:
It terms of what we are seeing when we are looking at any celestial object with our naked eyes, my model unabashedly dictates that we are seeing its current, immediately given state (minus any delays that occur within our own brains, of course).

Brain delays seem like a troublesome topic. Nevertheless, I have to ask you about this picture:

640px-Ilc_9yr_moll4096.png


According to all known facts these photons from the CMB show us the state of the surface of last scattering, when universe was about 370,000 years old, i.e. roughly 13.8 billion years before our time.

Are you saying that this is “its current, immediately given state”?

This must be a joke.
 
  • #39
I'm not understanding your model for cosmic distance signal propagation, especially basing it on thermodynamics, and even more so the "no delay seeing" after the arrival of the first wavefront.

Thermodynamics is applicable to small systems that can change quickly, not large slow systems, and not communication between distant large systems. In other words, I think thermodynamics may be a "local" application, not appropriate for a "universal" application.

I'm thinking that a wavefront is not a unique part of a propagating wave; there may be a leading wavefront that initiates an observation, but subsequent observation is still seeing subsequent wavefronts. What are you thinking is special about the leading wavefront that would invoke "no delay seeing"?

If the light is switched on, and then after a delay the leading wavefront arrives, you say that "no delay seeing" begins... but if the light was switched off 10 seconds after being switched on, and the leading wavefront took 30 seconds to arrived, what is this "no delay seeing" going to see for the first 20 seconds after the arrival of the leading wavefront- an on light or an off light?
 
Last edited:
  • #40
glengarry said:
It is commonly understood that the progression from EPR to Bohmian mechanics to the Bell theorem relates to the idea that there is a common sense notion of the constitution of physical reality that is in need of being reevaluated. This notion is simply that the fundamental elements of physical reality are to be understood as causally isolated material points (ie particles). The name we can give to this notion is the principle of locality, or more simply, localism.
I had never seen locality defined in this way. But I guess it makes sense that locality imply point particle type objects even if it is often mentioned in the context of fields. Maybe this is what you mean by semi-localism below.
glengarry said:
In this post, I will finally attempt to explicitly answer the question "What is meant by the term universalism?" by way of a theoretical construction of a Bohm-like universe residing in classical spacetime.
I'm not sure how the notion of universalism you present is related to anything Bohm-like.
glengarry said:
The way to begin thinking about this is that a "body" must be represented as some kind of a field, such that all of the values of the field are necessarily related to one another.
If you are here basically trying to highlight the duality particle view-field view, and saying the field view is more complete I for one agree with you(see my posts in the "quantum incompleteness" thread) and probably many others. But that's a long ways from any "new" model IMO. For that you first need to concentrate in testable predictions rather than the "philosophy" of that way of looking at things.
glengarry said:
I think it is fairly intuitive that the "matter like" waveforms will appear to clump together due to some kind of intrinsic attractive force, but given that we know that gravity can best be described as the curvature of the spatial manifold, the easiest thing to do is to just imagine these curvatures as fundamental elements of physical reality.
I think thre are some unwarranted leaps of faith here. Gravity is related to the spacetime curvature rather than the spatial curvature.

glengarry said:
For some reason, the idea of creating yet another particle in order to "explain" what the force of gravity is all about just doesn't do it for me.
Agreed.

glengarry said:
From my perspective, it seems that the highest caliber "thinkers of physical reality" (call them theoretical physicists or natural philosophers, either term works for me) hold firm to the idea that there is no difference between a localized material body and the field that surrounds it.
Good point, perhaps you could post some quotes from those thinkers where this is manifestly seen.
glengarry said:
So, my theoretical model of a Bohm-like universe attempts to resolve the discrete/continuous conundrum precisely in terms of simple topological transformations of waveforms that are defined in Euclidean space, so that they may inhabit a singular universal domain in which relative translational motions are possible.
The fundamental causal principle for this model is simply that the universal hyperspherical surface must remain as flat as possible at all times.
This is confusing, is the universe surface hyperspherical or Euclidean, both are compatible with a FRW spacetime.

The discrete/continuous conundrum is very deep and important so it is laudable that you are so interested in it and want to tackle it, but seriously much more than some apparently "good-looking" ideas is needed, then again if you want to go about this rigorously a forum is probably not the best place for it.

glengarry said:
That is, the difference between the global minima and maxima of the composite waveform must be minimized. Logically speaking, this is no easy task in any sufficiently complex universe in which there are countless numbers of oscillators that have huge disparities in their waveform geometries, intensities (maximum amplitudes), and frequencies. So, in terms of being able to come up with trivially solvable mathematical equations in order to satisfy "chalkboard theorists", this model will offer very little in the way of satisfaction.
Sorry but in physics: no equations=no model
 
  • #41
glengarry said:
...The difference between this model and the current model (based on independently traveling photons) is profound. In the current model, there is a definite distinction between a source event and a reception event. But in this model, there is no distinction because there is truly no separation between the elements at the source and those at the receiver. This just means simply: once the wavefront has reached the receiver, all events are seen without delay. The corollary is: if the wavefront has not reached the receiver, no events will ever be seen.
...

Actually, I wasn't joking as much as it might have seemed*. I was asking how to distinguish one interpretation from another. And this touches upon a point I have asked about before (in other threads), and never received a satisfactory response one way or the other. Simply stated: do free photons exist?

Some have stated that photons are abstract elements which convey the EM force. In effect, this holds that ALL photons are virtual in a sense. If that were the case, which I believe your model implies, then one could conduct an experiment to confirm. I believe this to be be quite feasible. If I shine a beam of light to a dark area of the sky, it can NEVER reach anything to be absorbed.** Therefore, the light cannot be emitted in that direction! I can think of a couple of setups which could exploit this peculiar attribute. *Well maybe I was a little bit. :smile:

**The reason being that we are in an expanding universe; most of the light emitted by the sun can never reach most regions of the universe. So presumably, if there are free photons, they are sailing out into the vast empty and expanding regions between galactic clusters where they will continue their lonely existence forever.
 
  • Like
Likes 1 person
  • #42
Again, this post took a loooooong time to write. Hope you all enjoy...

DevilsAvocado said:
Brain delays seem like a troublesome topic.

Yes, very troublesome given the ridiculously complicated machinery that underlies a single human brain. My only point is that, just like the talk about the "real world components" that allow for the existence of satellite links between various parts of the world, the brain also consists of such components that allow for signals to be transferred between the optical nerves and the regions of the brain that are responsible for visualization. Indeed, these are not easy processes to understand.

According to all known facts these photons from the CMB show us the state of the surface of last scattering, when universe was about 370,000 years old, i.e. roughly 13.8 billion years before our time.

Are you saying that this is “its current, immediately given state”?

This must be a joke.

Strange indeed, but true. By the way, it's not technically me that is saying it. I personally don't care what the actual, deep truth of the universe is. I'm just saying that if a person happens to be interested in the idea of theoretical models of Bohm-like universes, then that person might be interested in understanding the theory underlying the model that I've developed. My point is simply: according to my model, the explanatory picture that consists of some kind of independently given "particle" that moves in a linear trajectory through space at some universally given velocity simply doesn't apply.

You even used the word "photons" in the buildup to your question. But I clearly anticipated your argument when I wrote the following in the paragraph immediately preceding the one you quoted:

glengarry said:
The difference between this model and the current model (based on independently traveling photons) is profound. In the current model, there is a definite distinction between a source event and a reception event. But in this model, there is no distinction because there is truly no separation between the elements at the source and those at the receiver.

Now there should be no theoretical controversy here, as I have made pains to ensure that my model is internally logically consistent. The next obvious question, therefore, is how the picture of the CMBR can be explained by my model. So here it goes...

First of all, I've made reference to the idea that some of the elemental waveforms in the universe should be fast and feeble in order to model matter, while others should be slow and intense in order to model gravity fields. But I wasn't quite explicit enough, because what I meant to say is that there should be a strict inverse relationship between the maximum amplitudes (aka "intensities") and the frequencies of any given waveform.

This relationship can be trivially satisfied by the equation:

frequency x maximum amplitude = a universal constant

But don't worry if this doesn't seem like how nature works, because I'm not fundamentally making a commentary about any empirically determined wave phenomena. That is, I am fully aware of the ability to, for example, amplify sound waves and make laser beams more intense without changing the frequencies involved. I'm rather making a fairly simple commentary on the fact that gravity fields obviously do not oscillate very quickly (e.g., the Earth has been around for billions of years) and that atoms obviously do (given the frequencies of atomic spectral emission/absorption lines). The next step is to realize that we just think of it as "natural" that very powerful oscillators are relatively slow, and that very feeble oscillators are relatively quick. This should just make "sense," and I could offer additional illustrations to show this, but for now, I'm just going to assume that we can all agree on its reasonability.

Now, the next point of agreement must simply be that, in terms of the model that I've constructed, the theory of independently propagating particles (i.e., photon theory) simply cannot possibly apply. All that this Bohm-like universe consists of is universally defined waveforms that are always collectively trying to realize a state of maximum smoothness of the composite waveform. In other words, this universe is already completely "occupied". There simply isn't any room for things like photons. I don't care how many photons come knocking at the door of this universe: the sign outside says NO VACANCY!

So, my previous post attempted to clearly describe the mechanism that would allow for something like electromagnetic energy to traverse effectively isolated regions of the universe. And the keyword to note from the above sentence is "effectively", because the reality of the universe is that all of the elements in it are universally defined. The question at hand, therefore, is concerning the nature of this "effective isolation" between regions of the universe. That is: why are we not being constantly bombarded with sensations from every part of the universe at once? The answer is simply due to the way in which each elemental waveform must be morphed into a hyperspherical form in order to allow for the possibility of relative translations between other waveforms. In the morphed state, its spatial density increases squarewise as we approach the central boundary point. This is just the natural geometric consequence of morphing a three-dimensional ball into a hypersphere.

So, now let us imagine a universe with two very slow, powerful [gravity field like] waveforms that have settled at polar opposite sides of the universe. Now let's randomly fill the universe with trillions of very quick, feeble [atom like] waveforms. We are going to see what looks like two big spheres of atoms ("worlds") as the atoms seek to fill the "holes" that are created by the gravity fields.

Now, let's consider a thought experiment in order to develop a crude model for the sensation of sight. Let's say there is a group of a million atoms on one of the worlds to be used as a light source. And on the other world, there is another group of a million atoms to be used as a receiver. In terms of the receiver, the atoms begin in a particular configuration that we can call "no reception". But once the configuration has been suitably rearranged, we can say that it is in a state called "signal received". In terms of the source, the particular configuration doesn't matter. All we want to do is coordinate the relative motions of the source elements so that there is a singular, suitably intense waving of the spatial manifold at the region of the receiver.

In the "off" state, the atoms at the light source are indeed moving, but we can say that their motions are fairly calm and disorganized. Given the fact that the local amplitudes of the individual atom-like waveforms get increasingly "spread out" through space as we move from their central boundary points, the ability for each one to offer any substantial changes to the composite universal waveform consequently diminishes. But if relatively large numbers of these elements can oscillate in tandem, then they will collectively be able to effectively alter further depths into the universe.

All that we really need to know about this thought experiment is that there is one of two situations:

1) The elements at the source are not oscillating with enough collective energy in order to change the configuration of the elements at the receiver to the "signal received" state.

2) The elements at the source are oscillating with enough collective energy in order to change the configuration of the elements at the receiver to the "signal received" state.

What this means is that for every level of energy transferred to the collection of source elements, there is a corresponding distance for which a receiver located at that distance will be affected by the oscillations of the composite waveform such that it will just barely cross over the threshold from the "no signal" state to the "signal received" state. And the crucial thing to realize in this picture is that the fundamentally thermodynamic character of the universe dictates that there is a necessary delay between any two given source energy states. That is, the process of inputting energy into the elements at the source necessarily requires some positive length of time.

So, when we flip the switch at the source to the "on" position, we can see that there will be a moving "surface of possible signal reception" whose precise location is determined by the following factors:

1) the amount of energy that has been input into the source
2) the level of sensitivity of the receiver to detect whether the arrangement of its elements has been suitably altered

In this theoretical universe, it is precisely the speed of this surface that determines what, in our universe, is called the speed of light, or c.

Now, the only reason I went through the preceding explanatory process was to show that the frequency of oscillation that is detected at the receiver is fundamentally correlated to the maximum amplitudes (intensities) of the elemental waveforms. That is, the only mechanism we have available to us in order to cause relative translatory oscillations (i.e., the motions that occur between elements) is nothing other than absolute elemental waveform oscillations (i.e., the motions of the spatial manifold that are intrinsic to the elemental waveforms themselves). All of these considerations point simply to the following: the location of a given source's "surface of possible signal reception" is proportional to the intrinsic intensities of the source elements.

This just means that source elements of larger maximum amplitude will tend to have two characteristics:

1) smaller frequencies
2) the ability to generate signals that effectively penetrate deeper into space

Now, let's just take this idea to an extreme level. Let's imagine that we can live forever and wait for gravity field waveforms to oscillate many times. Each of these waveforms obviously have very significant amplitudes at extremely far depths into space. So, for every oscillation, the elements at the receiver will likewise undergo relatively large changes in configuration. In this case, however, the "configuration change" is much better understood as the translation of the encompassing "spatial frame" of the entire receiver rather than the mere rearrangement of its individual elements. Nevertheless, we'll say that the receiver can somehow magically detect changes in its absolute location within the universe. In this case, it is obvious that the two characteristics outlined above will hold. That is, by virtue of being able to detect whether it has changed its absolute location in the universe, we can see that the receiver has detected a signal of very low frequency at a very large distance.

In terms of a way to understand the "map" of the Cosmic Microwave Background, we can thus say the following. In the first place, we must assume a sufficiently large universe filled with a sufficiently large number of signal sources that cover a sufficiently wide spectrum of frequencies. At any given location in the universe, there will be such a number of "lines of sight" that end in signal sources of relatively low frequency, such that the density of these sources will appear to be practically homogeneous.

The end result is that the CMB map is nothing other than the constantly bright sky of Olber's paradox.

[PLAIN]https://en.wikipedia.org/wiki/Olbers'_paradox said:
The[/PLAIN] paradox is that a static, infinitely old universe with an infinite number of stars distributed in an infinitely large space would be bright rather than dark.

The only difference is that the model I've developed understands signal propagation in terms of a thermodynamic process that depends upon a fundamental relationship between the wavelength detected at the receiver and the effective penetrative power of the elements at the signal source, as determined by the maximum amplitudes of the waveforms of the source elements. Also, it is not necessary for there to be any "moment of creation" in my model. I simply make the assumption that it exists in an eternal state of thermodynamic equilibrium. I take the CMB to simply represent the natural limit of our immediately given field of vision within an infinitely old universe of a practically infinite -- although theoretically finite -- size.
 
Last edited by a moderator:
  • #43
glengarry, I think you missed the most ‘troublesome’ property of CMB – redshift – which is not compatible with your “immediately given state”.

[PLAIN]http://background.uchicago.edu/~whu/beginners/expansion_tran.gif[/CENTER]​
 
Last edited by a moderator:
  • #44
DrChinese said:
Simply stated: do free photons exist?

If every photon emission requires absorption, then we will have trouble. The CMB emits 10 trillion photons per second per squared centimeter, and Ω = 1 (flat universe) gives one hydrogen atom per cubic meter = there is not enough baryonic matter in space to absorb every photon emitted.

Some have stated that photons are abstract elements which convey the EM force. In effect, this holds that ALL photons are virtual in a sense.

In a sense maybe, but CMB polarization and redshift (from millionths of meter wavelengths stretched into microwaves at centimeter wavelengths) is a bit hard to classify as only ‘virtual’.

If I shine a beam of light to a dark area of the sky, it can NEVER reach anything to be absorbed.** Therefore, the light cannot be emitted in that direction! I can think of a couple of setups which could exploit this peculiar attribute.

**The reason being that we are in an expanding universe; most of the light emitted by the sun can never reach most regions of the universe. So presumably, if there are free photons, they are sailing out into the vast empty and expanding regions between galactic clusters where they will continue their lonely existence forever.

Q: How do we know “we are in an expanding universe”?
A: We look at distant objects and calculate the redshift.
Q: The same type of redshift as in CMB?
A: Yup.
Q: Wait... wouldn’t that indicate that if we accept the reality of an expanding universe, we must also accept the reality of CMB redshift??
A: Exactly.
Q: Would it be wise to refute empirical data in favor of the "dippy process" of renormalization in QED?
A: Nope.


(And my personal guess is that the sun will explode by overheat of all trapped energy that can’t find a suitable absorber out there... :smile:)
 
  • #45
glengarry said:
But don't worry if this doesn't seem like how nature works...

I stopped worrying back about a half dozen things you said that don't seem like how nature works. :)
 
  • #46
glengarry said:
The end result is that the CMB map is nothing other than the constantly bright sky of Olber's paradox.The only difference is that the model I've developed understands signal propagation in terms of a thermodynamic process that depends upon a fundamental relationship between the wavelength detected at the receiver and the effective penetrative power of the elements at the signal source, as determined by the maximum amplitudes of the waveforms of the source elements. Also, it is not necessary for there to be any "moment of creation" in my model. I simply make the assumption that it exists in an eternal state of thermodynamic equilibrium. I take the CMB to simply represent the natural limit of our immediately given field of vision within an infinitely old universe of a practically infinite -- although theoretically finite -- size.
If the model "you've developed" is that of an infinitely old spatially hyperspherical static universe, in a QM version and in which you have tried a way to accommodate the CMB, I have to tell you sorry but it was already developed by Einstein in 1916 and it doesn't work, it is called the Einstein universe and was a few years later seen to have big flaws, for example it is unstable, and as DevilsAvocado said there is a little problem with redshift.
 
Last edited:
  • #47
DrChinese said:
And this touches upon a point I have asked about before (in other threads), and never received a satisfactory response one way or the other. Simply stated: do free photons exist?

The more I think about that one the more I realize how deep it is.

Sorry no answer to give - except my head hurts and I now have another one for the too hard basket.

Good one Dr Chinese - no very good one.

Thanks
Bill
 
  • #49
bhobba said:
The more I think about that one the more I realize how deep it is.

Sorry no answer to give - except my head hurts and I now have another one for the too hard basket.

Good one Dr Chinese - no very good one.

Thanks
Bill

It is deep, but if we want to take that path aren't free electrons (free particles in general) also a practical abstraction (in the sense that like photons we only know about them thru their interactions and then ultimately just as a form of EM field (like photons are)?
 
  • #50
TrickyDicky said:
It is deep, but if we want to take that path aren't free electrons (free particles in general) also a practical abstraction (in the sense that like photons we only know about them thru their interactions and then ultimately just as a form of EM field (like photons are)?


maybe you can monitor them continously

Phys. Rev. A 87, 052141 (2013)
http://pra.aps.org/abstract/PRA/v87/i5/e052141


.
 
  • #51
bhobba said:
The more I think about that one the more I realize how deep it is.

Sorry no answer to give - except my head hurts and I now have another one for the too hard basket.

Good one Dr Chinese - no very good one.

Thanks
Bill

Thanks. It is an ongoing puzzler to me. I have seen a lot of folks who are well versed in QFT repeat the comment that photons (and other intermediate particles) are mathematical constructs (in the sense that one never truly encounters a free photon in the lab). Often, I see commentary around "excitations" and the like. Clearly, the CMB photons we detect have both a beginning and an end point consistent with that. Is that a firm requirement? Is it an explicit assumption of the model?

Because light emitted today into the depth of space will likely NEVER encounter something to scatter against. Assuming an ever-expanding model, which is looking more likely all the time. So that assumption, I think, might be testable. But again, I am insufficiently versed to really follow some of the nuance. Maybe you can think it through further on the theoretical side. I believe I can explain how a test could be constructed around the idea.
 
  • #52
I was a little short before, but I'd like to give some further feedback to the OP on the signalling idea presented before;

glengarry said:
So, when we flip the switch at the source to the "on" position, we can see that there will be a moving "surface of possible signal reception" whose precise location is determined by the following factors:

1) the amount of energy that has been input into the source
2) the level of sensitivity of the receiver to detect whether the arrangement of its elements has been suitably altered

In this theoretical universe, it is precisely the speed of this surface that determines what, in our universe, is called the speed of light, or c. (my boldings)

So, if I understand this correctly (which I admit I am definitely not certain I do), this idea predicts that the speed of light is variable? That's what I get out of it, since:

A moving "surface of possible signal reception" (like a light cone but spherical) whose precise location is determined by 1) energy 2) the receiver's sensitivity - clearly implies a variable c. But the problem is we don't see this effect in our universe. Higher (or lower) energy does not mean faster light. Nor do we see that the speed depends on the receiver's sensitivity (I suppose sensitivity ≈ energy levels of atoms which finally absorbs the light (photons)?).

So in short, I sense that this idea does not fit well with neither observations, cosmology, astrophysics, special relativity nor general relativity. And may I allow myself to be somewhat philosophical, since it's getting late at my place? :smile:

A person can not make the Universe agree with a theory. But a person can make a theory agree with the Universe.

But it's sure not easy.
 
Last edited:
  • #53
This is also a refreshing clip IMHO; Who can resist hearing Richard Feynman describing the scientific method in one clear and concise minute?

https://www.youtube.com/watch?v=OL6-x0modwY
 
  • #54
DennisN said:
... clearly implies a variable c. But the problem is we don't see this effect in our universe. Higher (or lower) energy does not mean faster light. Nor do we see that the speed depends on the receiver's sensitivity (I suppose sensitivity ≈ energy levels of atoms which finally absorbs the light (photons)?).

So in short, I sense that this idea does not fit well with neither observations, cosmology, astrophysics, special relativity nor general relativity.

A variable speed of light is something you can make compatible with observation easily. All you need is to combine it with a variable speed of clocks and variable length of rulers. You do not even have to change the definition of the meter to obtain the constant c as measured in m/s, that means, with distorted rulers and distorted clock time. In undistorted terms, as defined by a system of preferred coordinates, the speed of light would be variable.
 
  • #55
DennisN said:
This is also a refreshing clip IMHO; Who can resist hearing Richard Feynman describing the scientific method in one clear and concise minute?

Yea - if it disagrees with experiment it's wrong.

That's it, that's all, end of story - or maybe not:


Feynman was actually involved with some of those 'guesses' that were preferred due to their beauty.

The point though is if experimental data kept coming in against it out the window it would go - its just that sometimes something is so beautiful some people will give it a bit of leeway - for a while anyway.

Thanks
Bill
 
Last edited by a moderator:
  • #56
DennisN said:
This is also a refreshing clip IMHO; Who can resist hearing Richard Feynman describing the scientific method in one clear and concise minute?

Great clip, thanks!
 
  • #57
bhobba said:
Yea - if it disagrees with experiment it's wrong.

That's it, that's all, end of story - or maybe not:


Feynman was actually involved with some of those 'guesses' that were preferred due to their beauty.

The point though is if experimental data kept coming in against it out the window it would go - its just that sometimes something is so beautiful some people will give it a bit of leeway - for a while anyway.

Thanks
Bill


Good points, and I don't disagree. The Gell-Mann clip is a good one too. I remember having seen it before.
 
Last edited by a moderator:
  • #58
Ilja said:
A variable speed of light is something you can make compatible with observation easily. All you need is to combine it with a variable speed of clocks and variable length of rulers. You do not even have to change the definition of the meter to obtain the constant c as measured in m/s, that means, with distorted rulers and distorted clock time. In undistorted terms, as defined by a system of preferred coordinates, the speed of light would be variable.

Well, I was trying to emphasize the scientific method and get the OP's idea at least a bit closer to experiments in our observable world. Neither the speed of stationary clocks ("ticking", I presume) nor the length of stationary rulers are hitherto observed to significantly change in the stationary frame of reference in a lab, if I'm not mistaken? Otherwise it would be devastating to relativity, I think.

Anyway, that wasn't the whole of it; the OP was heading in the direction of a connection between the speed of light and energy (or energies). To show that such an idea would have some merit, experiments would have to support that the speed of light is dependent of energy (of senders) and sensitivity* (of receivers; but, of course, "sensitivity" would have to be clearly defined).

To get anywhere with it, there would obviously have to be some proposed mathematical equation(s) that include energy/energies, distance and time. Without any equations, the idea can not be evaluated in more detail (at least not by me :biggrin:). So I won't try to interpret further what the OP meant.

* The OPERA neutrino anomaly was actually dependent on the sensitivity of the receiver - it was due to a faulty timing system (CERN). :wink:
 
  • #59
DevilsAvocado said:
glengarry, I think you missed the most ‘troublesome’ property of CMB – redshift – which is not compatible with your “immediately given state”.

[PLAIN]http://background.uchicago.edu/~whu/beginners/expansion_tran.gif[/CENTER][/QUOTE]

The last time I attempted to respond to this post, the result was about 6 hours wasted effort (the response was sent down the rabbit hole), because I tried to offer some "speculation" about what might cause redshift. Needless to say, I got in trouble for speculating. So I got mad and tried to forget about everything...

But then I saw the light. Today, I finally realized something very basic:

E=hv

The definition of a photon contains no spatial component. That is, it is a pure mathematical point. What I realized is that we are arguing in the QM forum here, and the rules of classical mechanics simply do not apply. It is only by way of continuous functions that we can speak about changes in wavelengths between signal sources and receivers. But the entire foundation of QM came from the theoretical failure of applying continuous functions to the interchange of energy between remote bits of matter. The result of this "failure" was that the energy of EM radiation has to be transferred in its entirety, without delay. This just means that photons are simply disembodied bundles of frequency. There is no concept of a "wavy substance" whose spatially measurable wavelengths can possibly change size, depending on relative speeds between source and receiver.

So, from the perspective of standard QM, the concept of redshift as a result of spatial expansion simply doesn't apply.

But this whole notion of "the perspective of standard QM" is the precise reason why I wanted to develop this thread. That is, standard QM, as I see it, is not a direct theory of physical reality, but only indirectly so. It is rather a theory of the measurements that are conducted on physical reality.

So the point of this thread is simply that de Broglie's matter waves were an attempt to construct a direct theory of physical reality, which was later taken up by Bohm. But seeing as this is a much more difficult thing to do than simply talking about the statistical outcomes of infinite experimental trials, there are not bound to be many theoretical physicists focusing on de Broglie/Bohm-like models.

Granted, there are issues with the simple model that I've developed when it comes to trying to explain every single phenomenon in existence. But when it comes to developing a theoretical unification of the major issues of GR (the nature of gravity fields) and QM (the nature of light and matter), I think this little model does quite well.

Again, I need to reiterate these points:

1) The axioms of standard QM do not allow for a classical mechanism of energy exchange that allow for shifts in frequency. Photons are defined as being perfectly discrete units that are delivered discontinuously (i.e., in moments of zero duration).

2) Standard QM is only an indirect theory of physical reality.

3) de Broglie/Bohm-like models are attempts to develop direct theories of physical reality. The model outlined in this thread is such an attempt. Granted, people might not like it... but then again, I dare anyone to find such a model that they do like.​
 
Last edited by a moderator:
  • #60
glengarry said:
The axioms of standard QM do not allow for a classical mechanism of energy exchange that allow for shifts in frequency. Photons are defined as being perfectly discrete units that are delivered discontinuously (i.e., in moments of zero duration).

They come about naturally in QFT. You start out with nothing but fields and particles emerge without being put in from the start.

glengarry said:
Standard QM is only an indirect theory of physical reality.

I think arguments like that would hold more water if you could actually get people to agree what 'reality' is. Philosophers have been arguing that one since time immemorial with zero agreement.

I prefer the definition of physical reality as what physical theories tell us. For example there is disagreement about how to interpret QM but very little disagreement about the theory itself - its the most accurately verified theory ever devised.

glengarry said:
de Broglie/Bohm-like models are attempts to develop direct theories of physical reality. The model outlined in this thread is such an attempt. Granted, people might not like it... but then again, I dare anyone to find such a model that they do like.

de Broglie/Bohm-like models are attempts at an interpretation that gels with some people's intuition about how the world should be. It's more a comment on their underlying world view than anything objective in the sense of the world MUST be like that because quite obviously it doesn't.

My favorite basis for QM these days is the following - and is a model I do like:
http://arxiv.org/pdf/0911.0695v1.pdf

But it leaves others cold. That doesn't make it right or wrong. In science correspondence with experiment is the only arbiter - not what people like or dislike.

This is why philosophers hardly never agree - but scientists agree on many things - just not all.

Thanks
Bill
 
Last edited:
  • #61
glengarry said:
Standard QM is only an indirect theory of physical reality.
bhobba said:
I think arguments like that would hold more water if you could actually get people to agree what 'reality' is. Philosophers have been arguing that one since time immemorial with zero agreement.
Only God knows what the underlying reality of the universe consists of, but I will say that theories that attempt to directly model reality make use of differential equations that operate over continuous manifolds/fields. Things like "action at a distance", "quantum jumps", and other types of discontinuity are not allowed in these kinds of models. The goal is to try to develop deeply intelligible notions of causation rather than relying upon the "crutch of abstraction" inherent in theories that are based on the Born's statistical interpretation of the wavefunction.
 
  • #62
glengarry said:
Only God knows what the underlying reality of the universe consists of, but I will say that theories that attempt to directly model reality make use of differential equations that operate over continuous manifolds/fields. Things like "action at a distance", "quantum jumps", and other types of discontinuity are not allowed in these kinds of models. The goal is to try to develop deeply intelligible notions of causation rather than relying upon the "crutch of abstraction" inherent in theories that are based on the Born's statistical interpretation of the wavefunction.

Yea - but how do you know that reality isn't really like that? If the model agrees with observation its as good as any other. Think about it - there is no a-priori reason it must conform to your intuition about these things.

Thanks
Bill
 
  • #63
glengarry said:
But then I saw the light. Today, I finally realized something very basic:

E=hv

The definition of a photon contains no spatial component. That is, it is a pure mathematical point.

That's no definition, it's an energy formula. Which doesn't include electric charge (0), mass (0) and spin (1).

glengarry said:
The definition of a photon contains no spatial component. That is, it is a pure mathematical point.

Really? How come the photon carries momentum? The relativistic energy is
[tex]E^{2} = p^{2}c^{2} + m^{2}c^{4}[/tex]
The mass of a photon is m=0, so this reduces to
[tex]p = \frac {E}{c} = \frac {h \nu}{\lambda \nu} = \frac {h}{\lambda}[/tex]
where p is the momentum of a photon. Furthermore, if the photon has no spatial component, how come there's a thing called polarization?

glengarry said:
What I realized is that we are arguing in the QM forum here, and the rules of classical mechanics simply do not apply.

E.g. conservation of energy and conservation of momentum still applies, AFAIK.

glengarry said:
Photons are defined as being perfectly discrete units that are delivered discontinuously (i.e., in moments of zero duration) (my bolding).
In moments of zero duration? I do not understand what that means. Individual photons travel at max c. See this: paper, article1, article2. Anyway, you can't have a frequency without a time component. My regards.
 

Similar threads

Replies
17
Views
2K
Replies
6
Views
2K
Replies
333
Views
14K
Replies
37
Views
3K
Replies
226
Views
20K
Replies
139
Views
8K
Back
Top