Are Photons Actually Infinitely Small Particles?

  • Thread starter sophiecentaur
  • Start date
  • Tags
    Photon
In summary, most people seem to think that photons are little bullets, when considering light in the whole em spectrum. However, when challenged about the 'extent' of a photon, they will say 'It's a wavelength long / wide / big'. This is problematic because if a photon has an extent, then it violates the wave-particle duality and the idea that a photon is a single quantum excitation of the electromagnetic field.
  • #1
sophiecentaur
Science Advisor
Gold Member
29,600
7,180
Most people seem to think of photons as little bullets and this is often OK, when you are considering light (amongst the whole em spectrum). When challenged about the 'extent' of a photon, they will say "It's a wavelength long / wide / big" or some such arm waving statement. Because the wavelength of visible em is so short, it is possible to gloss over the details and, indeed, to visualise little bullets ('corpuscles', as they used to be called).

Of course, there is an instant objection to the notion of a photon being just one wavelength long, on the grounds that a single cycle of a sinusoid (which is what you'd have if just one such photon were to be traveling through space in isolation) would have an infinitely wide spectrum, full of harmonics that would be detectable. We don't see these - ever, for photons of any wavelength.

BUT, what about when we are dealing with low Radio Frequency em? Consider a photon with an 'extent' of just one wavelength. For a 200kHz transmission, that represents a wavelength of 1500m. Now take a very simple transmitter with, say, the collector of a transistor connected to a short wire. Take an equally simple receiver, with a short wire connected to the base of transistor. Separate them by 10m. The receiver will receive photons that the transmitter is sending it. These photons, if they were to have the proposed extent would have to extend from the transmitter to a region that is 100 times as far away as the receiver input or, they would somehow need to extend ('coiled up?' somehow) from within the transmitter to somewhere within the nearby receiver. This just has to be a nonsense model. In fact you just can't allow a photon to have any extent al all or there will be some circumstance like the above that spoils the model.

The 'energy burst' model is also a problem if you consider the mechanism that generated any particular photon. All photons of a particular energy are assumed to be identical (there is only one parameter with which to describe them). That would imply that the systems that generated these little identical bursts of energy would all need to have identical characteristics. There will be a range of charge systems that can generate em of any given frequency but they would all need to generate a burst of em with the identical pulse shape. In the classical sense, that would mean that the resonances within any system would all need to have the same Q and to make the transition within exactly the same time interval. This is asking a lot, for all 2.23900000000eV transitions to be identical, whether they were the result of a single, discrete, atomic gas transition or a transition within a continuous energy band in a solid. The only way round this is for the transition time to be irrelevant and for the 'arrival' or 'departure' of a photon to be in the form of an impulse. Hence the spatial and temporal extent of a photon must be considered as zero.

These 'little bullets' all have to be infinitely small. But that's the least of our problems with QM, when we try to force it to lie within our conventional ideas.
 
Physics news on Phys.org
  • #2
The electromagnetic field is a quantum field, and the photon is a single quantum excitation of that field. Just as for all particles in quantum mechanics, a photon does not have a unique shape, rather it is a wave packet with a finite size and frequency spread. Even a 'pure' frequency photon such as Hydrogen alpha has a certain line width and a certain spatial extent, related by the uncertainty principle. The size and shape of a photon depends on the circumstances which created it.

No, the size of a photon is not one wavelength. The size corresponds to the lifetime of the atomic state that emitted it. For example if the lifetime is roughly a nanosecond, then the size of the photon is about 30 cm. How does this avoid a contradiction? You'd think for a photon that large you might be able to detect just part of it. No, because just as for all other quantum particles, the amplitude of the photon is a probability amplitude - it's the probability of detecting the photon at that point. Photons, like all quantum particles, interact at a single point, regardless of how big their wavepacket is.
 
  • #3
I have a problem with the fact that nearly everyone restricts the discussion of photons to those involved in optical / atomic transitions. Once a beam of light / radio waves has been launched, how can anyone 'know' where it originated? By that I mean how would a "pure" frequency photon from a Hydrogen atom be distinguished from one which came from an LED? In what way could it be different? What other parameters are used to characterise photons other than their energy?
Would a Hydrogen atom be 'blind' to photons produced from an LED. I don't think so.
 
  • #4
What other parameters are used to characterise photons other than their energy?
Their spread in energy. We are so used to writing down plane waves that we forget that in real life everything is a superposition. There are two equivalent ways of looking at a wave packet: a) it's a single photon with a finite size and a spread in frequency. b) it's a probabilistic superposition of photons, each one a plane wave with completely sharp frequency and infinite extent. Which description you use amounts to a choice of basis. Plane waves are mathematically clean, but being infinite they make it difficult to ask "how big is it", and "where and when was it created"?
 
  • #5
Bill_K said:
Their spread in energy.

That is to do with the statistics of a lot of photons and what you say confirms that there really is confusion about this. The question I ask is the nature of an individual photon. That is, after all, what interacts with a charge system when it is released or absorbed.

There is a catch 22 situation here. A single photon, if one insists that it actually 'exists' in the em wave and travels anywhere, needs to have infinite extent. I have no problem with this - it makes eminent sense, as it explains diffraction, for a start. So why do people insist on it as having a very limited (little bullet) extent?
It seems to me that, in the rest of your post, you are giving more of a description of the effect of a large number of photons than a description of an individual photon. I can't see how your model 'a' can really hold because it would have to imply that different photons would need to interact differently with a given system. Don't they have to be identical?
Also, if there needs to be a 'probabilistic superposition' (as in b) then, for a very low intensity source, turned on for a very short while, how could a small number of photons turn up in an identifiable interference pattern before the others had even been emitted from the source? How do they 'know'?

This is why I feel happier with a wave treatment, with the photon just being there and identifiable at the actual time of interaction with the systems at each end. It's all a bit abstract because you only know they're there when the do actually interact. Hummm.
 
  • #6
sophiecentaur said:
A single photon, if one insists that it actually 'exists' in the em wave and travels anywhere, needs to have infinite extent. I have no problem with this - it makes eminent sense, as it explains diffraction, for a start. So why do people insist on it as having a very limited (little bullet) extent?
Maybe because of photoelectric effect - one always gets just pinprick spots on the screen.

In #2 Bill K writes: "The electromagnetic field is a quantum field, and the photon is a single quantum excitation of that field... just as for all other quantum particles, the amplitude of the photon is a probability amplitude - it's the probability of detecting the photon at that point. Photons, like all quantum particles, interact at a single point, regardless of how big their wavepacket is." This leaves me somewhat confused. Is the field itself taken to be a continuous E, B, field, so the photon is 'comprised' of such? Or do we take 'field' to mean an abstract probability amplitude space, the photon itself considered as a point particle? Otherwise, if the interaction is truly always at a point, would this not imply instantaneous collapse of an extended wavepacket 'field' that could be light years in spatial extent?
 
  • Like
Likes Spinnor
  • #7
sophiecentaur said:
A single photon, if one insists that it actually 'exists' in the em wave and travels anywhere, needs to have infinite extent.

Ehm, this is not true unless we are purely discussing theory here. Sometimes a monochromatic mode of the em-field is termed photon and that would indeed have infinite extent as it also has a perfectly defined energy. However, this is a pathological case that never occurs in reality.

Real single photons are defined by a state with fixed photon number of one. This state does not necessarily have to (and in fact will never) be monochromatic, but will be spectrally broadened at least due to the finite duration of the emission process. This is already the case at the single photon level.
 
  • #8
Cthugha said:
Real single photons are defined by a state with fixed photon number of one.
The point is well taken, that's what the photon is in our theory for talking about it, but note the deeper problem here if we are to interpret the "photon" concept ontologically-- we are defining the thing by its state! That's why I would prefer to say there is no such thing as a "real single photon", it is entirely a conceptual language, an element of a theory, which borrows its ontology from mathematics (like all physical notions that sound ontological) but which never transcends or exits that mathematical source. There's no such thing as a real photon, and this is also made clear by the basic indistinguishability of photons. We can distinguish their states, but not the things that have these states, so the "things" themselves are largely indeterminate. We must not confuse our language about things for the things themselves, or we get all worried about "what is a photon really." A photon is really whatever we choose to say it is, we made it up and it does not even have to be viewed as a thing with a unique definition, it can be a set of definitions useful in different contexts but related in some important way. This is all perfectly normal in physics, we do it all the time even though we often don't realize it.

The relevance to the OP question is that once we recognize that the ontology of the photon is entirely contextual, then we can see that "how big it is" is also a contextual issue. For the purposes of diffraction, the deBroglie wavelength is what matters, for interference, it is the coherence length (related to the size of the wave function, as some pointed to above). These are both also true of electrons, but electrons have (at least) two other relevant "sizes"-- the absence of internal structure gets the electron defined to be a point object, and light interacts with free electrons as if they had a "Thomson cross section" related to the "classical radius of the electron." The same object can come in many shapes and sizes, so we are best recognizing that it has no unique ontology at all.
 
Last edited:
  • Like
Likes Spinnor
  • #9
Ken G said:
The point is well taken, that's what the photon is in our theory for talking about it, but note the deeper problem here if we are to interpret the "photon" concept ontologically-- we are defining the thing by its state! That's why I would prefer to say there is no such thing as a "real single photon", it is entirely a conceptual language, an element of a theory, which borrows its ontology from mathematics (like all physical notions that sound ontological) but which never transcends or exits that mathematical source. There's no such thing as a real photon, and this is also made clear by the basic indistinguishability of photons. We can distinguish their states, but not the things that have these states, so the "things" themselves are largely indeterminate. We must not confuse our language about things for the things themselves, or we get all worried about "what is a photon really." A photon is really whatever we choose to say it is, we made it up and it does not even have to be viewed as a thing with a unique definition, it can be a set of definitions useful in different contexts but related in some important way. This is all perfectly normal in physics, we do it all the time even though we often don't realize it.

The relevance to the OP question is that once we recognize that the ontology of the photon is entirely contextual, then we can see that "how big it is" is also a contextual issue. For the purposes of diffraction, the deBroglie wavelength is what matters, for interference, it is the coherence length (related to the size of the wave function, as some pointed to above). These are both also true of electrons, but electrons have (at least) two other relevant "sizes"-- the absence of internal structure gets the electron defined to be a point object, and light interacts with free electrons as if they had a "Thomson cross section" related to the "classical radius of the electron." The same object can come in many shapes and sizes, so we are best recognizing that it has no unique ontology at all.

I can go along with most of that.
SO why is it that the Photon is treated by all and sundry as something with the same sort of 'reality' as a cannon ball? It seems to me that it only serves to confuse. Isn't it time to make it more plain to the World that photons are not like that at all?
How many times do we read that the Photoelectric Effect 'proves' that photons are particles?
Q reeus made the comment a few posts ago. All the photoelectric effect shows is that E =hf and that energy interactions with em waves are Quantised. Can't we, as the relatively well-informed, do the World a favour and start putting things a bit more accurately?
 
  • #10
Ken G said:
The point is well taken, that's what the photon is in our theory for talking about it, but note the deeper problem here if we are to interpret the "photon" concept ontologically-- we are defining the thing by its state! That's why I would prefer to say there is no such thing as a "real single photon", it is entirely a conceptual language, an element of a theory, which borrows its ontology from mathematics (like all physical notions that sound ontological) but which never transcends or exits that mathematical source.

The terminology of real photon was used to distinguish between the concept of single modes of the em field which is in no way assignable to experimental studies and the concept of photon number states which can be assigned to experimental studies. There is no ontological implication in that wording, just a distinction between two different concepts which happen to share the same name for historical reasons. You can go ahead and call one experimental photon instead or whatever if you find the terminology appropriate.

Ken G said:
The relevance to the OP question is that once we recognize that the ontology of the photon is entirely contextual, then we can see that "how big it is" is also a contextual issue. For the purposes of diffraction, the deBroglie wavelength is what matters, for interference, it is the coherence length (related to the size of the wave function, as some pointed to above).

"How big is a photon" is simply an ill-defined question as the concept of size itself cannot be extrapolated to photons without clarifying its meaning.

sophiecentaur said:
SO why is it that the Photon is treated by all and sundry as something with the same sort of 'reality' as a cannon ball? It seems to me that it only serves to confuse. Isn't it time to make it more plain to the World that photons are not like that at all?

This is rather a problem of the term particle. The usage in QM and the everyday usage of that word are very different. Any qm particle is different from a cannon ball.

sophiecentaur said:
How many times do we read that the Photoelectric Effect 'proves' that photons are particles?
Q reeus made the comment a few posts ago. All the photoelectric effect shows is that E =hf and that energy interactions with em waves are Quantised. Can't we, as the relatively well-informed, do the World a favour and start putting things a bit more accurately?

Two comments:

1) Having quantized interactions IS a large portion of the meaning of having a particle in qm. The coincidence that the term particle is used for something rather different in classical physics is a rather unfortunate historical development. However, I do not think it is possible to have people use other terms instead.

2) In fact, it is a common misconception that the photoelectric effect demonstrates quantization. In fact it does not and could be explained without quantized interaction. One needs to demonstrate antibunching instead.
 
Last edited:
  • #11
Cthugha said:
"How big is a photon" is simply an ill-defined question as the concept of size itself cannot be extrapolated to photons without clarifying its meaning.

Haha. It may be an ill-defined question but it keeps getting asked. I have a feeling that we are a bit stuck with that one - rather in the same way that electrical current is described as electrons moving down a wore at high speed.

Common (mis?)use of words seems to be a problem all through Science.
 
  • #12
Cthugha said:
2) ...In fact, it is a common misconception that the photoelectric effect demonstrates quantization. In fact it does not and could be explained without quantized interaction. One needs to demonstrate antibunching instead.
you mean 'without quantized field' surely - there has to be some quantization going on - e.g. electron energy levels in detector surface. A.Neumaier argued just that some time back in a long thread. Problem I had with that is when it gets down to infrequent single photon emission hitting a distant small screen. If the screen area is very small relative to wavefront area (assuming spherically expanding wave past slit in 2-slit experiment)[let's make that single slit experiment - no complications with interference fringes], probability of ejecting a single photoelectron surely is infinitesimal, as only a tiny fraction of a single photon's energy impinges on the screen. With point particle viewpoint, ejection probability is simply always proportion to screen area.
 
Last edited:
  • #13
sophiecentaur said:
Haha. It may be an ill-defined question but it keeps getting asked. I have a feeling that we are a bit stuck with that one - rather in the same way that electrical current is described as electrons moving down a wore at high speed.

Common (mis?)use of words seems to be a problem all through Science.

Whereas you can't assign a "size" to a photon it IS possible to sometimes find a connection with classical EM in that the wavelength matters even for a single photon number state.
A good example is cavity QED where the cavities are entirely "classical" in that they are designed using conventional EM even thouugh they are used in the QM regime.
Specifically, if you have a lamda/2 microwave cavity/resonator with a single photon in it and you want that photon to interact with another system (say an atom with a suitable transistion, or a qubit of some sort) the coupling strength has its maximum value in the centre of the cavity where -classically- you would expect the E field to have its maxumum value.
Hence, in cases where photons are confined in a certain space (which tends to be a pretty common situation experimentally) they have a "size" related to their (classical) wavenlength.

I should perhaps point out that it IS possible to quantize everything (the cavity, transmission lines etc) and use QM even in the design, but it is usually pretty pointless since the results agree with classical EM,
 
  • #14
f95toli said:
Whereas you can't assign a "size" to a photon it IS possible to sometimes find a connection with classical EM in that the wavelength matters even for a single photon number state.
A good example is cavity QED where the cavities are entirely "classical" in that they are designed using conventional EM even thouugh they are used in the QM regime.
Specifically, if you have a lamda/2 microwave cavity/resonator with a single photon in it and you want that photon to interact with another system (say an atom with a suitable transistion, or a qubit of some sort) the coupling strength has its maximum value in the centre of the cavity where -classically- you would expect the E field to have its maxumum value.
Hence, in cases where photons are confined in a certain space (which tends to be a pretty common situation experimentally) they have a "size" related to their (classical) wavenlength.

I should perhaps point out that it IS possible to quantize everything (the cavity, transmission lines etc) and use QM even in the design, but it is usually pretty pointless since the results agree with classical EM,

That's an interesting way of looking at it and it's like the 'electron in a box' model. But I think it's just semantics to say the photon is 'that size' rather than its probability function - which is how the electron in a box model is treated.

I have a feeling that I am falling into the trap of trying to keep a foot in both camps though.
 
  • #15
Q-reeus said:
you mean 'without quantized field' surely - there has to be some quantization going on - e.g. electron energy levels in detector surface. A.Neumaier argued just that some time back in a long thread. Problem I had with that is when it gets down to infrequent single photon emission hitting a distant small screen. If the screen area is very small relative to wavefront area (assuming spherically expanding wave past slit in 2-slit experiment)[let's make that single slit experiment - no complications with interference fringes], probability of ejecting a single photoelectron surely is infinitesimal, as only a tiny fraction of a single photon's energy impinges on the screen. With point particle viewpoint, ejection probability is simply always proportion to screen area.

But tiny probabilities are indeed appropriate here.

Note that many real life situations that actually happen have in fact extremely tiny probabilities.

For example, the probability that a given book unknown to you contains exactly the characters it contains in exactly that arrangement is incredibly small, far smaller than the ratio of surface area of the smallest and the largest things currently known to exist.
 
  • #16
sophiecentaur said:
...BUT, what about when we are dealing with low Radio Frequency em? Consider a photon with an 'extent' of just one wavelength. For a 200kHz transmission, that represents a wavelength of 1500m. Now take a very simple transmitter with, say, the collector of a transistor connected to a short wire. Take an equally simple receiver, with a short wire connected to the base of transistor. Separate them by 10m. The receiver will receive photons that the transmitter is sending it. These photons, if they were to have the proposed extent would have to extend from the transmitter to a region that is 100 times as far away as the receiver input or, they would somehow need to extend ('coiled up?' somehow) from within the transmitter to somewhere within the nearby receiver. This just has to be a nonsense model. In fact you just can't allow a photon to have any extent al all or there will be some circumstance like the above that spoils the model...

You are going from the collector of a transistor connected directly to the base of another transistor. Is this really a "transmission" of a photon?
 
  • #17
edguy99 said:
You are going from the collector of a transistor connected directly to the base of another transistor. Is this really a "transmission" of a photon?

Well, the transmission is using em and can you avoid using photons? What else would you suggest?
 
  • #18
sophiecentaur said:
IHow many times do we read that the Photoelectric Effect 'proves' that photons are particles?
Q reeus made the comment a few posts ago. All the photoelectric effect shows is that E =hf and that energy interactions with em waves are Quantised. Can't we, as the relatively well-informed, do the World a favour and start putting things a bit more accurately?
In my view, the whole issue is resolved by noticing what the phrase "photons are particles" actually should mean to a scientist. It should mean nothing other than "in many contexts, we find it useful to imagine that a photon is a particle. This means, we borrow the ontological element 'particle' from a geometric/mathematical idealization that gives meaning to the concept, and apply the ontological concept to help us picture and understand observational outcomes via some particular theory that can manipulate the particle concept, often involving wave mechanics along the way." Now, that is rather long-winded, so we end up saving time and saying "a photon is a particle", but we really should mean the long-winded version. The only real problem appears when we forget that we should mean that, which does in fact happen all the time!
 
  • #19
sophiecentaur said:
Well, the transmission is using em and can you avoid using photons? What else would you suggest?

A density of electrons that varies on a regular basis over time (called the frequency). By default, this includes a density of electrons that varies from more dense to less dense, on a regular basis over space (called the wavelength).
 
  • #20
Cthugha said:
There is no ontological implication in that wording, just a distinction between two different concepts which happen to share the same name for historical reasons. You can go ahead and call one experimental photon instead or whatever if you find the terminology appropriate.
Yes, that resolves the problem. My pointing to the ontological issues was to show where the question "how big is a photon" comes from, which is an overdependence on the reliance of ontological thinking (in the sense that anything that is physically real should have a unique physical size associated with it).
"How big is a photon" is simply an ill-defined question as the concept of size itself cannot be extrapolated to photons without clarifying its meaning.
Yes, that is certainly the crux of the issue. So why do people imagine that "how big is a photon" is a well defined question? Ontological thinking. But if the "size" of a photon is contextual, then perhaps the whole notion of a photon is also contextual, and if that's true of our dearest friend the photon, perhaps it is true of all forms of scientific ontology.
This is rather a problem of the term particle. The usage in QM and the everyday usage of that word are very different. Any qm particle is different from a cannon ball.
Quite so, yet the re-use of the term is not coincidental. The "particle" ontology has certain attributes, yet the cannon ball and the photon borrow different ones. The confusion is resolved when we recognize that all these attributes exist only in the conceptual structure, not the real versions that are borrowing elements from that structure.
1) Having quantized interactions IS a large portion of the meaning of having a particle in qm. The coincidence that the term particle is used for something rather different in classical physics is a rather unfortunate historical development. However, I do not think it is possible to have people use other terms instead.
I often use "quantum" instead of "particle" for just that reason. Still, we must recognize that any word that gets used must not be confused with some kind of true ontology, they are all just going to label a set of borrowed ontologies.
2) In fact, it is a common misconception that the photoelectric effect demonstrates quantization. In fact it does not and could be explained without quantized interaction. One needs to demonstrate antibunching instead.
That sounds like it's worth a thread of its own.
 
  • #21


My problem with this thread is the misconception of the photon as a single entity. Quantum mechanics is a theory which ONLY addresses the description of an ensemble, not individual systems. The rules of QM are statistical and in order to use them to describe a system one must have a system comprised of a statistically relevant set. Wether the system described is an ensemble comprised of several similarly prepared experiments yielding individual observations, or an ensemble comprised of one experimental preparation yielding several observations, either way the ensemble must total a statistically relevant set.

Ergo, consideration of an individual photon does not make sense from a QM perspective.

Another obvious flaw in the question is the assumption that "a photon" can be described. Even if the system has the field strength only capeable of producing an individual photon it still does not make sense to address what the ontological existence of a photon(or several photons) looks like because such observations are never made in physics. Physicists must stick to describing the observations made in physics experiments at the time of measurement, not describing the interim intermediate apparent events that take place during the experiments at times that real observations/measurements are specifically not being made.

Although its always good food for thought.
 
  • #22
A. Neumaier said:
Originally Posted by Q-reeus:
"you mean 'without quantized field' surely - there has to be some quantization going on - e.g. electron energy levels in detector surface. A.Neumaier argued just that some time back in a long thread. Problem I had with that is when it gets down to infrequent single photon emission hitting a distant small screen. If the screen area is very small relative to wavefront area (assuming spherically expanding wave past slit in 2-slit experiment)[let's make that single slit experiment - no complications with interference fringes], probability of ejecting a single photoelectron surely is infinitesimal, as only a tiny fraction of a single photon's energy impinges on the screen. With point particle viewpoint, ejection probability is simply always proportion to screen area."

But tiny probabilities are indeed appropriate here.
Perhaps I am being naive but there seems to be a big problem with spreading wave model of field quanta (I know you dislike 'photon'). You said back then as I recall it that detector screen simply steadily accumulates whatever partial energy per field quanta it can, for however long it takes, until there is sufficient for a 'trigger event' - i.e. photoemission. But this implies something special about that energy storage - why would it not simply leak away as heat into environment, especially for very low level light emission through slit? For that matter, how could photoemission not be very sensitive to environmental temperature - does the detector screen electrons really care about where the overall energy shared between them originates? Can't think of a reference offhand, but I'm fairly sure photoemission works just as well down at near zero temperatures, as at elevated. And that a powerful heat-sink would also not effect emission probability. Which is in keeping with photon-as-point-particle photoemission model.

But how to reconcile with spreading wave model? There is something special about energy absorbed (and amazingly not lost through conduction/radiation/convection) by infrequent and partially absorbed light field quanta that environmental heat could not equally supply? Surely whatever the source, energy is very rapidly partitioned amongst conduction electrons - there is rapidly no memory of the source. Yet the photoemission rate will always be proportional to just the light incidence rate, and almost totally independent of environmental temperature, even though the latter could easily supply many orders of magnitude more thermal energy to detector screen than a very weak light source, where for instance it may take on average 10 minutes for a single field quanta to arrive at the screen, and only 1% of that single quanta's energy can be absorbed by the screen (owing to the screen's size). Seems like asking for miracles - but maybe I'm missing something basic here.
Note that many real life situations that actually happen have in fact extremely tiny probabilities.
For example, the probability that a given book unknown to you contains exactly the characters it contains in exactly that arrangement is incredibly small, far smaller than the ratio of surface area of the smallest and the largest things currently known to exist.
Maybe this is a jest! Otherwise failing to see the connection. :zzz:
 
  • #23


al onestone said:
Ergo, consideration of an individual photon does not make sense from a QM perspective.

Sure it does. A "single photon" just refers to a number(Fock) state with one photon in it. It is very well defined.
Moreover, if "single photon" does not make sense, then what exactly is a single photon detector detecting?

The idea that QM only deals with ensembles is simply wrong, there are a lot of experiments where we deal with single quantum systems.
 
  • #24
Q-reeus said:
Perhaps I am being naive but there seems to be a big problem with spreading wave model of field quanta (I know you dislike 'photon'). You said back then as I recall it that detector screen simply steadily accumulates whatever partial energy per field quanta it can, for however long it takes, until there is sufficient for a 'trigger event' - i.e. photoemission. But this implies something special about that energy storage - why would it not simply leak away as heat into environment,
Heat is also energy, and the screen _is_ the environment!
The heat intrinsic to the screen will already suffice for an occasional trigger event.
Q-reeus said:
But how to reconcile with spreading wave model? There is something special about energy absorbed (and amazingly not lost through conduction/radiation/convection) by infrequent and partially absorbed light field quanta that environmental heat could not equally supply? Surely whatever the source, energy is very rapidly partitioned amongst conduction electrons - there is rapidly no memory of the source. Yet the photoemission rate will always be proportional to just the light incidence rate, and almost totally independent of environmental temperature, even though the latter could easily supply many orders of magnitude more thermal energy to detector screen than a very weak light source, where for instance it may take on average 10 minutes for a single field quanta to arrive at the screen, and only 1% of that single quanta's energy can be absorbed by the screen (owing to the screen's size). Seems like asking for miracles - but maybe I'm missing something basic here.

Maybe this is a jest! Otherwise failing to see the connection. :zzz:

It was supposed to demonstrate that a tiny probability doesn't mean that something will not happen. As long as many tiny probabilities sum up to one, one of the very unlikely events (and hence a corresponding miracle) is bound to happen, though it is as unlikely as any other. (But we wouldn't call it a miracle unless it is a very conspicuous event.)

Now, once probabilities are tiny enough, any observable effect disappears in the unavoidable noise.
Thus experiments can be conclusive only if the relevant probabilities are large enough to be statistically meaningful. In your scenario of a single photon spread over spherical wave at large distance, this is simply not the case.

In quantum mechanics, conservation laws are anyway valid only in the mean, and the unavoidable fluctuations may cause an occasional photoemission even without an external stimulus.

Thus your argument is empty.
 
  • #25
A. Neumaier said:
Heat is also energy, and the screen _is_ the environment!
Agree entirely about heat as energy, but by environment was meant a larger space than the screen, into which can leak any energy accumulated by the screen owing to incident light. And at a rate such that sufficient accumulation via light is impossible to reconcile with photoemission rate.
Now, once probabilities are tiny enough, any observable effect disappears in the unavoidable noise.
Thus experiments can be conclusive only if the relevant probabilities are large enough to be statistically meaningful. In your scenario of a single photon spread over spherical wave at large distance, this is simply not the case.
How so? The point made is that average photoemission rate is always proportional to light incidence level - even when such level is way too low to allow significant accumulation of energy at the screen. Hence emission rate cannot be explained by gradual energy capture + trigger mechanism. Only by single point ejection event where entire energy is delivered in one blow - i.e. point particle photon.
In quantum mechanics, conservation laws are anyway valid only in the mean, and the unavoidable fluctuations may cause an occasional photoemission even without an external stimulus.
Not everyone here agrees with that claim re energy conservation, but regardless, random emission has no bearing on that photoemission will be directly proportional to light incidence when the screen is coooled by e.g. liquid helium - and thus no reasonable energy accumulation mechanism can be envoked.
Thus your argument is empty.
I think my hand-waving arguments so far are at least as good as yours!
 
  • #26
Q-reeus said:
How so? The point made is that average photoemission rate is always proportional to light incidence level - even when such level is way too low to allow significant accumulation of energy at the screen. Hence emission rate cannot be explained by gradual energy capture + trigger mechanism. Only by single point ejection event where entire energy is delivered in one blow - i.e. point particle photon.

Was it not this peak / mean power ratio argument that clinched the quantisation idea? Until the integrated incident light power can raise the overall temperature to 'white heat' you won't get a 'significant number' of electrons from the metal surface because of the energy distribution. There has to be interaction of a photon with an individual electron. But 'photon' is only a word which describes what was observed. The deeper nature of the photon is not included in the description of the event but it still involves an interaction at a localised part of space. From what has been written in this thread, this suffices to call the photon a particle - but it's only a name and it doesn't imply strict bulletlikeness.
 
  • #27
sophiecentaur said:
Was it not this peak / mean power ratio argument that clinched the quantisation idea? Until the integrated incident light power can raise the overall temperature to 'white heat' you won't get a 'significant number' of electrons from the metal surface because of the energy distribution.
Might have you wrong here but this sounds like maybe a confusing of thermionic emission (http://www.virginia.edu/ep/SurfaceScience/thermion.html) with photoemission (http://en.wikipedia.org/wiki/Photoelectric_effect). The latter reference agrees that greater light intensity means greater emission rate - provided threshold frequency is met or exceeded. But there is direct proportionality. And as argued earlier, this cannot be consistent with photon-as-spreading-wave when one takes into account the full spectrum of scenarios - e.g. small screen, low temperature screen, very low incident light intensity, or combinations of these.
There has to be interaction of a photon with an individual electron. But 'photon' is only a word which describes what was observed. The deeper nature of the photon is not included in the description of the event but it still involves an interaction at a localised part of space. From what has been written in this thread, this suffices to call the photon a particle - but it's only a name and it doesn't imply strict bulletlikeness.
I used to think in terms of photon = soliton-like wavepacket, a corpusle of maybe one to several cubic wavelengths in effective extent - and to make any sense that would mean bounded in all three spatial dimensions - an extended 'tail' in propagation direction to allow for finite frequency spread. That seems to be a common viewpoint. Trouble here again was when considering how that models works under SR. Suppose in a lab frame an antenna puts out photons of some 'effective wavelength' λ = c/f at a steady rate. Each such photon will have an effective cross-section ~ λ2, so that if aimed at a screen with pinhole perforations, the probability of transmission should be tiny unless pinhole diameters are at least comparable to λ.

But now consider the case where the source of photons is an antenna moving relativistically wrt and towards the screen - let's say with a gamma factor of 10. In order to produce photons having the same λ as before in the lab frame, in the frame of the antenna the wavelength would be 10*λ. It is a fact of SR transformations that lateral dimensions are unaffected by relative velocity. Hence the cross-section should in this case be (10*λ)2 - in the lab frame. Hence the nominally identical photon wrt frequency f now has a vanishingly small probability of transmission owing to the hugely increased cross-section. And conversely, if the antenna were moving away from the screen with gamma = 10, the same requirement of λ photons in lab frame would lead to 'identical' photons with cross-section (λ/10)2 in lab frame - thus presumably a now dramatically enhanced transmission rate.

None of this surely can be true - afaik synchrotron radiation has no such odd characteristics. From this I conclude the notion of a bounded, soliton-like wavepacket photon is inconsistent with experience. No doubt some expert in QED will vehemently disagree, but if so, kindly explain how to reconcile these matters. While I would personally be far more comfortable with a spreading wave model if reconciliation with 2-slit interference were the sole criteria, it is not the sole criteria. Maybe a 'pilot-wave' + particle concept fits it all - I don't pretend to know.
 
  • #28
Q-reeus said:
Agree entirely about heat as energy, but by environment was meant a larger space than the screen, into which can leak any energy accumulated by the screen owing to incident light. And at a rate such that sufficient accumulation via light is impossible to reconcile with photoemission rate.
As long as the energy input is below the thermal noise level (and this is the case in your scenario), the details of balance and rates do not matter as no statistics can tell the difference. The heat delivered by the screen to the environment doesn't change this, as usually the two are in equilibrium, and tiny bits of energy received and/or lost don't make a statistical difference.
Q-reeus said:
How so? The point made is that average photoemission rate is always proportional to light incidence level - even when such level is way too low to allow significant accumulation of energy at the screen. Hence emission rate cannot be explained by gradual energy capture + trigger mechanism. Only by single point ejection event where entire energy is delivered in one blow - i.e. point particle photon.
An emission rate below what can be statistically ascribed to emission (rather than thermal noise) in a reproducible way needs no explanation.
Q-reeus said:
Not everyone here agrees with that claim re energy conservation,
That energy is conserved in the mean is a simple consequence of the Schroedinger equation.
And energy conservation beyond that cannot be demonstrated from first principles, hence assuming
it is dangerous in an area where experiments are lacking.
Q-reeus said:
but regardless, random emission has no bearing on that photoemission will be directly proportional to light incidence when the screen is coooled by e.g. liquid helium - and thus no reasonable energy accumulation mechanism can be envoked.
if one observes a few events only, one cannot say reliably whether they are due to one or the other of two very unlikely mechanisms that operate simultaneously - in this case, triggering due to impact unlikely because of la spread out wave, or due to thermal effects unlikely due to suppression by cooling.
I think one would have to do detailed calculations to see how big the two effects, which probabilities are actually predicted from each effect, and how many observations are needed to decide with high confidence between the mechanisms.
Q-reeus said:
I think my hand-waving arguments so far are at least as good as yours!

So one must go from hand-waving to a more serious analysis to decide the issue.
 
  • #29
A. Neumaier said:
Originally Posted by Q-reeus:
"but regardless, random emission has no bearing on that photoemission will be directly proportional to light incidence when the screen is coooled by e.g. liquid helium - and thus no reasonable energy accumulation mechanism can be envoked."

if one observes a few events only, one cannot say reliably whether they are due to one or the other of two very unlikely mechanisms that operate simultaneously - in this case, triggering due to impact unlikely because of la spread out wave, or due to thermal effects unlikely due to suppression by cooling.
I think one would have to do detailed calculations to see how big the two effects, which probabilities are actually predicted from each effect, and how many observations are needed to decide with high confidence between the mechanisms.
Sure if choosing between those two it could be detailed and difficult, but a third option not mentioned is partculate photon. Probability here is simply directly proportional to screen area and incident intensity. 'Dark count' should be essentially zero in such circumstance. Hard to find specific experimental confirmation of my low temp claim, but there is no need to go to extremes here. For any regime where energy dump to screen from incident light-as-spreading-wave drains away faster than to account for photoemission rate, there is an overall energy deficit problem. Do you really think slapping a heat sink behind the screen would actually make a whit of difference to the count? Well it is true there is a weak dependence of photoemission rate on temperature - but weak is the word.

And here's a home grown nugget re photoejection delay - https://www.physicsforums.com/showpost.php?p=1556442&postcount=36
At any rate there are other compelling reasons to believe in a particulate photon imo:
Photoionization: http://en.wikipedia.org/wiki/Photoelectrochemical_processes#Photoionization
Compton scattering: http://en.wikipedia.org/wiki/Compton_scattering
Gamma ray detection (Compton scattering is one means): http://imagine.gsfc.nasa.gov/docs/science/how_l2/gamma_detectors.html
 
  • #30
Q-reeus said:
Might have you wrong here but this sounds like maybe a confusing of thermionic emission

My last answer went somewhere - so here is another one.

Absolutely not. I was making the point that you would have to put enough energy into the metal to get it to white heat (thermionic, if you like) to do the job, for one electron that just one optical photon can do.
 
  • #31
Q-reeus said:
Sure if choosing between those two it could be detailed and difficult, but a third option not mentioned is partculate photon.

What is a particulate photon?
 
  • #32
sophiecentaur said:
Absolutely not. I was making the point that you would have to put enough energy into the metal to get it to white heat (thermionic, if you like) to do the job, for one electron that just one optical photon can do.
OK' agreed - I hadn't understood your point properly earlier. So, going back to your concluding remarks in #26, should one conclude we have to get by with a mathematical model having no conceptually clear physical structure? Appears so from most participants remarks.
 
  • #33
A. Neumaier said:
What is a particulate photon?
I had misspelt it, but I'm sure you know what was meant - a highly localized entity assumed to contain the entire energy/mommentum of said field quantum. No chance that was a loaded question? :rolleyes:
 
  • #34
Q-reeus said:
I had misspelt it, but I'm sure you know what was meant - a highly localized entity assumed to contain the entire energy/mommentum of said field quantum. No chance that was a loaded question? :rolleyes:

If it is a single photon, spread out over a large sphere, then there is essentially no chance to detect it
experimentally except by putting very sensitive detectors on a large fraction of the sphere. And if you get somewhere a recording event, how do you know that it came from your source and not from somewhere else, unless you ensure that the whole huge sphere you were entertaining in your thought experiment is completely dark - an impossibility on a large scale.

Thus your ''particular photon'' assumption is quite infeasible to test.
 
  • #35
Q-reeus said:
OK' agreed - I hadn't understood your point properly earlier. So, going back to your concluding remarks in #26, should one conclude we have to get by with a mathematical model having no conceptually clear physical structure? Appears so from most participants remarks.

I think so. It's too much of a luxury to expect everything in Science to relate back to what we already know in a comfortable way. There was the same problem in appreciating the consequences of SR, surely, and we have (mostly) got over that by now.
 

Similar threads

Replies
38
Views
4K
Replies
29
Views
2K
Replies
5
Views
1K
Replies
32
Views
2K
Replies
1
Views
981
Replies
2
Views
1K
Back
Top