- #1
Gezstarski
- 15
- 6
A recent preprint ( https://arxiv.org/abs/2112.15157 ) seems to imply that the focussing properties of an optical system can depend on the bandwidth of the sensor, and even on that of the associated electronics! It is argued there that if the ‘frame rate’ of the sensor is very high, photons taking paths of different (optical) lengths through the optics will no longer be recorded in the same frame and so cannot interfere with each other.
I am not comfortable with the talk of photons interfering with one another – one should calculate the propagation as a wave and then evaluate the probability that a photon is absorbed, or interacts, at a given point – eg in a detector in the image plane. But the argument could be framed in terms of `radiation’ instead of `photons’ and it makes one think. Consider the following series of thought experiments:-
Suppose we have a system with a light source, a chromatic lens, and an imaging detector in the focal plane.
1) If the source of illumination were pulsed on the femtosecond timescale the spectrum would be broadened and the focal spot would be blurred by chromatic aberration.
2) If the light source was monochromatic and continuous but a shutter acting on the same timescale is introduced in the light path before the lens, then the effect must presumably be the same.
3) What if the shutter is after the lens?
4) What if instead of a shutter the detector is active only for very brief intervals?
5) What if it is continuously active but able to record the time of detection of every photon with femtosecond precision?
6) In (5) does the form of the recorded image depend on how you select the photons to include in the analysis – eg just those in short intervals like the pulses of a pulsed light source versus using all?
I believe that from (4) onwards the image is no longer blurred by chromatic aberration, but this implies that the logic in the preprint is ill-founded.
There are other issues with the remainder of the paper, but it would be good to get this one clear
I am not comfortable with the talk of photons interfering with one another – one should calculate the propagation as a wave and then evaluate the probability that a photon is absorbed, or interacts, at a given point – eg in a detector in the image plane. But the argument could be framed in terms of `radiation’ instead of `photons’ and it makes one think. Consider the following series of thought experiments:-
Suppose we have a system with a light source, a chromatic lens, and an imaging detector in the focal plane.
1) If the source of illumination were pulsed on the femtosecond timescale the spectrum would be broadened and the focal spot would be blurred by chromatic aberration.
2) If the light source was monochromatic and continuous but a shutter acting on the same timescale is introduced in the light path before the lens, then the effect must presumably be the same.
3) What if the shutter is after the lens?
4) What if instead of a shutter the detector is active only for very brief intervals?
5) What if it is continuously active but able to record the time of detection of every photon with femtosecond precision?
6) In (5) does the form of the recorded image depend on how you select the photons to include in the analysis – eg just those in short intervals like the pulses of a pulsed light source versus using all?
I believe that from (4) onwards the image is no longer blurred by chromatic aberration, but this implies that the logic in the preprint is ill-founded.
There are other issues with the remainder of the paper, but it would be good to get this one clear