- #1
- 962
- 667
With laser pointers being so ubiquitous, everyone is familiar with the sight of interference patterns on paper, ground glass and other surfaces (not to mention more subtle experiments like this one): Quantum Eraser -- which has been discussed recently in other threads. We take it for granted that interference patterns on the ground glass would effectively behave like glowing phosphor and cast corresponding images on our retinas.
But then, the actual "detectors" in such experiments are obviously not on the ground glass surface, but on the human retina. It is on the retina that waves add and cancel and produce the interference fringes. Yet we see these fringes as if the paper was coated with some kind of phosphor, which was absorbing incident photons and re-emiiting new photons towards the observer's eye, according to the local detection probability on the screeen surface.
On the one hand, it seems intuitively reasonable to expect that more photons would scatter from surfaces where more photons would have been detected IF there had been detectors there. On the other hand, it seems like a bit of a coincidence that those randomly scattered laser waves should create interference maxima on the retina, at those very places there the detector would have cast its phosphorescent image, if there had been a tiny detector on the surface of the ground glass.
So my question is : is there an explanation for this that can be understood without too much mathematics... and secondly, are there exceptions to this -- i.e. can the image on the retina have a less simple relation to the local intensity on the screen surface?
Thanks,
S T
But then, the actual "detectors" in such experiments are obviously not on the ground glass surface, but on the human retina. It is on the retina that waves add and cancel and produce the interference fringes. Yet we see these fringes as if the paper was coated with some kind of phosphor, which was absorbing incident photons and re-emiiting new photons towards the observer's eye, according to the local detection probability on the screeen surface.
On the one hand, it seems intuitively reasonable to expect that more photons would scatter from surfaces where more photons would have been detected IF there had been detectors there. On the other hand, it seems like a bit of a coincidence that those randomly scattered laser waves should create interference maxima on the retina, at those very places there the detector would have cast its phosphorescent image, if there had been a tiny detector on the surface of the ground glass.
So my question is : is there an explanation for this that can be understood without too much mathematics... and secondly, are there exceptions to this -- i.e. can the image on the retina have a less simple relation to the local intensity on the screen surface?
Thanks,
S T