- #1
alikim
- 18
- 0
Let's say here are two ways for a photon to go between point A and B and a detector at B detects photons with some probability based on interference.
Mathematically, it's calculated by carrying phases along each path and then adding them up regardless of how much time it takes to travel along each path and after what time the interference will appear.
But in practice, if one path is much much longer than another, how long do you have to wait after a photon passed point A till the moment of detection at B?
Is there a formula for the speed of this process that happens between A and B?
Mathematically, it's calculated by carrying phases along each path and then adding them up regardless of how much time it takes to travel along each path and after what time the interference will appear.
But in practice, if one path is much much longer than another, how long do you have to wait after a photon passed point A till the moment of detection at B?
Is there a formula for the speed of this process that happens between A and B?