- #1
mathman44
- 207
- 0
Hi all. I am asking for some help with getting a physical intuition for the following situation:
Consider an observer and source, at rest. The source emits sound waves of a particular frequency, 'f'. Letting the medium (air for example) between the source and the observer move with a particular velocity, why is it that the frequency of the emitted waves, as detected by the observer, remains unchanged?
It is intuitively clear that the velocity of the emitted wave increases as the medium gains speed, but it is less clear why the frequency detected by the observer remains unchanged.
We all know from experience that the sounds in the distance, traveling through moving air, don't get modified in frequency. But why must this be so, physically?
Any insight would be greatly appreciated!
Consider an observer and source, at rest. The source emits sound waves of a particular frequency, 'f'. Letting the medium (air for example) between the source and the observer move with a particular velocity, why is it that the frequency of the emitted waves, as detected by the observer, remains unchanged?
It is intuitively clear that the velocity of the emitted wave increases as the medium gains speed, but it is less clear why the frequency detected by the observer remains unchanged.
We all know from experience that the sounds in the distance, traveling through moving air, don't get modified in frequency. But why must this be so, physically?
Any insight would be greatly appreciated!