How Does the Phase Difference at Detector D Arise from Sources A and B?

  • #1
pkielkowski
1
0
Sources A and B are on the horizontal x-axis and both emit a long-range radio wave of wavelength 400m, with the phase of emission from A ahead of that from source B by 90 degrees. The distance r(A) from Source A to the detector (D) in the y-axis is greater than the distance of r(B) by 100m. What is the phase difference of the waves at D? (both waves are directed to point D)

So far I have:

path difference(phi) = (m+1)(lambda/2)

so then

I got phi = (2pi/lambda)(delta x)

delta x = phi*lambda/2pi

I'm not sure where to go from here
 
Mathematics news on Phys.org
  • #2
pkielkowski said:
Sources A and B are on the horizontal x-axis and both emit a long-range radio wave of wavelength 400m, with the phase of emission from A ahead of that from source B by 90 degrees. The distance r(A) from Source A to the detector (D) in the y-axis is greater than the distance of r(B) by 100m. What is the phase difference of the waves at D? (both waves are directed to point D)

So far I have:

path difference(phi) = (m+1)(lambda/2)

so then

I got phi = (2pi/lambda)(delta x)

delta x = phi*lambda/2pi

I'm not sure where to go from here

Hey pkielkowski! Welcome to MHB! (Wink)

You've got $\Delta \phi = \frac{2\pi}{\lambda}\Delta x$.
That's the contribution to the phase due to the difference in distance.
Just fill it in.
Since A is ahead by 90 degrees, that should be subtracted to get the actual phase difference.
 
Back
Top