- #1
skibum143
- 112
- 0
Homework Statement
Two antennas located at points A and B are broadcasting radio waves of frequency 96.0 MHz, perfectly in phase with each other. The two antennas are separated by a distance d=12.40m. An observer, P, is located on the x axis, a distance x=55.0m from antenna A, so that APB forms a right triangle with PB as hypotenuse. What is the phase difference between the waves arriving at P from antennas A and B? Use units of "rad" for the answer. (If you are stuck, read the hint.)
Homework Equations
1 wavelength = 2pi radians
m * lambda / distance between slits = distance between maxima (y) / Distance to screen
c/f = lambda
The Attempt at a Solution
First, I converted 96MHz to wavelengths, = 3.125 m
Then, I wanted to find the pathlength of AP and the pathlength of BP, but I don't know how to do this without knowing the distance between maxima (y). If I had y, I could find the value of the order (m), and then I would know the pathlength difference. How do I solve for this?
Finally, I will convert the pathlength distance to radians with the above formula. I just don't understand how to get the pathlength distance.