- #1
ElPimiento
- 17
- 0
Homework Statement
Suppose that A', B', and C' are at rest in frame S', which moves with respect to S at speed v in the +x direction. Let B' be located exactly midway between A' and C'. At t' = 0, a light flash occurs at B' and expands outward as a spherical wave. (A', B', and C' are all on the +x axis, with A' having the smallest x coordinate and C' having the largest x coordinate. Assume A'B' = B'C' = Lp.)
What is the difference between the time it takes the wave front to reach A' and the time it takes to reach B' (Use the following as necessary: v, c and Lp.
Homework Equations
$$ \begin{align}
L & = \frac{L_\text{p}}{\gamma} \\
\Delta t & = \gamma \big( t' + \frac{v}{c^2}x' \big) \\
\end{align} $$
The Attempt at a Solution
I think an observer in S' would see the events simultaneously. So the time interval should be 0. but this is not the correct answer. So I will present an alternate attempt at rationalizing the situation.
An observer in S sees the light travel a distance AB = BC which is contracted from the proper length A'B' = B'C' according to:
$$ \text{AB} = \text{BC} = \frac{\text{A'B'}}{\gamma} = \frac{L_\text{p}}{\gamma} $$
Since the wave is an electromagnetic one, the time take to traverse these distances will be
$$ \Delta t_\text{AB} = \Bigg( \frac{L_\text{p}}{\gamma} \Bigg) \Bigg(\frac{1}{c + v} \Bigg) = \frac{L_\text{p}}{\gamma (c + v)} \\
\Delta t_\text{BC} = \Bigg( \frac{L_\text{p}}{\gamma} \Bigg) \Bigg(\frac{1}{c - v} \Bigg) = \frac{L_\text{p}}{\gamma (c - v)} \\
\implies \Delta t = \Delta t_\text{AB} - \Delta t_\text{BC} = \frac{L_\text{p}}{\gamma}\Bigg(\frac{2v}{c^2 - v^2} \Bigg)
$$
Now, I think the time interval being asked for is one from S' (which would not be the proper time right?). So I'll apply the inverted Lorentz transformation:
$$ \begin{align}
\Delta t' & = \gamma \big(\Delta t + \frac{v}{c^2} \Delta x \big) \\
& = \gamma \Bigg(\Delta t + \frac{v}{c^2} v \Delta t \Bigg) \\
& = \gamma \Delta t \Bigg( 1 + \frac{v^2}{c^2} \Bigg) \\
& = \gamma \frac{L_\text{p}}{\gamma}\Bigg(\frac{2v}{c^2 - v^2} \Bigg) \Bigg( 1 + \frac{v^2}{c^2} \Bigg) \\
& = L_\text{p} \Bigg(\frac{2v}{c^2 - v^2} \Bigg) \Bigg( 1 + \frac{v^2}{c^2} \Bigg)
\end{align}$$
But this is incorrect. The correct answer is:
$$ \Delta t = L_\text{p} \Bigg(\frac{2v}{c^2 - v^2} \Bigg) $$
Which I would get numerically by treating ##\Delta x ##, above, as 0; but I feel like this doesn't make physical sense since the ##\Delta x ## is distance B travels in the time between wavefronts.