- #1
Samama Fahim
- 52
- 4
Consider a homogeneous free scalar field ##\phi## of mass m which has a
potential
$$V(\phi) = \frac{1}{2}m^2\phi^2$$
Show that, for ##m ≫ H##, the scalar field undergoes oscillations with
frequency given by $m$ and that its energy density dilutes as
##a^{−3}##.
This is from Modern Cosmology, Scott Dodelson, Chapter 6.
For the part "Show that its energy density dilutes as ##a^{−3}##", following is my attempt:
In the equation ##\frac{\partial \rho}{\partial t} = -3H(P+\rho)##, put ##P = \frac{1}{2} \dot{\phi}^2-V(\phi)## and ##\rho=\frac{1}{2} \dot{\phi}^2+V(\phi)## to get
$$\frac{\partial \rho}{\partial t} = -3\frac{\dot{a}}{a}\dot{\phi}^2,$$
where ##H = \dot{a}/a##. I am not sure how to proceed or proceed or whether this is the correct approach. Should I use Friedmann's equation instead? But it involves densities of other species as well, and there is no assumption here whether one species dominates. Or Should I convert ##d\rho/dt## to ##d \rho/ da##?
Kindly provide a hint as to how to proceed.
Last edited: