- #1
zdream8
- 3
- 0
I was wondering how to prove the problem about an expanding black body.
There is a black body at a given temperature. All lengths are expanded by a factor of 2. Then it should still be a black body, but at a lower temperature.
I understand why this should happen, but I was wondering if anyone could show me how the proof works.
I found the equation for energy density
I(\lambda,T) =\frac{2 hc^2}{\lambda^5}\frac{1}{ e^{\frac{hc}{\lambda kT}}-1}
.
(sorry, I just copied it and that looks bad, but it's easy to find online)
but I wasn't really sure what to do with it. The wavelengths are obviously going to increase and the photon density is going to go down by appropriate factors...I'm just not sure how it all fits together.
Thanks. :)
There is a black body at a given temperature. All lengths are expanded by a factor of 2. Then it should still be a black body, but at a lower temperature.
I understand why this should happen, but I was wondering if anyone could show me how the proof works.
I found the equation for energy density
I(\lambda,T) =\frac{2 hc^2}{\lambda^5}\frac{1}{ e^{\frac{hc}{\lambda kT}}-1}
.
(sorry, I just copied it and that looks bad, but it's easy to find online)
but I wasn't really sure what to do with it. The wavelengths are obviously going to increase and the photon density is going to go down by appropriate factors...I'm just not sure how it all fits together.
Thanks. :)
Last edited by a moderator: