- #1
jaumzaum
- 434
- 33
I was attempting to solve a problem from IPhO 2012, but got in trouble with some concepts. I'd apreaciate if someone could help me
Let us model the formation of a star as follows. A spherical cloud of sparse interstellar gas, initially at rest, starts to collapse due to its own gravity. The initial radius of the ball is r0 and the mass is m. The temperature of the surroundings (much sparser than the gas) and the initial temperature of the gas is uniformly T0. The gas may be assumed to be ideal. The average molar mass of the gas is μ and its adiabatic index is γ > 4/3. Assume that Gmμ/r0 ≫ RT0, where R is the gas constant and G is the gravitational constant.
a) "During much of the collapse, the gas is so transparent that any heat generated is immediately radiated away, i.e. the ball stays in thermodynamic equilibrium with its surroundings (...)"
b) At some radius r3≪r0, the gas becomes dense enough to be opaque to the heat radiation. Calculate the amount of heat Q radiated away during the collapse from the radius r0 down to r3
I solve b) like this: Q = ΔU + W, where U = ΔEt + ΔEp (Et is the thermal energy and Ep is the potential energy of the system)
As T=constant, ΔEt = 0, so Q = ΔEp + W
As the gas is "sparse", P is very low, so W << ΔEp
That way Q = ΔEp = 3/5 G M² (1/r0 - 1/r3) ≈ -3/5GM²/r3
But the solution is the following:
As T=constant, ΔU=0, so Q=W=nRTln(V3/V0) = -3nRTln(r0/r3)
Why did they assume ΔU=0? As far as I know, the only quantity that depends only in the temperature (when n is constant) is Et, not U. For systems in which ΔEp=0, ΔEt = ΔU, but in our case ΔEp≠0, so why should we say ΔU=0?
Let us model the formation of a star as follows. A spherical cloud of sparse interstellar gas, initially at rest, starts to collapse due to its own gravity. The initial radius of the ball is r0 and the mass is m. The temperature of the surroundings (much sparser than the gas) and the initial temperature of the gas is uniformly T0. The gas may be assumed to be ideal. The average molar mass of the gas is μ and its adiabatic index is γ > 4/3. Assume that Gmμ/r0 ≫ RT0, where R is the gas constant and G is the gravitational constant.
a) "During much of the collapse, the gas is so transparent that any heat generated is immediately radiated away, i.e. the ball stays in thermodynamic equilibrium with its surroundings (...)"
b) At some radius r3≪r0, the gas becomes dense enough to be opaque to the heat radiation. Calculate the amount of heat Q radiated away during the collapse from the radius r0 down to r3
I solve b) like this: Q = ΔU + W, where U = ΔEt + ΔEp (Et is the thermal energy and Ep is the potential energy of the system)
As T=constant, ΔEt = 0, so Q = ΔEp + W
As the gas is "sparse", P is very low, so W << ΔEp
That way Q = ΔEp = 3/5 G M² (1/r0 - 1/r3) ≈ -3/5GM²/r3
But the solution is the following:
As T=constant, ΔU=0, so Q=W=nRTln(V3/V0) = -3nRTln(r0/r3)
Why did they assume ΔU=0? As far as I know, the only quantity that depends only in the temperature (when n is constant) is Et, not U. For systems in which ΔEp=0, ΔEt = ΔU, but in our case ΔEp≠0, so why should we say ΔU=0?
Last edited: