- #1
Zapped17
- 8
- 2
Thread moved from the technical forums to the schoolwork forums
TL;DR Summary: A weak spectral line connecting levels in neutral iron has been observed for a number of solar-type stars.
Its lower level has an excitation energy of 2 eV. If the line’s equivalent width is twice as large for star A
as for star B, how great is the difference in temperature (in the layers where the line is formed) between
the two stars? Assume that no hydrogen is ionized, nearly all iron is singly ionized, H− is responsible for
all the continuous opacity, the partition functions are independent
A weak spectral line connecting levels in neutral iron has been observed for a number of solar-type stars.
Its lower level has an excitation energy of 2 eV. If the line’s equivalent width is twice as large for star A
as for star B, how great is the difference in temperature (in the layers where the line is formed) between
the two stars? Assume that no hydrogen is ionized, nearly all iron is singly ionized, H− is responsible for
all the continuous opacity, the partition functions are independent of temperature, and both stars have the
same iron abundance. The dissociation energy of H− is 0.75 eV.
Solution:
Let the temperature difference be dT = T_A - T_B.
Set x = T_A/T_B and dW_A = 2dW_B (by using both the Boltzmann equation and the Saha equation) we get (after some long derivation):
2 = x^{3/2}*e* {((2 eV)/(k*T_B))*(x-1)}.
Finally, if we take natural logarithm on both sides, we obtain:
ln(2) = (3/2)*ln(x) + (2 eV/(k*T_B))*(x-1). But from this step i am stuck what to do, how i am suppose to find x, or have I used wrong method on this exericise?
Its lower level has an excitation energy of 2 eV. If the line’s equivalent width is twice as large for star A
as for star B, how great is the difference in temperature (in the layers where the line is formed) between
the two stars? Assume that no hydrogen is ionized, nearly all iron is singly ionized, H− is responsible for
all the continuous opacity, the partition functions are independent
A weak spectral line connecting levels in neutral iron has been observed for a number of solar-type stars.
Its lower level has an excitation energy of 2 eV. If the line’s equivalent width is twice as large for star A
as for star B, how great is the difference in temperature (in the layers where the line is formed) between
the two stars? Assume that no hydrogen is ionized, nearly all iron is singly ionized, H− is responsible for
all the continuous opacity, the partition functions are independent of temperature, and both stars have the
same iron abundance. The dissociation energy of H− is 0.75 eV.
Solution:
Let the temperature difference be dT = T_A - T_B.
Set x = T_A/T_B and dW_A = 2dW_B (by using both the Boltzmann equation and the Saha equation) we get (after some long derivation):
2 = x^{3/2}*e* {((2 eV)/(k*T_B))*(x-1)}.
Finally, if we take natural logarithm on both sides, we obtain:
ln(2) = (3/2)*ln(x) + (2 eV/(k*T_B))*(x-1). But from this step i am stuck what to do, how i am suppose to find x, or have I used wrong method on this exericise?
Last edited by a moderator: