- #1
RightFresh
- 21
- 1
Hi all, I have a question from a tutorial sheet that I'm stuck with. The question is
Estimate the pressure at which a gas of argon atoms, at a temperature of 300 K, will begin to show deviations from the ideal gas behaviour due to the finite size of the atoms. Answer: Of order 10^9 Pa.
So I tried taylor expanding the hard sphere gas equation: P'(V-b)=NkT, to get P'=P(1+b/V) to first order, where P is the ideal gas pressure. However, I don't know if this is the right approach or just what to do next really. Could someone point me in the right direction please?
Estimate the pressure at which a gas of argon atoms, at a temperature of 300 K, will begin to show deviations from the ideal gas behaviour due to the finite size of the atoms. Answer: Of order 10^9 Pa.
So I tried taylor expanding the hard sphere gas equation: P'(V-b)=NkT, to get P'=P(1+b/V) to first order, where P is the ideal gas pressure. However, I don't know if this is the right approach or just what to do next really. Could someone point me in the right direction please?